adesso Blog
15.04.2024 By Patrick Kübler
Why our customers start with product cons reporting when it comes to IIoT
Every day, employees struggle with manual reporting processes that cause high personnel costs, limited opportunities for process optimisation and quality deficiencies. Despite the crucial importance of KPIs for management, manual reporting processes are widespread in production. In this blog post, I explain why companies in the IIoT sector are starting with production reporting.
Read more06.03.2024 By Stefan Klempnauer
Jsonnet as an accelerator for metadata-driven data pipelines
Metadata-driven data pipelines are a game changer for data processing in companies. These pipelines use metadata to dynamically update processes instead of manually revising each step every time a data source changes. As with data pipelines, metadata maintenance can be a bottleneck in the maintenance and further development of a pipeline framework. In this blog post, I use practical examples to show how the Jsonnet template language makes it easier to maintain metadata.
Read more22.02.2024 By Azza Baatout and Marc Mezger
Prefect - Workflow orchestration for AI and data engineering projects
Workflow orchestration and workflow engines are crucial components in modern data processing and software development, especially in the field of artificial intelligence (AI). These technologies make it possible to efficiently manage and coordinate various tasks and processes within complex data pipelines. In this blog post, we present Prefect, an intuitive tool for orchestrating workflows in AI development.
Read more12.02.2024 By Siver Rajab
Snowflake: the advanced data management solution
Snowflake plays a prominent role in shaping the face of the industry in the ever-evolving world of data analytics and data management. This blog post looks at the development of Snowflake and why it is considered a ground-breaking solution for businesses.
Read more26.12.2023 By Mykola Zubok
Data Evolution Chronicles: Tracing the Path from Warehouse to Lakehouse Architecture - Part 1
Before the introduction of computers, companies used account books, inventory lists and intuition to record key figures. The data warehouse for static reports emerged at the end of the 1980s. Digitalisation brought new challenges. Big data overwhelmed traditional data warehouses, which is why companies introduced data lakes, which also had an impact on the architecture and concepts of analytics systems. The first part of this blog post is about their development, the reasons for their emergence and the problems they solved.
Read more20.12.2023 By Mykola Zubok
Data Evolution Chronicles: Tracing the Path from Warehouse to Lakehouse Architecture - part 2
In Part 1 of the blog post, we looked at the basics of data warehouses and data lakes to understand their benefits and limitations. This part is about the evolution to a hybrid solution - the data lakehouse. From the two-tier architecture to the emergence of the lakehouse concept, this part takes us through the evolution of data structures. Learn how the lakehouse combines the strengths of data lakes and warehouses to address today's data-driven challenges.
Read more21.11.2023 By Sebastian Dienst
Data governance – more interesting than you might think
In an increasingly digitalised world, the systematic collection, interpretation and use of data is becoming a factor in success. If a company fails to do this, it will lose a key tool needed to stay competitive and innovate in the age of digitalisation. In my blog post, I explain why data governance is important for companies and how adesso can support them in this area.
Read more04.09.2023 By Mike Deecke
The ingenious data lake
A data lake, also known as a data platform, is an ingenious and highly effective solution for addressing the complex challenges that come with storing, managing and analysing vast quantities of data. It provides an advanced infrastructure that enables organisations to store data in its native form and be flexible with how they process it later. In my blog post, I will go through the advantages it brings to the table.
Read more17.08.2023 By Mykola Zubok
Open table formats in the modern data analytics landscape
Apache Hudi, Iceberg and Delta Lake have proven to be powerful tools that are revolutionising the way modern data platforms handle data management and analysis. In my blog post, I will discuss the core features of these data types and highlight their respective strengths and applications.
Read more