The times given in the conference program of TDWI München digital correspond to Central European Time (CET).
By clicking on "EVENT MERKEN" within the lecture descriptions you can arrange your own schedule. You can view your schedule at any time using the icon in the upper right corner.
In verteilten Data Mesh-Architekturen spielt die Sammlung, Anreicherung und Verteilung von Metadaten eine zentrale Rolle, um das flüssige und sichere Arbeiten an und mit Datenprodukten quer durch die Organisation und darüber hinaus zu ermöglichen. Der Vortrag gibt Einblicke in Zalandos Metadata Mesh und wie mithilfe von Metadaten über Cloud-Provider und Technologien hinweg effizient mit Daten gearbeitet werden kann.
Zielpublikum: Data Engineers, Principals, Architekt:innen, Leiter:innen, Führungskräfte
Voraussetzungen: Grundkenntnisse in Datenarchitekturen
Schwierigkeitsgrad: Fortgeschritten
Extended Abstract:
In heterogenen Data Mesh-Architekturen fallen Unmengen an Metadaten an: Informationen über Schemata, Formate, Beschreibungen, Verantwortlichkeiten, Verwendung, Qualität, oder Klassifizierungen werden dezentral produziert, müssen weiterverarbeitet und weitergereicht werden, um flüssiges und sicheres Arbeiten mit Daten zu ermöglichen. Hierbei liegt ein großer Fokus auf der dezentralen Metadatenplattform, die quer über Technologien und Organisationen hinweg Metadaten integriert und bereitstellt. Dabei müssen nicht nur verteilte Datenkataloge, sondern auch diverse Tools wie z. B. zur Datenqualitätsmessung angeschlossen werden. Der Vortrag gibt Einblicke dazu in Zalandos jetzige und zukünftige Datenlandschaft.
Sebastian Herold treibt seit 5 Jahren die Entwicklung von Zalandos Datenlandschaft zu einem effizienten, skalierbaren, sicheren Data Mesh von Datenprodukten voran und arbeitet dabei eng mit den Nutzern und Cloud-Providern zusammen.
Zuvor hatte Sebastian für 7 Jahre die Datenplattform von Immobilienscout24 aufgebaut.
The definition of a well-designed, stable & scalable data architecture has changed due to the increasing complexity & volume of a modern system landscape. To keep up with the latest technology, this session will provide an overview of the components of a modern data architecture. By outlining the benefits of a well-designed & structured data architecture, this session will provide clear guidelines for the current data management challenges in the area of technology, processes & organization.
Target Audience: Decision Maker, CTO, CIO, Head of BI, Data Engineer, IT
Prerequisites: Basic knowledge in technical architecture & data management
Level: Basic
Extended Abstract:
The goal of a modern data architecture is to provide organizations with the ability to effectively manage and utilize their data in a manner that supports their business goals and objectives. This session will focus on two arising modern data architectures: Data Mesh & Data Fabric.
Data Mesh is a decentralized approach to data management that focuses on empowering teams to own and manage their own data. In a data mesh architecture, data is managed by a set of microservices, each of which is responsible for a specific set of data. This approach allows teams to work with their data in a more flexible and autonomous manner, while still maintaining a high level of data governance and security.
Data Fabric refers to a unified data architecture that enables data to be accessible and usable across different systems, applications, and teams. It is a flexible and scalable solution that helps to overcome data silos and enables a single view of data across an organization. A data fabric is designed to adapt to changing data needs over time and to provide a seamless solution for data management and processing.
In my current position as the Head of Data Intelligence at Camelot, I serve as a Managing Consultant and Architect, guiding clients in the selection of optimal data management tools for their modern data architectures. My focus extends to realizing the principles of data mesh across organizational, process, and technological dimensions. I am committed to delivering comprehensive solutions and ensuring that organizations harness the full potential of their data assets. I have cultivated expertise in data integration platforms, including SAP Datasphere, Informatica Intelligent Data Management Cloud (IDMC), and Denodo.
Data mesh is an outstanding approach to build and manage modern data management solutions, but it is hard to implement practically without using implementation patterns. Certainly, data mesh is fundamentally an organizational approach, but experience has shown that democratization also results in losses in the technical governance of such solutions. In this talk, we will show how implementation patterns can be used to keep the data mesh under control without limiting the flexibility of the individual data products.
Target Audience: Data Engineer, Project Leader, Decision Makers, CDO, Product Owner
Prerequisites: Basic knowledge, experience in DWH, Data Lake, Lakehouse concepts
Level: Basic
Gregor Zeiler has been working in various functions in business intelligence consulting for thirty years. In the course of his professional activities, he has gained extensive project experience in many industries and on the basis of a wide range of technologies. Numerous publications and lectures accompany his professional activities. As CEO at biGENIUS AG, he pursues his passion for optimizing processes in the development of data analytics solutions.
Daniel Zimmermann is Product Owner for the Generator part of biGENIUS. He started his career as an ERP Consultant and Developer, and after 10 years switched to the Business Intelligence field for the past 15 years. In his current role, he uses his expertise from multiple BI projects across different industries, to design and program the blueprints that are used to automate the creation of Data Warehouse solution with a Metadata driven approach.
At SWICA we started our movement towards data mesh 2021. At TDWI 2022 we presented our latest platform which basically consists of resources and patterns to hold and manage decentralized compute and storage services for our data. For this year’s session we are prepared to show working platform services like discovery, publishing and compliance. Our user interfaces and recap best practices during the implementation phasis.
Target Audience: Platform Engineers, Data Engineers, Data Analysts, Project Leaders, Decision Makers
Prerequisites: Basic understanding of the data mesh concept, data warehouse architectures and the challenges of diverse analytical use cases from multiple lines of business
Level: Expert
Extended Abstract:
A long with 'Version 1' we realized, that we need to refactor some components of our mesh. We will share our 'lessons learned' and 'best practices' with you and hope to answer a couple of questions that you might be thinking or haven’t thought of. E.g., orchestration, provisioning, cicd and much more.
SWICA
2019 - 2021
BUCHI LABORTECHNIK AG
2018 - 2019
Informatec Ltd.
2008 - 2018
Frenzel GmbH
15 years of BI industry experience as a project manager, analyst, team lead and solution architect. Closely following new concepts and technologies, aiming for practical application in the enterprise world.
Data Mesh erobert die Datenwelt im Sturm, obwohl das wegweisende Architekturkonzept in der Reinform für die meisten IT-Abteilungen noch nicht praktikabel ist. In verschiedenen Ausbaustufen der Dezentralisierung kann der Ansatz aber auch heute schon zu einer deutlich effizienteren Datennutzung in Unternehmen beitragen.
Jens Kröhnert stellt Ihnen anhand von Praxisbeispielen die wichtigsten Entscheidungskriterien für unterschiedliche Data Mesh-Ausprägungen vor. Ausgangspunkt bilden dabei Full-, Governed- und Hybrid-Varianten.
Zielpublikum: Data Engineers, Data Architects, Projektleiter:innen, Entscheider:innen
Voraussetzungen: Basiswissen
Schwierigkeitsgrad: Einsteiger
Jens Kröhnert verfügt über langjährige Erfahrung in der Planung und Umsetzung von Digitalisierungsprojekten. Als Innovationsexperte hat er für ORAYLIS immer die neuesten Technologien und Entwicklungen im Blick.
For large organizations, the challenges of enterprise-level data management are manifold. Leftovers of various integration / decentralization exercises and sourcing initiatives, system components of all kinds (DHWs, ODSs, data lakes and mixed forms), a complex mesh of central and decentral governance and ownership. Data Mesh and Data Platform promise to solve problems of complexity, heterogeneity, and efficiency. Supported by the Data Management & Analytics Community (DMAC), we studied status quo and plans in ten large companies.
Target Audience: CDO/CIO, Senior Analytics Management, Senior Data Management (especially in larger organizations)
Prerequisites: none (general awareness of enterprise-level, cross-solution data management challenges)
Level: Advanced
Robert Winter, University of St. Gallen (HSG), Switzerland is a full professor of business & information systems engineering and director of HSG's Institute of Information Management. He is also founding director of HSG's Executive MBA program in Business Engineering. His main research interests are design science research methodology and enterprise-level IS management topics such as architectural coordination, governance of digital platforms, governance of enterprise transformation, and enterprise-level data management. Since more than ten years, Robert Winter and Stephan Aier organize HSG's Data Management & Analytics Community (DMAC), a cooperation format of large companies to discuss good practices of enterprise-level data and analytics management.