Hinweis: Die aktuelle TDWI-Konferenz finden Sie hier!

CONFERENCE PROGRAM OF 2021

Please note:
On this site, there is only displayed the English speaking sessions of the TDWI München digital. You can find all conference sessions, including the German speaking ones, here.

The times given in the conference program of TDWI München digital correspond to Central European Time (CET).

By clicking on "EVENT MERKEN" within the lecture descriptions you can arrange your own schedule. You can view your schedule at any time using the icon in the upper right corner.

Complex Event Processing with Kafka Streams

Kafka and Kafka Streams have emerged as the de facto standard for scalable, high volume and low-latency real time data processing in the industry. Complex event processing is concerned with the detection of patterns in raw events and their transformation to higher level events that are more business relevant. This talk shows how to implement scalable, fault tolerant and declarative complex event processing based on Kafka Streams.

Target Audience:Data Engineer, Data Scientist, Data Architect
Prerequisites: Basic understanding of real time data processing and Apache Kafka.
Level: Advanced

Extended Abstract:

Kafka and Kafka Streams have emerged as the de facto open source standard for real time, low latency and high throughput data integration in a large variety of industries. Besides providing traditional operations such as grouping, aggregations and table-table joins, Kafka Streams treats streams of data as first class citizens and offers some unique features geared at real time data wrangling such as aggregation of data across different types of time windows, real time data enrichment, and joins between streams of data. While both types of operations have emerged from the ambit of relational databases, the complex event processing community has focused their attention on declarative, pattern based recognition of business relevant events from lower level, raw events. Examples for operations supported by complex event processing are sequences, negation, conjunction, disjunction and repetition. This talk shows how sequences, negation and regular expressions on event streams can be implemented in a scalable and fault tolerant manner based upon the Kafka Streams DSL and the Kafka Streams Processor API. The provided examples are relevant to use cases such as detection of shoplifting, security systems, operations of production lines, and online shopping.

Benedikt is a Senior Solutions Architect at Confluent and has always been passionate about data and algorithms. He holds a Diploma and PhD in Computer Science from the University of Munich, where he fell in love with rule based data integration, and has authored many papers and given talks at international conferences. At Confluent, Benedikt has turned his attention to distributed real time data processing with Apache Kafka, Kafka Streams. During the last years, he has advised a large number of high profile customers on their event streaming journey. During his spare time, Benedikt loves to spend time with his family and trying out any kind of new sport.
Benedikt Linse
14:50 - 15:30
Vortrag: Di 1.4

Vortrag Teilen