Launched in 2022, the Current conference hosted by Confluent has established itself as one of the leading conferences in the data streaming space. Stemming from Kafka Summit originally, it’s broadened to reflect Confluent’s product portfolio including most notably Apache Flink.
The Decodable team was there in force this year, with several talks and a bustling booth.
The conference began for me with what is now a regular tradition at Kafka Summit/Current—the 5k run. It’s something that was inspired by a similar community effort at Oracle OpenWorld several years ago, and it’s a fun way to begin the day. Fun, that is, if you don’t mind running in the pitch black in 25°C at 90% humidity … 🥵
After a much-needed shower it was onto the more serious matters of the day—beginning with the opening keynote. You can see a full recording of it online (along with day two’s keynote here).
Jay Kreps opened proceedings with a somewhat familiar refrain on where the data streaming platform sits (at the heart of a company’s architecture, of course!), and evolved the idea to embrace what else but AI.
He made the case that a data streaming platform was the natural foundation on which to build AI, and specifically called out the lakehouse as being unsuitable as a foundation.
The answer to these problems of a lakehouse? I wonder if you can guess…
By taking a “shift left” approach you solve <span style="text-decoration: line-through">world hunger</span> the problems of a lakehouse by using a data streaming platform on which to build your AI systems instead.
Unlike Kafka Summit London in March at which Confluent announced Tableflow, there weren’t any particularly big product announcements at Current that caught my eye. Some things like client-side field level encryption as well as Flink support in the upcoming release of Confluent Platform going GA are important, but they’re not headline grabbing. The most interesting product news for me was what we already knew from the announcement a week before Current—Confluent acquiring WarpStream to plug a BYOC-gap in Confluent’s offering. Jay Kreps talked about this in the keynote and there was some interesting technical content from WarpStream’s Richard Artoul.
Decodable at Current 2024
The expo hall was a lively place with food trucks, meetup hubs—and exhibitors! There were the usual faces, some missing—and several new ones. At the Decodable booth we had a bunch of interest in our managed Flink offering and in particular the new lineage feature.
Our swag went down a storm too, in particular the cool stickers :)
Our speakers were kept busy delivering four talks, covering topics from Flink UDFs, time in Flink, data contracts, and troubleshooting Flink SQL.
You can find the talks online here, with recordings to follow:
- Data Contracts In Practice With Debezium and Apache Flink (Gunnar Morling)
- The Joy of JARs (and Other Flink SQL Troubleshooting Tales) (Robin Moffatt)
- So you want to write a User-Defined Function for Flink? (Hans-Peter Grahsl)
- Timing is Everything: Understanding Event-Time Processing in Flink SQL (Sharon Xie)
So speaking of talks…
Conferences are not just about keynotes and swag—the reason many of us are there is to share knowledge by speaking and listening, and there were some really good talks at Current this year. Now obviously the Decodable ones were excellent 😁 but here are just a few of the other stand-out sessions that we attended:
The slides and recordings are put online by Confluent after the event so keep an eye on their website.
- Elijah Meeks gave an excellent talk on Visualization in Motion: How to Create Effective Data Visualization with Real-Time Data. It was both interesting as a subject—and something I’d love to see more discussion of at conferences like Current—and expertly delivered in a really engaging manner
- Sanath Shetty and Ashish Vijaywargiya from JPMorgan talked about Modernizing Systems with Kafka Connect. Connect is a technology close to my heart and I really enjoyed the real-world validation of the success of it at scale in a large organization. They have over 1000 connector instances and use it to migrate from old Sybase systems, via SQL Server and then Debezium on Kafka Connect to more modern RDBMS using the Kafka Connect JDBC Sink.
- Joseph Thaidigsman and Thomas Thornton shared their experiences from adopting Debezium and Kafka Connect at Slack. With 1400 Connect tasks for Debezium connector for Vitess, they run at an impressive scale, using Debezium as part of the realtime data streaming pipeline for updating their data lake. It’s just great to see adoption stories like this, and you definitely should check out the recording of this talk once it is available.
- Leonard Xu, Apache Flink PMC Member and Flink CDC Lead from Alibaba Cloud provided a deep dive into the fundamental design and implementation of Flink CDC. He discussed many important aspects of building real-time data pipelines including the challenging problem of schema evolution to synchronize upstream schema changes with downstream systems.
Current in 2025
Current 2025 will be in Bangalore (March), London (May 20-21), and New Orleans (October 29-30).