

What we’re about
New to Data Streaming? Confluent Developer is full of free resources to put you on the right path!
Get involved with an upcoming event. First-time speakers are welcome!
Interested in speaking or providing a venue? See our rules of engagement then get in touch ([community@confluent.io](mailto:community@confluent.io)) or, to become a local champion for this group, see our Meetup in a Box Program.
This meetup is for anyone interested in Data Streaming. You can be totally new to the space or an accomplished Data Streaming Engineer - you are welcome regardless! At Data Streaming events, talks will relate to Confluent, Apache Kafka®, Apache Flink®, stream processing, security, microservices, cloud topics, Kafka Streams, Apache Iceberg®, AI-related topics, and anything else adjacent to the world of data streaming!
This group is part of the Confluent Community: Find all our programs and spaces here
- Join over 45k members in our online slack community
- See upcoming meetups in-person and online and find recordings of past events!
- Meet our Community Catalysts (MVPs), nominate someone ideal or find out how to become one yourself!
- If you’re here, you should probably know about Current: The premier data streaming Conference, find out where it is going next!
Important:
Our group goal is to provide the opportunity for participants to learn, communicate, contribute and collaborate. Confluent’s Community Code of Conduct governs how we all participate and behave.
Apache Kafka, Kafka®, Apache Iceberg, Iceberg®, Apache Flink, Flink®, Apache®, the squirrel logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation.
Upcoming events
2
IN-PERSON: Apache Kafka® x Apache Flink® Meetup
Autodesk , 1 Windmill Lane, 2nd Floor, D02 F206, Dublin, IEJoin us for an Apache Kafka® x Apache Flink® meetup on Wednesday October 1st from 6:00pm in Dublin hosted by Autodesk!
📍Venue:
Autodesk
1 Windmill Lane, 2nd Floor, Dublin D02 F206NOTE: IF YOU RSVP HERE, YOU DON'T ALSO NEED TO RSVP AT Dublin Apache Kafka®
🗓 Agenda:
- 6:00pm: Doors open
- 6:00pm – 6:30pm: Food, Drinks & Networking
- 6:30pm - 7:15pm: Nitish Tiwari, Founder & CEO, Parseable
- 7:15pm - 8:00pm: John Watson, Principal Engineer, Autodesk & Petr Novicky, Principal Engineer, Autodesk
- 8:00pm - 9:00pm: Additional Q&A and Networking
💡 First Speaker:
Nitish Tiwari, Founder & CEO, ParseableTitle of Talk:
Making sense of Kafka metrics with Agentic designAbstract:
In this talk we will look at how to track and export Kafka metrics from a Kafka production cluster to an observability system like Parseable. We'll then deep dive into the metrics data, its implications and more. Finally we'll look at a LLM based agentic style workflow to see how to predict metrics data points, set up relevant alerts and create actionable insights from this metrics data.Bio:
Nitish is the Founder and CEO of Parseable Inc. At Parseable, Nitish and team are building the next generation infrastructure and tooling for observability. Parseable is one of the fastest, purpose built, full stack observability platforms. With more than 15+ years of experience in the software industry, Nitish has previously worked at data infrastructure organisations like MinIO and DataStax.💡 Second Speakers:
John Watson, Principal Engineer, Autodesk & Petr Novicky, Principal Engineer, AutodeskTitle of Talk:
Lessons from the Journey: Evolving Autodesk’s Streaming PlatformAbstract:
Building a streaming platform at scale requires constant evolution. In this session, we'll dive deeper into the technical journey of building a reliable data streaming platform at Autodesk capable of processing billions of events per day. We will share some real-world lessons learnt while using Apache Flink and Kafka.
You'll learn about our key architectural shifts, including:
- Decomposed Flink jobs: We'll show you how we broke up a monolithic Flink job to a more flexible model, writing materialized entities to a Kafka backbone for decoupled consumption.
- Platform as a Product: We'll discuss how we're building a self-service platform with tools to give development teams more control over their data pipelines.
- KDS to embedded Debezium migration: We'll explain why we migrated our change data capture strategy from Kinesis Data Streams (KDS) to Debezium embedded in Flink, sharing the benefits and trade-offs we encountered.
Join us to learn practical strategies for building and evolving a resilient streaming platform.Bios:
John is a Principal Engineer at Autodesk and a key member of the data streaming and processing platform team. With a deep passion for data streaming and backend development, he is currently building a highly scalable and reliable data streaming platform to handle complex data challenges.Petr is a Principal Engineer at Autodesk and is part of the data streaming and processing platform team. As a passionate backend software engineer, he thrives in the dynamic world of technology, adeptly utilizing a variety of programming languages. His current focus is on leveraging Apache Flink for real-time data processing. With a commitment to continuous learning, Petr enjoys tackling complex problems and creating innovative, scalable solutions across diverse domains.
***
DISCLAIMER
We are unable to cater for any attendees under the age of 18.
If you would like to speak or host our next event please let us know! community@confluent.io22 attendeesApache Kafka® x Apache Flink® x Apache Iceberg Meetup
Rue Gilt Groupe Ireland, 9, 10 Fenian St, Dublin, IEJoin us for an Apache Kafka® x Apache Flink® x Apache Iceberg meetup on Tuesday, October 7th from 6:00pm in Dublin hosted by Rue Gilt Groupe and supported by Confluent!
📍Venue:
Rue Gilt Groupe Ireland, 9, 10 Fenian St, Dublin, Ireland
1st Floor🗓 Agenda:
- 6:00pm: Doors open
- 6:00pm – 6:30pm: Food, Drinks & Networking
- 6:30pm - 7:00pm: PPJ, Director, Solutions Engineering DSP
- 7:00pm - 7:20pm: Douglas Temple - Principal Engineer (Platform), RGG
- 7:20pm - 7:50pm: Tim Berglund, VP DevRel, Confluent
- 8:00pm - 9:00pm: Additional Q&A and Networking
💡 Speaker One:
PPJ, Director, Solutions Engineering DSPTitle of Talk:
End-to-End Kafka & Flink: Oracle, Flink, VSCode & BeyondAbstract:
Join us for a fast-paced, demo-focused session showcasing the latest in Kafka integrations. We’ll explore how Confluent Cloud (CC) connects with the new Oracle connector, delve into real-time processing with Flink, and demonstrate how developers can streamline workflows using our custom VSCode plugin. If time permits, we’ll also preview an exciting integration with Claude via Managed Connector Platform (MCP). Whether you’re building pipelines or enhancing dev productivity, this talk highlights practical tools to supercharge your Kafka ecosystem.Bio:
During a 30-year career, Peter has been lucky enough to have worked worldwide in diverse IT and Software Industries sectors. A passionate contributor to several well-known and established open source projects (in his spare time), Peter firmly believes in utilising the right tools and technology to meet the data and analytical challenges of business today while ensuring his customers derive the very best ROI.With the convergence of robotics, analytics, artificial intelligence and IoT on the cusp of this fourth industrial revolution, Peter firmly believes there has never been a more exciting time to work in data streaming and advanced analytics.
💡 Speaker Two:
Douglas Temple - Principal Engineer (Platform), RGGTitle of Talk:
Our ongoing journey from batch to event-driven systems in e-commerce.Abstract:
We present a lightning intermission talk exploring our gradual progress towards integrating event-driven systems into an existing batch-oriented architecture. We'll outline some of the challenges, missteps, and wins that we've experienced along this path and prospects for our future use real-time streaming with Apache Kafka using Confluent Cloud.💡 Speaker Three:
Tim Berglund, VP DevRel, ConfluentTitle of Talk:
Building Agents with MCP and Data StreamsAbstract:
LLMs are incredibly powerful, but they have two problems: they only know what they read on the Internet, and they can't actually do anything—they can only chat with you. If you want to build agentic applications that have access to the immediate, non-public context of your business, and you want your agents to be able take actions in the world, you'll probably need some help from the Model Context Protocol, or MCP.And that "business context" increasingly exists in the form of real-time streaming data, often in Kafka topics. Once you're asking your microservices to interpret natural-language prompts, then deputizing them to take actions on your behalf—this is what an agent is!—you can't afford for them to be acting on out-of-data context. They need to remain deeply connected to the events that matter to your business.In this talk, we'll get a solid overview of MCP itself, then see you how you can use it to build practical multi-agent architectures powered by real-time, streaming data. We'll see what's possible when we stop thinking about AI as an external chatbot and start treating it as part of our streaming architecture. Agents are here, and they are powered by streams.Bio:
Tim serves as the VP of Developer Relations at Confluent, where he and his team work to make streaming data and its emerging toolset accessible to all developers. He is a regular speaker at conferences and a presence on YouTube explaining complex technology topics in an accessible way. He lives with his wife and stepdaughter in Mountain View, CA, USA. He has three grown children, three step-children, and four grandchildren.***
DISCLAIMER
We are unable to cater for any attendees under the age of 18.
If you would like to speak or host our next event please let us know! community@confluent.io20 attendees
Past events
3
Group links
Organizers
