data Artisans, the stream processing team behind Apache Flink, today unveiled data Artisans Streaming Ledger at the Flink Forward Conference in Berlin. Streaming Ledger, according to data Artisans “extends the scope of stream processing with fast, serializable ACID transactions directly on streaming data.”
This is significant because previously performing serializable transactions across streaming data – without losing data consistency – was impossible.
If data Artisans are right about Streaming Ledger that’s not only good news for them, it’s good news for developers and system architects struggling to manage streaming data within their applications.
Read next: Say hello to streaming analytics
How Streaming Ledger fits into a data streaming architecture
Streaming Ledger is essentially a new component within data Artisans existing data streaming architecture, which includes Apache Flink.
Stephan Ewen, co-founder and CTO at data Artisans said that “guaranteeing serializable ACID transactions is the crown discipline of data management.” He also claimed that Streaming Ledger does “something that even some large established databases fail to provide. We are very proud to have come up with a way to solve this problem for real time data streams, and make it fast and easy to use.”
Read next: Apache Flink version 1.6.0 released!
How Streaming Ledger works
It’s not easy for streaming technologies to process event streams across shared states and tables. That’s why streaming is so tough (okay, just about impossible) when used with relational databases.
However, Streaming Ledger works by isolating tables from concurrent changes as they are modified in transactions. This helps to ensure consistency is maintained across your data, as you might expect in a really robust relational database.
data Artisans have also produced a white paper that details how Streaming Ledger works as well as further information about why you want to use it. You need to provide details to gain access, but you can find it here.