Unlocks AI applications by Simplifying Real-time Streaming and Infrastructure
What do you like best about the product?
Decodable is a tool that unlocks a lot of use cases for us that were previously not achievalable for our team. With our AI pipelines, we require customer updates to take effect immediately. For example, if they change a document (PDF, DOCX), we need it to be updated and reindexed in our vector databases so that the new information can be surfaced in AI pipelines. It would be impossible to perform these updates in a timely manner if we rely on batch jobs.
Decodable is very effective at simplifying the monitoring of these updates (through Change Data Capture), and running the updates through a custom pipeline for us. For use cases that literally only need to move data from point A to point B, Decodable is a lot simpler and doesn't need much code maintenance.
Real time streaming will always take effort and require different thinking. The Deodable team has done a very good job in abstracting a huge amount of the complexity, allowing us to focus on our business logic. The support has been great all the way, which is important for a team like ours that wasn't used to Flink.
When it comes to simplifying infrastructure for streaming needs, this is where Decodable really shines. We are not bothered by keeping uptime of Kafka and Flink infrastructure, which can be a lot of work.
Decodable is very effective at simplifying the monitoring of these updates (through Change Data Capture), and running the updates through a custom pipeline for us. For use cases that literally only need to move data from point A to point B, Decodable is a lot simpler and doesn't need much code maintenance.
Real time streaming will always take effort and require different thinking. The Deodable team has done a very good job in abstracting a huge amount of the complexity, allowing us to focus on our business logic. The support has been great all the way, which is important for a team like ours that wasn't used to Flink.
When it comes to simplifying infrastructure for streaming needs, this is where Decodable really shines. We are not bothered by keeping uptime of Kafka and Flink infrastructure, which can be a lot of work.
What do you dislike about the product?
The Decodable team moves fast but as one of the earlier adopters, we certainly had to adjust to some of the limitations. For example, we used Java Flink even if our team was more comfortable in Python. A month after that decision, PyFlink Beta was rolled out. We also had to do some extra work around CI/CD. At the pace of the Decodable though, I am confident that they will have a significantly refined product in the next few months. The support team has been super responsive for us. If only our timeline were one to two months later, we would have had a much easier time.
Similar to above, the process around updating pipelines isn't quite refined yet. It requires a bit of extra work on our end to write some scripts to help.
Batch vs streaming can also require a mindset change that takes getting used to. For example, joins are significantly more complicated. This is not the fault of the Decodable team though, it's just a fact of life that streaming will be a bit more complicated.
Similar to above, the process around updating pipelines isn't quite refined yet. It requires a bit of extra work on our end to write some scripts to help.
Batch vs streaming can also require a mindset change that takes getting used to. For example, joins are significantly more complicated. This is not the fault of the Decodable team though, it's just a fact of life that streaming will be a bit more complicated.
What problems is the product solving and how is that benefiting you?
Data teams need to move data from transactional databases to data warehouses. Or in our case, we additionally need to move the records to our vector databases for AI applications. There are two options to do this, a batch job, or a streaming pipeline. Batch jobs are a lot simpler, at the expense of the data only being upto date every time the batch is run (every few hours or daily). For AI use cases in our organization, we needed data to be consistently upto date.
Decodable simplifies the complexity of streaming pipelines by handling the change data captures (CDC) stream, and the infrastructure to move the data from point A to point B in real time (through Kafka and Flink). This allows us to always provide upto date data in our AI applications because we are able to index the data in our vector databases in real time. Without Decodable, we'd need to manage too many things rather than focusing on the business logic.
Decodable simplifies the complexity of streaming pipelines by handling the change data captures (CDC) stream, and the infrastructure to move the data from point A to point B in real time (through Kafka and Flink). This allows us to always provide upto date data in our AI applications because we are able to index the data in our vector databases in real time. Without Decodable, we'd need to manage too many things rather than focusing on the business logic.
There are no comments to display