Hello Daft blog reader — we are moving the blog to a new home on our new domain at https://www.daft.ai/blog where you will have access to the full version of this blogpost.
We look forward to your continued support as we build the future of multimodal/AI data, together!
Multimodal Data and AI
The AI boom has created a massive bottleneck: processing multimodal data at scale is broken. Every company building AI applications needs to process massive amounts of text, images, documents, and video - but they're stuck using tools designed for web clicks and bank transactions.
We Raised our Series A
We've raised $30M total funding ($20M Series A led by Felicis, plus strategic investment from Microsoft's M12 Ventures and Citi) to build the data infrastructure the AI era actually needs.
We’re Hiring!
The team behind Daft has built data and AI systems at companies such as Databricks, AWS, Nvidia, Pinecone, Github Copilot and more.
With this round of funding, we are continuing to grow our talent — join the best engineers you’ll ever work with here at Eventual. We are hiring for roles across:
Come solve hard technical problems at the frontier of systems, data and AI with us.
Early Access to Eventual Cloud
We're opening early access to Eventual Cloud today—the first production-ready platform built from scratch for multimodal AI workloads.
Built on our open-source Daft engine, Eventual Cloud runs in your cloud with enterprise-grade security and reliability.
The AI era needs data infrastructure built for AI workloads.
Not adapted. Not retrofitted.
Built for it.
Congratulations!
We used a bit of Daft on Ray a while ago and it was really pleasant.