- Bug Driven Development
- Posts
- Launching Tether & The Journey So Far
Launching Tether & The Journey So Far
Excited to share the story of Tether a new concept we are validating out of BetterFutureLabs
Welcome to the new Bug Driven Developers this week! If you enjoy this post, forward it to your developer friends so they can join us.
Here’s what I got for you today:
Launch of Tether - The Data Exchange for Agentic AI
Tokenization - a crucial concept to understand for building AI systems
Touching Linux is good for you
Let's start off this week by talking about Tether
I’m excited to share we are launching a new product Tether for validation out of BetterFutureLabs.
Tether is the result of extensive research and development we have been doing in multi-agent and agentic systems, which we started last November. Through this work, we’ve uncovered a key insight: AI, particularly agentic systems, consumes data in a fundamentally different way than humans do. As AI continues to evolve, it’s becoming clear that AI systems, rather than humans, will be the primary consumers of data in the future. However, most existing data sources are tailored for human use at some point in the data pipeline, not for AI Systems.
Tether is here to change that. We’re building a data exchange specifically designed for agentic AI systems to be the end consumers of data, making it more accessible and legible for these advanced AI models and systems.
We are currently partnering with multiple respected data providers to be listed on our platform (5 data provider partnerships so far), and we’re looking for developers interested in becoming Beta users.
If you’re interested in joining the Tether Beta program, you can sign up here.
To learn more about Tether, check out our recently published podcast
📖 Tokenization
Tokenization is vital for AI systems as it breaks text into smaller units (tokens), allowing models to process and interpret language more effectively. For AI developers, understanding tokenization is key to optimizing data usage and enhancing model performance.
If you’re interested in learning more about tokenization, check out this new course by Andrew Ng.
Tokenization -- turning text into a sequence of integers -- is a key part of generative AI, and most API providers charge per million tokens. How does tokenization work? Learn the details of tokenization and RAG optimization in Retrieval Optimization: From Tokenization to Vector… x.com/i/web/status/1…
— Andrew Ng (@AndrewYNg)
3:11 PM • Oct 2, 2024
😂 Meme of the Week
Instead of touching grass this week go and touch Linux
Running Linux on your own machine can serve as cognitive-behavioral therapy to cure server-phobia. Making it possible to evaluate cloud/PaaS with rational analysis rather than succumb to the fear, uncertainty, and doubt the merchants of complexity so eagerly try to cultivate.
— DHH (@dhh)
4:23 PM • Oct 3, 2024
Cultivating and exploiting the insecurities of developers is a massive growth opportunity and market. No wonder VCs are rushing in to get some of that sweet 100x arbitrage action. We need open source to push back with tools and education.
— DHH (@dhh)
10:16 PM • Sep 27, 2024
Thanks for tuning in to this week’s newsletter! If you have any questions, feel free to let me know on X (Justin's X)
Thanks,
Justin
P.S. What new things about Software Development did you learn this week?
Reply