A Crypto Project Just Trained an AI Model From Scratch
A Crypto Project Just Trained an AI Model From Scratch
31 days agoVirtualBacon@VirtualBacon
YouTube3 min 29 sec
Watch on YouTube
Note: AI-generated summary based on third-party content. Not financial advice. Read more.
Quick Insights

Investors should focus on Bittensor (TAO) as it establishes itself as the clear leader in the Decentralized AI (DeAI) sector by successfully training a 72-billion parameter model from scratch. The successful launch of the Covenant 72B model on Subnet 3 proves that decentralized networks can now outperform established benchmarks like Meta’s Llama 2. Look to accumulate TAO as a high-conviction infrastructure play, as it is transitioning from speculative "vaporware" to a platform with verifiable technical output. While the technology currently competes with models from two years ago, the closing gap with state-of-the-art AI suggests a long-term bullish trend for the DeAI asset class. For secondary exposure, monitor Nous Research and their Psyche project, though Bittensor remains the preferred trade due to its superior permissionless coordination.

Detailed Analysis

Bittensor (TAO)

Bittensor has achieved a significant milestone in the Decentralized AI (DeAI) space by successfully training a Large Language Model (LLM) from scratch on its network. • The model, codenamed Covenant 72B, was trained on Subnet 3. • Technical Achievement: This was a "pre-training" exercise, meaning the model was built from raw data rather than just fine-tuning an existing model. • Scale: The model features 72 billion parameters, making it a substantial and powerful entry in the open-source AI field. • Permissionless Innovation: Unlike previous decentralized training attempts (like those by Nous Research) which required "whitelisting" or pre-screening participants, this Bittensor project was completely permissionless. Anyone could contribute their computing power to the training process. • Performance: The resulting model proved to be more competitive and powerful than Meta’s Llama 2 70B, which served as the benchmark for this trial.

Takeaways

Proof of Concept: This event proves that decentralized, distributed computing can successfully coordinate to build complex AI models without a central data center. • Efficiency vs. Centralization: While tech giants like Meta or Google control all their hardware, Bittensor’s "Subnet" model demonstrates that global coordination of individual "rigs" can achieve similar results. • Investment Horizon: The technology is currently capable of competing with models from ~2 years ago (e.g., Llama 2). It is not yet at the level of "state-of-the-art" models like GPT-4 or Claude Opus, but the gap is closing. • Sector Leadership: Bittensor is positioning itself as the primary infrastructure for "Real AI" in crypto, moving away from the "scam" reputation of many other AI-labeled tokens.


Decentralized AI (DeAI) Sector

• The transcript highlights a shift from "vaporware" to tangible results within the Crypto-AI intersection. • Key Challenges Overcome:Coordination: The main hurdle for decentralized AI is getting different machines across the world to talk to each other and run the same code efficiently. • Competition: Previous decentralized models (like Consilience by Nous Research) were less competitive compared to industry standards; the new Bittensor-based model suggests decentralized methods are becoming more effective.

Takeaways

Bullish Sentiment: AI is expected to be a "big wave" that lasts a long time in the crypto market. • Risk Factor: Investors should remain cautious, as the speaker notes that "most crypto AI projects are scams." Focus should be on projects with verifiable technical output (like model training). • Benchmark Tracking: Watch for decentralized projects that can compete with Meta's Llama series, as this is the current "gold standard" for open-source AI that crypto projects are trying to disrupt.


Nous Research / Psyche

• Mentioned as a "full AI research team" and a popular project within the crypto space. • They previously launched a decentralized training network called Psyche and a 40-billion parameter model called Consilience.

Takeaways

Secondary Opportunity: While Bittensor is the current focus, Nous Research remains a key player to watch in the decentralized AI research space. • Comparative Analysis: Their previous models required "pre-selection" of participants, making them less "decentralized" than the new permissionless methods being pioneered on Bittensor.

Ask about this postAnswers are grounded in this post's content.
Video Description
A Crypto Project Just Trained an AI Model From Scratch A decentralized compute network trained a 72B parameter model from zero, competing with Meta's LLaMA 3. #AI #Crypto #Altcoins #CryptoInvesting #DecentralizedAI #Shorts
About VirtualBacon
VirtualBacon

VirtualBacon

By @VirtualBacon

I'm Dennis, a Crypto angel investor with 100+ startups in our portfolio. On this channel I share my views on market trends and ...