Is Crypto X AI No Longer Popular? A Quick Look at High-Potential Narrative Directions You Might Have Overlooked

Deep Tide TechFlow
2024-07-23 22:44:17
Collection
The initial Web3-AI craze was mainly focused on some impractical value propositions, and now the emphasis should shift to building feasible solutions.

Author: Crypto, Distilled

Compiled by: Shenchao TechFlow

Crypto and AI: Have We Reached the End?

In 2023, Web3-AI briefly became a hot topic.

But now, it is filled with imitators and massive projects with no real utility.

Here are the misconceptions to avoid and the key points to focus on.

Overview

The CEO of IntoTheBlock @jrdothoughts recently shared his insights in an article.

He discussed:

a. Core challenges of Web3-AI

b. Overhyped trends

c. High-potential trends

I have distilled each point for you! Let's take a look:

Market Status

The current Web3-AI market is overhyped and overfunded.

Many projects are disconnected from the actual needs of the AI industry.

This disconnect has created confusion but also opportunities for the insightful.

(Thanks to @coinbase

Core Challenges

The gap between Web2 and Web3 AI is widening, primarily due to three reasons:

  1. Limited AI research talent

  2. Restricted infrastructure

  3. Insufficient models, data, and computing resources

Generative AI Fundamentals

Generative AI relies on three key elements: models, data, and computing resources.

Currently, there are no major models optimized for Web3 infrastructure.

Initial funding supported some Web3 projects that were disconnected from AI realities.

Overrated Trends

Despite the hype, not all Web3-AI trends are worth paying attention to.

Here are some trends that @jrdothoughts considers to be the most overrated:

a. Decentralized GPU networks

b. ZK-AI models

c. Proof of inference (Thanks to @ModulusLabs )

Decentralized GPU Networks

These networks promise to democratize AI training.

But the reality is that training large models on decentralized infrastructure is slow and impractical.

This trend has yet to deliver on its lofty promises.

Zero-Knowledge AI Models

Zero-knowledge AI models seem attractive in terms of privacy protection.

However, they are computationally expensive and difficult to interpret.

This makes them less practical for large-scale applications.

(Thanks to @oraprotocol

Information in the image:

b) Currently, the overhead is up to 1000 times.

However, this approach is far from practical, especially for the use cases described by Vitalik. Here are some examples:

  • The zkML framework EZKL takes about 80 minutes to generate a proof for a 1M-nanoGPT model.

  • According to data from Modulus Labs, the overhead of zkML is over 1000 times that of pure computation, with the latest report showing 1000 times.

  • According to EZKL's benchmarks, RISC Zero has an average proof time of 173 seconds for random forest classification tasks.

Proof of Inference

The proof of inference framework provides cryptographic proof for AI outputs.

However, @jrdothoughts believes these solutions address non-existent problems.

As a result, their real-world applications are limited.

High-Potential Trends

While some trends are overhyped, others show significant potential.

Here are some underrated trends that may offer real opportunities:

a. AI agents with wallets

b. Cryptocurrency funding for AI

c. Small foundational models

d. Synthetic data generation

AI Agents with Wallets

Imagine AI agents having financial capabilities through cryptocurrency.

These agents could hire other agents or stake funds to ensure quality.

Another interesting application is "predictive agents," as mentioned by @vitalikbuterin.

Cryptocurrency Funding for AI

Generative AI projects often face funding shortages.

Efficient capital formation methods in cryptocurrency, such as airdrops and incentives, provide crucial funding support for open-source AI projects.

These methods help drive innovation.

(Thanks to @oraprotocol

Small Foundational Models

Small foundational models, such as Microsoft's Phi model, demonstrate the idea that less is more.

Models with 1B-5B parameters are crucial for decentralized AI, providing powerful edge AI solutions.

(Source: @microsoft

Synthetic Data Generation

Data scarcity is one of the main obstacles to AI development.

Synthetic data generated by foundational models can effectively supplement real-world datasets.

Overcoming Hype

The initial Web3-AI craze primarily focused on some unrealistic value propositions.

@jrdothoughts believes that the focus should now shift to building practical solutions.

As attention shifts, the AI field remains full of opportunities waiting for keen eyes to discover.

This article is for educational purposes only and is not financial advice. Special thanks to @jrdothoughts for the valuable insights provided.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators