AI Triggers GPU Shortage: How Can Blockchain Alleviate Machine Learning Bottlenecks?

Foresight News
2023-11-20 13:53:29
Collection
Blockchain provides a bridge for accessing lower-cost GPU computing by allowing a distributed access model and creating a cheaper model market with cryptographic incentives.

Original Title: The Rise of AI and GPU Shortages: How Blockchain Alleviates Machine Learning Bottlenecks

Original Author: Tommy Eastman

Original Compilation: Frank, Foresight News


With the development of artificial intelligence and the increasing demand for GPUs, the machine learning industry is facing issues related to GPU costs and accessibility. Let’s explore how blockchain technology provides solutions.

GPU Industry

In the past year, AI-based applications and integrations have seen tremendous growth. OpenAI's ChatGPT became the fastest-growing application in history, reaching 100 million monthly active users just two months after its launch. In comparison, TikTok took 9 months, and Instagram took 18 months to reach the same milestone.

The demand for AI has greatly impacted the value and availability of graphics processing units (GPUs). GPUs are processing units optimized for performing parallel computations, handling many data points simultaneously, making them useful for machine learning, video editing, and gaming applications. Due to their versatility in the AI space, the market demand for GPUs has increased.

GPUs are developed and sold by only a few companies, which is evident in the delays in the manufacturing supply chain. Since the bull market in 2017, they have been closely linked to the blockchain industry, with Ethereum proof-of-work miners purchasing nearly all available GPUs in 2018. While the Ethereum blockchain has transitioned to proof-of-stake, the explosive growth of AI still presents blockchain technology as a useful solution for common issues such as acquiring GPUs, training costs, and distributed inference.

Machine Learning Processes and Bottlenecks

Machine learning is a vast and rapidly evolving industry. The training of models typically involves several steps, each with its own bottlenecks.

1. Base Model Training

Base model training involves acquiring large datasets (e.g., Wikipedia) and training an initial base model to serve as a general intelligence model or for fine-tuning later. It uses learned patterns and relationships to predict the next item in a sequence.

For example, image generation models are trained to associate image patterns with corresponding text, so when given a text input, they generate images based on these learned patterns. Similarly, for text, the model predicts the next word in a text string based on previous words and context.

Training base models is expensive in terms of labor, infrastructure, time, and effort, and the current supply chain makes it difficult to obtain state-of-the-art NVIDIA GPUs, even for companies with ample funding.

For instance, the iterative training of OpenAI's GPT-3 lasted several months, consuming millions of dollars just in energy costs. Therefore, training base models remains a prohibitively expensive endeavor that only a few private enterprises can undertake.

2. Fine-tuning

Notably, fine-tuning is less resource-intensive compared to base model training, optimizing models for specific tasks (e.g., language models learning new dialects). The performance of a base model on specific tasks can be significantly improved through fine-tuning.

While GPU scarcity affects all three areas, fine-tuning is the least impacted. However, fine-tuning relies entirely on open-source base models. If private companies decide to stop open-sourcing their models, community models will quickly fall behind the state-of-the-art (SOTA) models.

3. Inference

Accessing models represents the final step in this process—such as receiving answers to questions from ChatGPT, which generates images based on user prompts from stable diffusion—requiring GPU resources for model queries. Inference is rapidly escalating in computational demands, especially regarding GPU expenditures.

Inference involves both end-users and developers integrating models into their applications, which is crucial for ensuring the economic viability of the model. This concept is vital for integrating AI systems into society, as evidenced by the rapid adoption rate of end-users actively using tools like ChatGPT.

The scarcity of GPUs has led to a rapid increase in inference costs. Although inference has lower baseline requirements compared to base model training, the scale at which companies deploy applications demands an astonishing GPU load for querying models. As the diversity of GPU models increases (through fine-tuning and new base model development), the diversity of applications will also increase, leading to a sharp rise in GPU demand from inference.

Blockchain Provides Solutions to Machine Learning Bottlenecks

In the past, GPUs were used for mining Ethereum and other PoW tokens. Now, blockchain is seen as a unique opportunity to provide access and coordinate between GPU space bottlenecks—especially in machine learning.

Crypto Incentives

Large-scale GPU deployments require significant upfront capital, hindering the progress of all companies in this field except for large corporations. Blockchain incentives create the potential for GPU owners to profit from spare computing power, creating a cheaper and more accessible market for users.

Distributed Access

Anyone can provide/use computing, host models, and query models—this is a stark contrast to the limited access often found in traditional spaces or in beta testing.

One important feature that blockchain can provide for the machine learning space is distributed access. Traditionally, machine learning requires large data centers, as FMT has not yet been completed on non-cluster GPUs at scale, while distributed protocols are attempting to address this issue. If successful, it will open the floodgates for FMT.

Market Coordination

Blockchain markets help coordinate GPU procurement, allowing individuals and companies that own GPUs to find those who want to rent them, rather than letting them sit idle. Generating income while GPUs are idle can help offset the upfront costs of purchasing GPUs, allowing more entities to participate in GPU hosting.

Foundry's Commitment to Responsible AI

The blockchain machine learning space is an emerging industry with very few projects on the mainnet. Currently, Foundry is supporting the Bittensor AI project as well as Akash, which has proven to be a meaningful way to advance distributed AI.

Bittensor

Bittensor is a decentralized, permissionless computing network that makes it easier to access models and creates a cheaper model market through crypto incentives, allowing anyone to host models and matching user prompts with the highest-ranked models for a given modality.

Bittensor has evolved into one of the largest AI projects in the crypto space, creating a large-scale computing inference network using blockchain, which recently launched subnets to incentivize different modalities, including image generation, prediction markets, and more.

Foundry validates and mines on the network and runs proof-of-authority nodes to ensure consensus.

Akash

Akash is a universal computing marketplace that allows for easier large-scale access to GPUs, training more base models, and reducing GPU costs.

Akash recently launched their GPU marketplace, aiming to reduce the entry capital barrier, lower GPU computing costs, and increase accessibility, with base model training programs developing on Akash. Foundry is providing GPU computing for the network and collaborating with the team to develop features.

What’s Next?

As machine learning continues to integrate into businesses, the demand for GPUs will continue to soar, triggering ongoing supply chain issues in the machine learning space. Blockchain technology provides a bridge to access lower-cost GPUs by allowing distributed access to models and creating a cheaper model market with crypto incentives.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators