Founders Fund: Will AI be the next Crypto?
Author: John Luttig
Compiled by: Fan Yang
ChatGPT has recently made its way into family groups, and many people in the tech and investment industries naturally find this lively scene reminiscent of last year's cryptocurrency industry.
A technology that has the potential to become a disruptor gradually moving towards the mainstream always carries a vague mystery; a thousand people have a thousand imaginations and interpretations, which converge into the cultural trends behind this technology.
When AI truly realizes its potential, akin to fire and electricity, it will undoubtedly be like a bicycle, enhancing the efficiency and speed of society, and also like a psychedelic drug, opening up neural pathways in the human brain, expanding perspectives, allowing for thoughts that were never considered and actions that were never dared, ultimately blending into daily life.
Today, I share an article from John Luttig, the investment director of Founders Fund, a venture capital firm founded by Peter Thiel. This article is a continuation of yesterday's piece: “The Future of AI: Self-Assembly, Nature-Inspired, and Biological Fusion”. If you don't have time to read the full text, I can share a few points, but any viewpoint has its limitations, and we can revisit them as time progresses:
1. The AI field has not been corrupted by financialization; its long-thread and research-driven attributes allow it to avoid the pitfalls of Crypto. Artificial intelligence promises productivity, not easy money.
2. The value capture of large models may resemble the semiconductor industry, with a few winners emerging in each geopolitical region. Superpowers will also rarely use each other's models.
3. The current ideology of AI resembles a blank canvas; ideally, at the infrastructure level, it should be neutral, while at the application level, it can have clearer directional guidance. The development of AI needs to find a mission that aligns with social progress.
4. Foundational models will be monopolized by tech oligopolies, and the opportunities in the AI field lie in the application layer, where there is great potential for integration with industries requiring specialized knowledge and the largest markets. Once technology is abundant, it becomes difficult to capture value from the technology itself; establishing Systems of Intelligence will capture value.
It seems that a nuclear winter has arrived in the tech world: software companies, SPACs, fintech, and cryptocurrencies have all entered a deep freeze. Artificial intelligence may be the only industry donning a military coat.
Peak AI indicators are everywhere. In-depth analysis articles from venture capital firms and Twitter brainstorming discussions about AI have reached historical highs. The loyal fans from the cryptocurrency boom have migrated to the new playground of AI. In this circle, the number of MBAs may soon surpass that of geeks.
Just a year ago, cryptocurrency still had a similar bubble of hype. Can AI avoid the fate of a cryptocurrency bubble burst? The cryptocurrency industry provides a cautionary tale of excessive speculation, with four shortcomings worth pondering:
Capital: The sources of capital for cryptocurrencies are disconnected from traditional venture capital, distorting project valuations and liquidity timelines.
Mission: Founders have abandoned the decentralized founding mission of cryptocurrency.
People: Token extraction speculators have overwhelmed idealistic ideological builders.
Value Creation: The value created by founders for token holders exceeds that for ordinary users.
Both AI and Crypto industries are overheated, but that doesn't mean they will collapse in the same way. If the AI industry can develop more prudently, artificial intelligence can escape the confidence collapse brought on by excessive inflation of expectations.
Capital
In the early years of the 2010s, no institutional investors spent time on cryptocurrencies. This was even impossible: venture capital LP agreements restricted investments in assets like cryptocurrencies. By 2015, a few venture capital firms had resolved the LPA restrictions on cryptocurrencies.
In 2017, institutional cryptocurrency investment changed. Every venture capital firm began seriously searching for targets. Even most startups had a cryptocurrency strategy. Dedicated cryptocurrency funds emerged. By 2020, the loose funding environment allowed funds to deploy excess capital, adding rocket fuel to an easily speculative category.
Two capital-related issues arose: capital misalignment with social progress, and the short-termism of VCs.
Constant capital, non-constant progress: countless cryptocurrency-specific funds provided a stable cash flow for the ecosystem. This seemed like a good division of labor: general venture capital firms could tune out the noise of cryptocurrencies; cryptocurrency founders could raise funds from more specialized individuals.
As market enthusiasm rose, the rapid liquidity of cryptocurrency funds meant that specialized funds quickly expanded to billions of dollars in assets under management, comparable to the scale of general venture capital funds.
However, when the music stops, cryptocurrency-specific venture capital firms cannot assess cryptocurrency transactions based on the opportunity costs of startups in other industries. This opportunity cost seems trivial but plays a limiting role: if industry B progresses faster than industry A, more incremental risk capital will flow into industry B.
Constant capital inflow, alongside the non-constant (likely declining) progress of user adoption of this technology and business success, decouples the valuations of cryptocurrency startups from those of traditional tech startups. Investors self-justify the rationality of valuations: "Token X is better than token Y valued at $10 billion, so investing at $1 billion is no problem."
VCs' short-sighted behavior: as a financialized technology, the lure of rapid wealth in cryptocurrencies has altered the mindset of founders and venture capital firms.
Venture capital firms typically wait over ten years for returns. However, when investors see they can obtain token allocations in less than three years, short-termism seduces them, contradicting historical norms. Venture capital firms opt for short-term locks and encourage pre-IPO token sales for quick liquidity.
The economic incentives are fundamentally not aimed at creating and funding real product applications over a long time frame. If you've already monetized and walked away, why continue building products?
Artificial intelligence avoids these capital-related challenges.
Generalist funds allow artificial intelligence to develop on the right track. There will continue to be dedicated AI funds, but they will not dominate the industry like cryptocurrency funds. Funds focused on specific technologies often concentrate on infrastructure companies, which are highly oligopolistic—by 2023, there won't be many new VC-backed AI infrastructure winners. Today, in the AI boom, the real opportunity for venture capital firms lies in specific applications.
When a technology becomes ubiquitous, dedicated funds for that technology become meaningless: for example, creating a specialized fund to invest in companies using databases makes no sense. Beyond the infrastructure realm, "AI company" will no longer be a common phrase (just as no one today would say they are an internet company to distinguish themselves). An AI tech company will be seen as leveraging machine learning as a toolbox.
What kind of venture capital firms will perform well in the AI bull market? Likely generalist funds that see the big picture: buyer psychology, distribution strategy, compounding advantages.
Under the control of generalist investors, AI should not face the issue of capital runaway. Machine learning-driven products deeply interact with the real economy—customers, competitors, and investors regularly provide reality checks.
AI is structurally long-term oriented: some early employees will profit from secondary bids, but artificial intelligence lacks a clear siphoning and dumping plan. Its nature is precisely the opposite.
Unlike cryptocurrencies, artificial intelligence lacks an inherent liquidity mechanism.
Despite the hype, the commercialization of artificial intelligence is still in its early stages—real, lasting businesses take time to ferment.
In a high-interest-rate environment, only businesses with real operations can exit through public markets or mergers and acquisitions.
It is highly likely that profiting from artificial intelligence will take longer than previous generations of software and internet companies.
The concept of AGI (Artificial General Intelligence), which could be infinitely valuable, may distort people's overestimation of the bullish options value of AI assets. However, the current macro and micro capital environment should be able to control the emergence of bubbles.
Mission
Since the 1990s, entrepreneurs have attempted to create digital currencies, with years of attempts ending in failure. It wasn't until 2008 that Bitcoin indicated that proof-of-work + blockchain was the first satisfactory solution.
The Bitcoin white paper clarified its use case: to circumvent traditional financial institutions. The mission of Crypto is anarcho-capitalist and revolutionary. The founders are the best evangelists of this technology, as Nakamoto said:
"We are required to trust that central banks will not devalue currency, but the history of fiat currency is full of breaches of that trust. We are required to trust that banks hold our money and transfer it electronically, but they lend out money in wave after wave of credit bubbles with almost no reserves. We have to hand over our privacy to them…"
The ideology of Crypto drove early applications. Decentralized currency proved to have use cases, whether legal or illegal: anonymously buying pizza, purchasing drugs on Silk Road, escaping the fiat system of tyrannical governments. By exchanging cash for Bitcoin, you could escape the oversight of central banks.
Later, this mission became more ambiguous. The decentralized purpose of cryptocurrency was lost: activities became centralized, transactions were tracked, and KYC and AML provisions increased. Perhaps cryptocurrency must integrate with the financial system to gain popularity, but if blockchain is not decentralized, what is it really doing?
As Crypto evolved into Web3, the scope of its mission quietly changed. Crypto was initially about decentralized currency, then morphed into decentralization of everything. Builders focused on blockchain-based social networks, decentralized gaming, and NFT ticketing. Crypto is digital currency, but people feel it needs more.
Even now, the Crypto industry has been corrupted, the original reactionary mission of cryptocurrency can still get you high. This mission is present in the titles of founding white papers, in forum discussions, in architectures, and in applications.
If decentralization is the founding mission of Crypto—what is the corresponding mission of artificial intelligence? Hollywood does not paint a positive picture; in every dystopian sci-fi movie, AI makes an appearance: HAL 9000 in "2001: A Space Odyssey," Skynet in "The Terminator," Samantha in "Her," and Ava in "Ex Machina."
Beyond the movies, people narrate credible dystopian AI narratives around deeper entrenchment of awakenism or, worse, that AI will replace human jobs. Artificial intelligence can become a symbiotic partner in human creativity and thought, but optimistic stories need to be shared.
The positive mission of artificial intelligence is not as specific or clear as that of cryptocurrency, or at least not yet.
Unestablished founding mission: AI technology itself does not insert any sense of mission, unlike blockchain, which is trustless and thus anti-establishment. This may be a good thing: artificial intelligence is a blank ideological canvas that can be shaped according to human will.
Many people are still excited about the mission of artificial intelligence. What exactly are they excited about? The ideology behind AI is some form of post-scarcity or abundance—ranging from capitalism aimed at expanding prosperity to communism. In the early modern period of artificial intelligence (mid-2010s), its mission was effective altruism: build AGI and then donate profits to society.
The established missions of many AI companies resonate with the failed socialist experiments of the 1900s, scaring away conservatives and libertarians. But they are distinctly different: generating abundance through technology has better precedents than European socialism, which redistributed wealth through political power.
Centralized control: control over AI's foundational models seems likely to centralize. Open-source models of AI may have prospects, but models from large private companies perform much better. Even the ultimate degree of centralization in cryptocurrency far exceeds its vision.
My best mental model for the centralization of artificial intelligence is to reference the semiconductor industry. There are many lagging edge chip suppliers, but few (valuable) leading edge suppliers.
If we think about the history of politics, centralization sounds dangerous, but decentralizing a technology is not necessarily a good thing—if AI becomes omnipotent, you wouldn't want everyone to have nuclear launch codes.
Mission neutrality: most industries have a bidirectional application-infrastructure duality. For example, in software, cloud providers handle infrastructure while software companies build applications. Bitcoin is monolithic: infrastructure is the application. This means that the technological mission is defined at the infrastructure level.
Most enabling technologies are relatively mission-less: cloud infrastructure, semiconductors, mobile communications, and even the internet. Perhaps AI should be mission-neutral at the infrastructure level, while the AI mission will be defined at the application level. DeepMind's stated mission seems to point in the right neutral direction: "Solve intelligence, then use that to solve everything else."
Foundational models should power a wide range of application layer tasks—political, social, and economic. Just as AWS should not censor developers except in the most extreme cases, large models should avoid censoring AI applications.
Unlike cloud infrastructure, foundational models have training biases. But they also have a fine-tuning lever that allows flexibility in the mission at the application layer. Developers can generate Fox-weighted or NYT-weighted results. The political biases in current large models require a long-term solution.
The mission of artificial intelligence still needs refinement, and who controls it is a topic worth public discussion. But in a world of centralized infrastructure, the best mission may be neutrality.
People
The pioneers of cryptocurrency were libertarians and anarchists, but as the ecosystem matured, disorganized scammers flooded in, drawing regulatory scrutiny.
Cultural drift: during the 2008-2016 era, only tech experts and cryptocurrency enthusiasts bought and built cryptocurrencies.
The initial applications of cryptocurrency required its original sin: tokenization architecture. The corrosive power of finance ultimately overwhelmed the utility of the technology.
Years later, scammers rode the coattails of ICOs. Easy money even corrupted the technological purists: cryptocurrency founders poured millions into liquid tokens within 1-2 years, with no users or management oversight.
The returns from cryptocurrency became a superstition—a culture-driven positive feedback loop. Hype-driven folklore propelled retail adoption. WAGMI, Bitcoin's rainbow chart, supercycle theory, standing on the "right" side. Anyone could profit from the technological revolution!
During the loose monetary ZIRP (Zero Interest-Rate Policy) period of 2020 and 2021, the superstitious retail wave created a $3 trillion crypto market cap. Unfortunately, most people got burned.
Regulatory creep: under the rule of chaotic short-sightedness, cryptocurrency failed to gain regulatory support. Some cryptocurrency companies struggled to earn the trust of institutions, while others worked to undermine it. The cryptocurrency companies closest to regulators caused the most significant trust erosion.
The cryptocurrency crowd was too anti-establishment to win over regulators, yet too disorganized internally to complete the anti-fiat mission. In 2023, buying Bitcoin requires centralized onboarding with your SSN and passport information. Every transaction is tracked by intelligence agencies. Regulators have led the cryptocurrency industry into their own trap.
When the bubble burst, the SEC, FinCEN, and OFAC were relentless in fanning the flames. Given the deep-rooted nature of centralized financial institutions, perhaps gaining regulatory support is forever impossible. Either way, the centralized side has now won.
Artificial intelligence has a purer talent evolution arc but also faces regulatory scrutiny risks.
Cultural expansion challenges: the formative period of AI development has a spirit of research and academic publication, ensuring that tech experts, not financiers, become the controllers.
In 2023, AI must absorb a large influx of tech tourists—thinly packaged GPT startups, MBA Twitter discussions about AI trends, LinkedIn resumes shifting from #crypto to #AI. True tech experts will join development work, but filtering out negative human capital is challenging.
Undoubtedly, more builders are needed to apply and produce the latest technology, but rapidly expanding the number of people can lead to indigestion. Most great products are built by a very small number of highly skilled individuals.
The best cultural control mechanism is to hold tech experts accountable. The tokenization elements of Crypto amplified the voices of scammers. In the field of artificial intelligence, there are no transactions—participants must build or leverage core technology. Artificial intelligence promises productivity, not easy money.
Engineers without machine learning backgrounds can also hope to make contributions in this field. You don't need a PhD in database learning to use a database—we've already climbed the ladder of abstraction. With the rapid improvement of foundational models, generalist engineers find it increasingly easier to build AI applications. The limiting factors will be founders and core employees with terminal market expertise and distribution channel advantages.
Regulatory risks: as with any technology's expansion, people inevitably impose political burdens on it. AI is no exception, especially considering it involves many end consumers. Depending on whom you ask, current AI models are either too "woke" or not woke enough.
The cliché of AGI replacing human labor generates negative tendencies among regulators. AI safety researchers are working to prevent binary oppositions seen in sci-fi movies, but as models improve, regulators will establish strict guardrails.
The strategic importance of artificial intelligence means regulators will delineate strong geopolitical boundaries. Soon, we will not use Chinese models to drive our applications. Even AI-driven TikTok has faced significant scrutiny from both parties.
The emphasis on the starting point of scientific research and longer-term business models in AI should help prevent it from being overtaken by tech tourists. However, political controversies and regulation will slow things down.
Value Creation
In cryptocurrency, there has been serious debate about whether there is a single Web3 use case that creates user value. Let's define the net value created as:
Net value of Crypto = Value of activities the world can conduct with crypto - Value of activities the world can conduct without crypto
It is still unclear whether this total is positive, zero, or negative. The first half of the equation certainly has positive components—such as people using Bitcoin to escape the fiat system of tyrannical governments.
But there are also many negative components. Engineers' time is spent on cryptocurrencies instead of other high-return tech work. People lost their holdings in FTX, BlockFi deposits, and altcoins that dropped 90%.
Cryptocurrency leaders claim we are still in the infrastructure phase. Infrastructure is necessary but not a sufficient condition for generating value—sustained usage trumps trading volume.
Developers innovating at the application layer face release delays or active regulatory resistance.
Successful cryptocurrency infrastructure startups provide infrastructure for other cryptocurrency infrastructure companies and traders. The leverage within the cryptocurrency ecosystem has inflated, manifesting as trading leverage, crypto-denominated lending, and crypto-based reserve policies.
Web3 lingers in a value-neutral infrastructure phase, bound by self-suggestion. There are some exciting research frontiers, such as zero-knowledge proofs, which could unleash new growth. But at most commercial levels, user adoption has stagnated.
On the other hand, people are using artificial intelligence to do valuable things based on past experiences.
Big tech has been leveraging it for a decade: product recommendations, news pushes, spam filtering, ad personalization. Within days of ChatGPT's launch, there were countless specific use cases.
Perhaps ChatGPT and generative AI will not evolve into powerful, reliable systems. Journalists (ironically, they may be the first to be replaced) find it superficial and overhyped. But it is hard to ignore its practical applications. Copilot accelerates programming, Jasper simplifies copywriting, Midjourney and DALL-E exhibit extraordinary artistry, and ChatGPT provides analysis and answers questions.
The negative effects of AI are harder to determine. The TikTok algorithm may be the best example of a generation wasting time, although social media consumption may still be very high in a world without AI.
How to capture value in the AI field? Approaching it from two aspects.
Infrastructure Oligopolies.
If you believe in sustained returns in expanding compute, data, and parameters, the large model race should proceed in an oligopolistic manner: scaling models in the 2020s is a game requiring a billion-dollar threshold to participate. The value capture of large models may resemble that of semiconductors, with a few winners emerging in each geopolitical region.Application Proliferation.
Unlike infrastructure, the value capture of applications will be decentralized. It is already late to start a new foundational model company, but venture capital firms have not yet focused on the application layer. Many billion-dollar applications will emerge on top of large models.
If foundational models develop too quickly, it could mean trouble for thinner GPT-packaged companies. However, applications with proprietary domain-specific training data, unique distribution, and complex integrations will endure.
Some startups position themselves as "full-stack," spanning application products and infrastructure, building custom models to support specific domain applications. This will be challenging: customizing excellent generalized models is easier than recreating models from scratch. To use a current tech circle analogy, few software companies should build their custom hardware.
A superintelligent AGI could generate negative value—using my boss Peter Thiel's words, it could be a "Zeus throwing lightning at people." This tail risk makes AI safety work worthwhile. But adjustments in other areas have already taken effect: financial regulations align capital (which is essentially a form of AI to some extent) with humanity.
The productization of AI systems is a nascent endeavor, but from a net value perspective, it seems to crush the value created by cryptocurrency.
Conclusion
Can the artificial intelligence industry maintain the confidence lost by the cryptocurrency industry? It needs to get these things right:
Capital: AI has a long capital feedback cycle, which helps control bubbles. But applications must be productized to justify sustained capital inflows.
Mission: The ideology of artificial intelligence currently resembles a blank canvas, and the degree of inherent corruption in the technology is relatively low. But it needs a positive—or at least neutral, non-political—mission to overturn dystopian pessimism.
People: This industry will see a common influx of scammers, but AI leaders cannot let them truly control the ecosystem.
Value Creation: AI adoption seems promising, but transitioning from interesting toys to reliable tools is not easy.
In the long run, value creation should dominate, and artificial intelligence seems to be winning effortlessly. Businesses and consumers benefit from this technology, even in its nascent stage.
People are operating within Gartner's hype cycle framework: what goes up must come down.
The shape of the early stage of the Gartner curve is correct, but the Plateau of Productivity is misleading: it varies greatly by industry. Some plateau periods go to zero—like private cloud computing—while others exceed the inflated expectations peak.
Crypto may experience plateau periods as suggested by Gartner in a small portion. If I had to guess, the peak of artificial intelligence will far exceed its historical peak.