Bitget App
Trade smarter
Buy cryptoMarketsTradeCopyBotsEarnWeb3

In the AI era, how can Web3 enterprises compete with traditional artificial intelligence giants?

View original
ChaincatcherChaincatcher2024/05/12 12:37
By:原文标题:《Flipping the AI coin》

This article is not blindly optimistic advocacy, but a profound reflection on the challenges of reality and the opportunities of the future.

Original Title: "Flipping the AI coin"

Original Author: Gagra Ventures

Original Translation: ChainCatcher

Editor's Note: Through the halo of technology, the author sees the multiple obstacles faced by Web3 projects in advancing AI development, such as capital and hardware. Despite Web3's original intention to break centralization and achieve decentralized ideals, in practice, it is often influenced by market narratives and token incentives, deviating from its original purpose.

ChainCatcher's translation of the original text is as follows:

The call for the integration of AI and Web3 is growing louder, but this is no longer an optimistic venture capital article. We are optimistic about merging these two technologies, but the following text is a call to action. Otherwise, this optimism will not materialize.

Why? Because developing and running the best AI models requires massive capital expenditure, the most advanced hardware is often hard to come by, and it requires very specific domain research. Crowdsourcing these resources through encrypted incentives, as most Web3 AI projects are doing, is not enough to counter the hundreds of billions of dollars invested by large companies controlling AI development. Given the limitations in hardware, this may be the first major software paradigm that smart and creative engineers outside existing organizations cannot break through.

Software is "eating the world" at an increasingly rapid pace, and will soon grow exponentially with the acceleration of artificial intelligence. Currently, all this "cake" is flowing to tech giants, while end users, including governments and large enterprises, are more subject to their power.

Displaced Incentive Mechanism

All of this is happening at an extremely inappropriate time—90% of decentralized network participants are busy chasing the "golden eggs" of narrative-driven easy fiat profits.

Developers are following our industry's investors, rather than the other way around. This situation manifests in various forms, from open acknowledgment to more subtle subconscious motives, but narratives and the market they form are driving many decisions in Web3. Like traditional reflex bubbles, participants are too focused on the internal world to notice the external world unless it helps further drive the narrative of this cycle. And artificial intelligence is clearly the biggest narrative, as it is in a stage of vigorous development itself.

We have interacted with dozens of teams at the intersection of artificial intelligence and cryptocurrency, and can confirm that many of them are highly capable, mission-driven, and passionate builders. But human nature is such that in the face of temptation, we often succumb to it, and then rationalize these choices afterwards.

The path to liquidity has always been the historical curse of the crypto industry—currently, it has delayed years of development and valuable adoption. It has even led the most faithful cryptocurrency believers to turn to "pumping tokens." The rationale is that holders of tokens may have a better chance.

The low complexity of institutional and retail capital provides builders with the opportunity to make claims detached from reality, while still benefiting from valuations, as if these claims had already been realized. The result of these processes is actually deep-rooted moral risks and capital destruction, and few such strategies are effective in the long term. Demand is the mother of all inventions, and when demand disappears, so does the invention.

This situation could not be worse timing. While all the smartest tech entrepreneurs, state actors, and large enterprises are competing to ensure a piece of the pie from the AI revolution, cryptocurrency founders and investors have chosen the "quick 10x." And in our view, this is the real opportunity cost.

Overview of Web3 AI Prospects

Given the aforementioned incentive mechanisms, Web3 AI projects can actually be classified as:

  • Reasonable (which can also be further divided into realists and idealists)
  • Semi-reasonable
  • Fraudulent

Essentially, we believe that project builders should clearly understand how to keep up with their Web2 competitors and know which areas are competitive and which are delusional, even though these delusional areas may be marketed to venture capital firms and the public.

Our goal is to be able to compete here and now. Otherwise, the pace of AI development may leave Web3 behind, and the world will leap to a "Web4" between Western corporate AI and Chinese national AI. Those who cannot be competitive in a timely manner and rely on distributed technology to catch up over a longer period are overly optimistic and not to be taken seriously.

Obviously, this is just a very rough summary, even in the "fraudulent" category, there are at least a few serious teams (perhaps more are just dreamers). But this article is a call to action, so we are not objective, but rather calling on readers to have a sense of urgency.

Reasonable:

There are few founders of solutions developing "AI on-chain" middleware who understand that it is currently impractical, even impossible, to train or infer models (i.e., cutting-edge technology) in a decentralized manner. Therefore, finding a way to connect the best centralized models to the on-chain environment, allowing them to benefit from complex automation, is a good first step for them. Currently, hardware-isolated TEEs (Trusted Execution Environments) that can host API access points, bidirectional oracles (for bidirectional indexing of on-chain and off-chain data), and coprocessor architectures that provide verifiable off-chain computing environments for agents seem to be the best solutions at the moment.

There is also a coprocessor architecture that uses Zero-Knowledge Proofs (ZKPs) to snapshot state changes (rather than verifying complete computations), which we believe is also feasible in the medium term.

For the same problem, a more idealistic approach is to attempt to verify off-chain inferences to make them consistent with on-chain computations in terms of trust assumptions.

We believe the goal of doing this should be to have artificial intelligence perform on-chain and off-chain tasks in a unified operating environment. However, most supporters of verifiable inference talk about "trust model weights" and other tricky goals that will actually become relevant in several years (if at all). Recently, founders in this camp have begun exploring alternative methods to verify inferences, but initially all based on ZKP. While many smart teams are working on Zero-Knowledge Machine Learning (ZKML), they expect the pace of encrypted optimization to outstrip the complexity and computational requirements of AI models, taking on too much risk. Therefore, we believe they are currently not ready to compete. However, some recent developments are interesting and should not be overlooked.

Semi-Reasonable:

Consumer applications use wrappers that encapsulate closed-source and open-source models (e.g., Stable Diffusion for image generation or Midjourney). Some of these teams are leading the market and gaining recognition from actual users. Therefore, it is not fair to categorize them all as fraudulent, but only a few teams are deeply considering how to develop their underlying models in a decentralized manner and innovate in incentive design. There are also some interesting governance/ownership designs in the token part. However, most of the projects in this category have not addressed the training and inference of large models in a decentralized environment. Currently, training basic models within a reasonable time frame is impossible without relying on tightly connected hardware clusters. Given the level of competition, "reasonable time" is a key factor.

There have been some promising research results recently, and theoretically, methods such as "Differential Data Flow" may eventually be extended to distributed computing networks to increase their capacity (as network capabilities catch up with data flow requirements). However, competitive model training still requires communication between localized clusters, rather than a single distributed device and cutting-edge computing (retail GPUs are becoming increasingly uncompetitive).

Research progress has also been made recently in achieving localized inference by reducing model size (one of the two decentralized methods), but it has not been utilized in existing protocols in Web3.

The problem of decentralized training and inference logically leads us to the last and most important of the three camps, and therefore the most emotionally triggering for us.

Fraudulent:

Infrastructure applications are mainly focused on the field of distributed servers, providing bare hardware or decentralized model training/hosting environments. Some software infrastructure projects are pushing protocols like federated learning (decentralized model training) or projects that combine software and hardware components into a platform where people can essentially train and deploy their decentralized models end-to-end. Most of them lack the complexity needed to actually solve the problem, and the naive idea of "token incentives + market boost" prevails here. The solutions we see in public and private markets do not offer anything meaningful competitively at this time and place. Some solutions may evolve into viable (but niche) products, but what we need now are fresh, competitive solutions. And this can only be achieved through innovative designs that address the bottlenecks of distributed computing. In training, speed is a major issue, as is the verifiability of completed work and the coordination of training workloads, adding to the bandwidth bottleneck.

We need a competitive set of truly decentralized base models that require decentralized training and inference to be effective. Losing AI could completely negate all the achievements made since the emergence of Ethereum as the "decentralized world computer." If computers become AI and AI is centralized, then the world computer is nothing but a kind of dystopian version.

Training and inference are at the core of AI innovation. While other areas of the AI world are moving towards tighter architectures, Web3 needs orthogonal solutions to compete with them, as the feasibility of head-on competition is becoming increasingly important.

Lower.

Scale of the Issue

Everything is related to computation. The more investment in training and inference, the better the results. Yes, there may be some adjustments and optimizations here and there, and the computation itself is not homogeneous. There are now various new methods to overcome the bottleneck of traditional von Neumann architecture processing units, but it all boils down to how many matrix multiplications you can do on how large a memory block and how fast.

This is why we see the so-called "super-scale" being so heavily invested in data centers, all hoping to create a complete stack, with artificial intelligence models at the top and hardware powering them at the bottom: OpenAI (models) + Microsoft (computing), Anthropic (models) + AWS (computing), Google (both), and Meta (building their own data centers with increasing frequency). There are more subtle differences, dynamic interactions, and stakeholders, but we won't list them all. Overall, super-scale enterprises are investing unprecedented billions in data center construction and creating synergies between their computing and artificial intelligence products, expecting significant returns as artificial intelligence becomes more prevalent in the global economy.

Let's take a look at the expected construction levels of these 4 companies just this year:

NVIDIA™ (NVIDIA®) CEO Jensen Huang previously stated that a total of $1 trillion would be invested in the field of artificial intelligence acceleration in the coming years. Recently, he doubled this prediction to $2 trillion, reportedly due to the interest he sees from sovereign enterprises.

Analysts at Altimeter predict that global data center spending related to artificial intelligence will reach over $160 billion in 2024 and over $200 billion in 2025.

Now, comparing these numbers with the incentives provided by Web3 to independent data center operators to expand capital expenditures on the latest artificial intelligence hardware:

Currently, the total market value of all decentralized physical infrastructure (DePIn) projects is approximately $40 billion, mainly composed of relatively illiquid and speculative tokens. Essentially, the market value of these networks equals the upper estimate of total capital expenditures contributed by their stakeholders, as they use tokens to incentivize this construction. However, the current market value is almost useless as it has already been issued.

Therefore, let's assume that in the next 3-5 years, an additional $80 billion (twice the current value) of private and public DePIn token capital will appear on the market as an incentive, and assume that these tokens will be used 100% for artificial intelligence use cases. Even if we roughly divide this estimate by 3 (years) and compare its dollar value with the cash value invested only in 2024 by the super-scale companies, it is clear that imposing token incentives on a bunch of "decentralized GPU network" projects is not enough.

Furthermore, there is a need for billions of dollars in investor demand to absorb these tokens, as the operators of these networks will sell a large amount of mined tokens to cover significant costs of capital and operational expenses. More funds are also needed to drive up the value of these tokens and incentivize further construction to surpass the super-scale companies.

However, those who have a deep understanding of how Web3 servers currently operate may think that a significant portion of "decentralized physical infrastructure" is actually running on the cloud services of these super-scale companies. Of course, the surge in demand for GPUs and other artificial intelligence-specific hardware is driving more supply, which will ultimately make cloud leasing or purchasing cheaper. At least that's the expectation.

But at the same time, it should be noted that NVIDIA now needs to prioritize customer demand for its latest generation GPUs. NVIDIA is also starting to compete with the largest cloud computing providers on its own turf—offering artificial intelligence platform services to enterprises already locked into these supercomputers. This will ultimately either compel them to gradually establish their own data centers over time (which would erode the hefty profits they currently enjoy, so it's unlikely), or significantly restrict the sale of their artificial intelligence hardware to their network of cloud providers.

Additionally, NVIDIA's competitors launching additional artificial intelligence-specific hardware mostly use chips produced by TSMC, the same as NVIDIA. Therefore, essentially all artificial intelligence hardware companies are vying for TSMC's production capacity. TSMC also needs to prioritize certain customers. Samsung and potential players like Intel (Intel is trying to quickly re-enter the cutting-edge chip manufacturing field to produce its own hardware chips) may be able to absorb additional demand, but TSMC currently produces most of the chips related to artificial intelligence, and expanding and calibrating to cutting-edge chip manufacturing (3 and 2 nanometers) takes several years.

Finally, due to restrictions imposed by the United States on NVIDIA and TSMC, China is basically cut off from the latest generation of artificial intelligence hardware. Unlike Web3, Chinese companies actually have their own competitive models, especially companies like Baidu and Alibaba with LLM, which require a large number of previous-generation devices to operate.

Due to one or a combination of the above reasons, as the battle for artificial intelligence intensifies and takes precedence over cloud business, super-scale enterprises will restrict external access to their artificial intelligence hardware, posing a non-substantive risk. Essentially, it's a situation where they are taking all the cloud capacity related to artificial intelligence for themselves, no longer providing it to others, while also consuming all the latest hardware. As a result, other major companies, including sovereign nations, will demand higher requirements for the remaining computing supply. Meanwhile, the remaining consumer-grade GPUs are becoming increasingly less competitive.

Obviously, this is an extreme scenario, but if hardware bottlenecks persist, major players will retreat due to excessive bonuses. As a result, decentralized operators like secondary data centers and retail hardware owners (the majority of Web3 DePIn providers) will be excluded from the competition.

The Other Side of the Coin

While the founders of cryptocurrencies are still in their dreams, the artificial intelligence giants are closely watching cryptocurrencies. Government pressure and competition may drive them to adopt cryptocurrencies to avoid being shut down or heavily regulated.

The founder of Stability AI recently resigned to "decentralize" his company, which is one of the earliest public hints. He had not hidden his plans to launch tokens once his company succeeded in going public, which somewhat exposed the true motivation behind the expected actions.

Similarly, although Sam Altman is not involved in the operations of Worldcoin, the cryptocurrency's transactions undoubtedly resemble those of OpenAI. Whether there is a way to link internet token projects with artificial intelligence research projects, only time will tell, but the Worldcoin team also seems to realize that the market is testing this assumption.

For us, it is very meaningful to see the exploration of different decentralized paths by the artificial intelligence giants. What we see here again is that Web3 has not produced meaningful solutions. Most of the time, "governance tokens" are just a joke, and currently, only tokens that explicitly avoid direct contact between asset holders and their network development and operation, such as BTC and ETH, are truly decentralized tokens.

The incentive mechanisms that slow down technological development also affect the development of different governance designs for encrypted networks. Startup teams are just slapping a "governance token" on their product, hoping to find a new path during the buildup process, but ultimately end up stagnating in the "governance theater" revolving around resource allocation.

Conclusion

The AI competition is underway, and everyone is taking it very seriously. In the contemplation of expanding computing capabilities by large tech giants, we cannot find any loopholes—more computation means better artificial intelligence, better artificial intelligence means lower costs, increased new revenue, and expanded market share. For us, this means that the bubble is reasonable, but all fraudsters will still be eliminated in the inevitable reshuffle in the future.

Centralized large enterprise artificial intelligence is dominating this field, making it difficult for startups to keep up. While the Web3 field is late to the game, it is also joining the competition. Compared to startups in the Web2 field, the market rewards for encrypted artificial intelligence projects are overly lucrative, causing founders to shift their focus from product delivery to driving token price increases at critical moments, a window that is rapidly closing. So far, no innovation has been able to avoid scaling up computation for competition.

Now, a credible open-source movement has emerged around consumer-facing models, initially chosen only by some centralized enterprises to compete with larger closed-source competitors (such as Meta, Stability AI). But now, the community is catching up, putting pressure on leading artificial intelligence companies. These pressures will continue to affect the closed-source development of artificial intelligence products, but until open-source products catch up, the impact will not be significant. This is another major opportunity in the Web3 field, but it must address the issue of decentralized model training and inference.

Therefore, although the opportunity for "classic" disruptors seems to exist on the surface, the reality is far from it. Artificial intelligence is closely related to computation, and without breakthrough innovations in the next 3-5 years, this status quo cannot be changed, which is the key period determining who controls and guides the development of artificial intelligence.

The computing market itself, driven by demand, is constrained by structural factors such as chip manufacturing and economies of scale, making it impossible for manufacturers to "blossom everywhere."

We remain optimistic about human ingenuity and believe that there are enough smart and noble people who can try to tackle the challenges of artificial intelligence in a way that benefits the free world rather than being controlled from top-down by corporations or governments. However, this opportunity seems very slim, at best a coin toss, but the founders of Web3 are busy flipping coins for economic gain rather than making a real impact on the world.

Associated Labels
AI Web3 enterprise decentralized token incentives market narrative NVIDIA GPU TSMC
ChainCatcher reminds readers to rationally view blockchain, effectively improve risk awareness, be vigilant against various virtual token issuance and speculation. All content on the site is only market information or the opinions of relevant parties and does not constitute any form of investment advice. If sensitive information is found in the content on the site, you can click "Report", and we will handle it promptly.
0

Disclaimer: everything in the article represents the author's point of view and has nothing to do with this platform. This article is not intended to be used as a reference for making investment decisions.

You may also like

What Is The Polkadot Treasury?

Cryptodaily2024/05/31 23:13

Elon Musk reportedly building ‘Gigafactory of Compute’ for AI

Cointime2024/05/30 16:02

Reflections on Ethereum Governance Following the 3074 Saga

Cointime2024/05/30 09:23

Conversation with Farcaster Co-Founder: How Decentralized Social Media Can Grow from 100,000 to 1 Billion Users

Farcaster co-founders Dan Romero and Varun Srinivasan shared their views on a range of topics.

Chaincatcher2024/05/23 02:37

‌Spot copy trading

More
AIOnline
AIOnline
insight1000/1000
9359.81%
ROI
Total profit $47735.04
WhaleGo_YouTube
WhaleGo_YouTube
insight500/500
1269.4%
ROI
Total profit $3685.87

Bot copy trading

More
BGUSER-FFF8CNJ4
BGUSER-FFF8CNJ4
insight8/150
$7892.34
Total profit
Total subscriber profits $210.93
TopTrader85
TopTrader85
insight150/150
$13284.03
Total profit
Total subscriber profits $137.16