
At first glance, AI x Web3 seems to be independent technologies, each based on fundamentally different principles and serving different functions.However, in-depth discussions revealed that these two technologies have the opportunity to balance each other’s trade-offs, and their unique advantages can complement each other and improve each other.Balaji Srinivasan elaborated on the concept of this complementary capability at the SuperAI conference, inspiring a detailed comparison of how these technologies interact.
Token adopts a bottom-up approach, emerging from the decentralized efforts of anonymous cyberpunk, and has evolved over a decade through the collaborative efforts of many independent entities around the world.Instead, artificial intelligence is developed through a top-down approach and dominated by a handful of tech giants.These companies determine the pace and dynamics of the industry, and the threshold for entry is more determined by resource intensity than technical complexity.
These two technologies also have completely different natures.Essentially, a token is a deterministic system that produces immutable results such as the predictability of a hash function or zero-knowledge proof.This is in sharp contrast to the probability and often unpredictability of artificial intelligence.
Similarly, encryption technology isVerification aspectPerform excellently, ensure authenticity and security of transactions, and buildTrustlessProcesses and systems, while artificial intelligence focuses ongenerate, create rich digital content.However, ensuring content sources and preventing identity thefts have become a challenge in creating digital richness.
Fortunately, Token provides digitally rich concepts of opposition—Digital scarcity.It provides relatively mature tools that can be generalized to artificial intelligence technologies to ensure the reliability of content sources and avoid identity theft issues.
A significant advantage of Token is its ability to attract a large amount of hardware and capital into the coordination network to serve specific goals.This capability is particularly beneficial for artificial intelligence that consumes a lot of computing power.Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.
By comparing these two technologies, we can not only appreciate their respective contributions, but also see how they can jointly create new paths for technology and economy.Each technology can make up for the shortcomings of another technology and create a more integrated and innovative future.In this blog post, we aim to explore the emerging AI x Web3 industry map, focusing on some emerging verticals at the intersection of these technologies.
Source: IOSG Ventures
2.1 Computing Network
The Industry Map first introduces computing networks that try to solve the problem of constrained GPU provisioning and try to reduce computing costs in different ways.The following items are worth paying attention to:
-
Non-unified GPU interoperability: This is a very ambitious attempt, with high technical risks and uncertainties, but if successful, it will be possible to create results of scale and impact, making all computing resources interchangeable.Essentially, the idea is to build compilers and other prerequisites so that any hardware resource can be inserted on the supply side, while on the demand side, the nonunity of all hardware will be completely abstracted so that your computing requests can be routed to the networkany resource in.If this vision is successful, it will reduce the current dependence on CUDA software that is fully dominated by AI developers.Despite the high technical risks, many experts are highly skeptical of the feasibility of this approach.
-
High-performance GPU aggregation: Integrate the world’s most popular GPUs into a distributed and unauthorized network without worrying about interoperability between non-unified GPU resources.
-
Commodity consumer-grade GPU aggregation: Point to aggregating some GPUs that are less performant but may be available in consumer devices that are the most underutilized resources on the supply side.It caters to those who are willing to sacrifice performance and speed for cheaper, longer training processes.
2.2 Training and reasoning
Computational networks are mainly used for two main functions: training and inference.The demand for these networks comes from Web 2.0 and Web 3.0 projects.In the Web 3.0 world, projects like Bittensor use computing resources to fine-tune the model.In terms of reasoning, the Web 3.0 project emphasizes the verifiability of the process.This focus has spawned verifiable reasoning as a market vertical, projects are exploring how to integrate AI reasoning into smart contracts while maintaining the principle of decentralization.
2.3 Intelligent proxy platform
Next isSmart proxy platform, The Map outlines the core issues that startups in this category need to solve:
-
Agent interoperability and discovery and communication capabilities: Agents can discover and communicate with each other.
-
Agent cluster construction and management capabilities: Agents can form clusters and manage other agents.
-
Ownership and Marketing of AI Agents: Provide ownership and market for AI agents.
These features emphasize the importance of flexible and modular systems that can be seamlessly integrated into a variety of blockchain and artificial intelligence applications.AI agents have the potential to revolutionize the way we interact with the Internet, and we believe agents will leverage infrastructure to support their operations.We envision AI agents rely on infrastructure in the following aspects:
-
Access real-time network data using distributed crawling networks
-
Use DeFi channels to make inter-agent payment
-
Needing an economic deposit is not only for punishment when misconduct occurs, but also for improving the discoverability of the agent (i.e. using the deposit as an economic signal during the discovery process)
-
useconsensusDecide which events should lead to cuts
-
OpenInteroperabilityStandard and proxy frameworks to support buildingCombinable collective
-
according toImmutable data historyTo evaluate past performance and select the right agent collective in real time
Source: IOSG Ventures
2.4 Data layer
In the convergence of AI x Web3, data is a core component.Data is a strategic asset in AI competition and forms a key resource together with computing resources.However, this category is often overlooked because most of the industry’s attention is focused on the computing level.In fact, primitives provide many interesting value directions in the process of data acquisition, mainly including the following two high-level directions:
-
accessPublic Internet Data
-
accessProtected data
Access to public Internet data: This direction aims to build a distributed crawler network that can crawl the entire Internet within a few days, obtain massive data sets, or access very specific Internet data in real time.However, to crawl a large number of data sets on the Internet, the network demand is very high, and at least a few hundred nodes are required to start some meaningful work.Fortunately, Grass, a distributed crawler node network, has more than 2 million nodes actively sharing Internet bandwidth to the network, with the goal of crawling the entire Internet.This shows the great potential of economic incentives in attracting valuable resources.
Although Grass provides a level playing field in public data, there is still a problem of leveraging potential data—that is, access to proprietary data sets.Specifically, there are still a large amount of data that is kept in a privacy-protected manner due to its sensitive nature.Many startups are leveraging cryptography tools that enable AI developers to build and fine-tune large language models using the underlying data structures of proprietary data sets while keeping sensitive information private.
Federated learning, differential privacy, trusted execution environment, full homomorphism and multi-party computing, etc.Technology provides different levels of privacy protection and trade-offs.Bagel’s research article (https://blog.bagel.net/p/with-great-data-comes-great-responsibility-d67) summarizes an excellent overview of these technologies.These technologies not only protect data privacy during machine learning, but also enable comprehensive privacy protection AI solutions at the computing level.
2.5 Data and model sources
Data and model source technology is designed to establish processes that can assure users that they are interacting with the expected models and data.In addition, these technologies provide assurances of authenticity and source.Taking watermarking technology as an example, watermarking is one of the model source technologies, which embeds signatures directly into machine learning algorithms, and more specifically into model weights, so that when retrieving, it can verify whether the reasoning comes from the expected model.
2.6 Application
In terms of application, the possibilities of design are endless.In the industry map above, we list some particularly exciting development cases with the application of AI technology in the Web 3.0 field.Since most of these use cases are self-described, we will not comment on this.It is worth noting, however, that the intersection of AI and Web 3.0 has the potential to reshape many verticals in the field, as these new primitives provide developers with more freedom to create innovative use cases and optimize existing ones.
Summarize
The convergence of AI x Web3 brings prospects full of innovation and potential.By leveraging the unique advantages of each technology, we can solve various challenges and open up new technological paths.As we explore this emerging industry, the synergy between AI x Web3 can drive progress, reshaping our future digital experiences and how we interact on the web.
The integration of digital scarcity and digital richness, the mobilization of underutilized resources to achieve computing efficiency, and the establishment of data practices for security and privacy protection will define the era of evolution of the next generation of technology.
However, we must recognize that this industry is still in its infancy and the current industry landscape may become obsolete in a short period of time.The rapid pace of innovation means that today’s cutting-edge solutions may soon be replaced by new breakthroughs.Nevertheless, the basic concepts explored—such as computing networks, proxy platforms and data protocols— highlight the huge possibility of converging artificial intelligence with Web 3.0.