image source head

See how SCP and AO affect the world on the chain from first principles

trendx logo

Reprinted from chaincatcher

01/09/2025·1M

Author: Shijun Jun, PermaDAO

  • 1. From Bitcoin to Ethereum, how to find the optimal path to break throughput and scenario limitations?
  • 2. Starting from first principles, how to find the most essential basic needs of blockchain from the key to breaking through the market memes?
  • 3. What magic does the disruptive innovation principle of SCP and AO (Actor Oriented) (separating storage and computing) have, so that Web3 can completely unleash itself?
  • 4. Will the results of a deterministic program run on immutable data be unique and reliable?
  • 5. Under such a narrative, why can SCP and AO (Actor Oriented) become hexagonal warriors with unlimited performance, reliable data and composability?

The gloom behind the "Web3" boom after the US election

picture

[Data source: DefiLlama]

Fading away from the spotlight, the TVL of Ethereum, the second largest digital currency in the digital currency market, has continued its sluggish trend since reaching the highest peak in history in 2021.

Even in the third quarter of 2024, Ethereum’s decentralized finance (DeFi) revenue fell to $261 million, the lowest level since the fourth quarter of 2020.

At first glance, there may appear to be occasional spikes, but the overall trend points to a slowdown in overall DeFi activity on the Ethereum network.

In addition, some completely alternative public chains dedicated to trading scenarios have emerged in the market. For example, Hyperliquid, which has been very popular recently, is an order book model trading chain. The overall data has grown rapidly. It has entered the top 50 in terms of market value in just 2 weeks. It is estimated that the annualized revenue will be Among all public chains, it is only lower than Ethereum, Solana, and TRON, which reflects the weakness of traditional DeFi based on the AMM architecture and Ethereum from the side.

picture

[Data source: Compound trading volume]

picture

[Data source: Uniswap trading volume]

DeFi was once the core highlight of the Ethereum ecosystem, but its revenue has fallen sharply due to reduced transaction fees and user activity.

In this regard, the author tries to think about what is the reason for the current difficulties faced by Ethereum or the entire blockchain, and how to break it?

Coincidentally, with the success of SpaceX's fifth test launch, SpaceX has become a rising star in commercial aerospace. Looking back on the development path of SpaceX, it is possible to get to where it is today by relying on the key methodology-first principles (Tips: The concept of first principles was first proposed by the ancient Greek philosopher Aristotle 2,300 years ago. However, he expressed the "first principle" like this: "In the exploration of every system, there is a first principle, which is the most basic proposition or assumption and cannot be omitted or deleted, nor can it be violated. ") .

So, let us also use the method of first principles to peel away the fog layer by layer and explore the most essential "atoms" of the blockchain industry. From a fundamental perspective, re-examine the difficulties and opportunities currently facing this industry.

Is Web3’s “cloud service” a regression or the future?

When the concept of AO (Actor Oriented) was introduced, it attracted widespread attention. In the context of the homogenization of many EVM series blockchain public chains, AO, as a disruptive architectural design, has shown unique appeal.

This is not just a theoretical idea, but a team is putting it into practice.

As mentioned above, the greatest value of the blockchain is to record digital value. From this perspective, it is an open and transparent global public ledger. Therefore, based on this essence, it can be considered that the first principle of the blockchain is kind of "storage".

AO is implemented based on the storage consensus paradigm (SCP). As long as the storage is immutable, no matter where the computing end performs calculations , the results can be guaranteed to be consistent. The AO global computer was born to realize the interconnection and collaboration of large-scale parallel computers.

Looking back on 2024, one of the most eye-catching events in the Web3 field is the explosion of the inscription ecology, which can be seen as a practice of the early storage and computing separation model. For example, the etching technology used by the Runes protocol allows small amounts of data to be embedded in Bitcoin transactions. Although this data does not affect the main functionality of the transaction, it serves as additional information and constitutes a clear verifiable and non-consumable output.

Although in its early days, some technology observers questioned the security of Bitcoin Inscription, fearing that it could be a potential entry point for cyberattacks.

However, in the past 2 years, it has stored data entirely on the chain, and no blockchain forks have occurred so far. This stability once again confirms that as long as the stored data is not tampered with, the consistency and security of the data can be guaranteed no matter where the calculation is performed.

Maybe you will find that this is almost the same as traditional cloud services? For example:

In terms of computing resource management, in the AO architecture, "Actor" is an independent computing entity and each computing unit can run its own environment. Isn't this the same as the microservices and Docker of traditional cloud servers? Similarly, traditional cloud services can rely on S3 or NFS for storage, while AO relies on Arweave.

However, it is not accurate to simply attribute AO to "cold rice and hot stir-fry". Although AO borrows some design concepts from traditional cloud services, its core lies in combining decentralized storage with distributed computing. As a decentralized storage network, Arweave is fundamentally different from traditional centralized storage. This decentralized feature gives Web3 data greater security and censorship resistance.

More importantly, the combination of AO and Arweave is not a simple technology stack, but creates a new paradigm. This paradigm combines the performance advantages of distributed computing with the credibility of decentralized storage, providing a solid foundation for the innovation and development of Web3 applications. Specifically, this combination is mainly reflected in the following two aspects:

1. While achieving a completely decentralized design in the storage system, it relies on a distributed architecture to ensure performance.

2. This combination not only solves some core challenges in the Web3 field (such as storage security and openness), but also provides a technical foundation for possible unlimited innovations and combinations in the future.

The following will deeply explore the concept and architectural design of AO, and analyze how it copes with the difficulties faced by existing public chains such as Ethereum, ultimately bringing new development opportunities to Web3.

Looking at the current dilemma and shackles of Web3 from an "atomic" perspective

Since the emergence of Ethereum carrying smart contracts, Ethereum has become the undisputed king.

Some people may ask, isn’t there still Bitcoin? But an important point to note is that Bitcoin was created as an alternative to traditional currencies and was intended to be a decentralized and digital cash system. Ethereum is not only a cryptocurrency, but also has the ability to create and implement smart contracts and decentralized applications (DApps).

In general, Bitcoin is a digital substitute for traditional currency. It has a higher price, but it does not mean a higher value. Ethereum is more like an open source platform, and it has the expected richness. The value can better represent the open world of Web3 in current concepts.

Therefore, since 2017, many projects have tried to challenge Ethereum, but very few have persisted to the end. However, the performance of Ethereum has been criticized, so what follows is the growth of Layer 2. Layer 2 looks Behind the seemingly prosperous situation is helpless struggle in a difficult situation. As competition intensifies, a series of problems have gradually been exposed, becoming serious shackles to the development of Web3:

There is an upper limit on performance and poor user experience

picture

[Data source: DeFiLlama]

picture

[Data source: L2 BEAT]

Recently, more and more people believe that Ethereum’s expansion plan Layer 2 has failed.

Initially, L2 was an important continuation of the Ethereum subculture in the expansion plan of Ethereum, and many people were needed to support the development path of L2. It was expected that L2 would reduce gas fees and increase throughput to achieve growth in the number of users and transactions. However, the expected increase in the number of users did not occur despite the reduction in gas fees.

In fact, is L2 responsible for the failure of the expansion plan? In fact, it is obvious that L2 is just a scapegoat. Although it bears some responsibility, its main responsibility still lies with Ethereum. Furthermore, it is the inevitable result of the underlying design problems of most of the current Web3 chains.

We explain this issue from an "atomic" perspective. L2 itself assumes the computing function, while the essential "storage" of the blockchain is undertaken by Ethereum, and in order to obtain sufficient security, it must also be Ethereum is used to store and agree on data.

However, Ethereum itself is designed to avoid possible infinite loops in the execution process, which would cause the entire Ethereum platform to stop. Therefore, any given smart contract execution will be limited to a limited number of calculation steps.

This further leads to the fact that L2 is designed to expect unlimited performance, but in fact the upper limit of the main chain puts shackles on it .

The short board effect determines that there is a ceiling for L2.

Readers can read more to understand the detailed mechanism: " From Traditional DeFi to AgentFi: Exploring the Future of Decentralized Finance ".

The gameplay is very limited and difficult to create effective attraction.

What Ethereum is most proud of is the prosperous ecosystem of the application layer. In the Ethereum application ecosystem, there are various DApps.

But is there really a scene of a hundred flowers blooming behind the prosperity?

The author believes that this is obviously not the case. Behind the prosperous application ecosystem of Ethereum is a single situation where financialization is serious and non-financial applications are far from mature .

Let’s take a look at the more prosperous application sectors on Ethereum:

picture

First of all, although concepts such as NFT, DeFi, GameFi and SocialFi have exploratory significance in financial innovation, such products are currently not suitable for the general public. The reason why Web2 can develop so rapidly is that its functions are close enough to people's daily lives.

Compared with financial products and services, ordinary users are more concerned about messaging, social networking, video, e-commerce and other functions.

Secondly, from a competitive perspective, credit loans in traditional finance are a very common and extensive product, but in the DeFi field, this type of product is still relatively rare, mainly due to the current lack of an effective on-chain credit system.

The construction of the credit system needs to allow users to truly own their own online profiles and social graphs, and to be able to span different applications.

Only when this decentralized information can be stored and transmitted at zero cost can it be possible to build a powerful personal information graph of Web3 and a set of Web3 applications based on the credit system.

Since then, we have once again clarified a key issue. L2 's failure to attract enough users is not their problem. The existence of L2 has never been the core driving force. The way to truly break through the shackles of Web3's dilemma is to innovate application scenarios to attract users. .

But the current situation is like a holiday highway. Limited by transaction performance, no matter how many innovative ideas there are, it will be difficult to implement them.

The essence of the blockchain itself is "storage". When storage and computing are coupled, it becomes not "atomic" enough. Under this insufficiently essential design, there must be a critical point in performance.

Some views define the essence of blockchain as a trading platform, a currency system, or emphasize transparency and anonymity. However, this view ignores the fundamental nature of blockchain as a data structure and its broader application potential. Blockchain is not just for financial transactions. Its technical architecture allows it to be applied across multiple industries, such as supply chain management, medical health records, and even copyright management. Therefore, the essence of blockchain lies in its ability as a storage system, not only because it can safely store data, but also because it guarantees the integrity and transparency of data through a distributed consensus mechanism. Once each data block is added to the chain, it is almost impossible to change or delete it.

Atomized infrastructure: AO makes unlimited performance possible

picture

[Data source: L2 TPS]

The basic architecture of the blockchain faces an obvious bottleneck: the limitation of block space. Just like a fixed-size ledger, every transaction and data needs to be recorded in a block. Ethereum and other blockchains are subject to block size limits, causing transactions to have to compete with each other for space. This raises a key question: Can we break through this limit? Does block space have to be limited? Is there a way to make the system truly infinitely scalable?

Although Ethereum's L2 route has been successful in terms of performance expansion, this can only be said to be half of the success, because L2 has improved throughput by several orders of magnitude, which may be detrimental to a project when facing transaction peaks. It can be sustained, but for most L2 storage and consensus security inheritance chains, this expansion and improvement is far from enough.

It is worth noting that L2’s TPS cannot be infinitely improved and is mainly limited by the following factors: data availability, settlement speed, verification cost, network bandwidth, contract complexity and other factors. Although Rollup optimizes the storage and computing requirements of L1 through compression and verification, data still needs to be submitted and verified on L1, so it is limited by L1's bandwidth and block time. At the same time, computational overhead such as generating zero-knowledge proofs, node performance bottlenecks, and execution requirements of complex contracts also limit the upper limit of L2 expansion.

picture

[Data source: suiscan TPS]

At present, the real challenge of Web3 lies in the insufficient throughput and application, which will make it difficult to attract new users, and Web3 may face the risk of losing influence.

In short, the improvement of throughput is the key to a bright future for Web3. It is the vision of Web3 to realize a network that can be infinitely expanded and have high throughput. For example, Sui uses deterministic parallel processing to pre-order transactions to avoid conflicts, thereby making the system more predictable and scalable. This enables Sui to process over 10,000 transactions per second (TPS). At the same time, Sui's architecture allows network throughput to be increased by adding more verification nodes, theoretically achieving unlimited expansion. It also uses the Narwhal and Tusk protocols to reduce delays, allowing the system to efficiently process transactions in parallel, thereby overcoming the expansion bottleneck of traditional Layer 2 solutions.

The AO we discussed is also based on this idea. Although the focus is different, they are all building a scalable storage system.

Web3 requires a new infrastructure based on first principles and with storage as the core. Just as Elon Musk did when he rethought the rocket launch and electric car industries, he disrupted industries by radically redesigning these complex technologies from first principles. The design of AO is similar. It decouples computing and storage, abandons the traditional blockchain framework, builds a future-oriented Web3 storage foundation, and promotes Web3 towards the vision of decentralized cloud services.

Design paradigm based on storage consensus (SCP)

Before introducing AO, we must first talk about the relatively new SCP design paradigm.

SCP may be unfamiliar to most people, but I believe everyone is familiar with the inscription of Bitcoin. Loosely speaking, the design idea of ​​the inscription is to some extent a design idea that uses storage as an "atomic" unit, and perhaps it has some deviations.

Interestingly, Vitalik has also expressed the intention of becoming a Web3 paper tape, and the SCP paradigm is exactly this type of idea.

In Ethereum's model, calculations are performed by complete nodes, and then stored globally and provided for query. This leads to a problem. Although Ethereum is a "world-class" computer, it is a single-threaded program. , all steps can only be done step by step, which is obviously inefficient. It is also "excellent soil for MEV". After all, transaction signatures will enter the Ethereum memory pool and be disseminated publicly, and then miners will sort and produce blocks. Although this process may only take 12 seconds, in this short period of time , the transaction content has been exposed to countless "hunters", who can quickly intercept and simulate, and even reversely deduce possible trading strategies. For more details about MEV, please read: "The MEV Pattern One Year After the Merger of Ethereum"

On the other hand, the idea of ​​SCP is to separate computing and storage. Maybe you will find this a bit abstract. It doesn't matter. Let's take the Web2 scenario as an example.

In the process of chatting and online shopping in Web2, there are often sudden peak traffic at certain times. However, it is difficult for a computer to support such a large load in terms of hardware resources. For this reason, smart engineers have proposed distributed The concept of handing over calculations to multiple computers, and finally they synchronize and store their respective calculation states. In this way, it can be flexibly expanded to cope with traffic in different periods.

Similar SCP can also be regarded as a design that distributes calculations to various computing nodes. The difference is that SCP's storage is not a database such as MySQL or Postsql, but relies on the main network of the blockchain.

In short, SCP uses blockchain to store state results and other data, thereby ensuring the credibility of the stored data and implementing a high-performance network layered with the underlying blockchain.

More specifically, the blockchain is used in SCP only for data storage, while the off-chain client/server is responsible for performing all computations and storing all generated state. Such an architectural design significantly improves performance and scalability, but under an architecture where computing and storage are separated, can we truly guarantee the integrity and security of data?

To put it simply, the blockchain is mainly used to store data, and the actual computing work is done by servers off the chain. This new system design has an important feature: it no longer uses the complex node consensus mechanism of traditional blockchains, but places all consensus processes off-chain.

What are the benefits of doing this? Because there is no need for a complicated consensus process, each server only needs to focus on processing its own computing tasks. This allows the system to handle an almost unlimited number of transactions while also being cheaper to run.

Although this design is somewhat similar to the currently popular Rollup expansion solution, its goal is greater: it is not only used to solve the blockchain expansion problem, but also to provide a new path for the transformation from Web2 to Web3.

picture

Having said all that, what are the advantages of SCP? SCP works by decoupling computation and storage. This design not only improves the flexibility and composability of the system, but also lowers the development threshold and effectively solves the performance limitations of traditional blockchain while ensuring the credibility of the data. Such innovation makes SCP an efficient and scalable infrastructure, empowering future decentralized ecosystems.

1. Composability: SCP places calculations off-chain, which prevents the essence of the blockchain from being polluted and allows the blockchain to maintain the "atomic" attribute. At the same time, the calculation is outside the chain, and the blockchain only bears the functional attributes of storage, which means that any smart contract can be executed, and application migration based on SCP becomes extremely simple, which is very important.

2. Low development barriers: Off-chain computing determines that developers can use any language for development, whether it is C++, python or Rust. There is no need to specifically use EVM to write in Solidity language, and the only cost for programmers may be to interact with the chain. The cost of the API.

3. No performance limitations: Off-chain computing enables computing power to be directly aligned with traditional applications. The upper limit of performance depends on the machine performance of the computing server. Then the elastic expansion of traditional computing resources is a very mature technology, regardless of the cost of computing machines. That’s it, the computing power is unlimited.

4. Trustworthy data: Since the basic function of "storage" is undertaken by the blockchain, this means that all data is non-tamperable and traceable, and any node can pull the data if it has doubts about the status results. Recalculate. Therefore, blockchain gives data credibility.

Bitcoin proposed the PoW solution to the "Byzantine Generals Problem". This was Satoshi Nakamoto's approach to breaking the conventional thinking in the environment at that time, which made Bitcoin possible.

Similarly, when facing the calculation of smart contracts, we start from first principles. Perhaps this is a solution that seems to go against common sense, but when we boldly decentralize the calculation function and return the blockchain to its essence, looking back Suddenly, I found that while the storage consensus was satisfied, it also met the characteristics of open source data and trustworthy supervision, and achieved the same excellent performance as Web2. This is SCP.

The combination of SCP and AO: getting rid of the shackles

After talking so much, AO is finally here.

First, the design of AO adopts a pattern called Actor Model, which was originally used in the Erlang programming language.

At the same time, AO's architecture and technology are based on the SCP paradigm, which separates the computing layer from the storage layer, making the storage layer permanently decentralized, while the computing layer maintains the traditional computing layer model.

AO's computing resources are similar to traditional computing models, but it adds a permanent storage layer to make the computing process traceable and decentralized.

Having said this, you may find that which main chain is the storage layer used by AO?

Obviously, it is impossible to use Bitcoin and Ethereum as the main chain used as the storage layer. The reason has been discussed by the author above. I believe readers can easily understand this. The data storage and final verifiability issues for the final calculation of AO are handled by Arweave.

So among so many decentralized storage tracks, why choose Arweave?

The choice of Arweave as the storage layer is mainly based on the following considerations: Arweave is a decentralized network focused on permanent storage of data. Its positioning is similar to "a global hard drive that never loses data" and is similar to Bitcoin's "global ledger" and Ethereum's "Global Computer" is different. Arweave is like a global hard drive that never loses data.

For more technical details about Arweave, please refer to: "Understanding Arweave: Key Infrastructure of Web3"

Next, let us focus on discussing the principles and technologies of AO and see how AO achieves unlimited computing?

picture

[Data Source: How ao Messenger Works | Manual]

The core of AO is to build an infinitely scalable and environment-free computing layer. Each node of AO cooperates based on protocols and communication mechanisms, so that each node can provide optimal services and avoid the consumption of competition.

First, let's take a look at the basic architecture of AO. AO is composed of two basic units: processes and messages, and scheduling units (SU), computing units (CU), and messenger units (MU):

  • Process: The computing unit of nodes in the network, used for corresponding data calculation and message processing. For example, each contract can be a process.

  • Message: Processes interact through messages. Each message is ANS-104 standard data, and the entire AO must follow this standard.

  • Scheduling unit (SU): Responsible for numbering the messages of the process so that the process can be sorted, and responsible for uploading the messages to Arweave.

  • Computing unit (CU): The status node in the AO process is responsible for performing calculation tasks and returning the calculation results and signatures to the SU to ensure the correctness and verifiability of the calculation results.

  • Messenger Unit (MU): The routing exists in the node and is responsible for delivering the user's message to the SU and performing integrity verification on the signature data.

It is worth noting that AO has no shared state, only holographic state. The consensus of AO is generated by games, because the state generated by each calculation will be uploaded to Arweave, thus ensuring the verifiability of the data. When users question certain data, they can request one or more nodes to calculate the data on Arweave. If the settlement results are inconsistent, the corresponding dishonest nodes will be punished.

Innovations in AO Architecture: Storage and Holographic States

The innovation of the AO architecture lies in its data storage and verification mechanism, which replaces redundant calculations and limited block space in traditional blockchains by utilizing decentralized storage (Arweave) and holographic states.

1. Holographic state: In the AO architecture, the "holographic state" generated by each calculation will be uploaded to the decentralized storage network (Arweave). This "holographic state" is not just a simple record of transaction data, it contains the complete status and related data of each calculation. This means that every calculation and result is permanently recorded and can be verified at any time. As a "data snapshot", the holographic state provides a distributed and decentralized data storage solution for the entire network.

2. Storage verification: In this mode, data verification no longer relies on each node to repeatedly calculate all transactions, but confirms the validity of transactions by storing and comparing the data uploaded to Arweave. When the calculation results produced by a node are inconsistent with the data stored on Arweave, users or other nodes can initiate verification requests. At this point, the network recalculates the data and checks the stored records in Arweave. If the calculation results are inconsistent, nodes will be punished to ensure the integrity of the network.

3. Break through block space limitations: The block space of traditional blockchains is limited by storage, and each block can only contain limited transactions. In the AO architecture, data is no longer directly stored in blocks, but uploaded to a decentralized storage network (such as Arweave). This means that the storage and verification of the blockchain network no longer depends on the size of the block space, but is shared and expanded through decentralized storage. The capacity of the blockchain system is therefore no longer directly limited by the block size.

The block space limit of blockchain is not unbreakable. The AO architecture changes the data storage and verification methods of traditional blockchains by relying on decentralized storage and holographic states, thereby providing the possibility for unlimited expansion.

Does consensus have to rely on redundant computation?

uncertain. The consensus mechanism does not necessarily rely on redundant computation, it can be implemented in a variety of ways. Solutions that rely on storage rather than redundant computing are also feasible in certain scenarios, especially when data integrity and consistency can be guaranteed through storage verification.

In AO's architecture, storage becomes a way to replace redundant computing. By uploading the calculation results to the decentralized storage network (here Arweave), the system can ensure the non-tamperability of the data, and through the holographic upload of the status, any node can check the calculation results at any time to ensure the consistency and accuracy of the data. sex. This method relies on the reliability of data storage rather than the results of repeated calculations on each node.

Let us look at the difference between AO and ETH through a table:

picture

It is not difficult to find that the core characteristics of AO can be summarized into two:

1. Large-scale parallel computing: supports numerous processes running in parallel, significantly improving computing capabilities.

2. Minimize trust dependence: There is no need to trust any single node, and all calculation results can be reproduced and traced infinitely.

How does AO break the situation: the dilemma of public chains led by Ethereum?

Regarding the two major dilemmas faced by Ethereum, performance constraints and insufficient applications, the author believes that these are the strengths of AO for the following reasons:

1. AO is designed based on the SCP paradigm, and calculation and storage are separated, so in terms of performance, it is no longer comparable to Ethereum's single-process calculation. AO can flexibly expand more computing resources according to demand, and Arwearve The holographic state storage of message logs allows AO to ensure consensus by reproducing calculation results, and it is not inferior to Ethereum and Bitcoin from a security perspective.

2. The parallel computing architecture based on message passing can eliminate the need for AO processes to compete for "locks". In the development of Web2, it is not difficult to know that a high-performance service will try to avoid lock competition, because this is not necessary for a Efficient services come at a huge cost. Similarly, this idea is used to avoid lock competition between AO processes through messages, which allows its scalability to reach any scale.

3. AO’s modular architecture. The modularity of AO is reflected in the separation of CU, SU, and MU. This allows AO to use any virtual machine, sequencer, etc., which provides extremely convenience for DApp migration and development of different chains. And the low cost, combined with Arwearve's efficient storage capabilities, allows DApps developed on it to implement richer gameplay. For example, character maps can at least be easily implemented on AO.

4. The support of modular architecture enables Web3 to adapt to the policy requirements of different countries and regions. Although the core concepts of Web3 are decentralization and deregulation, it is inevitable that different policies in various countries have a profound impact on the development and promotion of Web3. The flexible modular combination can be adapted according to the policies of different regions, thus ensuring the robustness and sustainable development of Web3 applications to a certain extent.

ending

The separation of computing and storage is a great idea, and it is also a systematic design based on first principles.

As a narrative direction similar to "decentralized cloud services", it not only provides a good implementation scenario, but also provides a broader space for imagination when combined with AI.

In fact, only by truly understanding the basic needs of Web3 can we get rid of the difficulties and shackles caused by path dependency.

The combination of SCP and AO provides a brand-new idea: it inherits all the characteristics of SCP, no longer deploys smart contracts on the chain, but stores non-tamperable and traceable data on the chain, realizing everyone 's Verifiable data trustworthiness.

Of course, there is no absolutely perfect path yet, and AO is still in its nascent stage of development. How to prevent Web3 from being over-financialized, create enough application scenarios, and bring richer possibilities to the future is still a test paper on the road to AO's success. Whether AO can deliver a satisfactory answer remains to be tested by the market and time.

The combination of SCP and AO is a development paradigm full of potential. Although its concept has not yet been widely recognized in the market, AO is expected to play an important role in the Web3 field in the future and even promote the further development of Web3.

more