image source head

Dialogue with EdgeX founder Davy: How can decentralized edge computing networks promote the universalization of AI Agent?

trendx logo

Reprinted from chaincatcher

02/11/2025·29D

Author: Grapefruit, ChainCatcher

During the Spring Festival, DeepSeek's stunning debut made AI once again stand in the spotlight of the public, and impacted the AI ​​industry with its outstanding performance and cost-effective development costs. At present, how to reduce the operating costs of AI models, improve operating efficiency, and make them more popular is becoming the development narrative theme of the AI ​​industry.

As early as last year, EdgeX, a decentralized edge AI computing network, began to work hard to lower the threshold for AI operation, and was committed to building a basic network for users to connect with AI. Its strength lies in the distributed computing power infrastructure, and the operating resources required by AI Agent are provided by users, thereby promoting the implementation and development of decentralized edge computing power.

EdgeX is fully committed to building a decentralized AI infrastructure platform, which integrates distributed computing resources and AI scheduling management systems to build an efficient, secure and transparent decentralized computing network that supports various AI models. Run seamlessly in a distributed environment to promote the widespread implementation and application of AI technology in edge scenarios.

Simply put, the EdgeX network uses a decentralized computing framework to gather the computing power, storage and bandwidth contributed by participants to form a global edge computing network, which greatly reduces the computing power cost and makes any AI model. All can shuttle seamlessly and operate efficiently on edge devices.

In an interview with ChainCatcher, Davy, founder of EdgeX, repeatedly emphasized that EdgeX is not only a technology platform, but also a practice of concepts. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, so that AI can truly connect with every user.

At present, EdgeX has successfully launched the hardware product XR7 AI Dual Gateway, which has been successfully delivered and received widespread praise in the Korean market. Users can join the network to contribute computing power by purchasing the hardware product and deploying hardware nodes to obtain early rewards. At the same time, the first phase of the EdgeX APP beta version has also been launched in South Korea, and users can also participate in the test network.

It is worth mentioning that founder Davy also revealed to ChainCatcher that EdgeX has successfully obtained early support from two well-known domestic Web3 capitals and is actively engaged in in-depth negotiations with many traditional capitals and Web3 capitals in North America, focusing on leading investment, etc. In-depth cooperation on key issues. It is expected that the specific progress and results of financing will be gradually announced from the second quarter to the third quarter of this year.

The story behind EdgeX creation

1. ChainCatcher: As the founder of EdgeX, can Davy share his experience in the Web3 and AI industries and the opportunity to launch an AI infrastructure project? What are your main tasks in the EdgeX project?

Davy: Since 2015, I have been deeply involved in the data center and cloud industries. I have cooperated with many Web3 companies to build nodes and provide comprehensive infrastructure support for them, as well as technical support for leading CEX matching engines. In addition, I have participated in the entire process of many Silicon Valley projects from product conception to successful entry into the exchange, and have deeply involved in key links such as infrastructure blueprint planning, product research and development, daily operations and marketing promotion.

In the field of AI, I was exposed to machine learning technology early on, especially in storage and computing, and have worked with many data scientists to develop and design AI applications. Subsequently, I participated in several large-scale model projects in Silicon Valley and served as an advisor, committed to the optimization of multimodal models, fine adjustment and efficient training of vertical domain models.

In 2024, I observed that the AI ​​field is undergoing a transformation from centralization to decentralization, which coincides with the evolution of Web2 to Web3. As AI's demand for distributed computing power is increasing day by day, edge computing, as a key technology, can effectively meet this demand. Based on this, I decided to integrate the advantages and experiences of Web3 and AI, launch the EdgeX project, and focus on building a decentralized AI infrastructure.

Currently in the EdgeX project, I am mainly responsible for the technical architecture design of the EdgeX project, designing an intelligent computing power scheduling system, and ensuring efficient coordination of computing power resources; the construction of the technical facilities of the EdgeX computing power network provides stable and reliable underlying support for AI applications. ; Optimization of AI applications, insight into the needs of all industries, and tailor AI solutions in vertical fields.

2. ChainCatcher: What is the product positioning and vision pursued by EdgeX? What pain points do you want to solve in the current market?

Davy: EdgeX is committed to building a decentralized AI infrastructure platform that integrates distributed computing power and intelligent task management to promote the implementation and application of AI technology in edge scenarios. By building an efficient, secure and transparent computing power platform, it allows AI models to operate seamlessly in a distributed environment, providing strong underlying technical support for decentralized applications and various scenarios.

At present, the AI ​​industry is facing many challenges: the cost of centralized computing power remains high, data privacy and security are worrying, and support for edge scenarios is even more difficult. Specifically, traditional AI models rely heavily on expensive centralized cloud computing resources, resulting in high computing power costs and limiting the innovation pace of small teams and developers; centralized data storage models are like time bombs, threatening data at all times. Security and privacy; and the operation efficiency of most AI models on edge devices is even more unsatisfactory. EdgeX comes to solve these problems.

Simply put, EdgeX uses distributed computing power to significantly reduce computing power costs, opening up the door to innovation for small teams and developers; while distributed computing power network supports any AI model to shuttle seamlessly and efficiently run on edge devices. , fills the gap in the operation of edge devices AI. At the same time, its decentralized infrastructure can provide a higher level of protection for data privacy and security;

EdgeX is not only a technology platform, but also a practice of concepts. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, so that AI can truly benefit every user. At the same time, we are also committed to providing developers and enterprises with more innovative solutions, and jointly building an open, shared, prosperous, AI ecosystem.

3. ChainCatcher: Can you introduce the composition of EdgeX team members? And talk about your unique advantages in the field of AI and Web3 convergence?

Davy: The EdgeX team is a diversified and international team composed of outstanding elites in global technology research and development, marketing operations and brand promotion. Members are located in many international cities such as Silicon Valley, Singapore, and Taipei. Such a global distributed layout allows it to quickly capture global market demand and find the best partners and resources.

The core members of the team have held core positions in top global technology companies such as Amazon, Alibaba Cloud, and Tencent. They not only have the ability to successfully promote the implementation of projects from 0 to 1, but also have a wide range of industry resources to help the effective implementation of projects on a global scale.

In the field of technology, the EdgeX team has deep accumulation in both the frontier positions of AI and Web3. Especially in key areas such as large models, multimodal technologies, and decentralized computing power scheduling. At the same time, we have an in-depth understanding and mastery of core modules such as Web3 token economics and smart contract design, and can calmly respond to various challenges in the integration of AI and Web3 technology. Not only that, the EdgeX team is also good at commercialization, and its members are experienced in marketing promotion, user growth, supply chain management and other fields.

In addition, we are supported by a team of top Web3 consultants who will provide valuable guidance in product technology, token economics design, market expansion and strategic planning, and provide a solid support for the rapid development of EdgeX.

4. ChainCatcher: According to EdgeX 's official roadmap, the project plans to complete seed round financing from the second quarter to the fourth quarter of 2024. How is the current progress of this round of financing?

A: Up to now, the EdgeX project has successfully obtained support from two well-known Web3 capitals in China. At the same time, we are actively conducting in-depth negotiations with many traditional capital and Web3 capital in North America, and conducting cooperation and discussions on key issues such as leading investment.

According to the established plan, EdgeX is expected to officially announce the specific progress and results of this round of financing between Q2 and Q3 in 2025.

EdgeX product features and advantages

5. ChainCatcher: What are the specific operating mechanism, core composition, main components and functions of the EdgeX network?

Davy: The EdgeX network uses a decentralized computing framework to gather the computing power, storage and bandwidth contributed by participants to form a global edge computing network. Users purchase and deploy EdgeX hardware nodes to participate. After these nodes are connected to the network and complete tasks, they will generate a proof of work (PoW) and receive token rewards.

In this process, EdgeX designed an intelligent task scheduling system. For example, there is an AI model that needs to be run on edge devices, and this task will be split and assigned to different nodes to perform to ensure the entire network is running efficiently while maintaining low latency and high concurrency.

The core components of the EdgeX network include: hardware nodes, EdgeX exclusive operating system, and AI-Agent system:

Hardware nodes not only support AI model inference, but also provide resources such as storage and bandwidth;

The EdgeX operating system runs on the hardware node, providing optimized computing power for edge scenarios.

The core functions of the AI-Agent system can realize distributed AI scheduling, can localize data analysis and reasoning, and call high-performance nodes when necessary to improve the completion effect of tasks.

In addition, the EdgeX network combines decentralized protocols and distributed storage systems to ensure data security and network stability.

All EdgeX components jointly build a decentralized, efficient and secure computing ecosystem, providing better infrastructure support for AI inference and other scenarios.

6. ChainCatcher: Compared with the decentralized computing power DePIN projects such as Aethir, io.net, Gradient Network, Theta, etc. on the market, what is unique about EdgeX?

Davy: First of all, most decentralized computing power networks on the market are more inclined to the general computing field, while EdgeX focuses on the deep integration of edge computing power and AI. It pays special attention to the optimization of AI inference tasks and resource scheduling, aiming to accurately serve various specific AI application scenarios, and has unique advantages in meeting specific computing power needs.

Secondly, unlike large-scale distributed networks that rely on centralized data centers, EdgeX emphasizes the autonomous computing capabilities of edge nodes, which is the key to adapting to AI inference tasks. Through an intelligent task scheduling system, EdgeX can accurately assign AI tasks to the most suitable edge nodes, thereby significantly reducing latency and improving real-time performance.

In terms of product design, EdgeX combines integrated software and hardware solutions and has launched its own hardware nodes. Most similar computing power projects mainly focus on software platforms, which is an important feature that distinguishes them from other computing power projects. The EdgeX hardware node is equipped with an exclusive operating system and is deeply optimized for edge computing and AI scenarios. This design not only greatly improves the efficiency of computing power, but also provides users with a more stable and efficient solution.

In terms of token economics, EdgeX combines proof of work and proof of resources to motivate contributors to provide efficient computing power and storage resources. This mechanism ensures the reasonable allocation of network resources and effectively avoids resource waste.

In application scenarios, EdgeX's application scenarios are more extensive, not only supporting general decentralized computing power requirements, but also focusing on multimodal AI tasks, lightweight reasoning on edge devices and IoT scenario applications. This diversified technical coverage and practical application makes EdgeX not limited to a certain type of specific tasks or services, but shows strong versatility and flexibility.

Realize the diversification of technical coverage and practical applications, and is not limited to a certain type of tasks or services.

From this point of view , EdgeX is not only a general decentralized computing power network, but also an innovative platform focusing on edge computing and AI tasks. It can bring more possibilities to the integration of AI and Web3.

7. ChainCatcher: What specific application scenarios does EdgeX have and its implemented products?

Davy: Currently, EdgeX has successfully achieved deep integration and real-time connection between user physical devices and AI Agents. Users can easily interact with the AI ​​Agent through their physical devices, making the Agent become a user 's personal intelligent assistant. Make the Agent more than just a virtual existence, but a smart device that can accurately understand, continuously learn and execute user instructions. EdgeX's Agent can not only provide localized decision support, but also rely on EdgeX's distributed network to flexibly obtain the required computing power and storage resources to meet various complex computing needs.

Application scenarios include:

Smart Home: EdgeX's Agent is a home assistant, interconnects with IoT devices at home, such as through edge computing power support, real-time analysis of user habits, intelligently adjusts air conditioners, lighting and other devices, and protects data privacy.

Industrial Automation: In factories or production lines, EdgeX supports edge AI-Agent to complete equipment monitoring, fault prediction and process optimization, reducing delays and improving production efficiency.

Multimodal AI Service: EdgeX's network can support multimodal data processing, including images, videos, voice, etc. In the medical field, Agent processes patient data at the edge and provides real-time auxiliary diagnostic advice to doctors.

Education and Training: Through the EdgeX network, AI-Agent becomes a learning assistant for students, providing personalized tutoring while protecting data privacy.

Virtual Assistant and Game: In gaming or virtual reality applications, Agent uses EdgeX's distributed computing power to provide real-time environment generation and character interaction support.

Up to now, EdgeX has successfully launched a series of physical products, covering hardware nodes and AI Agent devices closely tied to users. These products take advantage of the EdgeX network to ensure efficient allocation and utilization of computing power and storage resources, thereby achieving a smooth and unimpeded interactive experience between users and smart devices.

8. ChainCatcher: As a new decentralized AI computing network, what measures does EdgeX have to attract and retain developers?

Davy: EdgeX is committed to building a vibrant and vigorous developer ecosystem. It not only hopes that developers can use our platform, but also hopes that they can find a sense of belonging and long-term development opportunities here. At present, EdgeX has implemented a number of measures to help developers start quickly on the EdgeX platform and gain long-term benefits and development opportunities.

At the technical level, EdgeX provides comprehensive development tools and detailed documentation support, equipped with developer-friendly SDK, API interface and multiple programming language support, and is supplemented by detailed technical documents and step-by-step tutorials to fully assist in development Get started quickly.

In terms of incentive mechanisms, developers on EdgeX can receive rewards for $EX tokens by developing high-quality applications, optimizing network performance or providing computing resources. At the same time, EdgeX has also launched a revenue sharing model, where developers deploy applications on the EdgeX network, support them to directly obtain revenue sharing through the usage fees paid by users.

In terms of community construction, EdgeX has created an open developer community that encourages exchange of experiences and sharing of ideas. The core technical team will directly participate in the community to provide technical guidance and support to ensure that developer problems are promptly responded to and resolved.

In addition, EdgeX also plans rich growth opportunities for developers, such as holding hackathons and developer competitions regularly to provide a display platform. At the same time, EdgeX will help developers expand their user base through its global cooperation network and allow creativity to reach a wider market.

**Application scenarios and early user rewards for ecological governance

token EX**

9. ChainCatcher: EdgeX has released an economic model for governing token EX on its official website. What role does EX play in the EdgeX network? What are the incentive policies for early users?

Davy: EX tokens are the core force driving the operation of the EdgeX network.

As a governance token, EX gives holders the right to participate in network decision-making, including voting to decide on the direction of network development, protocol upgrades and resource allocation . This decentralized governance model promotes network transparency and can also attract communities to participate more actively in EdgeX's ecological construction and community participation.

At the same time, EX is also the main medium of economic activities within the Internet. In the EdgeX ecosystem, users need to use EX to pay for resource call fees, such as computing power, storage and bandwidth services. After providing resources, the node operator will receive an EX reward through Proof of Work or Proof of Resource. This mechanism encourages more nodes to participate in network operation to ensure efficient utilization of resources.

For early users, EdgeX has introduced a variety of incentives: users who deploy hardware nodes in the early stage can enjoy higher EX token mining rewards; in addition, early developers publish high-quality applications on the EdgeX network or optimize network performance, Get additional reward pool allocation; it is also planned to launch an early user-specific token airdrop activity to help it quickly integrate into the EdgeX ecosystem.

Overall, EX tokens are not only an incentive tool, but also an ecological connector, which closely connects users, developers and node operators, and jointly promotes the growth and prosperity of the EdgeX network. For early users, they can not only get economic returns, but also become an important member of the ecosystem by participating in network governance and share the dividends of EdgeX's development.

10. ChainCatcher: How is the development of EdgeX product? What are the ways for users to participate?

Davy: EdgeX's hardware product XR7 AI Dual Gateway has been successfully delivered and received widespread praise in the Korean market, marking an important step in global promotion and an important verification of the actual performance and application value of EdgeX networks.

At the same time, the beta version of the EdgeX APP has launched a first phase of trial in South Korea, focusing on testing network stability and user experience, laying the foundation for subsequent global market expansion.

In the field of AI Agent, EdgeX's development team is committed to continuous optimization of model parameters in order to achieve significant performance improvements and make the user experience smoother, such as faster response speeds and more precise task processing capabilities.

Regarding how users participate in the EdgeX network, Korean users can currently join the network by deploying the XR7 AI Dual Gateway hardware node, contribute resources and complete tasks to obtain token rewards. In addition, users can also participate in the trial of the APP beta version, experience the service and provide feedback.

11. ChainCatcher: EdgeX once revealed that it is discussing the details of the cooperation with Eliza, the leading actress in the AI ​​Agent track. What are the specific cooperation details? What key role does EdgeX play in AI Agent applications? How to optimize the performance and efficiency of AI Agent through edge computing?

Davy: As a representative product of the AI ​​Agent track, Eliza's smooth interactive capabilities and user experience are very consistent with EdgeX's decentralized computing power network. EdgeX is committed to integrating Eliza's white label version into its network, aiming to improve Eliza's service efficiency and user experience through this cooperation, and achieve rapid response and real-time interaction. The specific cooperation plans between the two parties are still being refined.

In the application scenario of AI Agent, EdgeX provides underlying computing power support and optimization. Through EdgeX, AI Agents such as Eliza 's computing tasks can be smoothly transferred to distributed edge nodes for processing . This mode allows Eliza to be closer to the user's network location, thereby reducing latency. At the same time, EdgeX's intelligent scheduling mechanism can dynamically allocate tasks to the optimal nodes, improving the resource utilization and operation efficiency of the entire system.

EdgeX's edge computing framework optimizes AI Agent performance from the following three aspects, so that it reaches a new level in speed, intelligence and user experience.

Low latency: The task can be completed at edge nodes near the user, and there is no need to remotely transmit it to the cloud, which greatly reduces data transmission time and improves interaction fluency.

Intelligent scheduling: EdgeX can analyze the status of each node in real time and dynamically adjust the allocation of tasks according to actual conditions to ensure the rational use of resources and effectively prevent node overloading.

Distributed computing power collaboration: When a single node cannot cope with complex tasks, EdgeX's distributed architecture can quickly call multiple nodes for collaborative processing, which not only ensures the smooth completion of tasks but also improves overall efficiency.

How to measure the reliability of an AI infrastructure?

12. ChainCatcher: What qualities do an AI Agent infrastructure need to be recognized by the market? At the same time, as an entrepreneur in the DePIN and AI tracks, what suggestions do users have for how to measure the reliability of a decentralized edge computing AI network?

Davy: The successful construction of the AI ​​Agent infrastructure requires the following core traits, which are also applicable to measuring the reliability of decentralized edge computing AI networks:

1. Evaluate the performance of the network: First of all, high performance and low latency, which are the cornerstones of user experience and system practicality. When users use AI Agent, the most basic requirement is to respond quickly. If the task processing speed is too slow, not only will the user experience be greatly reduced, but the practicality of the entire system will also be questioned. The second is scalability and flexibility . Excellent infrastructure should be able to flexibly expand as user needs grow and support diverse application scenarios.

For a decentralized computing power network, users can examine the intelligence level of task allocation, the efficiency of computing power scheduling, response speed and processing capabilities, whether they have the ability to dynamically allocate computing power according to the complexity of the task, and whether they support variety by examining the intelligence level of task allocation, as well as response speed and processing capabilities, as well as their response speed and processing capabilities, and whether they support the diversity of the task, as well as the ability to dynamically allocate computing power, and and other aspects. For example, EdgeX can accurately allocate tasks to users' neighboring nodes, which improves response speed and reduces latency, meets real-time requirements, and can also dynamically allocate computing power according to task complexity, so as to easily deal with multimodal tasks such as images, videos, and voices, etc. , adapted to various scenarios from smart home to industrial applications, and even medical assistance.

2. Security and privacy protection . As data privacy and security issues become increasingly prominent, users’ security requirements for infrastructure are becoming increasingly strict. Users should check whether the corresponding AI network uses reliable encryption protocols and data storage mechanisms to protect data privacy.

  1. Developer Ecology and User Community : A strong developer Ecology and User Community are the key forces in promoting the sustainable development of infrastructure. For decentralized AI networks, users should pay attention to whether they have strong developer support, whether they can continuously launch new features or optimize existing services, as well as the activity of the user community and ecological construction.

To measure the reliability of decentralized edge computing AI networks, users should also consider the following two dimensions:

Stability and participation of nodes : The reliability of a network depends to a large extent on the stability and distribution range of its nodes. If the nodes are too centralized or unstable, the network is hard to be considered reliable.

Actual user experience : This is the most intuitive way to measure it. Users can experience network reliability by actually deploying nodes or running applications, such as whether they encounter technical problems and whether the response meets the standards.

In summary, an AI Agent infrastructure or decentralized edge computing AI network that has been recognized by the market should have high performance, scalability, security, a strong developer ecosystem and user community, and be stable through nodes. Sex and engagement and actual user experience to further measure its reliability.

13. ChainCatcher: What do you think about the future development of AI Agents? In the integration of encryption technology and AI Agent, what specific scenarios are you particularly optimistic about?

Davy: I think the future of AI Agent will develop towards intelligence, personalization and collaboration. AI Agent is no longer just a simple task assistant, but can become a multimodal agent that actively learns and adapts to user needs, deeply integrates into people's lives and work, completes complex task processing, and can also have emotional experiences in interaction.

From a technical perspective, decentralization and edge computing will be important development trends. Traditional centralized AI architectures have bottlenecks when dealing with large-scale personalized needs, while distributed networks can provide more flexible computing power and storage support, allowing AI Agents to be closer to users. In addition, multi-agent collaboration will also become the norm. By introducing a collaboration mechanism, different AI agents can share information and share tasks to achieve more complex goals. For example, in a smart city, AI Agents in multiple fields such as transportation, energy, and security can work together to provide an overall optimization solution for urban management.

Regarding the application scenarios of the integration of encryption and AI, I am more optimistic:

1. Personalized services and privacy protection : When AI Agent provides personalized services, it uses encryption technology to protect user sensitive data. For example, in the medical and health field, AI Agent can provide users with personalized health advice while ensuring the privacy of medical data. Sex is not leaked.

2. Distributed collaboration and incentive mechanism : In a decentralized network, multiple AI agents can achieve trustworthy collaboration and division of labor through blockchain technology. Encryption technology can support transparent settlement and incentive allocation after tasks are completed through smart contracts.

3. Decentralized market and AI service transactions : Build a decentralized AI service market, allowing users to directly interact with AI Agent and pay fees, which is suitable for education, consulting, design and other fields.

4. Multi-party computing and joint learning: During the AI ​​model training process, encryption technology can realize the secure sharing of data between different parties. For example, multiple organizations can jointly train AI models without exposing their original data, thereby improving model performance and protecting data privacy.

more