Dialogue with EdgeX Founder Davy: How does a decentralized edge computing network promote the popularization of AI Agents?

ChainCatcher Selection
2025-02-11 16:56:33
Collection
EdgeX has received support from two well-known Web3 capital firms and is expected to announce specific financing details in Q2 or Q3.

Author: Grapefruit, ChainCatcher

During the Spring Festival, DeepSeek's stunning debut once again brought AI into the spotlight of public attention, impacting the AI industry with its outstanding performance and highly cost-effective development costs. Currently, how to reduce the operational costs of AI models, improve operational efficiency, and make them more widespread has become the narrative theme of the AI industry's development.

As early as last year, the decentralized edge AI computing network EdgeX began its efforts to lower the threshold for AI operations, aiming to build a foundational network connecting users and AI. Its strength lies in the distributed computing infrastructure, where the operational resources required by AI Agents are provided by users, thus promoting the realization and development of decentralized edge computing.

EdgeX is fully committed to creating a decentralized AI infrastructure platform that integrates distributed computing resources and an AI scheduling management system, building an efficient, secure, and transparent decentralized computing network that supports various AI models to run seamlessly in a distributed environment, promoting the widespread implementation and application of AI technology in edge scenarios.

In simple terms, the EdgeX network gathers computing power, storage, and bandwidth contributed by participants through a decentralized computing framework, forming a global edge computing network that significantly reduces computing costs while allowing any AI model to operate efficiently and seamlessly on edge devices.

Davy, the founder of EdgeX, emphasized multiple times in an interview with ChainCatcher that EdgeX is not just a technology platform, but a practice of a philosophy. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, allowing AI to truly connect closely with every user.

Currently, EdgeX has successfully launched the hardware product XR7 AI Dual Gateway, which has been successfully delivered in the South Korean market and received widespread acclaim. Users can join the network and contribute computing power by purchasing this hardware product and deploying hardware nodes, earning early rewards. Meanwhile, the beta version of the EdgeX APP has also launched the first phase of trials in South Korea, allowing users to participate in testing the network.

It is worth mentioning that founder Davy also revealed to ChainCatcher that EdgeX has successfully obtained early support from two well-known domestic Web3 capital firms and is actively engaging in in-depth discussions with several traditional and Web3 capital firms in North America to explore deep cooperation on key matters such as leading investments. It is expected that specific progress and results of the financing will be gradually announced to the public between the second and third quarters of this year.

The Story Behind the Creation of EdgeX

1. ChainCatcher: As the founder of EdgeX, can you share your experiences in the Web3 and AI industries, as well as the opportunity that led to the launch of the AI infrastructure project? What are your main responsibilities in the EdgeX project?

Davy: Since 2015, I have been deeply involved in the data center and cloud industry, collaborating with several Web3 companies to build nodes and provide comprehensive infrastructure support, as well as technical support for leading CEX matching engines. Additionally, I have participated in several Silicon Valley projects from product conception to successful listing on exchanges, deeply engaging in key aspects such as infrastructure blueprint planning, product development, daily operations, and marketing.

In the AI field, I was exposed to machine learning technology early on, especially in storage and computing, and collaborated with several data scientists to develop and design AI applications. Subsequently, I participated in several large model projects in Silicon Valley as a consultant, focusing on optimizing multimodal models, fine-tuning vertical domain models, and efficient training.

In 2024, I observed that the AI field is undergoing a transformation from centralization to decentralization, which coincides with the evolution from Web2 to Web3. As the demand for distributed computing power in AI continues to grow, edge computing, as a key technology, can effectively meet this demand. Based on this, I decided to integrate the advantageous technologies and experiences of Web3 and AI to launch the EdgeX project, focusing on building a decentralized AI infrastructure.

Currently, in the EdgeX project, I am mainly responsible for the technical architecture design of EdgeX, designing an intelligent computing power scheduling system to ensure efficient collaboration of computing resources; building the technical infrastructure of the EdgeX computing network to provide stable and reliable support for AI applications; and optimizing AI applications, gaining insights into the needs of various industries, and customizing AI solutions for vertical domains.

2. ChainCatcher: What is the product positioning of EdgeX and what vision do you pursue? What pain points in the current market do you aim to address?

Davy: EdgeX is committed to building a decentralized AI infrastructure platform that integrates distributed computing and intelligent task management to promote the implementation and application of AI technology in edge scenarios. It constructs an efficient, secure, and transparent computing platform that allows AI models to run seamlessly in a distributed environment, providing strong underlying technical support for decentralized applications and various scenarios.**

Currently, the AI industry faces numerous challenges: centralized computing costs remain high, data privacy and security are concerning, and support for edge scenarios is lacking. Specifically, traditional AI models heavily rely on expensive centralized cloud computing resources, leading to high computing costs that limit the innovation pace of small teams and developers; centralized data storage models act like ticking time bombs, constantly threatening data security and privacy; and the operational efficiency of most AI models on edge devices is far from satisfactory. EdgeX aims to address these issues.

In simple terms, EdgeX significantly reduces computing costs through distributed computing, opening the door for innovation for small teams and developers; the distributed computing network supports any AI model to operate efficiently and seamlessly on edge devices, filling the gap in AI operations on edge devices. At the same time, its decentralized infrastructure can provide a higher level of protection for data privacy and security;

EdgeX is not just a technology platform, but a practice of a philosophy. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, making AI truly benefit every user. At the same time, we are also committed to providing more innovative solutions for developers and enterprises, jointly building an open, shared, prosperous AI ecosystem.

3. ChainCatcher: Can you introduce the composition of the EdgeX team? What unique advantages do you have in the field of AI and Web3 integration?

Davy: The EdgeX team is a diverse and international team composed of outstanding elites in global technology research and development, market operations, and brand promotion. Members are spread across international cities such as Silicon Valley, Singapore, and Taipei. This global distributed layout allows us to quickly capture global market demands and find the best partners and resources.

Core team members have held key positions in top global technology companies such as Amazon, Alibaba Cloud, and Tencent, possessing the ability to successfully drive projects from 0 to 1, as well as extensive industry resources to support effective implementation of projects globally.

In the technical field, the EdgeX team has deep expertise in both AI and Web3. Particularly in key areas such as large models, multimodal technology, and decentralized computing power scheduling. At the same time, we have a deep understanding and mastery of core modules such as Web3 token economics and smart contract design, enabling us to confidently address various challenges in the integration of AI and Web3 technologies. Moreover, the EdgeX team excels in commercialization, with members having rich experience in market promotion, user growth, and supply chain management.

Additionally, we have the support of a top Web3 advisory team, which will provide valuable guidance in product technology, token economics design, market expansion, and strategic planning, providing a solid backing for the rapid development of EdgeX.

4. ChainCatcher: According to EdgeX's official roadmap, the project plans to complete seed round financing between the second and fourth quarters of 2024. What is the current progress of this round of financing?

A: As of now, the EdgeX project has successfully obtained support from two well-known domestic Web3 capital firms. At the same time, we are actively engaging in in-depth discussions with several traditional and Web3 capital firms in North America to explore cooperation on key matters such as leading investments.

According to the established plan, EdgeX expects to officially announce the specific progress and results of this round of financing between Q2 and Q3 of 2025.

Features and Advantages of EdgeX Products

5. ChainCatcher: What are the specific operating mechanisms, core components, and main functions of the EdgeX network?

Davy: The EdgeX network forms a global edge computing network through a decentralized computing framework that gathers computing power, storage, and bandwidth contributed by participants. Users purchase and deploy EdgeX hardware nodes to participate, and these nodes generate proof of work (PoW) after completing tasks to earn token rewards.

In this process, EdgeX has designed an intelligent task scheduling system. For example, if an AI model needs to run on an edge device, this task will be split and assigned to different nodes for execution, ensuring the entire network operates efficiently while maintaining low latency and high concurrency.

The core components of the EdgeX network include:

  • Hardware nodes, EdgeX exclusive operating system, and AI-Agent system;
  • Hardware nodes not only support AI model inference but also provide resources such as storage and bandwidth;
  • The EdgeX operating system runs on hardware nodes, providing optimized computing power for edge scenarios.
  • The core function of the AI-Agent system can achieve distributed AI scheduling, completing data analysis and inference locally, and calling high-performance nodes when necessary to enhance task completion.

Additionally, the EdgeX network combines decentralized protocols and distributed storage systems to ensure data security and network stability.

The various components of EdgeX work together to build a decentralized, efficient, and secure computing ecosystem, providing better infrastructure support for AI inference and other AI applications.

6. ChainCatcher: How does EdgeX differ from other decentralized computing DePIN projects on the market, such as Aethir, io.net, Gradient Network, and Theta?

Davy: First of all, most of the current decentralized computing networks on the market tend to focus on general computing, while EdgeX focuses on the deep integration of edge computing and AI. It particularly emphasizes the optimization of AI inference tasks and resource scheduling, aiming to precisely serve various specific AI application scenarios, thus having unique advantages in meeting specific computing power demands.

Secondly, unlike large-scale distributed networks that rely on centralized data centers, EdgeX emphasizes the autonomous computing capabilities of edge nodes, which is key to its adaptation to AI inference tasks. Through an intelligent task scheduling system, EdgeX can accurately assign AI tasks to the most suitable edge nodes, significantly reducing latency and improving real-time performance.

In terms of product design, EdgeX combines a software and hardware integrated solution, having launched its own hardware nodes, while most similar computing projects mainly focus on software platforms, which is an important distinguishing feature from other computing projects. EdgeX hardware nodes are equipped with an exclusive operating system and have been deeply optimized for edge computing and AI scenarios. This design not only significantly enhances computing efficiency but also provides users with a more stable and efficient solution.

In terms of token economics, EdgeX combines proof of work and proof of resource mechanisms to incentivize contributors to provide efficient computing and storage resources. This mechanism ensures the rational allocation of network resources and effectively avoids resource waste.

In terms of application scenarios, EdgeX has a broader range of applications, supporting not only general decentralized computing needs but also focusing on multimodal AI tasks, lightweight inference on edge devices, and IoT scenario applications. This diversified technical coverage and practical application allow EdgeX to not be limited to a specific type of task or service, showcasing strong versatility and flexibility.

In this regard, EdgeX is not only a general decentralized computing network but also an innovative platform focused on edge computing and AI tasks. It can bring more possibilities for the integration of AI and Web3.

7. ChainCatcher: What specific application scenarios and products has EdgeX implemented?

Davy: Currently, EdgeX has successfully achieved deep integration and real-time connection between users' physical devices and AI Agents. Users can easily interact with AI Agents through their physical devices, making the Agent a personal intelligent assistant. This allows the Agent to not just be a virtual entity, but a smart device that can accurately understand, continuously learn, and execute user commands. EdgeX's Agent can provide localized decision support and flexibly obtain the necessary computing and storage resources through the EdgeX distributed network to meet various complex computing needs.

Application scenarios include:

Smart Home: EdgeX's Agent acts as a home assistant, interconnecting with IoT devices in the home, such as intelligently adjusting air conditioning and lighting based on real-time analysis of user habits while protecting data privacy.

Industrial Automation: In factories or production lines, EdgeX supports edge AI Agents to complete equipment monitoring, fault prediction, and process optimization, reducing latency and improving production efficiency.

Multimodal AI Services: The EdgeX network can support multimodal data processing, including images, videos, and voice. For example, in the medical field, the Agent processes patient data at the edge to provide real-time diagnostic suggestions to doctors.

Education and Training: Through the EdgeX network, AI Agents become learning assistants for students, providing personalized tutoring while protecting data privacy.

Virtual Assistants and Gaming: In gaming or virtual reality applications, Agents utilize EdgeX's distributed computing to provide real-time environment generation and character interaction support.

As of now, EdgeX has successfully launched a series of physical products, including hardware nodes and AI Agent devices closely tied to users. These products leverage the advantages of the EdgeX network to ensure efficient configuration and utilization of computing and storage resources, thus achieving smooth and seamless interaction between users and intelligent devices.

8. ChainCatcher: As a new decentralized AI computing network, what measures does EdgeX take to attract and retain developers?

Davy: EdgeX is committed to creating a vibrant and thriving developer ecosystem, not only hoping developers will use our platform but also expecting them to find a sense of belonging and long-term development opportunities here. Currently, EdgeX has implemented multiple initiatives to help developers quickly get started on the EdgeX platform and gain long-term benefits and development opportunities.

On the technical side, EdgeX provides comprehensive development tools and detailed documentation support, equipped with developer-friendly SDKs, API interfaces, and support for multiple programming languages, along with detailed technical documentation and step-by-step tutorials to assist developers in getting started quickly.

In terms of incentive mechanisms, developers on EdgeX can earn $EX tokens by developing high-quality applications, optimizing network performance, or providing computing resources. Additionally, EdgeX has launched a revenue-sharing model, allowing developers to directly earn revenue shares from user payments for applications deployed on the EdgeX network.

In community building, EdgeX has created an open developer community that encourages experience exchange and idea sharing. The core technical team actively participates in the community, providing technical guidance and support to ensure developers' questions are addressed and resolved promptly.

Furthermore, EdgeX plans rich growth opportunities for developers, such as regularly hosting hackathons and developer competitions to provide a platform for showcasing. At the same time, EdgeX will help developers expand their user base through its global partnership network, allowing their ideas to reach a broader market.

Application Scenarios of the Governance Token EX and Early User Rewards

9. ChainCatcher: EdgeX has released the economic model of the governance token EX on its official website. What role does EX play in the EdgeX network? What incentive policies are in place for early users?

Davy: The EX token is the core driving force behind the operation of the EdgeX network.

As a governance token, EX grants holders the right to participate in network decision-making, including voting on key matters such as network development direction, protocol upgrades, and resource allocation. This decentralized governance model promotes transparency in the network and encourages more active community participation in the construction of the EdgeX ecosystem.

At the same time, EX is the main medium for economic activities within the network. In the EdgeX ecosystem, users need to use EX to pay for resource invocation fees, such as computing power, storage, and bandwidth services. Node operators earn EX rewards after providing resources through proof of work (PoW) or proof of resource (PoR). This mechanism incentivizes more nodes to participate in network operations, ensuring efficient utilization of resources.

For early users, EdgeX has launched various incentive measures: Users who deploy hardware nodes early can enjoy higher EX token mining rewards; additionally, early developers who publish high-quality applications or optimize network performance on the EdgeX network can receive allocations from an additional reward pool; there are also plans to launch exclusive token airdrop events for early users to help them quickly integrate into the EdgeX ecosystem.

Overall, the EX token is not only an incentive tool but also an ecosystem connector that tightly links users, developers, and node operators, jointly promoting the growth and prosperity of the EdgeX network. Early users can not only gain economic returns but also participate in network governance, becoming important members of the ecosystem and sharing in the dividends of EdgeX's development.

10. ChainCatcher: How is the product development progress of EdgeX? What ways can users participate?

Davy: The EdgeX hardware product XR7 AI Dual Gateway has been successfully delivered in the South Korean market and has received widespread acclaim, marking an important step in global promotion and serving as a significant validation of the actual performance and application value of the EdgeX network.

At the same time, the beta version of the EdgeX APP has launched the first phase of trials in South Korea, focusing on testing network stability and user experience to lay the groundwork for subsequent global market expansion.

In the AI Agent field, EdgeX's development team is committed to continuously optimizing model parameters to achieve significant performance improvements, making user experiences smoother, including faster response times and more precise task handling capabilities.

As for how users can participate in the EdgeX network, currently, South Korean users can join the network by deploying XR7 AI Dual Gateway hardware nodes, contributing resources and completing tasks to earn token rewards. Additionally, users can participate in testing the beta version of the APP to experience the service and provide feedback.

11. ChainCatcher: EdgeX has revealed that it is discussing cooperation details with the leading AI Agent product Eliza. What are the specific cooperation details? What key roles does EdgeX play in the application of AI Agents? How does it optimize the performance and efficiency of AI Agents through edge computing?

Davy: As a representative product in the AI Agent field, Eliza's smooth interaction capabilities and user experience align very well with EdgeX's decentralized computing network. EdgeX is committed to integrating Eliza's white-label version into its network, aiming to enhance Eliza's service efficiency and user experience through this collaboration, achieving rapid response and real-time interaction. The specific cooperation plan between the two parties is still being refined.

In the application scenarios of AI Agents, EdgeX provides the underlying computing support and optimization. Through EdgeX, the computing tasks of AI Agents like Eliza can be smoothly transferred to distributed edge nodes for processing. This model allows Eliza to be closer to the user's network location, thereby reducing latency. At the same time, EdgeX's intelligent scheduling mechanism can dynamically assign tasks to the optimal nodes, enhancing the overall resource utilization and operational efficiency of the system.

EdgeX's edge computing framework optimizes the performance of AI Agents in the following three aspects, achieving new heights in speed, intelligence, and user experience.

Low Latency: Tasks can be completed at edge nodes near the user without needing to be transmitted to the cloud, significantly reducing data transmission time and improving interaction smoothness.

Intelligent Scheduling: EdgeX can analyze the status of each node in real-time and dynamically adjust task assignments based on actual conditions, ensuring rational resource utilization and effectively preventing node overload.

Distributed Computing Collaboration: When a single node cannot handle complex tasks, EdgeX's distributed architecture can quickly call multiple nodes to collaborate, ensuring task completion while enhancing overall efficiency.

How to Measure the Reliability of an AI Infrastructure?

12. ChainCatcher: What qualities must an AI Agent infrastructure possess to gain market recognition? As an entrepreneur in the DePIN and AI fields, what advice do you have for users on how to measure the reliability of a decentralized edge computing AI network?

Davy: The successful construction of an AI Agent infrastructure must revolve around the following core qualities, which are also applicable for measuring the reliability of a decentralized edge computing AI network:

1. Assessing Network Performance: First is high performance and low latency, which are the cornerstones of user experience and system practicality. Users expect quick responses when using AI Agents; if task processing speeds are too slow, not only will user experience suffer, but the overall practicality of the system will also be questioned. Secondly, scalability and flexibility are essential; an excellent infrastructure should be able to flexibly expand as user demands grow and support diverse application scenarios.

For a decentralized computing network, users can evaluate the degree of intelligence in task allocation, the efficiency of computing power scheduling, response speed, and processing capabilities, whether it has the ability to dynamically allocate computing power based on task complexity, and whether it supports diverse application scenarios. For example, EdgeX can accurately assign tasks to nodes near users, improving response speed while reducing latency, meeting real-time needs, and dynamically allocating computing power based on task complexity, easily handling multimodal tasks such as images, videos, and voice, adaptable to various scenarios from smart homes to industrial applications and even medical assistance.

2. Security and Privacy Protection: As data privacy and security issues become increasingly prominent, users have stricter security requirements for infrastructures. Users should examine whether the corresponding AI network employs reliable encryption protocols and data storage mechanisms to protect data privacy.

  1. Developer Ecosystem and User Community: A strong developer ecosystem and user community are key driving forces for the continuous development of infrastructure. For decentralized AI networks, users should pay attention to whether there is strong developer support, whether new features or optimizations of existing services are continuously launched, and the activity level of the user community and ecological construction.

To measure the reliability of a decentralized edge computing AI network, users should also consider the following two dimensions:

Node Stability and Participation: The reliability of a network largely depends on the stability and distribution of its nodes. If nodes are too centralized or unstable, the network can hardly be considered reliable.

Actual User Experience: This is the most intuitive measure. Users can experience network reliability by actually deploying nodes or running applications, such as whether they encounter technical issues and whether responses meet standards.

In summary, an AI Agent infrastructure or decentralized edge computing AI network that gains market recognition should possess qualities such as high performance, scalability, security, a strong developer ecosystem, and user community, and further measure its reliability through node stability and participation as well as actual user experience.

13. ChainCatcher: What are your views on the future development of AI Agents? In the integration of cryptography and AI Agents, which specific scenarios do you particularly favor?

Davy: I believe the future of AI Agents will develop towards intelligence, personalization, and collaboration. AI Agents will no longer just be simple task assistants; they will become multimodal intelligent entities that actively learn and adapt to user needs, deeply integrating into people's lives and work, handling complex tasks, and providing emotional experiences in interactions.

From a technical perspective, decentralization and edge computing will be important development trends. Traditional centralized AI architectures face bottlenecks when handling large-scale personalized demands, while distributed networks can provide more flexible computing and storage support, allowing AI Agents to be closer to users. Additionally, multi-Agent collaboration will become the norm; by introducing collaboration mechanisms, different AI Agents can share information and divide tasks to achieve more complex goals. For example, in a smart city, AI Agents in transportation, energy, and security can work together to provide overall optimization solutions for city management.

Regarding the integration of cryptography and AI applications, I particularly favor:

1. Personalized Services and Privacy Protection: When AI Agents provide personalized services, they can use encryption technology to protect sensitive user data. For example, in the healthcare field, AI Agents can provide personalized health advice while ensuring the privacy of medical data is not compromised.

2. Distributed Collaboration and Incentive Mechanisms: In decentralized networks, multiple AI Agents can achieve trusted collaboration and division of labor through blockchain technology. Cryptography can support transparent settlement and incentive distribution after task completion through smart contracts.

3. Decentralized Market and AI Service Transactions: Building a decentralized AI service market allows users to interact directly with AI Agents and pay fees, applicable in fields such as education, consulting, and design.

4. Multi-party Computation and Federated Learning: During AI model training, encryption technology can enable secure sharing of data between different parties. For instance, multiple organizations can jointly train AI models without exposing their original data, thus improving model performance while protecting data privacy.

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
ChainCatcher Building the Web3 world with innovators