Загрузил Плутос Крит

aioz-w3ai-vision-paper

реклама
AIOZ W3AI:
R EVOLUTIONIZING W EB 3 AI D E PIN C OMPUTING
A C OMPREHENSIVE W EB 3 S OLUTION
ERED VIA A
FOR
DATASETS , M ODELING , I NFERENCE ,
AND
T RAINING P OW-
D ECENTRALIZED P HYSICAL I NFRASTRUCTURE N ETWORK (D E PIN). E MPOWERING U SERS
E ARN T HROUGH DATASET AND AI M ODEL C REATION AND C ONTRIBUTION .
AIOZ Network, Mar 2024
1
TO
C ONTENTS
1
Executive Summary
3
2
Introduction
3
2.1
AIOZ Network: Building a Decentralized Future . . . . . . . . . . . . . . . . . . . . .
3
2.2
A Two-Tiered Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3
3
Addressing the AI Market Opportunity
4
4
AIOZ W3AI Overview
5
4.1
The Cornerstone: A Thriving Decentralized Network (AIOZ DePIN) . . . . . . . . . . .
6
4.2
Unleashing AI Potential with the AIOZ Web3 Infrastructure . . . . . . . . . . . . . . .
6
4.3
The W3AI Marketplace: A Decentralized Hub for Innovation . . . . . . . . . . . . . . .
6
5
AIOZ W3AI Architecture
7
5.1
Overview: A Decentralized Foundation for AI . . . . . . . . . . . . . . . . . . . . . . .
7
5.1.1
W3AI Platform: Unleashing User Potential . . . . . . . . . . . . . . . . . . . .
7
5.1.2
W3AI Infrastructure: The Powerhouse Beneath . . . . . . . . . . . . . . . . . .
8
5.1.3
Collaboration and Innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
5.2
AI Optimized Routing Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
5.3
AI Management for Task Assignment . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
5.4
AIOZ W3AI Computing workflow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
15
5.4.1
Design Philosophy: Balancing Security and Performance . . . . . . . . . . . . .
15
5.4.2
A Secure and Streamlined Workflow . . . . . . . . . . . . . . . . . . . . . . . .
15
Democratizing AI Through a Decentralized Marketplace . . . . . . . . . . . . . . . . .
16
5.5.1
Key Stakeholders and Their Contributions . . . . . . . . . . . . . . . . . . . . .
16
5.5.2
Unlocking Monetization Opportunities . . . . . . . . . . . . . . . . . . . . . .
17
5.5.3
Empowering Collaboration, Guaranteeing Privacy . . . . . . . . . . . . . . . .
17
5.5.4
User-Friendly AI for All . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17
AIOZ Token: Fueling the Decentralized AI Marketplace . . . . . . . . . . . . . . . . .
17
5.5
5.6
6
Conclusion
20
2
1
E XECUTIVE S UMMARY
The transformative power of AI is undeniable, impacting everything from healthcare and finance to media
and entertainment. As AI applications continue to proliferate, the demand for scalable, efficient, and
cost-effective AI computing solutions is paramount. However, traditional centralized AI infrastructure
often faces limitations in scalability, cost, and privacy, hindering broader adoption.
AIOZ W3AI addresses these challenges by offering a groundbreaking decentralized AI computing infrastructure built on top of the AIOZ Network’s DePIN (Decentralized Physical Infrastructure Network). Through a two-tiered architecture, AIOZ W3AI unlocks the power of a global network of
human-powered nodes:
• Web3 Infrastructure: Provides a scalable and cost-effective foundation for distributed AI computation, leveraging innovative communication protocols for efficient resource allocation and
user privacy protection.
• Web3 AI Platform: Empowers developers and businesses to access diverse AI solutions, from
traditional models like classification and NLP to cutting-edge capabilities like LLMs and Generative AI. Users can contribute models and datasets, conduct training and inference, and even
monetize their contributions through a decentralized marketplace.
This whitepaper explores the innovative architecture of AIOZ W3AI, showcasing its capabilities as a
decentralized infrastructure and a platform for diverse AI solutions. We delve into the benefits of decentralization, highlighting how AIOZ W3AI paves the way for a more democratic and accessible future of
AI.
2
2.1
I NTRODUCTION
AIOZ N ETWORK : B UILDING A D ECENTRALIZED F UTURE
AIOZ Network is a Layer-1 blockchain with full Ethereum and Cosmos interoperability that lays the
foundation for a decentralized AI ecosystem.
AIOZ Network architecture addresses the limitations of centralized systems by leveraging the collective power of the AIOZ DePIN. Thousands of human-powered AIOZ DePINs collectively provide the
computational resources that fuel various decentralized functionalities, including:
• Decentralized Storage: Secure and efficient storage of data for AI applications, eliminating
reliance on centralized servers and mitigating the risk of data breaches.
• Decentralized Content Delivery: Robust infrastructure for delivering AI models and datasets
efficiently, ensuring seamless access and scalability for resource-intensive AI tasks.
• Decentralized AI Computing: The crucial pillar of AIOZ Network, enabling distributed AI training and inference tasks in a secure and cost-effective manner. This breaks away from the limitations of centralized computing, fostering greater accessibility and fostering trust for users.
Through these functionalities, AIOZ Network creates a robust foundation upon which AIOZ W3AI
builds its innovative two-tiered architecture for decentralized AI computing, unlocking the full potential
of AI in the Web3 space.
2.2
A T WO -T IERED A PPROACH
AIOZ W3AI addresses these limitations and unlocks the potential of decentralized AI through its unique
two-tiered architecture:
1. DePIN Infrastructure:
Built on top of the AIOZ Network’s DePIN, this robust infrastructure provides the foundation for distributed AI computation.
Multigraph topology ensures efficient communication routes between AIOZ DePINs minimizing computing costs and maximizing processing speed.
3
This decentralized approach significantly reduces the risk of server bottlenecks and enhances user privacy
by eliminating single points of control.
2. Web3 AI Platform:
W3AI sits atop the infrastructure, enabling developers and businesses to access the power of decentralized
AI in a user-friendly way.
W3AI offers a wide range of AI solutions, including traditional architectures like classification, detection,
and NLP, as well as cutting-edge capabilities like LLMs and Generative AI.
Enabling users to contribute AI models and datasets, conduct training and inference tasks, and even monetize their contributions through a decentralized, collaborative, incentivized AI marketplace, W3AI combines a robust infrastructure with a comprehensive platform, fostering a more democratic and accessible
landscape for AI development and utilization.
Figure 1: The W3AI Decentralized Computing Infrastructure powered by a network of AIOZ Nodes.
Purple areas indicate storage node distribution, and blue areas represent computing node distribution.
Red areas encrypt data and transmit it to blue areas for decentralized AI task execution.
3
A DDRESSING THE AI M ARKET O PPORTUNITY
AIOZ W3AI emerges at the forefront of a rapidly evolving landscape, capitalizing on three key market
trends that unlock vast potential for decentralized AI computing:
1. The Data Deluge and the Rise of AI:
The global datasphere is expanding, expected to reach 175 zettabytes by 2025 and fuelled by the proliferation of connected devices, sensors, and online activities.
Businesses in all corners of industry increasingly recognize AI’s transformative power for tasks like data
analysis, predictive modeling, and personalized experiences.
4
However, traditional centralized cloud computing solutions struggle to handle advanced AI’s sheer volume and processing demands, leading to scalability limitations, high costs, and potential privacy concerns.
2. The Edge Computing Revolution:
The rise of edge computing brings processing power closer to the data source, enabling faster response
times and reduced latency, crucial for real-time AI applications.
3. The Need for a Secure and Accessible AI Ecosystem:
Existing AI solutions often raise concerns regarding data privacy and security as users relinquish control
of their data to centralized providers.
Additionally, the barrier to entry for accessing powerful AI resources can be high, hindering innovation
and limiting participation for smaller players and individuals.
By addressing these critical market trends and challenges, AIOZ W3AI unlocks a burgeoning opportunity
for:
• Businesses: Leverage decentralized AI for improved efficiency, cost savings, and innovation
across diverse applications.
• Developers: Access a secure and scalable platform to build and deploy AI models with greater
flexibility and control.
• Data owners: Maintain control and monetize their data through secure and transparent AI marketplaces.
4
AIOZ W3AI OVERVIEW
Figure 2: Overview of AIOZ Network’s Web3 Infrastructure for Storage and Compute.
5
4.1
T HE C ORNERSTONE : A T HRIVING D ECENTRALIZED N ETWORK (AIOZ D E PIN)
The foundation of AIOZ W3AI lies in its robust, vast network of distributed human-powered edge AIOZ
DePIN worldwide. AIOZ DePIN contributes its computing resources, including storage, CPU, and GPU
power, to form a decentralized powerhouse.
This approach unlocks several advantages:
• Unmatched Scalability and Availability: Tasks are no longer bottlenecked by centralized servers.
The distributed nature of AIOZ DePIN ensures efficient processing and high availability, even
for demanding workloads.
• Enhanced Security and Privacy: Data is never stored in a single location, minimizing risks
associated with centralized breaches. Additionally, AIOZ leverages blockchain technology to
further safeguard user privacy, ensuring complete control over your data.
4.2
U NLEASHING AI P OTENTIAL WITH THE AIOZ W EB 3 I NFRASTRUCTURE
AIOZ W3AI leverages the power of the AIOZ DePIN through its comprehensive Web3 infrastructure,
offering a suite of functionalities:
• AIOZ W3S (Web3 Storage) Infrastructure: Built on AIOZ Network’s decentralized content
delivery network (dCDN), W3S revolutionizes how digital assets are stored and distributed.
The AIOZ W3S ensures efficient, secure, and globally accessible content delivery, empowering
media giants and individual creators alike.
• AIOZ W3AI (Web3 AI) Infrastructure: This comprehensive solution unlocks the true potential of distributed AI processing:
– W3AI Inference: Brings AI closer to the data source by running trained models on edge
AIOZ DePIN. This enables real-time, location-aware AI insights with significantly lower
latency compared to centralized solutions.
– W3AI Training: Embraces federated learning, a privacy-preserving technique (such as Homomorphic Encryption) to train and improve AI models collaboratively. This allows diverse participants to contribute their data and expertise, fostering a continuous cycle of AI
advancement.
4.3
T HE W3AI M ARKETPLACE : A D ECENTRALIZED H UB FOR I NNOVATION
AIOZ W3AI goes beyond just infrastructure; it fosters a thriving marketplace for AI assets and expertise.
This decentralized AI marketplace empowers users and organizations to:
• Contribute AI Datasets and Models: Share valuable data (e.g., a medical imaging dataset) and
pre-trained models with the community, accelerating AI development.
• Monetize Your Work: Earn rewards for contributions and leverage the marketplace to sell access
to your AI assets.
• Build and Deploy AI dApps: Develop and deploy innovative AI applications directly on the
platform, fostering a vibrant ecosystem of decentralized AI solutions.
This democratized approach offers significant advantages:
• For Developers and Researchers: Gain access to a vast pool of computing resources for complex
AI tasks, all while contributing to the collective intelligence of the network.
• For Businesses: Enjoy significant cost savings compared to centralized cloud solutions. The
platform’s scalability and security enable businesses of all sizes – from large enterprises to agile
startups – to leverage powerful AI capabilities.
• For Individuals: AIOZ W3AI empowers individuals to contribute their unused computing resources and participate in the AI revolution. The user-friendly platform allows even those without extensive technical knowledge to benefit from AI.
6
5
AIOZ W3AI A RCHITECTURE
The AIOZ W3AI Architecture underpins a secure and collaborative environment for AI development.
It utilizes advanced privacy-preserving techniques alongside Decentralized Federated Learning (DFL)
to safeguard data throughout the AI model creation process. This innovative architecture empowers a
diverse community, including AI Creators, casual users, and AIOZ DePIN, to actively participate in a
thriving marketplace for AI models and datasets. By fostering secure data exchange and incentivizing
contributions, the W3AI Architecture facilitates a future where AI innovation is accessible and driven by
collective participation.
5.1
OVERVIEW: A D ECENTRALIZED F OUNDATION FOR AI
Figure 3: AIOZ W3AI Architecture: A two-tiered approach.
As illustrated in Figure 3, the AIOZ W3AI architecture forms the robust and decentralized foundation
for a thriving AI ecosystem. This two-tiered architecture seamlessly integrates user-facing functionalities
with the underlying distributed network, unlocking the true potential of distributed AI processing.
5.1.1
W3AI P LATFORM : U NLEASHING U SER P OTENTIAL
The W3AI Platform provides a comprehensive suite of tools and resources designed to empower users:
• Decentralized Apps & Marketplace: This interactive layer forms the heart of user engagement.
It encompasses:
– AIOZ AI dApp Store: This open and decentralized marketplace allows users to discover,
utilize, and even publish AI applications. Operating on Web3 principles, Spaces ensures
transparency and equitable access to AI technologies.
7
– AI Model & Dataset Marketplace: This decentralized exchange platform facilitates the
trading or sharing of AI models and datasets. This marketplace fosters a collaborative AI
development environment by incentivizing the creation and sharing of valuable resources.
• Decentralized Training: Building and Refining AI Models: The training block focuses on
building and refining AI models using distributed computational resources. Key functionalities
include:
– Decentralized Federated Learning: This innovative system enables the collaborative training of AI models across numerous edge AIOZ DePIN while preserving data privacy and
security. This approach enhances model performance and aligns with the privacy-centric
ethos of Web3.
– Model & Dataset Encryption: Rigorous encryption protocols safeguard intellectual property and user privacy during training. This ensures data integrity and security across the
decentralized network.
• Decentralized Inference: The inference block handles the operational execution of AI models:
– Edge AIOZ DePIN AI Inference: Leveraging the distributed nature of edge computing, this
component enables efficient and rapid AI inferences. Processing data closer to the source
minimizes latency and optimizes resource usage.
– Model & Dataset Repository: This secure and decentralized repository stores AI models
and datasets, ensuring their ready availability for inference processes across the network.
5.1.2
W3AI I NFRASTRUCTURE : T HE P OWERHOUSE B ENEATH
The underlying foundation of this architecture is the W3AI Infrastructure, which provides the platform’s
essential computational resources and network capabilities. This tier consists of the distributed network
that powers the entire system:
• Edge AI Computes: These Edge AI Computes, contributed by users, form the backbone of the
network. Each edge computation contributes its computing resources like storage, CPU, and
GPU power to the platform.
• AIOZ DePIN: TBy leveraging the collective power of AIOZ DePIN, the W3AI platform
achieves unmatched scalability, security, and cost-effectiveness for AI tasks.
5.1.3
C OLLABORATION AND I NNOVATION
By combining these two tiers, the AIOZ W3AI architecture fosters a collaborative environment where
users can contribute resources, access powerful AI capabilities, and participate in the development of a
decentralized AI future.
This architecture serves as a comprehensive blueprint for a future where a scalable, efficient, and usergoverned AI platform thrives on the strengths of the AIOZ Network. By integrating decentralized applications, training, and inference layers upon a robust infrastructure, W3AI stands as a paradigm shift in
the way AI services are delivered and consumed.
5.2
AI O PTIMIZED ROUTING A RCHITECTURE
Figure 4 illustrates how the AIOZ W3AI architecture leverages the inherent flexibility of edge AIOZ
DePIN for storage, delivery, and computing tasks.
The key to this efficiency lies in its AI-optimized routing architecture. This intelligent system goes
beyond traditional methods by dynamically optimizing the ”Storing-Delivering-Computing Triangle” for
each task.
This ensures that all edge AIOZ DePIN benefits and the overall solution remains efficient and costeffective.
AI-Powered Task Management: Automating Resource Allocation
The W3AI Task Manager acts as the central conductor of this dynamic routing system. It eliminates
the need for manual configuration by automatically identifying the most suitable AIOZ DePIN for each
computing task. Here’s how it works:
8
Figure 4: AIOZ W3AI Architecture.
• Signal Topology Broadcast: Instead of pre-defined rankings or AIOZ DePIN types, the Task
Manager broadcasts a signal topology. This signal gathers information about potential AIOZ
DePIN candidates, including their roles and communication methods. The AI then analyzes this
data to select the optimal set of AIOZ DePIN for the task.
• AIOZ DePIN Selection: The system identifies a list of edge AIOZ DePIN with confidence scores
based on the received data and powerful AI algorithms. These AIOZ DePIN selections are
deemed the most proficient for handling the specific computing task at hand.
Dynamic AIOZ DePIN Roles: Adapting to Task Requirements
AIOZ W3AI goes beyond static computing roles. The AI management system dynamically defines three
key roles for each task, ensuring optimal efficiency:
• Storage: DePIN Storage handles the secure storage and delivery of encoded data required for the
computing tasks. They ensure data security during transfer and create duplicates near computing
DePINs for faster processing.
• Computing: Based on DePINs’ confidence scores calculated by the AI manager, specific DePIN
Computing is authorized to connect and contribute computing resources for task execution. During each communication round, the manager assesses the progress and re-calibrates the topology
as needed.
• Accumulation: The AI Task Manager also considers task-specific accumulation on certain AIOZ
DePINs. This allows users to contribute computing containers (standardized units of software)
with varied architectures but identical inputs and outputs. This collaborative approach fosters a
collective effort to enhance overall task performance.
Benefits of AI-Optimized Routing
This AI-powered approach offers several advantages:
• Enhanced Efficiency: By dynamically selecting the optimal AIOZ DePIN for each task, the
system maximizes resource utilization and minimizes processing time.
9
• Improved Scalability: The architecture can seamlessly adapt to handle increasing computing
demands by leveraging the vast network of AIOZ DePINs.
• Cost-Effectiveness: Tasks are distributed across AIOZ DePINs, eliminating the need for expensive centralized resources and ultimately reducing user costs.
• Enhanced Security: Data remains fragmented and encrypted throughout the network, minimizing security risks associated with centralized storage.
The AIOZ W3AI platform fosters a dynamic and efficient ecosystem for distributed AI processing by
utilizing an AI-optimized routing architecture.
5.3
AI M ANAGEMENT FOR TASK A SSIGNMENT
The W3AI Task Manager integrates atop the AIOZ network’s DePIN to handle task assignments for
computing, utilizing generated topologies encoded in a multigraph.
This topology assesses the computing devices’ capabilities to dispatch tasks and can disconnect or reconnect connections, ensuring optimal process speed or readiness of computed results, if necessary.
Our research on this technology has been conducted and our findings1 were published at the prestigious
ICCV 2023 conference.
Impact Factors: The W3AI Task Manager utilizes these inputs to generate a multigraph topology, facilitating the handling of computing tasks.
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Available Storage SA
Geometric Distance GD
Computing Resources (CPU/ GPU/ RAM) RC , RG , RR
Network Traffic Capacity TC
Latency L
Task size and Minimum/ Maximum requirements Smax , Smin
Chance of Resource Explosion in computing CE
Previous percentage of finished tasks in each role P
Size of Data used for computing SD
Complexity of tasks CT
Effectiveness of the previous computed Topologies ET .
Task Tagging TT
Price P
Fairness F
Availability A
Inputs and Outputs required for considered tasks. I, O
User Priority U
Overall Objective Function. All mentioned inputs are treated as a sample input set, extracted as features,
and processed through an AI engine to produce confidence scores for each candidate, which are then used
to generate a multigraph containing topologies for n subsequent communication rounds.
The AI model is trained by AIOZ DePIN under a decentralized, federated learning concept, and its
objective function is:
N
X
min
pi Eξi [Li (w, ξi )] ,
(1)
w∈Rd
i=1
1
https://openaccess.thecvf.com/content/ICCV2023/papers/Do_Reducing_
Training_Time_in_Cross-Silo_Federated_Learning_Using_Multigraph_Topology_
ICCV_2023_paper.pdf
10
where Li (w, ξi ) is the loss of model parameterized by the model weight w ∈ Rd , ξi is an input sample
drawn from encrypted data at silo i, and the coefficient pi > 0 specifies the relative importance of each
AIOZ DePIN.
To optimize Eq. 1, we employ a decentralized periodic averaging stochastic gradient descent (DPASGD),
which updates the weight of each AIOZ DePIN i in each training round as follows:
P


j∈Ni+ ∪{i} Ai,j wj (k) ,



ifk ≡ 0 (mod u +1) ,
Pb
wi (k + 1) =
(h)
1

wi (k) − αk b h=1 ∇Li wi (k) , ξi (k) ,




otherwise.
(2)
where b is the batch size, i and j denote the AIOZ DePIN, u is the number of updates, αk > 0 is a
potentially varying learning rate at the k-th round, A ∈ RN ×N is a consensus matrix with non-negative
weights, and Ni+ is the set of in-neighbors that AIOZ DePIN i has connections to.
Confident Score Computing. The confidence score S and multigraph topology M are conducted as:
S, M = f(w, SA , GD , RC , RG , RR , TC , L, Smax , Smin , CE , P, SD , CT , ET , TT , P, F, A, I, O, U )
(3)
Confident score plays an essential role in guiding the conduction of multigraph, also supporting W3AI
Task Manager to identify strong and weight-connected edges.
AIOZ DePINs that receive multiple weak-connected edges consecutively have the potential to be considered isolated AIOZ DePIN and will have fewer chances of joining AIOZ computing networks and
receiving fewer rewards.
Algorithm 1: Multigraph Construction.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Input: Overlay Go = (V, Eo );
Maximum edge between two AIOZ DePINs t.
Output: Multigraph Gm = (V, Em );
List number of edges between AIOZ DePIN pairs L.
// Reliability computation for overlay
Do ← Establish a list of all reliable AIOZ DePIN related to each AIOZ DePIN pair.
foreach edge e(i, j) ∈ Eo do
d(i, j) ← Compute the reliability of AIOZ DePIN in overlay using Eq. 3.
Append the computed score d(i, j) into Do .
// Multigraph Establishment
dmin ← Compute the smallest score by min(D0 ).
Em ← Establish the multiset containing all edges.
L[|V|, |V|] ← Initialize the full-zero list for tracking the number of edges between each AIOZ DePIN pair.
foreach edge e(i, j) ∈ Eo do
n(i, j) ← Find the number of edges for (i,j) pair by computing min t, round d(i,j)
dmin
Et ← Establish the set for each edge e(i, j) that has marked with connection status. 1 labeled as
strong-connected edge while 0 represented for weak-connected edge.
Append strong-connected edge e(i, j) = 1 into Et .
foreach (n(i, j) − 1) do
Append weak-connected edge e(i, j) = 0 into Et .
Append established edge set Et into multiset Em .
Update the number of edges between (i, j) pair n(i, j) into the corresponding position L[i, j] of track
list L.
return Multigraph Gm = (V, Em ); Track list L
Multigraph Conduction and Training process. In Algorithm 1, we describe our proposed algorithm to
generate the multigraph Gm with multiple edges between AIOZ DePIN. The algorithm takes the overlay
Go as input. Then, we establish multiple edges that indicate different statuses (strongly-connected or
weakly-connected) based on the computed score S. We assume that the AIOZ DePIN pairs with low
11
confident score S deserve to contain more weakly-connected edges, hence potentially becoming isolated
AIOZ DePIN.
Algorithm 2: Multigraph Parsing.
1
2
3
4
5
6
7
8
Input: Multigraph Gm = (V, Em );
List edge numbers between AIOZ DePIN pairs L.
s
s
Output: List of multigraph states S = {Gm
= (V, Em
)}.
smax ← Compute maximum number of distinct state in Gm by using Least Common Multiple ?
L̄ ← Establish the dynamic list that tracks the number of edges between AIOZ DePIN pairs during changing
graph state and is initialized by input list edge L.
s
Ēm
← Establish the list of all possible extracted states.
// States of Multigraph Establishment
for state s = 0 to smax do
Et ← Establish temporary edge set.
foreach edge e(i, j) ∈ Em do
if L̄[i, j] = L[i, j] then
Append strong connected-edge e(i, j) = 1 into edge set Et .
else
9
Append weak-connected edge e(i, j) = 0 into edge set Et .
10
12
if L̄[i, j] = 1 then
Update the status of dynamic list L̄ using input L through L̄[i, j] = L[i, j].
13
else
11
Reduce the corresponding number of edges by L̄[i, j]− = 1.
14
s
Append edge set Et into Ēm
15
16
s
s
s
return List of multigraph state S = {Gm
= (V, Em
)} by using list of possible extracted states Ēm
.
s
. Graph states are essential for
In Algorithm 2, we parse the multigraph Gm into multiple graph states Gm
identifying the connection status of AIOZ DePIN in a specific communication round to perform model
aggregation. In each graph state, our goal is to identify and remove the isolated AIOZ DePIN to preserve
speed and training performance.
To parse the multigraph into graph states, we first identify the maximum number of states in a multigraph,
denoted as smax , by using the least common multiple (LCM). We then parse the multigraph into smax
states and use the regularly updated score S to generate different training flows. These training flows play
an essential role in managing the contributions of AIOZ DePIN.
Loss. In practice, we observe that the early stages of federated learning mostly have poor accumulated
models. Different from other works that deal with the Non-Independent and identically distributed (nonIID) problem by optimizing the accumulation step whenever AIOZ DePIN transmit their models, we
directly decrease the effect of divergence impact factors during the learning process of each AIOZ DePIN.
To achieve that, we reduce the distance between distribution from accumulated weights θib at AIOZ DePIN
i in the backbone network, which contains information from other AIOZ DePIN known as divergence
factors, and its i-th weights θis in the sub-network, which only contains knowledge learned from encrypted
data. When the distribution between AIOZ DePIN has been synchronized at an acceptable rate, we lower
the effectiveness of the sub-network and focus more on the score prediction task. Note that, the subnetwork has the same architecture as the backbone but stores the weights at (k − 1) communication
round. The loss responsible for this optimization is the Contrastive Divergence Loss, which has been
conducted and published at the prestigious ICRA 2024 conference2 . The Contrastive Divergence Loss is
defined as:
Lcd = βLcd+ + (1 − β)Lcd− = βH(θib , θis ) + (1 − β)H(θis , θib )
(4)
where Lcd+ is the positive contrastive divergence term and Lcd− is the negative regularizer term; H is
the Kullback-Leibler Divergence loss function:
2
https://arxiv.org/pdf/2303.06305.pdf
12
H(ŷ, y) =
X
f (ŷ) log
f (ŷ)
f (y)
(5)
where ŷ is the predicted representation, y is dynamic soft label.
Consider Lcd+ in Equation 4 as a Bayesian statistical inference task, our goal is to estimate the model
parameters θb∗ by minimizing the Kullback-Leibler divergence H(θib , θis ) between the measured regression probability distribution of the observed AIOZ DePIN P0 (x|θis ) and the accumulated model P (x|θib ).
b
Hence, we can assume that the model distribution has a form of P (x|θib ) = e−E(x,θi ) /Z(θib ), where
b
b
Z(θi ) is the normalization term. However, evaluating the normalization term Z(θi ) is not trivial, which
leads to risks of getting stuck in a local minimum. We use samples obtained through a Markov Chain
Monte Carlo (MCMC) procedure with a specific initialization strategy to deal with the mentioned problem. Additionally, the Lcd+ can be expressed under the SGD algorithm in a AIOZ DePIN by setting:
Lcd+ = −
X
P0 (x|θis )
x
∂E(x; θib ) X
∂E(x; θib )
+
Qθb (x|θis )
b
i
∂θi
∂θib
x
(6)
where Qθib (x|θis ) is the measured probability distribution on the samples obtained by initializing the chain
at P0 (x|θis ) and running the Markov chain forward for a defined step.
Consider Lcd− regularizer in Equation 4 as a Bayesian statistical inference task, we can calculate Lcd−
as in Equation 6, however, the role of θs and θb is inverse:
Lcd− = −
X
P0 (x|θib )
x
∂E(x; θis ) X
∂E(x; θis )
+
Qθis (x|θib )
s
∂θi
∂θis
x
(7)
We note that although Equation 6 and Equation 7 share the same structure, the key difference is that while
the weight θib of the backbone is updated by the accumulation process, the weight θis of the sub-network,
instead, is not. This lead to different convergence behavior of contrastive divergence in Lcd+ and Lcd− .
∂E
The negative regularizer term Lcd− will converge to state θis∗ provided ∂θ
s is bounded:
i
g(x, θis ) =
and
(θis − θis∗ ) ·

X

x
∂E(x; θis ) X
∂E(x; θis )
−
P0 (x|(θib , θis ))
s
∂θi
∂θis
x
P0 (x)g(x, θis ) −
X
x′ ,x


′
s∗
P0 (x′ )Km
≥ k1 |θis − θis∗ |2
θis (x , x)g(x, θi )

(8)
(9)
for any k1 constraint; Km
θ s is the transition kernel. Note that the negative regularizer term Lcd− is only
used in training models on AIOZ DePIN. Thus, it does not contribute to the accumulation process of
federated training.
To learn the confident score S, we need a regression loss. We use mean square error (MAE) to compute
loss for predicting the confidence score in each AIOZ DePIN. Note that, we only use features from the
backbone for predicting the score.
Lmae = MAE(θib , ξˆi )
(10)
where ξˆi is the ground-truth score of the data sample ξi collected from silo i.
To make sure the model of any i-th AIOZ DePIN can be learned and also robust to Non-IID problems. We
need to combine the Contrastive Divergence Loss Lcd and Mean Square Error Lmae . The loss computed
in each communication round at each AIOZ DePIN before applying the accumulation process is described
as:
Li = Lmae + Lcd
(11)
Unlike other infrastructures that tend to explicitly define the role of AIOZ DePIN, W3AI offers realtime flexibility in roles, enabling resource optimization to balance the Storing-Delivering-Computing
Triangle. However, certain considerations may arise, such as the potential instability of the trained AI, its
robustness, and its performance.
13
14
Figure 5: Transmission route from multigraph topology
that assigns tasks for edge AIOZ DePIN. The
green color represents AIOZ DePIN with connections, while the blue color ndicates those whose connections are skipped due to low confidence.
Enterprise customers, in particular, have stricter SLAs, making it crucial to control the AI to stay within
specified limits.
5.4
AIOZ W3AI C OMPUTING WORKFLOW.
AIOZ W3AI Computing workflow represents an intricately crafted system created with precision to tackle
the urgent issues related to maintaining privacy in AI computations. Grounded in a strong design philosophy, AIOZ W3AI Computing incorporates state-of-the-art technologies to provide a secure, effective,
and collaborative setting.
5.4.1
D ESIGN P HILOSOPHY: BALANCING S ECURITY AND P ERFORMANCE
The core design philosophy revolves around the harmonious integration of two key technologies:
• Homomorphic Encryption (HE): This innovative technology allows computations to be performed directly on encrypted data, eliminating the need to expose sensitive raw information.
• Decentralized Federated Learning (DFL): DFL optimizes and secures the transmission of models during the learning process, further enhancing privacy and security.
This combined approach safeguards user data and model privacy while simultaneously optimizing processing speed and performance throughout the entire model training process.
5.4.2
A S ECURE AND S TREAMLINED W ORKFLOW
The AIOZ W3AI Computing Workflow ensures a smooth and privacy-centric experience for users involved in machine learning tasks. Here’s a breakdown of the key stages:
1. Task Initialization:
• Users submit their computing tasks and deposit a fee to compensate the AIOZ DePIN that
contributes resources.
• Using built-in homomorphic encryption capabilities, data, and model containers are securely encrypted locally on the user’s AIOZ DePIN.
• A local switching key is created for decryption purposes.
2. Data and Container Distribution:
• The W3AI Manager leverages network topology to assign encrypted data and containers to
suitable storage and computing AIOZ DePINs.
• During training, storage AIOZ DePINs provide data to computing AIOZ DePINs for task
execution.
• The W3AI Manager monitors AIOZ DePIN performance and distributes rewards based on
their contribution in each communication round.
3. Computation Phase:
• The computing task can involve either model inference (generating predictions based on
trained models) or model training (improving a model’s capabilities).
• Outputs can be encrypted results or encrypted models, depending on the task.
• The computation process stops once completed or when allocated rewards are exhausted.
4. Result Delivery and Decryption:
• Encrypted results or models are securely delivered back to the user.
• The local switching key is used to decrypt the results or models using the AIOZ DePIN’s
decryption and authorization functionalities.
• Any unused rewards are returned to their respective owners by the W3AI Task Manager.
This meticulous workflow ensures comprehensive data privacy and security within the AIOZ Network.
By combining cutting-edge technologies with a well-designed architecture, AIOZ W3AI empowers users
to participate in collaborative AI development while safeguarding their sensitive information.
15
5.5
D EMOCRATIZING AI T HROUGH A D ECENTRALIZED M ARKETPLACE
The AIOZ W3AI platform introduces a revolutionary Decentralized AI Marketplace, poised to transform
how we develop, share, and benefit from artificial intelligence.
This platform empowers a diverse community, from seasoned developers to everyday users, to actively
shape the future of AI while being rewarded for their contributions.
Figure 6: AIOZ W3AI: Decentralized AI Marketplace.
5.5.1
K EY S TAKEHOLDERS AND T HEIR C ONTRIBUTIONS
1. Developers and AI Experts:
• Model Creation and Publication: Accomplished developers and AI experts can create and
publish a wide range of AI models on our marketplace. These models cover an array of
functions, from image recognition to natural language processing, greatly expanding the
AI solutions available to users.
2. Casual Users and Crowdsourcing Participants:
• Autonomous AI Training Workflow: Casual users, even those without extensive AI knowledge, can actively participate in the AI model creation process through an intuitive UI/UX
interaction. Our platform offers a simplified AI training workflow, allowing users to:
– Upload Training Data: Users can easily upload their training data.
– Label Data: The platform assists users in labeling their data, streamlining the process.
– Select AI Model: Users can choose their preferred AI model type from a user-friendly
list, such as a classification model.
– Train and Create New AI Model: With a few simple clicks, users initiate the training
process, creating a new AI model.
16
• Access and Monetize AI Models: Once the AI model is created, casual users can utilize it for
their own needs, whether it’s for personal projects or business applications. Additionally,
they have the option to monetize their newly created AI models by offering them on the
marketplace for other users to access, further contributing to the collaborative ecosystem.
• Privacy-Preserving Data Contributions: Casual users can contribute privacy-protected personal data, facilitating the enrichment of AI training datasets without compromising individual privacy. This invaluable crowd-sourced data aids in the training and enhancement of
AI models.
• Labeling and Annotation Tasks: Our platform offers users the opportunity to engage in
labeling tasks similar to popular crowdsourcing platforms. By providing annotations and
labels to datasets, users play a pivotal role in refining the accuracy and efficacy of AI models.
• AI Model Evaluation: Casual users are encouraged to participate in evaluating AI models, providing invaluable feedback, ratings, and usability assessments. Their input ensures
continuous improvement, maintaining the marketplace’s high standards of quality.
5.5.2
U NLOCKING M ONETIZATION O PPORTUNITIES
1. Contributing Data and Tasks: Users are rewarded with tokens for their contributions, including privacy-protected data, labeling tasks, and AI model evaluations. Each contribution
positively impacts the AI ecosystem and ensures fair compensation for their valuable efforts.
2. Model Publication and Usage: Developers have the chance to monetize their expertise by
setting pricing and licensing terms for their AI models published on our marketplace. Users
seeking AI solutions can access these models by paying the stipulated fees, generating income
for the model creators.
5.5.3
E MPOWERING C OLLABORATION , G UARANTEEING P RIVACY
Our incentivized AI marketplace fosters collaboration while safeguarding user data and privacy. Users
retain full control over their data, which stays on their devices, mitigating the risk of data breaches or
unauthorized access.
5.5.4
U SER -F RIENDLY AI FOR A LL
We’re committed to making AI accessible to everyone. Our autonomous AI training workflow ensures
that even individuals without deep AI expertise can actively participate in AI model creation and benefit
from their contributions.
This inclusive approach underscores our dedication to democratizing AI contributions and innovations.
AIOZ W3AI’s Decentralized AI Marketplace serves as a thriving hub for collaboration, enabling both
casual users and experts to contribute, earn rewards, and actively participate in the evolution of AI technologies.
Our goal is to shape the future of AI, where every contributor counts, and every idea matters.
5.6
AIOZ T OKEN : F UELING THE D ECENTRALIZED AI M ARKETPLACE
The AIOZ token serves as the lifeblood of the W3AI ecosystem, incentivizing various stakeholders and
facilitating seamless economic interactions.
Let’s explore how AIOZ flows through the different layers of the W3AI architecture (Figure 8):
• Rewarding AIOZ:
– AI Creators: By publishing AI models, datasets, and AI Playgrounds on the Marketplace,
AI Creators earn AIOZ. This rewards them for their innovation and contributions to the
ecosystem’s knowledge base. The potential reward for an AI Creator (RAIC ) can be represented by a formula that considers several factors:
RAIC = α ∗ f (C) ∗ g(D) ∗ (1 − CR)
17
Figure 7: The collaborative network of stakeholders driving advancement in AI through the AIOZ W3AI
Marketplace.
Here, α is a platform-defined coefficient that influences the overall reward pool allocated
to AI Creators. The function f (C) represents the complexity of the model, potentially
incorporating factors like the number of parameters, training data size, and computational
requirements. Similarly, g(D) represents the value of the dataset, considering its size,
quality, and potential reusability. (CR remains the platform-defined commission rate).
– W3AI Users: Developers and casual enthusiasts can earn AIOZ by publishing applications
within the W3AI Platform Spaces. This mechanism fosters a collaborative environment
where users can create and monetize their AI-powered applications. Similar to AI Creators,
user rewards (RU ) can be a function of the application’s complexity (AC), user adoption
(U A), and the commission rate:
RU = β ∗ h(AC, U A) ∗ (1 − CR)
β is another platform-defined coefficient influencing the reward pool allocated to users.
The function h(AC, U A) considers both the application’s complexity (potentially similar
to f (C) for AI models) and user adoption metrics like active users, app downloads, and
generated revenue.
– AIOZ DePINs: hese individuals contribute valuable computing resources (storage, delivery,
and compute power) to the network. In return, they are compensated with AIOZ for their
contributions. AIOZ DePIN rewards (RN R ) can be based on a combination of factors:
RN R = (wRC ∗ RC + wT CR ∗ T CR) ∗ RR
This formula incorporates weighted factors (wRC and wT CR ) to reflect the relative importance of resource contribution (RC) and task completion rate (T CR) on AIOZ DePIN
rewards. The platform might adjust these weights dynamically based on network needs.
RR remains the predefined reward rate per unit resource.
– Crowdworkers: The W3AI ecosystem also leverages crowdworkers who support various
tasks such as data annotation, fine-tuning AI models, and reviewing computing results.
These contributions are crucial for ensuring the quality and performance of AI models
within the marketplace. Crowdworkers are compensated with AIOZ token for their efforts.
18
Figure 8: The accompanying diagram elegantly maps out the token flow within the W3AI ecosystem,
illustrating the interplay between AI Creators, Users, and the AIOZ DePIN. The flow of AIOZ tokens is
depicted through various pathways, highlighting the processes of reward distribution, token circulation,
and commission-based transactions. This visual representation clarifies the underlying economic model
and the value exchange that drives the ecosystem.
• Spending AIOZ token:
– AI Model & Dataset Marketplace: Users leverage AIOZ token to unlock and access AI
models, datasets, and other resources offered by AI Creators within the Marketplace. A
portion of these transaction fees are directed towards the AIOZ Network Operations and
AIOZ Treasury, ensuring the platform’s ongoing maintenance and growth. Another portion
is burned, helping to regulate token supply and mitigate inflation. The burning mechanism
can be represented by a function (B ) that considers a set burning rate (BR ) and a dynamic
factor based on transaction volume (T V ):
B(T V ) = BR ∗ T V γ
Here, γ is an exponent that determines the impact of transaction volume on the burning
rate. A higher γ value signifies a more aggressive burning strategy as transaction volume
increases.
– AI Playground and APIs: Users who wish to experiment with AI models or utilize AI
functionalities through the AI Playground or API incur fees in AIOZ token. This revenue
stream helps sustain the platform’s development.
– Community Participation: Staking the AIOZ token allows users to participate in governance initiatives and potentially earn rewards through airdrops. This fosters a strong and
engaged community around the W3AI platform.
A Sustainable Economic Cycle
This meticulously designed token flow loop incentivizes innovation, rewards participation, and fuels the
continuous development of the AIOZ W3AI ecosystem.
19
W3AI empowers AI Creators to monetize their work, attracts users by providing them with rewards
opportunities, and compensates AIOZ DePINs for their critical contributions. This system ensures longterm sustainability and growth by strategically burning tokens and allocating funds to the Treasury.
6
C ONCLUSION
The AIOZ W3AI architecture stands at the forefront of a paradigm shift in AI development. It fosters a
collaborative environment where innovation thrives, data privacy is paramount, and users of all levels can
benefit from the power of artificial intelligence.
This white paper has explored the intricate workings of the W3AI architecture, delving into its secure
workflow powered by FHE and DFL. We have examined the token flow within the ecosystem, highlighting how the AIOZ token incentivizes participation and fuels continuous growth.
Looking ahead, the W3AI platform is poised for significant progress in 2024. The upcoming launch of
the AI Model Marketplace promises a user-centric platform for creators to share their work and users to
discover powerful AI tools.
This marketplace, along with functionalities for monetization through APIs and user profiles, will further
empower the AIOZ W3AI ecosystem.
The AIOZ W3AI ecosystem represents a vision for the future of AI; a future where users have control
over their data, creators are fairly compensated for their work, and the collective intelligence of a global
community drives groundbreaking advancements.
By fostering a decentralized and secure environment, AIOZ W3AI empowers individuals and organizations to unlock the true potential of AI and contribute to a more innovative and intelligent world.
Looking Ahead: A Call to Action
We invite you to join our vibrant community of developers, researchers, and AI enthusiasts. Explore the
possibilities of the W3AI platform, contribute your talents, and be a part of shaping the decentralized
future of AI.
20
Скачать