Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page.
Computational reflection of agents, especially in their workflow design and execution aspects, will allow entities to create workflows (i.e. logical structures) in the network in a decentralized manner (see picture below). In a decentralized network, meta-agents can act as intermediaries that transform input data into output data through the curation of other agents’ computational services, which ultimately can be expressed as a logical structure consisting of a variety of agents existing in a connected network workflow. So while such a meta-agent uses the same abstraction as other agents in the network, internally it holds only the computational reflection (or representation) of a workflow: the identities of agents in workflow, their inputs and outputs, their cost, location, and data offered, as well as scheduling information needed for designing and executing a workflow.
Once the computational reflection is fully mapped out by a meta-agent, the workflow can be executed entirely at their discretion, provided that the initial data and the amount of tokens covering the costs of all computational agents within the workflow are covered. Note that as meta-agents are able to design workflows involving other computational agents, similarly meta-agents themselves can be incorporated into higher-order workflows giving rise to the logical scalability property of the network. Meta-agents will be able to create complex computational reflections consisting of a hierarchy of sub-meta agents, all the way down to base agent services, that are constantly and dynamically changing their costs, workflows, and services offered. Furthermore, these workflows can be designed by a human operator, automatic procedure, or an AI agent using the same level of abstraction. These functionalities will give rise to what call a decentralized network of dynamic service meshes.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
Agents are building blocks that can be combined to form arbitrarily complex domain specific computing workflows that can perform a variety of useful computations in the network. The same agent can participate in many workflows, and connect with other agents to form clusters. Agent mobility enables such workflows to operate across boundaries of cloud vendors, mobile devices, private clouds and more, while respecting and ensuring ownership, economic value of resources, and data security/privacy are maintained by the respective parties. NuNet implements a tokenomic mechanism to enable and facilitate the design of frictionless cross-vendor workflow execution.
In terms of workflow design, agents, using NuNet’s functionality, will be able to search for other agents in the network, which could provide building blocks for their original task, calculate the costs of such workflows, and estimate time requirements of execution. This would allow for agents to make optimal decisions with or without help from humans, and enable agents to express larger computational tasks that would be difficult for one agent to achieve.
The workflow execution aspect of computational reflection will enable agents to time, schedule and manage the actual execution of their workflow, data transfers between agents, error propagation, crash recovery, necessary caching, etc.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page.
Inputs and outputs of computational processes are data, whereas data has its own inherent value. The value of data, however, is not absolute, but instead relative to what other participants of the ecosystem (i.e. computational processes) can do with it and how they value it with respect to the ecosystem’s dynamics. Data’s value, broadly speaking, is context-dependent and is subject to negotiations between providers and requesters. Entities can value static data (e.g. stored in a database) or dynamic data (e.g. real-time streaming) that is time sensitive, private, or public, and useful in either broad or very specific contexts. Also, it is important to keep in mind that data has associated costs - production, storage, analysis, and transformation. As these costs become more transparent, specific solutions can be designed for various stages of the data creation/analysis life-cycle and these open and collaborative efforts could likely result in the reduction of transactional data costs and the ability to deliver improved insights. Transparent, secure, and efficient matching of all data forms to the immediate requirements of societal, business, government, and individual processes will tap into the enormous economic potential of the data economy, which for the time being is still waiting to be unlocked.
NuNet provides tools for the economic exchange and sharing of data (which may be, but is not necessarily free). Note, that in a computational workflow data can be very specific and time-sensitive, i.e. produced purposefully for the next process, and by converting input data to output data, agents produce value which they can then exchange with other agents on the basis of a tokenomic mechanism. Since the value of data is different for different agents, NuNet enables a decentralized value exchange mechanism, based on, but not limited to, pairwise negotiations and contracts between computational agents. NuNet’s tokenomic mechanism will also enable solutions for tamper-proof traceability of resource consumption, data provenance, and vendor-consumer relations. It will provide the basis for enabling companies and customers to accurately trace and manage their spending and decentralized partnerships.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
An obvious requirement and one of the most important aspects of the framework’s functionality is the ability to verify and validate the correctness of computational processes performed in a network and establish a way to validate good users/components, reward good actors and punish bad actors.
The ability to verify and validate each general computational process in a decentralized network can only be performed in a decentralized way, which means that NuNet as a whole will not attempt to provide guarantees of the correctness of each process and computational workflow performed in the network. Instead, the framework will provide APIs, tools, network-wide telemetry information, and reputation system(s) that will enable each constituent of the network (network operations agent) to evaluate the validity and correctness of concrete results of the computational processes in question. Through network-wide telemetry information available to all constituents of the network, NuNet will facilitate the self-learning and healing capabilities of the framework effectively minimising the impact of bad actors on the overall network performance as well as the results of individual computational processes.
Main aspects which will ensure the reliability of the NuNet network and the validity of its individual computational processes are:
The tokenomic mechanism supported by implicit and explicit reputation systems, on top of technical means of verification and validation, will provide immediate and clear economic incentives for the good (i.e. beneficial for all) behavior of network constituents;
The variant of the non-repudiation/proof of receipt mechanism, where new tokens will be minted and distributed to platform users upon successful completion of a transaction and based on actual computing power used by this transaction - a crucial part of NuNet tokenomics.
NuNet will also make the best use of formal third-party verification tools, such as those developed by SingularityNET, TrueBit, zkSNARKs, or other open-source protocols or even businesses, as well as encourage using secure hardware enclaves. However, formal verification methods are an active research field and are not available for verifying computations in general - only in specific cases. Therefore, NuNet will mostly rely on tokenomic and reputation-based mechanisms, while integrating formal work verification methods for specific use cases where it is appropriate. In the future, NuNet will aim to provide an API for integrating third-party formal verification tools for general usage.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
NuNet’s APIs will support the functionalities of decentralized computing platforms and marketplaces, initially of SingularityNET and members of the Decentralized AI Alliance (DAIA). These functionalities include, but are not limited to:
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
Ecosystems of adaptive decentralized computations, whose individual agents are capable of learning and meta-learning in collaboration with each other, will give rise to the learning and adaptive capabilities of the decentralized marketplace of NuNet as a whole. Since some agents will represent humans participating in the network, and in the beginning human agents may contribute the largest part of the intelligence of the network, the framework as a whole will be able to learn from human actions and intelligence and progressively undergo cognitive development. The governance mechanisms of NuNet will guide this evolutionary development for the benefit of all.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
NuNet will support the principle of radical decentralization of the computing platforms and marketplaces in the sense that every agent will be able to become a meta-agent if it decides to do so and has computational, cognitive, and financial resources or the support of human operators to execute such roles. Given a large enough number of agents operating in the network, their ability to form workflows on their own will lead to pluripotency and degeneracy (i.e. many-to-many relations of structures and functions), competition, cooperation, and capacity of the network to self-organize into progressively more complex cognitive structures.
In the decomposition of NuNet participants into computational resource providers, computational resource users, and network operations agents, meta-agents may fall into any of the categories; or a single meta-agent might span 2 or 3 of the categories.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
A computational agent encloses a computational process that turns input data into output data, without any restriction whatsoever on the nature of the process or the amount of computational resources that it needs. Agents isolate the process’ computational logic from the physical implementation, resources and location. A computational process encapsulated into an agent can be any combination of memory and processing, which can range from complex AI and machine learning processes to simple queries for retrieving data from a database or a streaming data source. The abstraction layer that isolates computational logic from physical implementation enables agents to be agnostic to the physical infrastructure and location, which can be dynamically changed as per demands of specific workflow.
View and download the original Whitepaper. For more up-to-date developments you can view our Info Hub or Medium page
Computational agents will be able to express any computational algorithm, AI, or a machine learning engine, and will also be able to access information about their own and other agents’ capabilities through NuNet, as well as the history and activity in the network. Therefore, agents will be able to learn from experience about the credibility, efficiency, and security of other agents, and also about other dimensions and activities happening in the network. Different meta-agents may start to specialize in analyzing other agents’ reputations and rating their performance, and then providing this information to other agents in exchange for tokens or information. These intricate interactions ultimately will give rise to a decentralized ecosystem of reputation systems within the network, that humans and machine agents will be able to examine and rely upon when designing computational workflows. Overall, these capabilities will allow individual agents to learn from their own, or network, experience and become better at performing their tasks, and allow them to be adaptive to changing circumstances, new algorithms, cutting-edge AI engines, and novel use cases.