Exploring the first model offering in the ORA protocol

AI, Tokenized: Exploring the Initial Model Offering with ORA Protocol


As AI emerges as the cornerstone of new mystery narratives as we move into 2024, the idea of ​​AI-driven asset extraction has charted new avenues for creative exploration. BeenCrypto sat down with ORA protocol founder Kartin Wong about the Initial Model Offering (IMO). This novel concept has attracted a lot of attention and has the potential to become a new industry standard.

The ORA protocol (formerly known as HyperOracle) is a proven protocol that brings AI and complex computing on-chain. ORA expands the capabilities of smart contracts by integrating rich data sources and computing power, allowing developers to innovate without limits. ORA's solutions are trusted by various entities including Compound, Ethereum Foundation, Optimism and Polygon.

A new era in AI development

Historically, AI development has been constrained by a monolithic approach to monetization, relying heavily on proprietary models and subscription-based services. This model limits the exchange of ideas and creates significant financial and resource-based barriers for developers. Wong's ORA vision addresses critical challenges facing AI innovators – primarily, monetization and accessibility of AI models.

okex

The biggest issue with AI models today is that they are not yet powerful enough to solve the problems they are intended to solve. To be widely used, the technology must be high-end, as with ChatGPT, developing a high-performance language model requires hundreds of millions of dollars in investment. Most AI companies, especially in their respective fields, struggle with insufficient funding to successfully launch their products. This lack of funding is a major obstacle for many AI companies looking to bring their products to market.

Recognizing the need for a new model that aligns open source collaboration with financial viability, ORA introduced the concept of “IMO”, which stands for Initial Model Delivery. Basically, IMO the premise is clear. If tokenization applies to everything, then AI models are nothing extraneous and can be a token for asset delivery. This allows inventors to recoup their investment and profit directly from their development.

For example, after investing $5M ​​in a high-quality AI model, an inventor can open source it to increase its capabilities. By tokenizing the model and immediately offering the tokens for sale or selling them in bulk to the public, creators can anticipate that the value of the token will increase. This strategy immediately returns significant cash flow to the model creator after a successful model launch.

From the point of view of users, such a device looks very attractive. Individuals who believe in a particular AI model can invest in the corresponding token. If the model generates economic profits from its implementation, investors may receive a share of those profits.

“For example, I put an AI model on the chain, and every time a smart contract or user wants to use it, they pay me like 0.01 ETH. After a week, about 10 000 people call it on the chain, so the income of the protocol is now 100 ETH per week. This ETH goes to model token holders according to their investment amount.

The founder of the ORA protocol stresses the importance of educating people on IMOs and their impact on AI project funding. When a model offering succeeds and the value of the token increases significantly, it attracts people's attention. This will increase interest and investment in future IMOs, increasing liquidity in the AI ​​sector.

Understanding the paradigm shift

Cartin admits that the move to a fully decentralized and permissionless ecosystem is fraught with technical challenges. These include ensuring model integrity, maintaining performance standards, and achieving true decentralization without compromising user trust or model quality.

ORA solves bottlenecks using two ERC standards ERC-7641 and ERC-7007 with their onchain AI oracle. Let's look at these parts separately.

ERC-7641 is a token standard compatible with ERC-20. To start IMO, developers connect the model to the ERC-7641 asset and set the token numbers in the smart contract. Investors buy these tokens by gaining ownership of the AI ​​model in proportion to their stake. The ERC-7641 protocol sets profit distribution rules in the contract, which enables automatic profit sharing based on token holdings. ERC-7007 is a token standard that protects AI-generated content, authenticity and traceability. It maintains AI-generated content metadata on the blockchain and uses smart contracts for automatic verification. Developers can apply technologies such as zkML or opML to verify that AI-generated content (AIGC) for a given NFT truly comes from a specific machine learning model and inputs. ORA Onchain AI Oracle validates and implements AI models on the blockchain, ensuring that the deployment and operation of AI models occurs entirely on-chain, enhancing the transparency and accuracy of their workflows.

photo 2024 04 25 11 07 11

ORA further integrates Optimistic Machine Learning (opML) technology. AI models often represent critical competitive advantages, and fully disclosing them can undermine their business value – opML can use zero-knowledge proofs or similar cryptographic techniques to validate model results without disclosing the details of the model. This method preserves the integrity and efficiency of the model while preserving its confidentiality and uniqueness.

However, Cartin recognizes that the protocol still has a long way to go before it achieves its primary goal.

“The main problem with simulating AI models is that we want it to be licenseless and completely decentralized. We spent 18 months on R&D to make something real. It's revolutionary. But the problem is that we're the only ones who can do it because we created the OpML library. So currently simulating AI models is If you want to do it, you have to go through the ORA protocol, and then we'll share the Docker image with all the nodes, etc. It's completely decentralized and permissionless. Now, it's not completely permissionless.”

Ethical implications and regulatory alignment

Simulation of AI models raises significant ethical questions, particularly regarding misuse and liability. The ORA Protocol addresses these challenges through a two-pronged strategy that focuses on ethical guidelines and regulatory compliance. Cartin emphasizes the importance of creating a governance framework that is compatible with international standards, which ensures that the potential of tokenized AI models does not exceed the community and regulatory regulations.

“The biggest ethical challenge in the IMO is, in fact, not the IMO itself. It is actually an AI Oracle. Once you put an AI model on the blockchain, it cannot be blocked until people use it. So it raises some concerns – for example, people can generate unethical content with an AI model. But the blockchain itself has similar risks. Some people use it to cheat money and you have no way to stop it. Once you do it on Bitcoin or Ethereum, it is completely decentralized. Then you have no control over it.

The founder of ORA said that when an AI modeling company performs IMO, they ensure full compliance with simulations of their models, which are not classified as safe, especially in compliance with regulations in the United States. The approach is to make every AI company, regardless of their global location, responsible for compliance due to different laws and regulations in different countries. Since the primary beneficiary of an IMO is the IA company itself, it falls to them to navigate and comply with their respective country's specific legal frameworks.

For example, Singapore's regulations on AI and tokenization may differ significantly from those in the US, requiring companies in Singapore to independently manage and mitigate any ethical concerns related to their AI models. This could include implementing governance mechanisms in tokens or blacklisting in smart contracts to prevent unethical use of an AI model. This principle of local responsibility and compliance is universally applicable to all jurisdictions.

A new industry standard

The current relationship of cryptocurrency narratives with the AI ​​sector stems from its impact on AI production. This integration is particularly important because blockchain technology can solve two critical challenges in the AI ​​industry. The first is liquidity, which Initial Model Offerings (IMO) has begun to tackle by providing a platform to fund AI models. A second challenge involves ensuring transparency to prevent IMOs from being perceived as opaque or unreliable. The use of blockchain technology can ensure the authenticity of these supplies by helping to address issues of trust and transparency.

Cartin strongly believes that all industries that use AI in their production processes will see benefits from tokenized AI models, especially through the so-called “controller on-chain” side. This concept addresses a common issue in AI production models – not the fear of AI turning on humans, but the frequent occurrence of operational failures. A large-scale linguistic model (LLM) detects and resolves such deficiencies. If a user suspects a bug, they can report the issue to this LLM, evaluating whether the original AI model is experiencing a problem.

The most effective controller for this purpose is available on the blockchain. Despite the high cost, the main advantage is that this supervisory AI model works every hour, which ensures constant performance monitoring of the AI ​​models deployed in local devices. Therefore, every AI model running locally can benefit from being connected to a tokenized, on-chain AI model that acts as a perpetual controller.

In his closing thoughts, Cartin noted that the excitement surrounding the AI ​​crypto space will continue as long as there are new developments and discoveries. Given these dynamics, the momentum in the AI ​​sector is likely to be long-term due to both developments and integration with blockchain.

“AI is here to stay, and what we're doing here is solving a pressing problem for all AI companies in the world. So this brings a ton of value to crypto and also a ton of value to all AI companies. I think it will be the story of the year, full cycle, IMO. I think after this year it will be industry standard to start doing AI model 100%.

Disclaimer

In compliance with Trust Project guidelines, this opinion piece presents the views of the author and may not necessarily reflect the views of BeInCrypto. BeInCrypto is committed to transparent reporting and maintaining the highest journalistic standards. Readers are advised to independently verify information and consult with professionals before making decisions based on this content. Please note that our terms and conditions, privacy policy and disclaimer have been updated.

Leave a Reply

Pin It on Pinterest