AGI (Artificial Guarding Intelligence) utilizes onchain AI for determining if unexpected behaviors happen to onchain smart contract.
Any whitehat developer can initiate a request for onchain AI (LlaMA2) to evaluate if malicious behaviors (eg. hacker set “month” to be “13”) happen to smart contract. If the smart contract is determined to be hacked, the whitehat can claim bounty from a funding pool as reward.
Onchain AI here serves as a neutral judge with verifiable arbitration process.