Etherscan launches AI-powered Code Reader

Etherscan Unveils AI-Powered Code Reader Tool

On the 19th of June, Etherscan, a blockchain explorer and analytics platform, unveiled its new ‘Code Reader’ tool. This tool, which uses AI, can fetch and analyze the source code of any given contract address. After the user enters the information, the Code Reader will use OpenAI’s large language model (LLM) to generate a response which gives an insight into the source code of the contract. The developers of Etherscan stated:

Code Reader use cases include obtaining insights into the code of contracts with AI-generated explanations, obtaining complete lists of Ethereum-related smart contract functions, and comprehending how the contract works with decentralized applications (dApps). “After retrieving the contract files, you can select a particular source code file to analyze. Plus, you can alter the source code within the UI before sending it to the AI,” developers declared.

Challenges of Large AI Models in Decentralized Computing Networks

Amid the AI surge, certain specialists have expressed doubts about the practicality of existing AI systems. As per a recent study published by Singaporean venture capital firm Foresight Ventures, “computing power resources will be the next great battleground in the next decade”. Even though there is a growing demand for training large AI models in decentralized distributed computing power networks, researchers state that the current models have major obstacles such as complex data synchronization, network optimization, data privacy and security issues.

In one example, Foresight researchers pointed out that training a large model with 175 billion parameters using single-precision floating-point representation would necessitate approximately 700 gigabytes. However, distributed training necessitates the transmission and updating of these parameters between computing nodes on a regular basis. If there are 100 computing nodes, with each node requiring all parameters to be updated at each unit step, the model would need to transmit 70 terabytes of data per second, which far surpasses the capacity of the majority of networks. The researchers concluded:

Categorized in:

Tagged in: