Updated November 11th at 6am UTC: Two quotes previously attributed to Edward Snowden in the last paragraph have been corrected to Ilya Poloskin.
Near Protocol announced ambitious plans to build the world’s largest open source artificial intelligence model on the first day of the Redacted conference in Bangkok, Thailand. The 1.4 trillion parameter model is 3.5 times larger than Meta’s current open-source Llama model.
It was created through competitive crowdsourced research and development by thousands of contributors in the new Near AI Research hub, where participants can train a small 500 million parameter model starting today, November 10th. You can participate.
Near Protocol’s ambitious AI model
The project grows in size and sophistication across seven models, with only the best contributors moving on to develop progressively more complex and larger models. Models are monetized and privacy protected using an encrypted trusted execution environment that rewards contributors and facilitates continuous updates as technology advances.
Related: Critical bug that could crash all nodes on network approaches patch
Near Protocol co-founder Illia Polosukhin told Cointelegraph at the Redacted conference in Bangkok that expensive training and computing will be funded through token sales.
“It costs about $160 million, so obviously it’s a lot of money, but it’s actually money that can be raised in cryptocurrencies,” he said. Poloskin added:
“Token holders then receive rewards from all the inferences that occur when this model is used. That means we have a business model, we have a way to monetize it, and we have a way to raise funds. There’s a way to do that, and there’s a way to loop this around, so people can actually reinvest into the next model.”
Near is one of the few crypto projects capable of realizing such an ambitious endeavour. Poloskin was one of the authors of the groundbreaking Transformers research paper that led to ChatGPT, and co-founder Alex Skidanov worked at OpenAI prior to ChatGPT. The era-defining model will be released in late 2022.
Skidanov, who is currently the head of Near AI, acknowledged that this is a large undertaking with significant hurdles to overcome.
Decentralized AI tackles privacy issues
To train such a large model, the project would require “tens of thousands of GPUs in one location,” which is not ideal. But using a distributed computing network “will require new technologies that don’t exist today, because all the distributed training techniques we have require very fast interconnections.” But new research from Deep Mind suggests it’s possible, he added.
Poloskin said he hasn’t yet spoken to existing projects like the Artificial Superintelligence Alliance, but would like to see if there are synergies. No matter what happens, decentralized AI technology must win for us, he said.
“This is probably the most important technology now and probably in the future. And the reality is that when AI is controlled by one company, we are effectively doing everything that company says. ” he explained, adding:
“If all AI and virtually all the economy is run by one company, then there is no decentralization. So if you have AI that follows the same principles, then philosophically Web3 is still appropriate. It’s like the only way to be.”
Conference guest speaker Edward Snowden drove that message home with a chilling depiction of centralized AI turning the world into a vast surveillance state.
He also spoke about the need for civil rights on the internet and that “there are legitimate limits to the power to regulate, and the only way to maintain digital sovereignty is to create our own systems that are enforced by mathematics.” He also talked about the need to recognize that. ”
Magazine: Asian crypto traders profit from Trump’s victory, China’s 2025 CBDC deadline: Asia Express