
A lab at Stanford focused on AI research has selected the decentralized cloud computing platform Theta EdgeCloud to support its initiatives involving large language models (LLMs).
This decentralized cloud solution could address the significant computing demands of AI. On April 17, Theta Labs revealed that Stanford’s AI research team would utilize Theta (THETA) EdgeCloud to advance their work on large language models. The lab, led by Assistant Professor Ellen Vitercik, aims to harness the platform for discrete optimization and algorithmic reasoning related to LLMs.
Stanford joins an increasing number of academic institutions adopting the decentralized platform for their research endeavors. Theta Labs notes that other institutions using EdgeCloud include Seoul National University, Korea University, the University of Oregon, Michigan State University, and many others.
Big Tech and Decentralized Services Compete for AI Computing Power
In comparison, Amazon recently revealed intentions to invest $11 billion in data centers in Indiana. Additionally, Google is globally expanding by dedicating $1.1 billion to its data center in Finland and building another facility in Malaysia at a cost of $2 billion.
Nevertheless, the big tech approach is not the only method competing for AI workloads. Unlike traditional large LLM services, Theta EdgeCloud operates as a decentralized cloud computing platform. Its infrastructure is distributed across various locations, reducing dependence on centralized data centers for computing resources.
The platform utilizes blockchain technology to incentivize smaller GPU providers based on the revenue generated from end-users. This innovative model enables Theta to lower capital expenditures and achieve faster scalability, ultimately providing a more cost-effective infrastructure for users.
The Theta Network is a blockchain protocol that was initially designed for decentralized video streaming. It has since transformed to offer decentralized infrastructure for cloud computing, with a specific focus on AI applications.