Bitget App
Trade smarter
Buy cryptoMarketsTradeFuturesCopyBotsEarn

Decentralizing AI: Balancing Innovation and Accessibility in the Quest for Sustainable Computing Power

DailyCoinDailyCoin2024/06/05 15:58
By:DailyCoin

The accessible use of AI is one of the most impactful changes of our generation, existing on the same level as the computer chip, the internet, and the search engine in terms of bringing more data, computing, and knowledge to individuals. However, there’s a design principle both public and private sectors have yet to agree on: Should AI be centralized or decentralized (or what combination of both), and what does that imply for the functionality, sustainability, and economics of the AI economy? 

It’s also important to take this question in context. We are not (yet) talking about thinking machines, legal or ethical boundaries, or the extent to which AI can or should replace human activity. We are just talking about a fancy and flexible logic layer that can process a lot of data and needs tremendous amounts of computing power to do it. 

What Is Decentralized AI? 

At its core, the definition of decentralized AI exists on two levels: Compute and data.

Sponsored

Let’s start with hardware. In a centralized system, a static set of data sources, models, and processors perform tasks defined by a user. If the demand is light, processing power goes unused. If demand is heavy, performance drops off as system limits are reached. Gains in 

model effectiveness and efficiency are limited to what you can build or buy, and acquiring clean data for your sole use can be a combination of expensive and lengthy, especially if you’re only going to use it once. 

Read More

Jenner Token Faces Disaster Days After Launch, Tokens Dumped

Is XRP Stuck in a Slump or Coiling Up for a Massive Surge?

Ripple Doubles Down with New Filing Letter in SEC Lawsuit

Enter decentralized AI. The system exists across multiple nodes, each of which can be scaled up or down as needed. Have spare capacity? Rent it out to offset your costs. Need spare capacity for a few hours? No problem. You can also use models and data sets on an ad-hoc basis. This arrangement not only allows projects with limited resources to play at a higher level, but it also encourages continuous cost reductions through competition. It’s clear, then, that hardware should be decentralized. If that were even a question, Amazon Web Services, Microsoft Azure, and Google Cloud Platform wouldn’t be the monsters they are today. 

Sponsored

We’ll get to the governance side of things in a moment, so keep it in the back of your mind. The New Frontier 

Let’s get back to our fancy and flexible piece of math. What has its impact been on the world? Again, we’re confining ourselves to compute considerations, and so stopping short of increased productivity, redefining or replacing roles in the workplace, and sometimes achieving better-than-human results on limited use cases, the most fundamental consequence of the increasing scope and complexity of AI is that the demand for computing power has increased exponentially. 

According to Stanford’s 2024 AI Index report , the power required for a model in 2003 was one petaflop, and it remained around that level for almost ten years. In 2013, there were no ten thousand petaflop models recorded. In 2023, only a handful of models were below ten thousand petaflops, and several were above ten billion (with Gemini Ultra, Google’s model, approaching the 100B boundary). NVidia, which focused on optimizing its processors for AI applications, has seen its stock price go up nearly 8x in the past year. This is not a simple increase; it’s an explosion. The so-called “frontier” cases that push the state of the art (and attract the majority of the funds) are also significantly more expensive in terms of computing costs. 

How do we solve an escalation that is both so necessary to propel human performance to new heights but is also so hard to sustain? 

Governance, not Dominance 

It’s tempting to endorse the continued scaling of behemoths like Amazon Web Services, Microsoft Azure, and Google Cloud Platform. After all, their expansive infrastructure and robust service offerings provide a seemingly easy solution to the soaring demands for AI computing power. These platforms can scale operations seamlessly, offer global reach, and bring extensive

RD budgets to bear, driving down costs through economies of scale. However, this centralization brings risks: it fosters dependency on a few key players, creates potential monopolies, and may stifle innovation in the broader tech ecosystem by overshadowing smaller, agile companies that could introduce disruptive new technologies. 

Alternatively, envisioning a decentralized AI ecosystem presents a compelling counterproposal. In this model, giants like Google, Amazon, and Microsoft would still play a crucial role, acting as the backbone of a global AI marketplace while enabling a network of smaller entities to provide 

specialized services and innovation. This approach would not only reduce the load on the core systems but also allow for local solutions that are more responsive to specific regional needs. It would encourage a more competitive environment, driving technological advancement and reducing costs through a diverse array of contributors. 

It will also permit the most efficient use of resources because while a lack of capacity will stifle innovation, excess capacity due to each player trying to create closed systems will stifle the environment. 

At SingularityNET we have been spearheading decentralized AI with our core decentralized AGI research Hyperon and have accelerated projects such as Hypercycle and Nunet, both in distributed compute.

It comes down to governance, to creating a system that allows an ecosystem to flourish and therefore, grow to meet human demands we have not yet anticipated. 

The challenge of the next decade will be managing AI’s exponential demand for computing power in a way that promotes both innovation and accessibility. Decentralizing AI infrastructure is a key part of the solution, offering a more sustainable, flexible, and resilient model for future growth. This shift would democratize the development of AI, allowing more players to contribute to the ecosystem and ensuring that advancements are shared more broadly, not just in primary consuming markets but also in the whole world. As we stand on this frontier, it’s crucial for every stakeholder, from tech giants to individual developers, to consider how they can contribute to a decentralized future. 

0

Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.

PoolX: Stake to earn
CEC, QTLX, GDV and other popular new coins are in hot progress!
Stake now!

You may also like

Ethereum ETFs gain traction with $13 billion in trading volume, but ETH price impact remains subdued

Since their launch on July 23, ether ETFs have been chugging along, accumulating $13 billion in cumulative spot volume across nine funds.The following is an excerpt from The Block’s Data and Insights newsletter.

The Block2024/09/23 18:21