Space: The final frontier for data processing

5gDedicated

The space race is back – but unlike the battle between the political superpowers in the 60s, this one is between tech corporations looking to explore data processing potential above the clouds.

Google and Elon Musk’s SpaceX were both quick off the mark following Starcloud’s announcement that it had launched its Starcloud-1 satellite containing a Nvidia H100 GPU to facilitate AI processing in space. Starcloud, under its previous name of Lumen Orbit, first proposed the viability of a such a data center last year. It’s a modest proposal at the moment; a single computer is far cry from the banks of machines in a data center.

The announcement, however, prompted plenty of activity. Elon Musk didn’t offer any details about a proposed SpaceX launch, although in a post on X he said, in response to the Starcloud announcement, “Simply scaling up Starlink V3 satellites, which have high speed laser links would work. SpaceX will be doing this.“

Google’s announcement on Tuesday, however, highlighted its planned Project Suncatcher that, the company said in a press release, would equip “solar-powered satellite constellations with TPUs and free-space optical links to one day scale machine learning compute in space.”

The proposed Suncatcher launch is based on Google’s own research paper predicting the viability of an AI processing infrastructure in space. The paper is yet to be peer reviewed, so it’s not clear whether its claims that a processing infrastructure based on solar power is viable, nor whether its assumptions about satellite launch costs coming down in the future will hold true. The company is certainly looking far ahead; the white paper proposes that Project Suncatcher will become financially viable in the mid-2030s.

Both Google and SpaceX’s plans are, at the moment, vague in terms of what can be achieved and by when.  According to Peter Judge, an analyst with data center research company Uptime Institute, “there’s still a long way to go before data centers as we think of them at the moment are launched into space. There are a lot of extreme tech issues that need to be resolved first.”

There are, however, a couple of reasons why data centers in space are being considered. There are plenty of reports about how the increased amount of AI processing is affecting power consumption within data centers; the World Economic Forum has estimated that the power required to handle AI is increasing at a rate of between 26% and 36% annually. Therefore, it is not surprising that organizations are looking at other options.

But an even more pressing reason for orbiting data centers is to handle the amount of data that is being produced by existing satellites, Judge said. “Essentially, satellites are gathering a lot more data than can be sent to earth, because downlinks are a bottleneck,” he noted. “With AI capacity in orbit, they could potentially analyze more of this data, extract more useful information, and send insights back to earth. My overall feeling is that any more data processing in space is going to be driven by space processing needs.”

And China may already be ahead of the game. Last year, Guoxing Aerospace  launched 12 satellites, forming a space-based computing network dubbed the Three-Body Computing Constellation. When completed, it will contain 2,800 satellites, all handling the orchestration and processing of data, taking edge computing to a new dimension.

This article originally appeared on Network World.Space: The final frontier for data processing – ComputerworldRead More