That’s still not very much compared to most data centers. Like, 7000 terabytes is a lot of storage for one person, but it barely even registers compared to most modern data centers.
Also, 2800 desktops networked together isn’t really a super computer or a data center.
such a network is interesting as a scientific tool for gathering and processing data, certainly, but not a data-center and not a super computer.
But being accurate with the headline makes it less click baity. 😏 Honestly, this article is scant on details.
Data centers don’t usually have an “X-ray polarization detector for picking up brief cosmic phenomena.” Like you said, it seems more like a scientific tool than an actual “data center.”
Imagine the latency on a data center in space. Uplink/downlink every time your server gets an inferencing request? Lol.
I could see it being fine for longer running asynchronous requests, but that would be if the cost/benefit made any sense at all, and if the servers had any resources worth talking about.
That’s still not very much compared to most data centers. Like, 7000 terabytes is a lot of storage for one person, but it barely even registers compared to most modern data centers.
Also, 2800 desktops networked together isn’t really a super computer or a data center.
such a network is interesting as a scientific tool for gathering and processing data, certainly, but not a data-center and not a super computer.
But being accurate with the headline makes it less click baity. 😏 Honestly, this article is scant on details.
Data centers don’t usually have an “X-ray polarization detector for picking up brief cosmic phenomena.” Like you said, it seems more like a scientific tool than an actual “data center.”
Imagine the latency on a data center in space. Uplink/downlink every time your server gets an inferencing request? Lol.
I could see it being fine for longer running asynchronous requests, but that would be if the cost/benefit made any sense at all, and if the servers had any resources worth talking about.