ai

Your Home Is Now a Data Center: How the AI Energy Crisis Is Turning Living Rooms Into Server Farms

Your Home Is Now a Data Center: How the AI Energy Crisis Is Turning Living Rooms Into Server Farms
Source/credit: Span
Follow us on Google

The next time you take a hot shower, there's a small but growing chance the water was heated not by your gas boiler, but by a cluster of servers crunching AI inference tasks in a box attached to your wall. That is not a thought experiment. That is 2026.

The artificial intelligence industry is in the middle of what can only be described as a full-blown energy emergency. The demand for compute has grown so aggressively that the traditional playbook of "find land, build a data center, plug into the grid" is no longer working fast enough. Grids can't keep up. Planning approvals take years. Transformers are on backorder. Power utilities are overwhelmed. And so the tech industry, with all its characteristic boldness (and occasional absurdity), has started looking at the rest of the physical world and asking: can we put a GPU in that?

The answers it is coming up with are genuinely fascinating.

The Scale of the Problem

Before we get to hot water and backyard server boxes, it helps to understand just how hungry AI has become.

According to the International Energy Agency (IEA), electricity demand from data centres soared by 17% in 2025, well outpacing growth in global electricity demand of just 3%. Electricity consumption from AI-focused data centres grew even faster, surging 50% in that same year. And this is not expected to slow down. The IEA's updated projections see electricity consumption from data centres roughly doubling from 485 TWh in 2025 to 950 TWh in 2030, accounting for around 3% of global electricity demand by that date.

To put that in perspective, by one estimate, the energy consumption of data centers could approach 1,050 TWh by 2026, which, if data centers were a country, would make them the fifth largest energy consumer in the world, between Japan and Russia.

At the company level, the investment numbers are staggering. The capital expenditure of just five large technology companies surged to more than $400 billion in 2025 and is set to increase by a further 75% in 2026. That level of spending is now larger than global investment in oil and natural gas production.

The hardware inside data centers is also becoming exponentially more power-hungry. An individual server rack within an advanced data centre is only the size of a large refrigerator, but by 2027 it could have peak power demand equivalent to that of 65 households. Between 2020 and 2025, the power density of AI servers increased by 11 times, and by 2027 it is set to see a further fourfold increase.

This is the context in which companies are now showing up at people's front doors with unusual propositions.

The Hot Water Play: Heata and British Gas

Let us start in the United Kingdom, where a startup called Heata has come up with an idea so elegantly clever it almost sounds like a joke until you realize it actually works.

Heata has created a "virtual data centre," a network of servers distributed in people's homes. Each server is attached to the home's hot water cylinder, and as they process data, the heat they generate is transferred into the water. This reduces the energy needed to heat water in the home, cutting household bills and reducing carbon emissions in the process.

The concept flips the traditional data center problem on its head. In a normal data center, heat is the enemy. Enormous energy is spent on cooling systems just to keep the servers from melting. Heata essentially relocates that waste heat into a place where heat is actually valuable: your hot water tank.

Each Heata unit can provide up to 4kWh of hot water per day, and Heata pays for the electricity the unit uses, meaning the household pays less to heat their water. According to the company, this technology is expected to save households up to £340 per year when offsetting electrically heated hot water, and up to £120 per year when offsetting gas heating.

British Gas came on board for a trial in early 2025. As part of a three-month trial, 10 Heata units were installed in the homes of British Gas employees, with the energy provider's computing workloads processed on those units. British Gas provided free hot water for its own employees as a byproduct of its own cloud compute.

Heata claims its approach reduces energy use by 80% compared to traditional data centers, and can simultaneously help people in fuel poverty. That is a remarkable dual benefit: cheaper AI compute and cheaper home energy, both from the same device sitting next to your boiler.

Heata co-founder Chris Jordan described the logic plainly: "Waste heat is a big problem for data centers, leading to significant energy costs for cooling. Yet heat is valuable. On the other side of the coin, you have an energy crisis and people struggling to heat their homes. Our unique technology brings those two things together."

It is worth noting that the system does have one practical constraint. It requires a hot water cylinder, so it cannot be used in homes with combi boilers. That rules out a significant portion of UK homes. But for those with the right setup, it is a genuine win-win.

The GPU Box on Your Wall: Span's XFRA

If Heata is clever and subtle, what California-based Span is proposing is considerably more dramatic.

Span, a startup that originally launched with smart electrical panels designed to help homeowners save money on their electricity bills, has partnered with Nvidia to develop small, fractional data centers called XFRA nodes that can be installed on the exterior walls of residential homes and small businesses.

These are not small gadgets. Each XFRA node contains Dell PowerEdge servers with 16 Nvidia RTX Pro 6000 Blackwell GPUs, 4 AMD EPYC CPUs, and 3 TB of RAM, connected to a 24-port gigabit switch. This is serious professional hardware, the kind you would normally find in a hyperscale cloud facility, being mounted on the side of a house in a suburban neighborhood.

Span collaborated with Nvidia to use liquid-cooled Nvidia RTX PRO 6000 Blackwell Server Edition GPUs. These require no fans, so there is no noise. That is a critical detail. One of the biggest community complaints about data centers is the constant industrial humming. The fanless, liquid-cooled design means a neighbor would likely have no idea the unit was there.

The financial deal being offered to homeowners is genuinely attractive. The company charges a flat monthly fee of about $150. In return, it essentially pays the host's electricity and internet bills. The computing power generated from the nodes is distributed to customers like hyperscalers and AI companies. Span's Chief Revenue Officer went further, suggesting that homeowners in places where XFRA nodes provide the most value could receive discounted electricity up to and including free electricity and free internet access.

The reason Span can offer this deal comes down to a simple observation about how homes use electricity. Residential homes operate at an average of just 40% of their peak power capacity, leaving significant untapped headroom that can be utilized for compute. The XFRA node taps that headroom without disrupting normal household activity.

Span plans a 100-home pilot in 2026 and aims to scale to 80,000 nodes nationwide starting in 2027, delivering more than 1 gigawatt of distributed compute capacity. For context, that is equivalent to a large traditional data center, but distributed across tens of thousands of homes in ordinary neighborhoods.

Span says it can install 8,000 units six times faster and for roughly a fifth of the cost of building a large, centralized 100-megawatt data center. That speed-to-power advantage is the whole point. The bottleneck for AI expansion is not money or hardware; it is grid access and permitting timelines. As of early 2025, the Center for Strategic and International Studies noted delays of up to 7 years in the nation's largest data center markets. Span's approach sidesteps all of that.

Major homebuilder PulteGroup is already testing the concept. "There is certainly opportunity, as SPAN can provide homeowners with access to innovative technology and potential income generation that can help offset monthly energy costs," the company said. "On a larger scale, if the technology proves out, it might also keep local infrastructure from being overburdened, which could keep land open for other uses, such as building homes."

The Experts Are Not Entirely Sold

Not everyone is enthusiastic about the idea of suburban compute nodes.

Ari Peskoe, director of the Electricity Law Initiative at Harvard Law School, called the homeowner subsidy "fascinating," but raised a practical concern about clustering. "If there's a block that has several homes with these devices, maxing out compute and energy would force a lot of power to that local area," he said. Local distribution networks were not designed for sustained heavy industrial loads dropped into residential streets.

There are also broader systemic concerns. A Utah State University physics professor warned that efforts to modestly reduce the ecological harms of data center power usage could actually exacerbate the overall problem. In a preliminary analysis, he calculated that only 30 to 40 percent of homes may be suitable for mini data centers due to integration constraints, the need for stable internet, and participants being willing to have the technology installed.

Security researchers have flagged physical theft and side-channel attacks as potential risks when high-value GPU hardware sits in unguarded residential locations. These are real concerns that Span will need to address before scaling to tens of thousands of homes.

The Bigger Picture: Compute by Any Means Necessary

Heata and Span are not isolated experiments. They are part of a broader, industry-wide scramble for power that is pushing the AI sector into increasingly unconventional territory.

The most dramatic example involves a nuclear plant that is synonymous with disaster. The owner of the shuttered Three Mile Island nuclear power plant announced plans to restart the reactor under a 20-year agreement that calls for Microsoft to purchase the plant's power to supply its data centers with carbon-free energy. Constellation Energy described the agreement with Microsoft as the largest power purchase agreement the nuclear plant operator has ever signed, with the plant expected to come back online in 2028. Constellation will invest $1.6 billion to restart the plant, covering nuclear fuel, the turbine, the generator, the main power transformer, and cooling and control systems.

Let that sink in. A reactor that shut down because it was economically unviable is being brought back to life, at a cost of over a billion dollars, because AI companies need more electricity.

Beyond nuclear, tech companies are hunting for power wherever they can find it: buying up decommissioned factories and old cryptocurrency mining facilities because they already have high-voltage grid connections, mapping deep geothermal zones near volcanic activity, and building solar arrays in the middle of deserts. The through-line is desperation for reliable, large-scale power, accessed faster than the normal permitting process allows.

The grid itself is beginning to feel the strain everywhere, not just near the data centers. In the PJM electricity market stretching from Illinois to North Carolina, data centers accounted for an estimated $9.3 billion price increase in the 2025-26 capacity market. As a result, the average residential bill is expected to rise by $18 a month in western Maryland and $16 a month in Ohio. Ordinary people who have never used an AI tool in their lives are already subsidizing the compute boom through their electricity bills.

This is precisely why the distributed, residential-compute model is so conceptually interesting. If AI companies are going to consume neighborhood electricity regardless, at least Span's approach gives the homeowner a direct financial benefit from hosting the load, rather than simply absorbing a higher bill.

What This Means Going Forward

We are watching the physical infrastructure of AI expand beyond the boundaries of the traditional data center in real time. The home, historically a private space insulated from industrial activity, is being quietly recruited into the global compute grid.

For homeowners, the deals on offer right now look generous. Free hot water, paid electricity bills, free internet. The question is what the long-term implications look like as the technology scales. Who is liable if a node malfunctions and causes a problem? How does resale value of a home change when there is a GPU cluster mounted to its exterior? What happens to the agreement if the AI company encounters financial difficulty?

These questions are not reasons to dismiss the technology. They are the normal growing pains of any infrastructure shift. The residential electricity grid, the mobile phone tower, the rooftop solar panel: all of these faced similar skepticism before becoming normalized features of everyday life.

The interesting thing about this particular moment is that the financial pressure to make distributed compute work is enormous and growing. The IEA projects that electricity consumption from AI-focused data centres will triple by 2030. That trajectory guarantees more innovation, more unusual partnerships, and more proposals that will sound absurd until they are suddenly everywhere.

Kenya and the broader African continent are not yet at the center of this story, but the dynamics playing out in the UK and the US are instructive. As local AI infrastructure grows and electricity reliability remains a challenge in many regions, the principle of distributing compute closer to where power exists, and compensating hosts for that privilege, could prove even more relevant here than in the markets where it is currently being tested.

The era of "compute by any means necessary" is producing some genuinely creative engineering. Some of it will fail. Some of it will reshape how we think about homes, energy, and infrastructure. Either way, it is worth paying close attention.

Comments

to join the discussion.