Bone Dry

I'm looking out my window right now at Mt. Crested Butte and something is wrong. It's mid-February and I can see rock. Not the little streaks of granite that poke through in a normal year and give the mountain its character, the jagged lines that make the Butte look like a rooster's comb from certain angles. I mean whole faces of exposed mountain where snow should be six feet deep. The peak looks like it's been stripped. Like someone reached down and peeled the winter right off of it.

I moved to Crested Butte because it's the last honest town in Colorado. I don't mean that as a bumper sticker. I mean it literally. Elk Avenue still has the Secret Stash and the Wooden Nickel and a post office where the fellow behind the counter knows your dog's name. There are more mountain bikes than BMWs. People came here because they wanted to be somewhere that hadn't been optimized, that hadn't been smoothed into another Aspen or another Telluride with a Gucci store where the feed shop used to be. Crested Butte is 8,900 feet of altitude and attitude. The wildflower capital of Colorado in July, and in February, it's supposed to be buried. It's not buried this year. Not even close.

Colorado's snowpack is sitting at 52% of median. That number doesn't do it justice, so let me translate. It's the worst measured winter in nearly forty years. Since SNOTEL stations started keeping statewide records in 1987, there has never been less snow on the ground in Colorado in early February than there is right now. Thirty-eight percent of monitoring sites are at record lows. The state climatologist, Russ Schumacher, said the system has Colorado's snowpack sitting at the zeroth percentile since mid-January. The zeroth percentile. That's not a ranking. That's a floor. It means every single year we've ever measured was better than this one. Every one.

This isn't a bad snow year. This is what happens when you have the third-warmest November and the warmest December in 130 years of records, back to back, and then January decides to keep the streak going. The precipitation has been closer to normal, which is the cruel part. It's been falling as rain. Rain in the Rockies in January. I stood on my porch on a Tuesday last month in a flannel shirt, no jacket, watching water run down the street at 8,900 feet, and I thought: this isn't right. This isn't how it works up here.

Now hold that picture in your head. The bare rock, the brown where white should be, the reservoirs already below normal, the streamflow forecasts looking grim for every major river basin in the state. The Colorado River, the Gunnison, the Arkansas, the Rio Grande. Hold all of it. Because while the mountains are running out of water, Denver is building something that's going to need a whole lot of it.

CoreSite, a Denver-based data center company, is constructing a campus in the Elyria-Swansea neighborhood that will eventually cover 600,000 square feet across three buildings. When it's fully operational, it will consume a maximum of 805,000 gallons of water per day to cool its systems. That's the daily indoor water use of 16,100 people. Eighty percent of it lost to evaporation. Gone. Into the air above a city that sits in a state where the snowpack is at the zeroth percentile. And that's one facility. QTS Realty is building a 160-megawatt hyperscale facility in Aurora that could consume as much power as 176,000 homes. Across the Front Range, Denver projects a 200% increase in data center growth. Xcel Energy says it will need $22 billion in new infrastructure just to keep the lights on.

Let me say this plainly. We are building cathedrals of computation in the desert and baptizing them with water we don't have.

Two-thirds of the data centers built or under construction since 2022 are in water-stressed regions. The American Southwest, already fighting over every acre-foot in the Colorado River Compact, is projected to see data center water consumption balloon to nearly 90,000 acre-feet by 2035 when you count the power plants feeding them. That's not a rounding error. That's a river. And it's being routed through cooling towers so that someone in New York can ask a chatbot what to make for dinner.

Look, I'm not anti-AI. I've spent the last few years helping universities, organizations, and anyone who'll listen understand what this technology actually means. I believe in it. I've seen it change lives in real time. But I also live in a place where water isn't abstract. Water is the East River flowing past my house. Water is whether the wildflowers come back in July. Water is whether my neighbors who ranch can keep their cattle alive through the summer. Water is not a resource you trade for inference.

And here's the thing that keeps me up at night, the thing that makes me stare at the bare mountain and think: the current model isn't just environmentally reckless. It's architecturally wrong. We've built the AI economy on an assumption that intelligence has to be centralized. That every query, every interaction, every moment of machine reasoning needs to phone home to a massive facility humming with GPUs and drowning in coolant. We've taken the most distributed technology since the printing press and run it through a bottleneck shaped like a power plant. It's the mainframe era all over again, just with better marketing.

The large language model, the LLM, was a necessary first chapter. I'll give it that. You needed massive scale to prove the concept, to show the world that language could be a computational substrate, that reasoning could emerge from pattern recognition at sufficient depth. It was spectacular. It was important. And it was always going to be a transitional architecture. The mainframe was spectacular too. Nobody lives there anymore.

The next chapter is already being written, and it doesn't need 805,000 gallons a day. Small language models, local language models, task-specific models running on hardware you already own. Microsoft's Phi-3.5-Mini matches GPT-3.5 performance at 98% less computational power. Mistral 7B fits on a laptop and scores 82% on MMLU benchmarks that would have seemed impossible for a model that size two years ago. Llama 3.2 runs on an iPhone at 30 tokens per second in 650 megabytes of RAM. Gartner predicts that by 2027, organizations will use small, task-specific models three times more than general-purpose LLMs. This isn't a prediction I'm making from my porch. It's happening now.

But here's where most people stop, and here's where the real argument begins. A small model without a data strategy is just a smaller version of the same mistake. You don't solve the problem by shrinking the cathedral. You solve it by rethinking what you're worshipping.

The organizations that will win the next decade of AI aren't the ones with the biggest models or the fattest GPU clusters. They're the ones who understand their own data. Who've done the unglamorous, tedious, career-making work of cleaning it, structuring it, governing it, and knowing what questions it can actually answer. A 7-billion parameter model fine-tuned on your institution's actual data, running on your own infrastructure, answering questions specific to your students, your patients, your customers, will outperform a trillion-parameter model guessing from the general internet every single time. And it'll do it without drinking a swimming pool of water every day to stay cool.

This is the argument I keep making and the one I think the industry doesn't want to hear: the future of AI is local. Local models. Local data. Local inference. Local accountability. Not because decentralization is trendy. Because centralization doesn't scale when the thing you need to scale it is running out. Water is running out. Power grids are buckling. And the environmental cost of asking a 175-billion parameter model to summarize a meeting is a moral question we've decided to ignore because the invoice doesn't itemize the river.

Seventy-five percent of enterprise-managed data is now created and processed outside traditional data centers. The data is already local. The intelligence should be too. Federated learning lets you train across distributed devices without centralizing sensitive information. Quantization compresses models 4 to 8 times with minimal accuracy loss. WebGPU is making it possible to run inference inside a browser tab with no server round trip, no data leaving the device, no cooling tower in sight. The technology is there. What's missing is the strategy.

Data strategy isn't sexy. I know that. It doesn't get you a keynote at CES. Nobody raises a Series B on the promise of better metadata governance. But it's the foundation under everything. Without it, you're just burning water and electricity to generate confident-sounding nonsense at scale. With it, you're building something that actually works, that knows what it knows and knows what it doesn't, that serves the specific humans in front of it instead of performing general intelligence for an audience of venture capitalists.

I think about this every morning when I look at the mountain. Crested Butte teaches you something if you live here long enough: resources aren't infinite, and the people who treat them like they are eventually leave. The developers who over-built, the water rights speculators, the guys who thought they could just take more. The mountain doesn't argue with them. It just outlasts them. The people who stay are the ones who learned to work with what's actually here. Who understood that constraint isn't the enemy of creativity. It's the source of it.

The AI industry is standing at a fork. One path leads to more data centers, more water, more power, more centralization, more of everything until the everything runs out. The other path leads to smaller models, smarter data, local inference, and the humility to admit that not every question requires a billion-dollar facility to answer. One path looks at the bare mountain and builds a bigger pipeline. The other looks at it and asks: what if we just needed less?

I know which path I'm on. I can see it from my window. It starts where the snow used to be and leads downhill, past the ranch gates and the river banks and the town that refuses to be optimized, toward a future where intelligence lives where the people are, runs on what's available, and doesn't ask a mountain to go thirsty so a server can stay cool.

The snowpack might recover. We've got two months before the typical peak, and stranger things have happened. But even the optimists will tell you: there has never been a year when Colorado was this far behind and made it all the way back. The mountain remembers that, even if the market doesn't.

Build local. Think small. Protect the water. The mountain is watching.

Ready to build a local AI strategy? Let's talk or join the conversation in Discord.