Photo by Cole Keister

Reclaiming the Human in Tech

Beneath the headlines about AI breakthroughs, a paradox is emerging. Technology races ahead while the human foundations it depends on quietly erode. But growing movements are building something different. Their approaches challenge Silicon Valley's obsession with scale, and revealing what innovation looks like when humans come first.

San Francisco – December 9, 2025

Silicon Valley has perfected the art of acceleration. Faster models, more data, quicker deployments. Yet somewhere in this relentless pursuit of scale and speed, a paradox emerged: technology is racing ahead while the human foundations it depends on are eroding.

The Speed-Scale Doom Loop

The numbers tell a seductive story. AI models double in capability every few months. Billion-dollar valuations are reached faster than ever. Automation promises to handle everything from customer service to creative work. The logic seems unassailable: move fast, scale big, optimize everything.

But beneath this narrative of progress lies the risk of “model collapse.” This occurs when AI systems are trained on data that increasingly consists of AI-generated output rather than real human input, causing the models to degrade. The irony? Systems like ChatGPT depend heavily on human-nurtured platforms such as Wikipedia and Reddit, built entirely by volunteers sharing knowledge and conversation. And if those human voices fade, the quality of the generative AI fades with them.

If AI answers replace the need to visit forums, ask questions, or share experiences, who will create the next generation of training data? What happens when the well runs dry?

The Counter-Narrative Emerges

Meanwhile, another perspective is gaining traction, not anti-tech, but intentionally human-centered.

An emerging school of thought often described as “Slow AI” is questioning the cult of acceleration. The issue isn’t model size; companies like Microsoft build task-specific models all the time. It’s about intention. Slow AI asks different questions: What if we prioritized understanding over speed? What if we built AI systems that augment human capability rather than replace it? What if smaller, community-governed models could serve specific needs better than billion-parameter giants?

The contrast is striking. Major tech companies build models of all sizes, from specialized to massive. But grassroots communities are building with different intentions: open governance, community ownership, and human agency rather than market capture. While venture capital chases the next unicorn, volunteer-driven projects are creating infrastructure that prioritizes human agency over corporate control.

One example is the Human Internet, a loose coalition of developers, researchers, and community organizers exploring ways to rebuild digital spaces around meaningful interaction rather than constant optimization. Their focus is on systems that encourage deliberation, trust, and shared governance, elements often missing from mainstream platforms.

A second example could be research initiatives like the Alignment Research Center that prioritize understanding over deployment speed. Rather than racing to release the next model, they focus on interpretability, understanding how AI systems work before scaling them up, even when it means moving more slowly than competitors.

The Critical Role of Human Connection

Stepping back from the acceleration cycle highlights a key insight: human connection becomes the scarcest resource. In a world where AI can generate content instantly, the bottleneck isn’t information; it’s meaning. It’s the trust that comes from knowing a human took time to craft a response. It’s the serendipity of unexpected conversations. It’s the richness of context that only lived experience provides.

This creates a profound value inversion. What Silicon Valley once optimized away– friction, slowness, and human involvement – suddenly becomes precious. The very inefficiencies that technology aimed to eliminate turn out to be where meaning lives.

During our Multimedia Lab for Journalists program in August 2025, editors from KQED, the Bay Area’s main public media organization, shared their most effective strategy for surviving the digital age: in-person events. While the media industry races to automate content production with AI, KQED found its strongest asset isn’t efficiency; it’s the live, face-to-face gatherings that keep their community engaged and loyal.

Giants and Gardeners

Two approaches now define technology’s future. On one side: the giants, pursuing ever-greater scale with models that consume massive resources and centralize power. On the other side, the gardeners cultivate small plots of digital space where human agency flourishes. In reality, the distinction is messier than a simple binary; what matters is the underlying philosophy.

While companies like OpenAI and Anthropic have built massive proprietary models, the alternative isn’t simply “open source”; that term hides important distinctions. There’s open weight (Meta’s Llama releases model parameters but not training infrastructure), open source (sharing code but not data), and fully open systems. Switzerland’s Apertus LLM represents the fullest version of openness: open source, open weight, and open data, a transparent alternative to black-box systems. Meanwhile, platforms like Hugging Face have created ecosystems where developers can share and adapt models, and gardeners are building alternative infrastructure such as Mastodon’s federated social network, LAION’s open datasets, and projects like LocalAI that let people run AI locally. The key difference isn’t size, it’s whether technology can be community-governed, locally adapted, and purpose-built rather than locked behind proprietary walls.

Both paths will coexist, but the gardeners challenge a core assumption of Silicon Valley: not all value increases with scale. Some value is inherently local, contextual, and human-sized. Some problems are better solved by communities instead of algorithms, by conversation rather than computation, by being present, not efficiency.

Where Technology Meets Humanity

What makes this moment significant isn’t just the emergence of counter-movements, but what they reveal about our deeper needs. We’re rediscovering that innovation isn’t just about building faster tools; it’s about creating spaces where human creativity, curiosity, and connection can thrive.

The goldmine isn’t just in the technology we build. It’s in the humanity we preserve, cultivate, and bring to the table. In Silicon Valley’s race to connect everyone, maybe the real breakthrough is remembering how.