The year is 2025, and the design industry a once respected field where innovation and empathy intersected, design has become a battleground. Economic precarity, accelerated by AI automation and corporate cost-cutting, has hollowed out opportunities for emerging designers. Meanwhile, political polarization and social fragmentation have left creatives scrambling to reconcile their values with client demands that often prioritize profit over people. This is the era of extractive design—where human labor, creativity, and ethics are mined for efficiency, then discarded.
What is Extractive Design?
Extractive design is a system that treats creativity as a resource to be exploited. It prioritizes efficiency and profit over people, leading to:
- Exploited labor: Freelancers and gig workers face stagnant wages, unsafe conditions, and unpaid work. AI tools like Canva and Adobe’s AutoDesigner scrape portfolios for training data, leaving designers uncompensated for their contributions.
- Homogenized creativity: Algorithms churn out “good enough” designs, erasing cultural nuance and human touch. The result? A world where everything looks the same, and creativity feels like a commodity.
- Environmental harm: Overproduction and waste are baked into the system, with little accountability for sustainability. Fast-fashion campaigns and disposable tech are just the tip of the iceberg.
In 2025, extractive design is everywhere—from AI-generated logos to fast-fashion campaigns. But what does it look like in practice?

How Extractive Design is Happening
1. AI and the Gig Economy
AI tools have turned design into a gig economy side hustle. Freelancers face:
- Flat fees for unlimited revisions: Clients demand endless tweaks for a single flat rate, leaving designers overworked and underpaid.
- AI undercutting pricing: Why hire a human when an algorithm can do it cheaper? Platforms like Fiverr and Upwork are flooded with AI-generated work, driving prices down.
- Unpaid labor: AI training data is often scraped from portfolios and crowdsourced platforms without consent.
2. Ethical Washing and Performative Allyship
Brands are masters of smoke and mirrors. They deploy social justice aesthetics—diverse stock photos, rainbow logos, and carbon-neutral badges—while ignoring systemic inequities. The World Benchmarking Alliance’s 2024 Social Benchmark assessed 2,000 of the world’s most influential companies and found:
- 90% scored below 50% on basic societal expectations for human rights, decent work, and ethical conduct.
- 30% scored between 0–2/20, showing near-total disregard for workers in their supply chains.
It’s not just hypocritical—it’s dangerous. When brands co-opt social justice for profit, they undermine real progress.
3. Algorithmic Censorship and Erasure
AI content moderators, trained on biased datasets, silence marginalized voices under the guise of “brand safety.” Tools like Adobe’s Ethical AI Filter scrub protest imagery and slang from visuals, effectively gentrifying creativity.
This isn’t just about aesthetics—it’s about power. When algorithms decide what’s “acceptable,” they erase the voices of those already fighting to be heard.

How to Resist Extractive Design
1. Unionize and Organize
- Join cross-disciplinary coalitions like the Human Artistry Campaign, which lobbies for fair pay laws and copyright protections against AI data scraping.
- Advocate for transparency in AI training data. If your work is being used to train algorithms, you deserve compensation.
2. Design for Degrowth
- Create modular, repairable products that challenge overproduction. Think furniture that lasts a lifetime, not fast-fashion campaigns.
- Use tools like the Ethical OS Toolkit to anticipate risks and prioritize sustainability. Design shouldn’t cost the earth—literally.
3. Open-Source Resistance
- Contribute to crowdsourced tools like the AI Incident Database, which tracks harms caused by AI systems. Knowledge is power, and sharing it is resistance.
- Share ethical design frameworks to hold corporations accountable. If we don’t define ethical design, they’ll define it for us.
4. Community Design and Mutual Aid
- Decentralized collectives: Groups like Designers for Digital Sanctuary are bypassing corporate gatekeepers altogether, creating open-source tools and resources for ethical design.
- Mutual aid networks: Platforms like Worker Info Exchange empower gig workers to audit algorithmic management systems, demanding transparency from companies like Uber and Amazon.
- Crowdsourced accountability: Initiatives like the AI Incident Database track harms caused by AI systems, providing evidence for lawsuits and policy reform.
Why This Matters?
Extractive design isn’t just bad for workers—it’s bad for creativity, communities, and the planet. By resisting exploitation, embracing community-driven models, and reclaiming creativity, we can transform design from a tool of extraction into a force for repair.
Final Thought
The truth is, design in 2025 is a mirror. It reflects who we are as a society: anxious, polarized, and desperate for meaning. But it’s also a bridge. For every startup selling AI-generated “empathy,” there’s a collective prototyping open-source tools for disability justice. For every brand reducing identity to a checkbox, there’s a designer laboring to translate subcultures without sterilizing them.
So here’s the question I’m sitting with—the one I’ll leave you with:
What will design demand of us next?