The old argument is dead

For a long time the advice was: specialize. Pick one language, go deep, become the expert. That made sense when writing code was the bottleneck. If it took a week to build a feature, the person who could do it in three days had a real edge.

That world is over. I was an early user when ChatGPT launched. Within the first few weeks, I made my first $1,000 just by using it the generalist way: not deep expertise in one thing, but the ability to connect problems across domains with a tool that amplified whatever skill I pointed it at. Design briefs, code scaffolding, data cleaning, client proposals. The people who benefited most from AI were not the specialists. They were the generalists who knew enough about enough to aim it at the right problems.

Now I live inside Claude Code and Codex most of my working day. The code itself is no longer the hard part. What is hard is knowing what to build, how systems connect, where the failure points are, and how to keep something running reliably after the first version ships.

I have built scrapers in Python, APIs in Node, frontends in React and Next.js, automation workflows in n8n, data pipelines that land in Postgres and BigQuery, and design work that I would do for free because I genuinely enjoy it. None of that breadth was planned. It accumulated because every project demanded a different tool, and I learned each one by shipping something that depended on it.

That is the generalist advantage. Not knowing everything. Knowing enough about everything to see how the pieces fit.

What AI changed

When I started freelancing three years ago, clients hired me because I could write the code. Now they hire me because I can architect the system. The code is the easy part. Claude Code writes a working scraper in 10 minutes. What it does not do is:

  • Decide whether the project needs a scraper, an API call, or a data vendor
  • Design the schema normalization layer so output from 12 different sources looks identical
  • Figure out that the bottleneck is not the scraper but the delivery pipeline that chokes on 50,000 rows
  • Notice that the county website changed its URL structure at 2 AM and the monitoring should have caught it
  • Know that the client's CRM webhook has a 30-second timeout and you need to batch the write-back

These are architecture decisions. They require understanding how databases work, how APIs fail, how browsers render, how proxies route, how CRMs ingest, how cron jobs break, and how clients actually use the data downstream. No single-language specialist knows all of that. A generalist who has built across the stack does.

The myth of the T-shaped developer

The industry talks about "T-shaped" developers: deep in one area, broad awareness of others. That model assumes the deep expertise is what gets hired.

In my experience it is the opposite. What gets hired is the ability to look at a client's problem and say: "You need a Python scraper that feeds into a normalization script, writes to Postgres, triggers an n8n workflow that pushes to your Google Sheet and sends a Slack notification. I can build all of that. Here is the timeline."

If I were deep in Python and shallow in everything else, I would build the scraper and hand it off. The client would need a second person for the database, a third for the automation, a fourth for the integration. Four people, four handoffs, four communication gaps. Or one generalist who sees the whole picture and builds it end to end.

What I actually use day to day

This is not a theoretical argument. Here is what a typical week looks like:

Python for scraping, data processing, and anything that touches raw data extraction. Playwright and requests for the actual scraping. pandas or polars for data cleaning when the volume warrants it.

TypeScript and Next.js for anything web-facing. This site, client dashboards, API endpoints. React for interactive components. The ScrapeBase API at scrapebase.io runs on this stack.

n8n for workflow automation when the logic is "if this, then that" and the client wants to see it visually. Webhooks, conditional routing, scheduled triggers. Faster to build and easier for clients to understand than custom code.

SQL (Postgres, BigQuery) for anything that stores or queries data. Every recurring pipeline I build writes to a database, not a CSV.

Claude Code and Codex for first drafts of everything. I write the architecture, the AI writes the implementation, I review and fix. My output per day roughly doubled when I started working this way. The skill is not typing faster. The skill is reviewing faster and catching what the AI gets wrong.

Design tools for when a project needs a frontend that does not look like a developer built it. I started with design before I started coding, and it is still the thing I can do for hours without needing a reason.

None of these make me an expert in any single one. All of them together make me the person who can deliver a complete system, not just a component.

The generalist premium

There is a pricing dynamic that works in the generalist's favor. A specialist charges for one piece. A generalist charges for the whole system. The client who needs a scraper, a pipeline, a database, and a dashboard has two options:

Option A: Hire 3-4 specialists. Coordinate handoffs. Manage communication. Accept that nobody owns the full picture. Total cost is the sum of the parts plus the coordination overhead.

Option B: Hire one generalist who builds and owns the entire thing. One point of contact. One person who knows why the scraper writes to this table, why the schema looks like this, and why the delivery goes to that webhook. Total cost is usually less than Option A, and the result is more coherent.

The premium is not in any single skill. The premium is in eliminating handoffs.

When specialization wins

I am not arguing that specialization is wrong. If you are building a machine learning model, you want a specialist. If you are optimizing a database that handles 10 million queries per day, you want a specialist. If you are doing anything at the frontier of a single domain, depth beats breadth.

But most of the projects I work on are not at the frontier. They are at the intersection. A scraping project is also a data engineering project is also an automation project is also a delivery project. The frontier is where the pieces connect, not where any single piece goes deepest.

The bottom line

The generalist advantage in 2026 is that AI has commoditized the code. What it has not commoditized is the judgment to choose the right architecture, the experience to predict where it will break, and the breadth to build the whole thing without handoffs.

I started with design. I moved to development. My brother pulled me into scraping. I learned databases because projects needed them. I learned automation because clients asked for it. I learned to build with AI because it made me faster. None of it was a career plan. All of it was "this project needs this, so I learned it."

Three years and 200+ projects later, the breadth is the product. Not any single skill inside it.