@lingodotdev
Which AI can actually be trusted to write code unsupervised?
Tweet claiming the Nodejs creator declared the era of humans writing code drew 41.12% support, 33.64% confrontation and 25.24% neutral responses—igniting debate
Real-time analysis of public opinion and engagement
Community concerns and opposing viewpoints
Many replies push back hard on the claim that AI will replace developers, calling it hype and even “ridiculous,” with sarcasm and dismissive jokes.
Several commenters praise AI as a productivity boost—examples like “80% of the code” being generated—but stress the final 20% requires human fixes for production readiness.
Writers emphasize that AI can write code but can’t decide what’s worth building or fully understand problem context; humans remain the architects of purpose.
people point to vulnerabilities, inefficient generated code, and the need for careful human review before deployment.
Many predict new job shapes—prompt engineering, AI training, and reviewer roles—where humans shift from typing syntax to specifying and validating outcomes.
some are hopeful about progress, others say it’s “not gonna happen” and that developers aren’t going anywhere for years.
Opus and Claude Code get credit for working with existing codebases but are criticized for failing to create robust projects from scratch.
The tone mixes skepticism, humor, and defensiveness—frequent one-liners, insults, and eye-rolls underscore a strong demand for human oversight.
Which AI can actually be trusted to write code unsupervised?
wn me that tools like Claude Code are changing the game, but they still need a human touch to understand the problem, not just write the code. This shift will likely create new jobs in areas like prompt engineering and AI training. Humans will focus on what to build, not just h
"Wow! So who is going to prompt the AI now?" "95% of the same people who were writing code before." "Oh"
Community members who agree with this perspective
intent → constraints → systems → outcomes, not faster typing.
Users point to concrete experience (hundreds of AI-driven commits, tools like CaffeineAI) as evidence the transition accelerated quicker than expected.
predictions of significant layoffs (figures like “250k” were invoked), people feeling forced into management roles, and critiques of corporations prioritizing profit over employment. Some see poetic irony in engineers becoming the architects of the models that replace them.
His name is invoked as a signal that this is not mere hype but a real paradigm shift.
writing clear requirements, system design, editing AI output, and coordinating agents. There’s even talk of a new, machine-oriented programming language and IDEs optimized for orchestration rather than keystroke-level authoring.
some celebrate new opportunities (entrepreneurial wins, faster progress), while others call for focus on safety, regulation, and critical domains before automating everything (health, voting systems, anti-fraud, etc.).
the practical nature of software work is changing, and skills will need to follow.
It's interesting that the computer science people are putting themselves out of work first. The developers of AI have put a special focus on coding and math. Other professions will fall just as quickly as the focus shifts.
ryan dahl spent a decade trying to fix node with deno and then just decided to replace the developers instead. the ultimate refactor
there is a grief to accept w this https://t.co/m5Kg83n4qj