by cortesoft
35 subcomments
- I feel like their are (at least) three main critiques of AI, and I wish we could debate them separately, because I think they each have different resolutions.
The first is the fear of job loss, and I feel like this is the most straightforward to deal with. Personally, I think the solution should be to share the productivity of AI with society at large, in particular since AI owes most of its abilities to training on the works of society. The easiest way would be a straight tax on AI usage, and using that tax to pay a universal basic income. There are obviously a ton of variations on this idea, but I think the general premise of sharing the gains with everyone is sound. I don’t think many would complain if they lost their job but kept their income.
The other two critiques are trickier. The first is the environmental impact of AI, and the response is difficult. Doing work to make it more efficient, and continuing to develop cleaner energy sources is paramount. Taxing and efficiency requirements might be a start. We have the technology to produce energy in sustainable ways, but it is expensive. It has to be non-negotiable if massive energy usage for AI is to continue.
The last is the REAL conversation, and I don’t know the answer. How do we handle AI doing creative work? How do we treat AI creative work? How much creative work do we feel comfortable handing over to AI?
I guess there is another issue, related to the last one, which is how do we deal with the ability to use AI to mislead and commit fraud at scale. How do we deal with not being able to trust what actually said/done by a human and what is AI pretending to be human? How do we avoid and mitigate the ability for AI to generate a massive amount of custom content that is used to mislead and defraud people? So much of our current mitigation strategy relies on the assumption that it takes a lot of effort and time to do certain things that can now be done instantly thousands of times?
by rescripting
0 subcomment
- The AI CEOs have been screaming for years now about how AI is scary, you should be afraid of it and it’s going to take your job.
“Mythos is too dangerous to release.”
“OpenAI offers a bounty if you can get ChatGPT to teach you how to do a bioterroism.”
“Agentic agents will replace entire categories of jobs. They’ll just be like, gone”
This is all signaling to their customers; no not you on their $20/month plan, the governments and corporations of the world who have deep pockets, fat to trim, and borders to defend and expand.
It’s no surprise that people don’t like AI. It’s not for people.
- This was evident everywhere except within the AI industry itself. The rhetoric from many of the industry’s top leaders has been “this technology will eliminate millions of jobs, fundamentally reshape countless other jobs, and automate the use of lethal force, but we’re going to develop it anyways”. Many of the current economic woes, including mass layoffs, have been blamed on AI by the very executives conducting said layoffs. In addition, the major AI companies have shamelessly stole intellectual property to train their models and shoveled AI down everyone’s throats. Is it any wonder that the general public hates AI? The AI industry isn’t exactly doing its best to appear likable.
- Are they? I heard a presentation from some pro-AI people on Friday to the large company I work at. They said they surveyed people at an AI conference and 93% of people were excited about it.
This was said with a straight face like “people love puppies!”.
No self awareness at all.
by deepsquirrelnet
0 subcomment
- > In a provocative GitHub post, machine-learning engineer Han-Chung Lee argued that even rosy internal numbers that do show AI-assisted productivity gains are suspect, as they’re produced to hit adoption targets no one can effectively audit.
Isn't this fundamentally what MBAs do with their time? Keep going with this analysis, because it goes much deeper... In my experience, BI is often a house of cards. A lot of times it's just narrative crafting, just like we're all encouraged to do when we write our resumes.
Can you embellish a story? Can you invent a convincing political narrative? As far as I can tell, that's the fundamental unit of US corporation.
by nayroclade
1 subcomments
- Bear in mind, in the same survey this article is talking about, nothing and nobody had an overall positive rating amongst those polled. So yeah, AI is unpopular, but it's just one more thing that people hate amongst a broader cultural movement of generalised hate.
- > Even within tech and coding, one of the areas where AI is reported to have the most promise, there’s the question of whether the productivity gains reported can be trusted.
I wish articles like this would at least acknowledge the massive adoption AI has among programmers. It's not comparable to stuff like helping you write the occasional email, which I presume is the baseline for most people outside tech. Making it sound like a minor tool that some people are still just experimenting with completely misses the impact it has already had on software development.
- To a lot of people AI is just image and text generation. And yes, these uses alone aren't worth the time, money, and energy.
But there are a lot of areas where AI is helping that people don't see, like in medicine. Drug development, cancer research and early detection, CT and MRI analysis, just to name a few. These uses cases are vastly more important but rarely get discussed. It's important to know that AI isn't this one singular thing or else we risk throwing the baby out with the bathwater.
- It should not take more than one brain cell to realize that in an era when employment is already perceived as precarious, you are not going to earn any public favors by telling people you are taking their jobs and making them obsolete. Doubly not so when you offer no alternative path towards building personal wealth. Triply not so when you address none of the economic problems people face like housing or healthcare costs but make others, like social cohesion and energy prices worse.
If the industry continue to gleefully ignore public discontent over AI impact on society, I imagine what might happen is a public backlash that would make the post Chernobyl anti nuclear sentiment look tame.
- ChatGPT has a billion users so surely not all of the public hates it
by mark_l_watson
0 subcomment
- So much of the public hates AI, at least the non-tech people I talk with. Good to see so much common sense among the general public.
While I find a Gemini Ultra subscription worthwhile for myself, most of the value is in the fun and entertainment of interacting with a strong API in AntiGravity (usually use Claude models), Gemini App, NotebookLM, etc. It is intellectually interesting and fun.
Can I justify the cost to society for data centers, possibility of US government bailing out the AI tech giants, etc.?
No I can't. I think the Chinese are skunking us. Building cheaper AI is the winning strategy. GLM-5.1 and Deepseek v4 are amazingly effective for much lower inference costs.
- Them: look how cool we are, stealing your data and making everybody redundant.
The people: ??
Investors: Tell us more.
- Gee I wonder why? Could it be because they promised to improve our lives but instead we are losing our jobs? Or maybe because there is insane shortage of electronics for the sake of AI data center? No, I think it should be the fact that this tech consumes more power than an average city. Actually, it must the fact that we have autonomous killing drones now. Or maybe it’s the misinformation slop? Nah, it should be the mass stealing of intellectual property.
I’m honestly baffled. What’s there not to like?
- I don’t think the public hates ai. I think AI needs a lot of money so it loudly only pursued the light-bendingly rich by leveraging the only two emotions they have:
1) greed: you will be able to fire all your employees
2) fear: if you don’t buy it someone else will and that is too dangerous for you
Of course normal people found this incredibly off putting.
by bwhiting2356
1 subcomments
- This is a problem of misaligned incentives that echoes other waves of new technology. The arrival of the washing machine was not resisted, because it directly benefited people who could now move up to higher value and less difficult work. AI doesn't seem to be playing out that way.
- I think there is conflation here.
Data centres popping up near you probably means higher electricity prices, poor air quality and water problems
Sam Altman is a massive penis, with a gift for saying the wrong thing at the wrong time.
The two things that link them are "rich" people imposing their will on everyone else, publicly.
- It really isn’t, and we by and large don’t. The New Republic seems to be falsely equating “active Mastodon posters” with “the public”; it’s just not true outside of some very specific and insular bubbles.
- I think the truth is in fact asymmetric on this front.
People, esp. many SWEs, like generating with AI, or more telling, wouldn't want to give it up in their work.
On the other hand, people generally hate consuming the product of gen AI.
Consumer experience = mostly negative
Producer experience = mostly positive
- It's unbelievable and frustrating that _we basically built our own demise._
We built the most meritocratic and accessible career path possible. If you knew how to code, and you invested in your craft (or didn't!), you were more-or-less guaranteed multiple amazing, well-paying career paths anywhere in the world.
Yet, a cohort of us decided "what if we built this thing that literally does our job? what could possibly go wrong?"
Yeah, this is gatekeeping, but the medical and legal industries have perfected that, and our industry doesn't even require advanced degrees to climb the ladder! (John Ternus only has a B.Eng in MEng!)
Why did we Eric-Andre-meme ourselves?
- Not really surprising. I would guess this goes beyond just the AI and jobs issue. Your average person sees AI all over the place in contexts they didn’t ask for it but can’t escape. Social media is covered with AI garbage (e.g., AI generated videos). Podcasts are being flooded with AI garbage that are pretty overt grabs for ad impressions where quality is … not important. Appliances and consumer devices are getting AI that nobody asked for. And of course, our world of tech stuff where the selling point is more or less leaning hard into FOMO (“Everybody’s doing it - don’t you want to be an 100x developer and not get left behind?”).
It’s easy to fixate on the OpenAI and Anthropic-level companies, but the real inescapable flood of AI garbage is coming from the downstream companies building on the core AI providers. Communities like HN have some role to play here. Maybe some peer pressure on AI founders to, maybe, not make the world a worse place?
- "Naturally, violence is never an answer, nor is it a politically effective tactic."
I am not condoning violence, but claiming it is not a politically effective tactic is disingenuous. I get that columnists are trying to cover their asses, but still.
by raffael_de
0 subcomment
- most readers seem to conclude that the ai industry should have done better marketing when the truth is that they believed they wouldn't even have to consider the public opinion because of how powerful their technology is.
- I think the top three people who deserve the hate are Amodei, Hinton, and Joshua. They are the pioneers of the field, and they could have been more objective and laid out exactly where AI does well, where AI falls short, and how we can build a bright future. But oh, no! Instead, they keep telling people that AI will get us the doomsday. AI will displace half of jobs yesterday. AI will be so dangerous that only government can control it. AI will be so malicious that we should make sure AI say the right words in the right tone and give the "right" information. So, a fear monger, a commie, and a woke. It's not hard to imagine why that the public would turn against AI.
- It has destroyed art, it has destroyed public trust with fabricated videos, it has caused skyrocketing prices in components so stuff like Valves console cannot get made, and its enriched freaks like Sam Altman.
The fact that AI acolytes are positively giddy about the above is just icing on the cake.
by Legend2440
8 subcomments
- I am very concerned by the rise of political violence in the US, and I especially don't like how much support it gets on social media. Burning down a warehouse or shooting a politician does not make you a hero.
- “Naturally, violence is never an answer, nor is it a politically effective tactic. But you also cannot ignore how the tone-deaf public messaging of the AI industry has helped to contribute to this reaction.”
And yet, as the will of the people is ignored to the benefit of but few, violence will become the answer.
by insane_dreamer
0 subcomment
- If the promise of AI is true, it will turn out to be the most socially and economically destructive force in the history of mankind. And yet, we're all being rushed headlong downstream without a clue of what's going to happen next, pushed by companies and investors driven by FOMO of not capturing the gains.
- I find it interesting how public opinion on AI is diametrically opposed in China to that in the west https://www.semafor.com/article/02/11/2026/trust-in-tech-com...
by IAmGraydon
0 subcomment
- AI is a useful tool and would likely be far more accepted if it wasn’t pushed and hyped to the point of becoming what amounts to a global mass delusion.
by giancarlostoro
0 subcomment
- Its really unwarranted some peoples reactions to someone using AI, and in other cases theres no AI used and people blindly assume something is AI and then proceed to write it off as slop, but it wasnt even AI it was just slightly low quality authentic content that you maybe would not have even commented about, we all have seen low quality videos on youtube we didnt freak out about.
- The group that really hate AI are the media and journalists, which makes perfect sense given what generative AI is doing to those industries.
As it stands though the whole "the public hates AI" is about as credible as that phase from a decade ago where random tweets were used to justify any position they wanted to.
- AI has automated my favorite part of the job: coding.
Gone is all the experience in clean code, good idioms, etc. All replaced by easily generated shitty code that can be removed and generated again as we please, until it works. No thought about the quality of code itself. Some companies are straight up forcing programmers to live in Claude Code and never even see the code, just write the spec.
It’s disgusting. And the worst part is that you can’t opt-out. If you give even the slightest hint that you don’t like AI you’re seen as a Luddite and you’ll be put next in line for the upcoming layoff.
- Post scarcity or death. We're about to face off with the great filter in the next few decades. Buckle up!
by devindotcom
1 subcomments
- certainly people are finding everyday uses... but a lot of those uses are necessitated by enshittification of search and other commonplace tools. so although I think many see the usefulness of the technology here and there, their experience of it is one of being forced to adopt a thing they never asked for by companies with few or no sincere or articulable values.
billions use windows and gmail but have a poor opinion of microsoft and google both for obvious reasons. I expect the same will be true of AI platforms and the usual suspects behind them.
- Tbh I don't care if vibe coding or generated art is produced as I think general public will eventually accept them or decide when it's worth to use/consume those kind of products.
What I really hate is agentic customer support, sales etc. - when you have to use them you realize how stupid the workflows, tool call, MCP, and all that garbage that is glued is just to reduce costs instead of churn.
PS: Ironically I'm working on coding an "agentic platform" for the product suite and their backend services. I simply don't feel confident about the product I'm building but I guess it is paying my bills for the moment
- All this, so people like us can do our jobs just a little bit easier, which wasn't that hard to begin with, and in fact was quite comfortable all things considered, for employers who are promising to lay us off, for productivity gains that aren't even measurable.
Think back on a time where you and a teammate (or teammates) spent hours or days debating back and forth on different technological or architectural options for trade-offs. How much nuance and detail went into those discussion. We used to take pride in our ability to make careful and measured tradeoffs. And yet with this tech all that is thrown out the window.
- Some perspective ... I really do not see 'the public hating AI' outside of a very specific demographic (17-30 year old artsy types, generally left-leaning). Average everyday people in my area either don't care about AI at all, or like it, using it as a better search engine.
The situation might be different in the States, but I'd wager Joe Sixpack, brass fisher in Montana, couldn't care less about GPT-5.5 or whatever Musk is up to these days.
- Aren't these types of incidents only expected to rise, as inequality and economic challenges are on the rise, what are hungry, bored, lonely and neglected people going to do? This isn't a surprise - we have neglected American health, wellbeing and happiness and then are telling them "AI is going to come trust us its different this time" and yet for most people their lives get worse as AI company shareholders/employees lives' get better.
I'm ashamed that we don't care more about human dignity. I care about human dignity and wonder if I'm an outlier? Even a tiny pledge and affirmation "Hey, we see you, we are working to bring relief and guaranteed dignity to your lives by doing xyz" would help. Instead when I ask for peace in war[edit: and basic income, anything that is an essential part of dignity[edit 2: and I hear its not possible right now while that isn't said of AI investments] I hear unaccountable leadership dodging the responsibility [of their constituents] and accelerating conflict while their friends' pockets get thicker.
- The AI loves it, though!
- AI is precise. People are not. AI calculates things. People manipulate them to their benefit. AI precision demands people's accountability. Some people feel threatened by that, fearing that their shady games will no longer be working. Deceivers cannot stand seeing their own reflection in the mirror, so they project their own pathological traits onto it. The whole thing turns into aggression. Psychology 101, inspired by the works of Carl Jung.
- > If Altman, Amodei, and their Big Tech peers want to rebuild public trust and create a genuine technology that benefits the public, then the path forward isn’t another white paper or postulating about the existential risks of their technology. It’s sustained, verifiable action: genuine transparency about what their products can do, a willingness to accept meaningful regulation and responsibility even at financial cost, and real democratic input from communities on the growth of data centers.
They need to accept far more than that. They need to accept that they may not be able to "create a genuine technology that benefits the public" at all, and that they therefore may be required to stop completely and totally dissolve all their operations if it turns out that is what is best.
- TNR - DNR
- Weird to have these threads and then fifteen minutes later there will be a 350+ comment, 500+ votes thread about some 200 USD/month AI subscription service which is now the I Have Seen The Light moment and My Beautiful Side Projects Are Finally Materializing.
This is creative destruction in a whole new sense. Just chugging through genuine (or human) creativity, then training on human prompting, then finally ascending near the cluster of Anthropic/AWS nuclear power plants. And people pay for the pleasure.
by forgetfreeman
0 subcomment
- "Naturally, violence is never an answer, nor is it a politically effective tactic." Abhorrent under normal circumstances certainly but declaring the primary drivers of both the workers rights and civil rights movements ineffective is laughable. Power cedes nothing without violence.
- 1. constant scare mongering of job "bloodbath" . amedio is the worst culprit by far.
2. flooding social media with obviously fake ai content
3. only billionaires benefiting from it and gloating about it .
by franktankbank
0 subcomment
- I mean come on. The public hates whatever face the vc puts on.
by AndrewKemendo
1 subcomments
- Welcome to the end of another AI hype cycle
Anyone who was in AI before 2022 can tell you about the last cycle that went from 2012-2018 or so when the metaverse failed, but we got tensorflow, pytorch, gpgpus
The cool thing is that every hype cycle generates a lot of really good new AI tech and integrations that persist. This time we got GPTs and diffusion sand splatting
I think this previous cycle will be seen as the penultimate with the next one permanently improving with no scale back.
We’ll be fine. We have survived every winter
by 1vuio0pswjnm7
0 subcomment
- [dead]
by aifactory5
0 subcomment
- [dead]
by CoherenceDaddy
0 subcomment
- [dead]
by mschuster91
0 subcomment
- [dead]
by navvyeanand
5 subcomments
- [flagged]
- I did mention that few days ago in here and it seems HN was in denial, but the reality is, most people don’t like it. Sure, they might use it, but that’s just because they are after some shortcuts, but they all have a negative sentiment towards it, and used as a term to discredit something, as opposed to making it better, “oh look, that’s AI, yeah whatever”, “yeah AI, fake and g..” I heard it many times online or offline.
The only people who still look positively at AI, are either the ones working on it/building something with it, or the ones who are profiting from it, kinda like crypto few years ago, and just like how crypto is mostly immediately associated with scams now, I imagine something similar will be associated with AI soon.
Even other tech people that are not directly in the AI industry hate AI, due to all the shortages in chips and prices increasing across the hardware board, from gamers to sysadmins to hobbyists, I mean, the rpi are almost like a fully fledged NUC few years ago.
Edit: to add, did AI improved the average person life? Nope, if not increasing the costs, or tracking and violating their privacy, it did flood the internet with slop, or a frustrating useless AI chat support.. from an average person perspective, it added none to their quality of life, it didn’t make things cheaper, it didn’t improve their travels, it didn’t magically made them teleport, and so on, instead, AI was used for all hostile purposes against average person. Even from technical perspective, have we seen any breakthrough in tech given AI is a “superior” assistant? Nope, software is more shitty and buggy now, and SaaS are even increasing the prices (probably to pay for AI tokens), software developers are saying coding isn’t fun anymore, hardware designs didn’t improve, governments processes still have the beuqacratic system plus AI. Unlike when automation was introduced decades ago, where people did notice an improvement in their quality of life.
- All people see is job losses and increased costs based upon articles and tweet type things. If due to AI or not, that is what they are seeing. It is like MAGA news, all they see "AI eliminates jobs and AI increases your electric bill".
Nothing at this point will make people believe AI is good for the masses.
What will need to happen for people to like AI ? I say they will get real $ month after month to cover more than the inflation, not the dumb tax deductions Trump harps on. In this case, maybe 1,000 USD per month adjusted for inflation yearly from AI will end this trend.
Why a payment ? All they see is the wealth of the top 1% increasing almost exponentially where they are struggling to pay their 'fixed' expenses.
In reality since 2008, the rich has been cashing in while workers have been footing the bill. That is the big issue.
- For a good reason
by very_good_man
0 subcomment
- The Journalism Industry Is Discovering That The Public Hates It
- Downsides of AI: Massive increase in RAM prices, housing crisis worsens as datacenters are build, massive energy usage, children are having trouble learning at school, spreading misinformation is easier than ever.
Upsides of AI: I can ask it if my farts are caused by the celery I ate earlier