Something tells me that this ain't gonna work. Kids and teens are more inventive than they probably realize.
Teen in love with chatbot killed himself – can the chatbot be held responsible?
I'd like to believe that most actual people want to protect kids.
It's easy to write off corporations and forget that they are founded by real people and employ real people... some with kids of their own or with nieces or nephews etc, and some of them probably do really care.
Not saying character.ai is driven by that but I imagine the times they've been in the news were genuinely hard times to be working there...
Your kid is on the fucking computer all day building an unhealthy relationship with essentially a computer game character. Step the fuck in. These companies absolutely have to make liability clear here. It's an 18+ product, watch your kids.
https://news.ycombinator.com/item?id=45733618
On a similar note, I was completing my application for YC Startup School / Co-Founder matching program. And when listing possible ideas for startups I straight out explicitly mentioned I'm not interested in pursuing AI ideas at the moment, AI features are fine, but not as the main pitch.
It feels like at least for me the bubble has popped, I have talked also recently about the way in which the bubble might pop would be due to legal liability collapse in the courts. https://news.ycombinator.com/item?id=45727060
This added with the fact that AI was always a vague folk category of software, it's being used for robotics, NLP and fake images, I just don't think it's a real taxon.
Similar to the crypto buzz from the last season, the reputable parties will exit and stop associating, while the grifters and free-associating mercenaries will remain.
Even if you are completely selfish, it's not even hugely beneficial to be in the "AI" space, at least in my experience, customers come in with huge expectations, and non-huge budgets. Even if you sell your soul to implement a chatbot that will replace 911 operators, at this point the major actors have already done so, or not, and you are left with small companies that want to be able to fire 5 employees and will pay you 3 months of employee salary if you can get it done by vibe code completing their vibe coded prototype within a 2-3 deadline.
> [Dr. Nina Vasan] said the company should work with child psychologists and psychiatrists to understand how suddenly losing access to A.I. companions would affect young users.
I hope there’s a more gradual transition here for those users. AI companions are often far more available than other people, so it’s easy to talk more and get more attached to them. This restriction may end up being a net negative to affected users.
https://news.ycombinator.com/item?id=44723418
It is also highly compatible with the internet both in terms of technical/performance scalability and utility scalability (you can use it for just about any information verification need in any kind of application).