- Can people please not post links with vague titles like this? I had to click through and read half the article to even figure out what this was about, and I wasn’t interested.
- In most of the world the past decades there has been no thought behind who should get university education. It has been given that after high school you should aim for university. I have studied software engineering in the most prestigious university in my country and from 100+ students in my group there were only a few (myself excluded) who actually had some interest in academic work and desire to pursue it. Most of us were just coasting - passing exams and writing mediocre papers without any goal to have those papers ever being read by someone after the graduation.
I think that university level and other kinds of formal education should be segregated. Universities should host fewer students and being able to provide them with higher rewards for actually meaningful work and I believe that a flood of mediocre quality papers (but let's admit it, in fact they are low quality in their content and perhaps good in their presentation) will lead us to rebuild the education system.
- Note the following comment by Jerry Ling: "The effect goes away if you search properly using the original submission date instead of the most recent submission date. By using most recent submission date, your analysis is biased because we’re so close to the beginning of 2026 so ofc we will see a peak that’s just people who have recently modified their submission."
- Well… it is happening. You can’t put spilled milk back to bottle. You can do future requirements that will try to stop this behaviour.
E.g. in the submission form could be a mandatory field “I hereby confirm that I wrote the paper personally.” In conditions there will be a note that violating this rule can lead to temporary or permanent ban of authors. In the world where research success is measured by points in WOS, this could lead to slow down the rise of LLM-generated papers.
- I assume hep = high energy physics in this context. PI = professor who received a government grant.
Peer review has never really been blind and I suspect PIs will reject papers from "outsiders" even if they are higher quality. This already happens to some extent today when the stakes are lower.
- > submission numbers in the last couple months have nearly doubled with respect to the stable numbers of previous years
This is showing up (no pun intended) on HN as well. The # of submissions and # of submitters, which traditionally had been surprisingly stable—fluctuating within a fixed range for well over 10 years—has recently been reaching all-time highs. Not double, though...yet.
- Reform idea:
We should decouple the publishing of papers from academic careers completely.
Papers can't generate any reputation or money for the authors anymore. To achieve that, we must anonymize the authors.
All scientists get some (paid) time to write papers — if they want. What they write and if they publish it is not known to anybody. They are trusted to write something of value in that time.
Universities can come up with other ways of judging which professors they hire. Interviews. Test teachings. Or the writing of an non-public application essay, which describes their past research and discoveries.
- In a normal and sane world, a scientist is a nerd about their field. They are highly interested in new thoughts and insights. When a new paper in their field is published, they try hard to find the time to read it. The reason is: every paper is written by enthusiasts who want to add something of value, new insights, to the discussion. Proving or disproving theories, adding puzzle pieces to the general picture.
That is the normal situation, which is the foundation of the progression of civilisation.
But some people install incentive systems to sabotage this. They are sabotaging civilisation itself.
by general_reveal
0 subcomment
- “And further, by these, my son, be admonished: of making many books there is no end; and much study is a weariness of the flesh.”
- Ecclesiastes 12:12 (KJV)
I suppose we’re entering TURBO mode for of ‘making many books there is no end’.
by 8organicbits
0 subcomment
- > when AI agents started being able to write papers indistinguishable in quality from [...]
Given that arXiv lacks peer review, I'm not clear what quality bar is being referenced here.
- This title should have been editorialised. It's like a headline from the daily mirror.
- What's happening? I hate click bait titles like these.
by pavel_lishin
0 subcomment
- Apparently "hep-th" stands for "High Energy Physics - Theory".
- There are many really excellent papers out there - the kind which will save you hours/months of work (or even make things that were previously inviable to build viable).
That said, it is amazing how terrible a lot of papers are; people are pressured to publish and therefore seem to get into weird ruts trying to do what they think will be published, rather than what is intellectually interesting...
- The shilling for AI continues. How much $$$ do the big tech companies pay Columbia? Oh yeah, and what exactly did Columbia agree to do to get the trmp admin to leave them alone? All speculation of course, but the circumstantial picture stinks.
- One thing I have been guilty of, even though I am an AI maximalist, is asking the question: "If AI is so good, why don't we see X". Where X might be (in the context of vibe coding) the next redis, nginx, sqlite, or even linux.
But I really have to remember, we are at the leading edge here. Things take time. There is an opening (generation) and a closing (discernment). Perhaps AI will first generate a huge amount of noise and then whittle it down to the useful signal.
If that view is correct, then this is solid evidence of the amplification of possibility. People will decry the increase of noise, perhaps feeling swamped by it. But the next phase will be separating the wheat from the chaff. It is only in that second phase that we will really know the potential impact.
- I like AI, I use Codex and ChatGPT like most people are, but I have to say that I am pretty tired of low-effort crap taking over everything, particularly YouTube.
There have always been content mills, but there was still some cost with producing the low-effort "Top 10" or "Iceberg Examination" videos. Now I will turn on a video about any topic, watch it for three minutes, immediately get a kind of uncanny vibe, and then the AI voice will make a pronunciation mistake (e.g. confusing wind, like the weather effect or the winding of a spring), or the script starts getting redundant or repetitive in ways that are common with AI.
And I suspect these kinds of videos will become more common as time goes on. The cost to producing these videos is getting close to "free" meaning that it doesn't take much to make a profit on them, even if their views are relatively low per-video.
If AI has taught me anything, it's that there still is no substitute for effort. I'm sure AI is used in plenty of places where I don't notice it, because the people who used it still put in effort to make a good product. There are people who don't just make a prompt like "make me a fifteen minute video about Chris Chan" and "generate me a thumbnail with Chris Chan with the caption 'he's gone too far'", and instead will use AI as a tool to make something neat.
Genuine effort is hard, and rare, and these AI videos can give the facsimile of something that prior to 2023 was high effort. I hate it.
by hmokiguess
2 subcomments
- I think this is solid proof that the bedrock of academia is deeply motivated by money and still defaults to optimizing where it impacts its bottom line. If professors can get more grants and more publications in less time with less spending, of course they are going to be doing that. This isn't just because of AI, but also because of how this system is designed in the first place.
- "THIS happened to submissions about high-energy theory to arXiv, and it will leave you speechless!"
- Who's spending money to write bots to comment on obscure (to me) websites and why?
by NooneAtAll3
1 subcomments
- Clickbait title
what would be a better one?
- I think the long term impact of this will be to strengthen the importance of social ties in academic publishing. As it is there are so many papers published in many fields that people tend to filter for papers published by big names and major institutions. But the inevitable torrent of AI slop will overwhelm anyone who is looking for any gems coming from outsiders. I suspect the net effect will be to make it even more important that you join a big name institution in order to be taken seriously.
by bitbytebane
1 subcomments
- STOP CITING YOUTUBERS AS A CREDIBLE SOURCE OF ANYTHING.
- Noise is going to be the coming years biggest issue for so many fields. A losing battle like arguing with a conspiracy minded relative, you can slowly and clearly address one conspiracy and disprove it, by the time you do, they are deep into 8 new ones.
- Website's down. What was it about?
- What is?
by lloydatkinson
1 subcomments
- Isn't there a rule about vague titles like this?
- What is happening?
- Convenience dictates that we will be drowning in slop as long as convenience lets us rank academics by number of publications. Publish or perish?
- Honestly, this is good. We were already in a completely unsustainable system. Nobody had an alternative. We still don’t have one but at least now it’s not just merely unsustainable— it is completely fucked in half.
This kind of pattern is gonna get repeated in a lot of sectors when previous practices that were merely unsustainable become unsustained.
by ModernMech
0 subcomment
- I mean... I dunno I wish the AI could write my papers. I ask it to and it's just bad. The research models return research that doesn't look anything like the research I do on my own -- half of it is wrong, the rest is shallow, and it's hardly comprehensive despite having access to everything (it will fail to find things unless you specifically prompt for them, and even then if the signal is too low it'll be wrong about it). So I can't even trust it to do something as simple as a literature review.
Insofar as most research is awful, it's true that the AI is producing research that looks and sounds like most of it out there today. But common-case research is not what propels society forward. If we try to automate research with the mediocrity machine, we'll just get mediocre research.
- If someone mentions Sabine Hossenfelder and it isn't to expose her as a rage-bait intellectual dark web grifter, then it puts that person in a suspect light.
- Tl;dr "It's happening" seems to be AI and similar writing papers and coming up with theories as in this recent Sabine youtube https://youtu.be/JvgaZ_myFE4?t=72
- [dead]