Obviously the answer to testing and grading is to do it in the classroom. If a computer is required, it can't connect to the internet.
Caught with a cellphone, you fail the test. Caught twice you fail the class.
The non-story beatings will continue until morale and common sense improve.
Now, I'm thinking that was pretty much they only way they could think of to ensure kids were doing things themselves.
I know it was a rough transition for my nephew, though, and I don't know that I would have handled it very well either. I'm not sure what would be a better option, though, given how much of a disservice such easy access to a mental crutch is.
Banish tech in schools (including cell phones) (except during comp classes) but allow it at home
Ie in high school only allow paper and pencil/pen
Go back to written exams (handwriting based)
Be lenient on spelling and grammer
Allow homework, digital tutoring AI assistants and AI only when it not primary- ie for homework not in class work
Bring back oral exams (in a limited way)
Encourage study groups in school but don’t allow digital tech in those groups in class or libraries only outside of campus or in computer labs
Give up iPads and Chromebooks and Pearson etc
Now, we are social animals, and we grew to value these thing for their own right. Societies valued strength and bravery, as virtues, but I guess ultimately because having brave strong soldiers made for more food and babies.
So over time, we tamed beasts and built tools, and most of these virtues kind of faded away. In our world of prosperity and machine power on tap, strength and bravery are not really extolled so much anymore. We work out because it makes us healthy and attractive, not because our societies demand this. We're happy to replace the hard work with a prosthetic.
Intelligence all these millenia was the outlier. The thing separating us from the animals. It was so inconceivable that it might be replaced that it is very deeply ingrained in us.
But if suddenly we don't need it? Or at least 95% of the population doesn't? Is it "ok" to lose it, like engineers of today don't rely on strength like blacksmiths used to? Maybe. Maybe it's ok that in 100 years we will all let our brains rot, occasionally doing a crossword as a work out. It feels sad, but maybe only in the way decline of swordsmanship felt to a Napoleonic veteran. The world moved on and we don't care anymore.
We lost so many skills that were once so key: the average person can't farm, can't forage, can't start a fire or ride a horse. And maybe it's ok. Or, who knows, maybe not.
I just tutored my nephew through his college intro to stats course. Not only are calculators allowed, but they had a course web app so that all they did was select a dataset, select columns from those datasets, and enter some parameters. They were expected to be able to pick the right technique in the app, select the right things, and interpret the results. Because of the time savings, they covered far more techniques than we did in my day because they weren't spending so much time doing arithmetic.
Despite lots of cries about "who will know how to make calculators?", this transition to calculators (and computers) being allowed was unavoidable because that's how these subjects would be applied later on in students' careers. The same is true of AI, so students need to learn to use it effectively (e.g., not blindly accepting AI answers as truth). It will be difficult for the teachers to make their lesson plans deeper, but I think that's where we're headed.
Another lesson we can draw from the adoption of calculators is that not all kids could afford calculators, so schools sometimes needed to provide them. Schools might need to provide access to AI as well. Maybe you are required to use the school's version and it logs every student's usage as the modern version of "show your work"? And it could intentionally spit out bad answers occasionally to test that students are examining the output. There's a lot to figure out, but we can find inspiration in past transitions.
The company was tutoring English Literature as one of its subjects.
They were generating English Literature exam problems - for their users - using the ChatGPT web UI.
They would upload the marking spec, and say: "Give me an excerpt from something that might be on this syllabus, and an appropriate question about it".
Naturally, their users - the high school students - were getting, often, hallucinated excerpts from hallucinated works by existing authors.
I think the kids will be fine - it'll be their world, at some point, and that world will look a lot different to now. Maybe that's too optimistic!
I would hope, in that world, LLM literacy amongst adults has increased.
Because I feel really, really bad for all the kids who are beating themselves up about getting badly marked by ChatGPT (I assume) on an imaginary excerpt of an imaginary Wordsworth poem by their functionally imaginary tutor.
It makes me laugh, and reminds me of one of my favourite jokes, about the inflatable boy who - being of a rebellious nature - takes a safety pin to the inflatable school. Chaos ensues. Afterwards, the inflatable boy's inflatable teacher says:
"You've let me down; you've let the school down, but worst of all, you've let yourself down."
I guess I'm suspicious of the linked article. Call me full of hot air, but is it actually a safety pin? Or is it just designed to look really good on an application for an inflatable college?
It's part of the job of education to instill some common culture. (Which common culture varies, but not all that much outside political topics.) For students, questions about that culture are new issues. LLMs have digested a huge amount of existing material on it. LLMs are thus really good at things students are graded upon.
This gives students the impression that LLMs are very smart. Which probably says more about educational practice than LLMs. The big problem is not cheating. It's that the areas schools cover are ones where LLMs are really good.
There's no easy fix for this.
Rather than framing this as destroying education, it should be interpreted as proof that these tasks were always shallow. AI is still much worse than humans at important things, why not focus on those things instead?
The school systems are clearly not keeping up. Any kid who isn't doing project oriented creative work, aided by an LLM as needed, is not preparing for the the world they will likely inherit.
If you're going to say "but in a working environment you use a computer", then teach them how to use text processing and spreadsheets int the computer room, a thing that didn't happen today in most schools btw.
LLMs can be amazing [^0] as an assistive technology, but using them as a "do it for me" button is just way too easy, so that's how they are de facto used.
I believe it will take about 5-10 years for us to fully comprehend how damaging unplanned remote classrooms and unchecked LLM use in the classroom was. Like heroin, it will be extremely hard to undo our dependence on them by that point. I'm pretty scared for how our students will fare on the global scale in the coming years.
[^0] I strongly believe that 60% of the value of LLMs can be realized by learning how to use a search engine properly. Probably more. Nonetheless, I've fully embraced my accidentally-acquired curmudgeon identity and know that I'm in the minority about this.
[^1] You won't believe how many people leave their laptops unlocked and their screen's contents visible for everyone to see. Committing identity theft has to be easier than ever these days. This basic infosec principle seems to be something we've lost since the great WFH migration.
https://www.nbcnews.com/tech/tech-news/new-york-city-public-...
Then reverseed the ban
https://www.nbcnews.com/tech/chatgpt-ban-dropped-new-york-ci...
A quite possible future: you're surrounded by dead-eyed humans with AI implants who mindlessly repeat whatever the chatbot tells them.
It's probably either that or ban it and do everything in-person, which might have to be the stopgap solution.
This is what this article reminded me of. The student writes how her classmates use help from AI as if she cannot decide for herself to do the work on her own if she cares about learning. She writes as if she is devoid of agency.
The Atlantic published a post on reddit about this article, titled "I’m a High Schooler. AI Is Demolishing My Education." [1] And yet, it is the other students that the author primarily focuses on. Why does other students' cheating demolish _her_ education?
[0] - https://gutenberg.org/cache/epub/1480/pg1480.txt
[1] - https://www.reddit.com/r/ArtificialInteligence/comments/1n7o...
My point is that education has to be aligned with the actual world outside.
Everyone uses AI now, for all sorts of tasks. And if they don't now, they will in the next few years. Trying to exclude AI from education is not only pointless, it's doing the kids a disservice: AI is going to be a large part of their future, so it needs to be a large part of their education.
If we follow the implied course of TFA we'd reduce AI use in schools and go back to old-skool teaching methods. Then that cohort of kids would get their first job and on day one they'd be handed an AI and told "this is the job, get on with it". Like with my ex-gf, everything they were taught would be useless because the basic foundation is different.
I know education is not entirely vocational, but if it moves too far from the world of work that everyone actually spends most of their time in, then it gets too theoretical and academic. AI is part of it, education needs to change.
I don't think this is necessarily wrong, but over the years I have seen many high achieving senior students writing about or being interviewed about topics where they are less representing the community they are a member of, but the opinions that supports those who give them praise, support, and opportunities.
I don't think it should reflect poorly on a student that does that, but I also don't think you can draw significant conclusions from their stated opinions. Most people like this have not yet found their own voice, what you hear is often the voice that they think they are supposed to have. For many, tertiary education is as much about finding that voice as it is studying specific fields
Measuring what is best for students is an incredibly complex task, not least because 'best' can mean different things to different people, and often the wellbeing of the student is not considered high enough. There is science here, but given the importance of the field, way less than there should be. Changing education for the better is extremely difficult when the science conflicts with public opinion. There are forces at play that know that their only path to success is through swaying public opinion because the science is against them. The science of education can be laborious, slow, and full of difficult to express nuance. It is also the only sure process by which we can find out what actually works.
So by all means follow the argument that it makes, but don't mistake the source as being representative. The author expresses their love for debating and development. I imagine that they would respect the sentiment that the work should stand on what was said, rather than who said it.
[as a final thought]
It would actually be an interesting research project to find articles like this written on contentious issues over the years and locating the writers to get their opinions on them with the benefit of hindsight.
Of course, they could still AI to help them with homework but people were already copying the homework from their mates. But if they just copy and don't learn, that would be surfaced during the exams.
Similarly for the debate club - why are teams allowed to have any technology in the hall in the first place?
Education is supposed to be difficult - that's how we learn!! Teachers seem to pander more and more to students who complain that "This is too difficult". As if easy learning was ever a thing!
But this has already been the case. We have all been running behind numbers for so long. Nobody gives a damn about actually learning.
I started learning after I got my first job. Started focusimg on literature, arts and languages a lot more after I started working. AI only amplifies this to the next level.
There are certain aspects like disciplinary and on time scenarios which I can agree with. But the education system has not been about education since for a long time. Sure, premium institutions had something going on. But maybe that is what will be takenover by AI as well?
In the workplace, we're using AI anyway.
I'm not sure if this direction is suitable for kids, like we still learn to do calculation even when we have calculator (which is needed for some cases, but for complex math, we opt for tools)
I hope that how we educate changes, forces by AI, improving in ways that would have helped people like me. I worry that might mean lessened access for all, if it requires the cost to go up.
Phones shouldn't be in the classroom, and devices used in the classroom shouldn't have any access to AI.
Students shouldn't really have homework anyway so I think it's completely reasonable to just have kids doing work on pen and paper in the class for the most part.
This reminds me of type 1 vs type 2 fun. Type 1 fun is fun in the moment; drinks with friends. Type 2 isn’t fun in the moment but is fun in retrospect. Generally people choose type 1 if given a choice but type 2 I find is the most rewarding. It’s what you’ll talk about with your friends at the bar. I know it’s very much old man, well I guess this high schooler is too, yelling at clouds but I do worry what the elimination of challenge does to our ability to learn and form relationships. I’d expect there to be a sweet spot. Obviously too much challenge and people shut down.
AI will, like previous technologies, enable some of us to become more productive. In fact, it raises the bar on productivity, since an experienced programmer can now create much more code. (An inexperienced one can create much more mess, so you might not see it in aggregate statistics).
When it comes to the classroom, we should do the same. We raise the bar so that in fact, you cannot do anything without using AI. Much as you would run out of time if you didn't have a spreadsheet in a stats course 20 years ago, or pandas 10 years ago. The new tech enables more work to get done in the form of learning more high level things, while relegating lower level things to just building blocks that can be understood in the same way we understand reference texts, ie "I've seen the principal once, and I can find it again if I get to that level of abstraction".
Teaching needs to change. Perhaps the thing to do is have an Oxford tutorial rather than traditional class. For those who didn't attend, a tutorial is basically two students and a professor in a room, talking. You can't hide. You can prepare however you like, and you should spend quite a lot of hours if you're sparring with a politics or math professor. But once you're in the room, it becomes painfully obvious if you are unprepared. This is a way to get accountability.
At the moment, we have this high school system testing that is a factory. Every test is done as a thing that is easily marked. Multiple choice, or short answer, or short essay. It encourages superficial learning when you know you can dance around the important topics and just pick up the easy points, as well as simply avoiding silly errors. You can also win by simply learning the likely questions, and aping the answers.
Have a weekly small-group session with an expert, and they can find your limits. Yes, it will cost money.
The only thing I'll say that's good is it might lead to less homework, which I always thought was poorly designed and mostly busywork.
We have great traction with universities in USA and Australia. The flywheel that we've constructed means that students are being prepared for industry + research in a Post-AI world, and professors can see exactly how students are using AI tooling. Our findings are that knowledge of how students are using AI goes a long way to helping institutions adapt.
Keen to chat and share our findings - reach out at hamish(at)kurnell.ai !
the montessori and sudbury school model always seemed closest to what was necessary, although now I wonder if even those are cracking at the seams with outsourced thinking
regardless, I think a re-evaluation of the point is absolutely necessary.
self-motivated children are rare and require a specific environment and support system to thrive in, but will always be there to escape the more obvious return to serfs working on fiefs, unless born into capital themselves
That was one of my frustrations with "prep" school: An artificial sense of urgency that does not, in any way, reflect how one leads a happy, healthy, and successful life; nor does one need a sense of urgency in academics to grow into an adult who makes a positive contribution to society.
> Some students may use these tools to develop their understanding or explore topics more deeply, ... can also be used as a study aid
I think the same can be said about internet searches. Altavista came around when I was in high school; and I lost all motivation to memorize arcane facts. The same can also be said about books and libraries.
Instead, it's important to realize that a lot of topics taught in schools have to do with someone's agenda and opinion about what's important to know, and even political agendas; and then accept that many lessons from school are forgotten.
> Student assessments should be focused on tasks that are not easily delegated to technology: oral exams ... or personalized writing assignments ... Portfolio-based or presentational grading
Those are all time consuming; but they miss a bigger point: What's the real point of grades anyway?
Perhaps its time to focus on quality instead of quantity in education?
It's pretty fucking dire. I think we're failing an entire generation of kids and the ramifications of this is going to be real bad in 5-10 years. I've heard similar stories from friends of mine whom are teachers.
Teachers can also use them to mark homework.
They are a boon as much as they are a bane.
It's awful, but I think we'll see it happen, sadly.
I've seen the same commentary about:
Spellcheck
Typed material
Computer art programs
Calculators
Is this what The Atlantic has come down to, publishing a complain-y piece by the class president?
EDIT: For anyone struggling with my criticism of the article, I very much agree that there is a problem of AI in education. Her suggestion which is "maybe more oral exams and less essays?" I'm sure has never been considered by teachers around the world rolls eyes.
As for how to tackle this, I think the only solution is accept the fact that AI is going nowhere and integrate it into the class. Show kids in the class how to use AI properly, compare what different AI models say, and compare what they say to what scholars and authors have written, to what kids in the past have written in their essays.
You don't have to fight AI to instill critical thinking in kids. You can embrace it to teach them its limitations.