upvote
I keep hearing this at work but so far no one has explained what “learning ai” actually means. It seems to just be nonsense like those people selling prompt recipes or claiming to be prompt engineers.

No one needs training in prompting AI. I could understand if they meant a deeper layer of integrating tech with systems but all they ever mean is typing things in to a text box.

reply
I suspect that, in practice, what many enthusiastic advocates mean by “learning AI” is actually “learning to need AI”.

In other words, the aim is to get kids used to using AI as soon as possible, so that they do not learn the skills to function without depending on it.

reply
If you’re smart AI saves you time getting to something you could probably achieve anyway. If you’re… not smart… then it will be a necessary crutch for you to get through life.

I can see the angle for making sure kids start using it before they develop the skills to become independent of it.

reply
You absolutely need prompting skills to use AI usefully. You need to know how to eliminate sycophancy, how to ask for and check primary sources, and how to use follow-up questions.

I've been using AI for some legal issues, and it's been incredibly good at searching for case law and summarising the key implications of various statutes - much more efficient than web search, with direct links to the primary sources it finds.

I'm still the one gaming out "What if...?" and "Does that mean..?" scenarios and making sure the answers are grounded in the relevant statutes, and aren't mistakes or hallucinations.

It's not so much a prompting problem as a critical thinking and verbal reasoning problem.

reply
Learning those prompting skills was very useful for you, but in the context of schools it's a lot more difficult to make the investment worth it.

Schools are slow, by the time the teachers get around to teaching the sophisticated techniques you use today, those techniques will be obsolete, the new AI models will require completely different style of prompts.

As for critical thinking and reasoning, those are even harder to teach. How can teachers teach what they don't know?

reply
> It's not so much a prompting problem as a critical thinking and verbal reasoning problem.

And that means you have to learn without AI to understand when the AI is wrong. This is just how its dangerous to use a calculator without knowing math since you wont spot when you entered things wrongly etc.

reply
As someone who sells AI... You'd be shocked at how bad people are at using AI.

My 6 year old kid who watches me is a better prompter.

reply
Especially since kids these days aren't even very good at using computers:

http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-co...

It seems to me that if someone can read and think critically-- they can RTFM and get much better much quicker at computers and AI than people who spent all their time tapping an ipad to watch the next video.

reply
I'd think really the only AI skill you need is the ability to think independently and be able to verify the results you are getting or spot when something is wrong in the response.

It would take a few sessions at most to take someone from 10 years ago and get them fully up to speed with AI tools since they have zero learning curve.

reply
I think exercises when student is given pre-generated AI output and told to identify as many issues or mistakes as possible might be sensible. Not sure how long creating such exercise would take and what should be the tools or sources to verify the output but that might be helpful excersise.
reply
Similar to Google and Wikipedia lessons back in my day.
reply
You also need to understand the limits of AI and that it has limits that a human that gives you usually correct and authoritative answers does not have.

I think it comes easily to the sort of people who comment here. Moat people have a very vague understanding of computers in general.

reply
reply
Are these supposed to be the "skilled" prompts? This just reads as a basic conversation and not as particularly well-written or well-defined prompts. So far everything I've seen about prompting "skills" has just come down to being able to articulate and critically think a bit.
reply
I’m not sure anything was clarified. Nothing about that conversation is special or unique?
reply
I don't think they need to learn 'AI workflows' (whatever that means). But I think it makes sense to use the LLM's as a resource.

I've used them when studying new languages (human languages not programming languages) and ML algorithms and they've been really useful.

Learning to check the citations it gives you is a useful skill too. I wish many adults were more sceptical about the things they are told.

reply
It's true that you can use LLMs as a learning resource and to unblock you. But students just aren't. They are using them as a way to avoid thinking, avoid research, and just spit out an answer they can paste in to their homework.
reply
They should at least require handwritten work, the kids will still be AI-stupid but will at least be able to write.
reply
You remember better when you write, too.
reply
I assume "AI workflows" means knowing how to split up a task to create a chain of agents that can complete a specific task reliably.

A bit like software development.

reply
The problem is that the task you've defined "split up a task to create a chain of agents" has changed dramatically in just the last six months, nevermind the last two years.

You're wasting effort and teaching an obsolete technology if you try to make primary/secondary education too topical. Students can learn how to decompose a task and how to think critically without ever touching a Large Language Model.

reply
AI “workflows” share the same addictive characteristics of web surfing online virtual media, which can be counter productive. In this regard, we do need some serious learning at all the levels in the workplace. Otherwise we will become addicted to the slot machines.

Addiction is a much harder problem than distraction.

reply
Had a buddy who works at a prestigious university teaching film history tell me their big boss is basically forcing all classes including his ones on film history to incorporate AI education in some way. So silly.
reply
It's not FOMO. The line level people actually educating the children don't give a crap about the technology. They will generally make the best of whatever resources they have and procure wisely. Like everything else in government it's an administrative racket and all the suppliers fan the flames because they make money. Ain't no different than how your local building or environmental inspector finds himself screwing people doing nothing wrong and approving absurd stuff because that's what the rules big business ghost wrote and paid to have the government adopt say he must do.

Kids are using crappy subscription education services for homework and doing all their reading on screens (and educators are toiling away to work with these systems) because the people who make money off the services and screens paid to have the incentives distorted such that buying their products is the least shitty option.

reply
> I don't see the advantage of learning 'AI workflows'.

This would be just the modern version of "Computer class" back in the day when we learned to use word, excel, etc. Just another tool among others that is helpful to learn but should be limited to that specific class.

Though actual sad thing learning from friends with kids is that the modern "computer class" does not actually teach kids to use computers much these days.

reply
Yeah I'd be happier if they learned how an Apollo computer worked (even though it has virtually no relevance) than how to use Excel.
reply
This reminds me of Harvey Cragon's intro to computer architecture textbook...

When it introduces Harvard vs. Von Neumann architectures, it doesn't invent some dumb RISC computer to illustrate the difference... No... it makes you learn the actual von Neumann machine! Also Conrad Zuse Z machine.

Cragon's argument is that students will not learn the concept of engineering trade-offs, if presented with a clean "textbook" architecture.

I hated MIX for various reasons, it's sort of in-between simple and kludgy.

[0] Cragon was professor at University of Texas Austin ca 1980. Also the architect of TI's ASC in the 1960s.

reply
[dead]
reply
>I don't see the advantage of learning 'AI workflows'.

Eventually everything that can be learned from a book will be done much better by machines, so for humans to have any chance of being employable they'll need to develop the soft skill of working with intelligent machines.

reply
Just as "there is no royal road to mathematics", no AI can do your learning for you. The need for memorization of essential math identities (like multiplication tables and use of fractions) or rules of grammar (like verb conjugation or use of anaphora) will never be enhanced by AI. There is an essential role for good old fashioned rote learning that can't be avoided. To pretend AI will not impede that learning is a fool's errand, literally.
reply
I do not see the point of either of your examples of rote learning. What do you lose if you do not know the? You will pick up enough of multiplication tables through doing maths, native speakers of a language will conjugate correctly without memorising (you do need to do it if learning foreign languages). Anaphora is a technique which cannot really be rote learned - and most people to try to use it do so badly and just sound repetitive.
reply
> You will pick up enough of multiplication tables through doing maths

You will not do maths casually until you have memorized enough multiplication to make it not torture. You will not pick up multiplication from using a calculator any more than you will pick up programming from using a computer.

> native speakers of a language will conjugate correctly without memorising

They do not. They have memorized, through massive, constant, and forced practice, and now they conjugate correctly. The alternative of consulting a computer every time they need to speak is not a realistic one.

reply
If AI is still too stupid to show people how to work with it, and to notice their lacks and anticipate their needs, it can't have become that indispensably useful.

The entire point of AI is to accommodate the user. AI doesn't do anything that people can't do, is worse at most of those things, but is a lot faster at some of them (basically looking up things.) The point of AI is natural language UI.

Teaching people how to use AI is just teaching people enough about the world to give them something to ask AI for.

reply
Luddite move.

Buddy AI is here to stay. You remind me of my 2nd grade teacher who said 'we wont have calculators in our pockets'.

reply
And s/he was right. Most students who were brought up with calculators in math class cannot do basic math without one today. When shopping in groceries, they have no idea if one product costs more than another by weight. They're easy to bamboozle with the simplest misrepresentations of numbers. Is one choice of product really better than another, fractionally, or corrected for a shifted baseline? They don't know and can't use basic algebra to find out.

This is bad -- an F grade for the education system that let them slide by without learning an essential skill. The chinese aren't this lazy. And if we persist in not learning this, America's future will regress to us asking them, "Do you want fries with that?"

reply
That is poor teaching. My kids were almost always allowed calculators (always after the age of 8 or 9) and they can do all that and a lot more (my older daughter is an electronics engineer, in R & D).

For one thing you do not need to do much arithmetic to do algebra, for another estimating and getting a feel for numbers is not the same skill as learning a bunch of arithmetic techniques. No one is going to do long division while shopping.

reply
Um... there's always exceptions.

I can keep enough digits in my working memory to do long division in the grocery aisle.

I also compulsively factor numbers on license plates..

reply
AI is important but we don't know what skills will be relevant in 10+ years to harness AI (I can't imagine prompt engineering is much the same). Anyway, would a typical teacher be ahead of the curve on what pedagogical tack to take here even if it was appropriate?

The best thing to do is to set the kids up to learn the most important thing - which is how to teach oneself. If a kid can read about something, and then understand what was important from the reading, and then write about it, and then know where to go next they will be well served in the AI world.

reply
AI is here to stay. But learning to copy-paste homework into a chatbot is not really a skill one needs to learn.
reply