But this isn't the first time a tech exec has predicted the death of coding.
Jensen Huang says kids shouldn't learn to code — they should leave it up to AI.::At the recent World Government Summit in Dubai, Nvidia CEO Jensen Huang made a counterintuitive break with tech leader wisdom by saying that programming is no longer a vital skill due to the AI revolution.
I think my take is, he might be right. That is that by the time kids become adults we may have AGI and we'll either be enslaved or have much less work to do (for better or worse).
But AI as it is now, relies on input from humans. When left to take their own output as input, they go full Alabama (sorry Alabamites) with their output pretty quickly. Currently, they work as a tool in tandem with a human that knows what they're doing. If we don't make a leap from this current iteration of AI, then he'll be very very wrong.
Okay but what I'm saying is that AGI isn't the logical progression of anything we have currently. So there's no reason to assume it will be here in one generation.
I'd tend to agree. I said we may have that, and then he might have a point. But, if we don't, he'll be wrong because current LLMs aren't going to (I think at least) get past the limitations and cannot create anything close to original content if left to feed on their own output.
I don't think it's easy to say what will be the situation in 15-20 years. The current generation of AI is moving ridiculously fast. Can we sidestep to AGI? I don't know the answer, probably people doing more work in this area have a better idea. I just know on this subject it's best not to write anything off.
The current generation of AI is moving ridiculously fast.
You're missing my point. My point is that the current "AI" has nothing to do with AGI. It's an extension of mathematical and computer science theory that has existed for decades. There is no logical link between the machine learning models of today and true AGI. One has nothing to do with the other. To call it AI at all is actually quite misleading.
Why would we plan for something if we have no idea what the time horizon is? It's like saying "we may have a Mars colony in the next generation, so we don't need to teach kids geography"
Why would we plan for something if we have no idea what the time horizon is? It’s like saying “we may have a Mars colony in the next generation, so we don’t need to teach kids geography”
Well, I think this is the point being made quite a bit in this thread. It's general business level hyperbole, really. Just to get a headline and attention (and it seems to have worked). No-one really knows at which point all of our jobs will be taken over.
My point is that in general, the current AI models and diffusion techniques are moving forward at quite the rate. But, I did specify that AGI would be a sidestep out of the current rail. I think that there's now weight (and money) behind AI and pushing forward AGI research. Things moving faster in one lane right now can push investment into other lanes and areas of research. AI is the buzzword every company wants a piece of.
I'm not as confident as Mr Nvidia is, but with this kind of money behind it, AGI does have a shot of happening.
In terms of advice regarding training for software development, though. What I think for sure is that the current LLMs and offshoots of the techniques will develop, better frameworks for businesses to train them on their own material will become commonplace, I think one of the biggest consultancy growth areas will be in producing private models for software (and other) companies.
The net effect of that is going to mean they will just want fewer (better) engineers to make use of the AI to produce more, with less people. So, even without AGI the demand for software developers and engineers is going to be lower I think. So, is it as favourable an industry to train for now as it was for the previous generations? Quite possibly it's not.