Why ChatGPT isn’t conscious – but future AI systems might be
Why ChatGPT isn’t conscious – but future AI systems might be

Why ChatGPT isn’t conscious – but future AI systems might be

Why ChatGPT isn’t conscious – but future AI systems might be
Why ChatGPT isn’t conscious – but future AI systems might be
We still have no real idea how consciousness develops in humans so how can we even begin to create it?
We don't have to. We create an artificial approximation.
We don't need to mimic our brains at all. We just need system that responds with the correct outputs to the inputs we give it.
Artificial intelligence if you will.
Humans aren't all that.
How do you approximate something we do not understand?
How do we know when we have created ir, when we do not understand what is it?
If we aren’t all that won't anything we create be less than all that as well?
We could engineer artificial flight without having a precise understanding of natural flight.
I think we don't need to understand how consciousness develops (unless you want to recreate exactly that developing process). But we do need to be able to define what it is, so that we know when to check the "done"-box. Wait, no. This, too, can be an iterative process.
So we need some idea what it is and what it isn't. We tinker around. We check if the result resembles what we intended. We refine our goals and processes, and try again. This will probably lead to a co-evolution of understanding and results. But a profound understanding isn't necessary (albeit very helpful) to get good results.
Also, maybe, there can be different kinds of consciousness. Maybe ours is just one of many possible. So clinging to our version might not be very helpful in the long run. Just like we don't try to recreate our eye when making cameras.
Cool.,
What is it?
Entropy my friend
How does that relate?