Facebook: "announces mind reading headset to animate imaginary body parts"
People: "Nice try, CIA!", "That's big gender propaganda!", "I'm not going to connect my brain to the internet!" "Not guilty, your honor. Facebook made me do it via the headset."
... oh that's interesting. Creation of a phantom "limb" with a brain control interface? I wonder how much control there is? Does it just wiggle? Is it purely binary up/down? Can they control the angle?
I actually have a set of LED eyes that I control with puppetry, last I looked at BCIs it was woefully incapable of what I wanted but maybe I should look at this again...
Edit: The twitter link shows a video which completely invalidates my previous comment. The ears do seem to be fluidly controllable.
Previous comment:
I would assume it's just two states (ears up and ears down) that will be switched to. Most VRChat avatars I have seen do exactly this but through pressing a button rather than mind controls.
Even something simple as this adds a lot of immersion! There are probably specific faces to go with the ears as well.
I'm hella curious how eeg stuff works now, as opposed to the cheap piece of crap I had in the late 90's/early 2000's that worked on the same principal way in it's infancy. Thing I had you could set 3 inputs for and even just recording the right "thoughts" that you sent to trigger them was a PITA, let alone getting it to work while using it in a game or something.