Once we can get brain-device-interfaces implanted into us in order to sort of link our brains with a computer/phone, we probably won't get that information as pictures appearing in front of our eyes.
The information would probably be delivered raw into our minds. We'd just know what time it is, what the weather will be, who's calling us and what the news are saying, without having to process the information.
Maybe we aphantisians will get a version with mental voice narration instead of mental images, much like visual blind people have today for visual media and information.
Maybe, but if it’s designed for a brain that works “normally”, the way a thought “works” (or is served up) might be incompatible.
For example, consider the prevalence of the phrase “close your eyes and picture X”, particularly in meditation and other relaxation techniques. I am incapable of doing this exercise (and for most of my life I was very confused by the phrase). Most people, from what I’ve been able to gather, are incapable of imagining a world without that inner picture, so why would they make the interface work for that deficit?