It's true that we don't know that much.
In fact, I'll bring this out for you. Facebook did a study. It was called the emotional contagion study. It altered its algorithm, so that it gave some users more positive news and other users more negative news. They wanted to see what impact that had on individuals' behaviour. The ones who got more positive news were more positive in their posts, and the ones who got more negative news had more negative reactions.
It was only because this study was published that it created such an outcry. You realize, then, the power that these companies might have to affect the public discourse.
This lack of transparency will only increase as we migrate from a phone world to the world of a digital personal assistant, in which perhaps one or two of these data-opolies could very much control, with Google with its Home, and Amazon with Alexa.
Now you're going to have, in orders of magnitude, a greater amount of data and greater interaction with the digital assistant, in the home, in the car, on the phone and elsewhere. There's going to be very little transparency on how that digital assistant is going to recommend the products and services it provides—what it features, what it says, what it does and the like.
We're really moving into an unexplored terrain.