I would add, too, that the most recent major change has been the use of recommendation algorithms and sorting algorithms that curate our information ecosystem.
We have moved on from an environment in which most of the information we consumed was curated by humans. Even if we didn't necessarily have access to the rooms where it happened, those processes were documented. They were understandable.
We're now in a situation where that is being done in a way that is not knowable to the consumer and, in many cases, is not knowable even to the people who operate the platforms. These are artificially intelligent, machine-learning algorithms, and they frequently make decisions based on data or proxy data that may be inaccurate, that may be discriminatory and that may, in some cases, lead people who have already begun consuming some conspiracy- or disinformation-adjacent information down rabbit holes.
However, in the even broader sense, it makes us alienated from our information ecosystem, because we don't know how these decisions are being made. We know from our own research and research that's been done elsewhere that this is not inevitable, that people can take more control over their information diet, and that people who have a self-curated media diet are more resilient to disinformation.
That's the latest major addition to our digital media literacy approach. It's something we're constantly iterating on the basis of changes in the environment and new research.