Imagine walking into a local hardware store for its grand opening. As the door closes behind you, you look around and realize that almost all of the products in the store seem uncannily familiar. Those that aren’t, they pique your interest. As the store manager greets you, it’s as if he knew you were going shopping that day, and he directs you to the products and services you didn’t know you needed or wanted. In fact, you were surprised to learn that he knew your dishwasher needed to be repaired, and you didn’t even know it was broken.
You come to find out that the store was designed for you – the only customer – and its entire inventory was assembled based on statistically driven predictions about the stores you’ve visited, the items you’ve purchased, the magazines you’ve subscribed to, and the conversations you’ve had with friends and family. But you didn’t realize anyone was even paying attention.
While the hardware store example seems relatively innocuous, the reality is that something like this is happening in the vast majority of Americans’ online experiences, from simply trying to catch up on the news to socializing with friends.
As technology has evolved and made life easier for the American people, so too have consumers’ concerns over privacy and how their data is being collected, shared, and used by third parties. That’s where my hardware store analogy might start to make a little more sense. As you use your smartphone, smart speaker, or computer to click, share, surf, download, purchase, post, or watch online, internet platforms are learning about you, and as they do, they use artificial intelligence to gorge on your data and create an engineered, just-for-you experience based on statistical predictions about what these platforms think you’ll want to see or do next. When we’re on our smartphone or on the internet, we’re constantly being measured, receiving engineered feedback to keep us engaged on the platform.
As internet platforms deliver this customized online experience, users can become ensnared in what one person in the technology industry coined a “filter bubble” – the online version of the hardware store that was constructed just for you. These filter bubbles can be created on social media platforms like Facebook and Instagram, search engines like Google and Yahoo, or even entertainment platforms like Netflix and Hulu.
Now don’t get me wrong, a personalized experience on these platforms isn’t necessarily a bad thing. For example, I like when Netflix suggests which program I might like to watch next. If I haven’t checked Twitter in a while, it can be helpful to see the top tweets I’ve missed, which are curated based on the content with which I’m most likely to engage.
Other times, though, I might want to opt out of that filter bubble and experience the platform without viewing content that has been selected based on my personal data and behavior. Think of this opt-out experience as a competing hardware store that’s been built for the entire community, where a wide range of products and services exist, not just the ones that were strategically selected and placed in front of you.
For example, Twitter currently allows its users to toggle between a customized newsfeed – based on what it thinks you’ll like to see – and another newsfeed that’s a chronological view of content as it’s being posted. Another example, in a filter bubble experience, if you and I each searched the same term – “shopping,” for example – we’re likely to be delivered vastly different results. In a filter bubble-free experience, we should be delivered the same results, no matter who we are.
Like a customer who might enter one of my imaginary hardware stores, I think consumers should have the option to choose between a personalized, filtered view of content and a view that is filter-free. Or, at the very least, consumers should know which “store” or experience they’re about to enter. That’s why I recently introduced the bipartisan Filter Bubble Transparency Act in the Senate, which would give consumers more control over their digital experience and provide them with more transparency about what they’re seeing online.
My bill is pretty simple and straightforward. It would require large-scale internet platforms to let consumers know when they’re being given a personalized user experience designed by artificial intelligence (filter bubble transparency), and it would give users the option to escape the filter bubble and consume information that has not been engineered specifically for the user to see (consumer control). That’s it.
I strongly support a light-touch approach to internet regulation that allows the free market to flourish. The internet would not have grown the way it has if it had been weighed down with heavy-handed government regulations. But in order for free markets to work effectively, consumers need as much information as possible – including a better understanding of how internet platforms use artificial intelligence and complex filters to shape the information users receive, and that’s exactly what my bill aims to achieve.