While the implications of how social media can affect public opinion have yet to be studied in great detail, startling revelations from the 2016 U.S. presidential election are bringing to light some of the very fears privacy advocates have been warning about for decades.

Recent reports detail how Facebook uncovered more than $100,000 worth of propaganda ads bought by various Russian troll farms. These ads are said to have been viewed by at least 10 million people — most of which reside in a handful of swing states.

Facebook’s freakishly feverish reach

With more than 2 billion active users, Facebook is unquestionably the world’s largest and most comprehensive social media platform. More than that, it’s quite possibly the world’s most dangerous propaganda machine.

Created out of a college dorm room in the early 2000s, Facebook’s premise has always been simple: Get users to willingly hand over their personal information in exchange for inside access into other people’s lives.

With Facebook’s robust ad platform, users have almost unlimited options when it comes to targeting a potential audience (including the ability to target an audience based on racist overtones). It’s never been easier to influence others online, and as the results leading up to and immediately following the 2016 U.S. election prove, anyone with the know-how and enough money can use these highly targeted ads to effectively change the world.

The devil is in the detail

When initially asked about Facebook’s role in influencing the election back in November, Zuckerberg was quick to dismiss accusations, saying “That’s a pretty crazy idea.” Only it wasn’t. Not only was Facebook willingly participating in spreading anonymous ads, new evidence now suggests that Facebook not only knew about these Russian ads ahead of time, they were purposely trying to bury them.

In fact, it’s only due to increasing pressure from Congress that Facebook has decided to put more security in place at all. In an effort to counteract these disastrous ads—or as an act to possibly save face—Facebook has announced it will now require all political ads to be more transparent, meaning they will no longer allow untraceable ads. They’ve also announced the hiring of some 1,000 new employees to vet future ads.

On a similar note, it appears that Facebook has also engaged in its own ad-buying process to help dissuade negative sentiment regarding their role in the election.

Don’t fall for it—Facebook has always been aware of their influence

Facebook’s feigned surprise at their role in the election is unsettling: the company has always known what their platform is capable of and how it can be used. Antonio García Martínez, a former Facebook sales employee, says they’ve always been aware of their reach. Speaking to The Guardian last December, Martínez said:

“It’s crazy that Zuckerberg says there’s no way Facebook can influence the election when there’s a whole sales force in Washington DC that does nothing but convince advertisers that they can. We used to joke that we could sell the whole election to the highest bidder.”

More than that, they’ve been testing their platform’s limits for years. Back in 2012, Facebook conducted a series of tests to see if they could effectively influence their users’ moods. Spoiler alert: They can.

By purposely manipulating users’ news feeds, Facebook was able to influence people’s emotions. And while there have been countless studies observing the effects Facebook has on users, this was the first known study that actually sought to manipulate it.

Pay to play (for more access to user information)

What’s perhaps even more alarming is that companies like Facebook actually send out employees to help teach bidders how to use these results to better target their ads. This quid pro quo approach among campaigns and tech companies has been happening for years but is only now being viewed as a potential red flag.

It makes sense: If tech lobbyists are successfully able to latch onto a political campaign, they can work hand in hand with the campaign to help raise awareness and gain notoriety. Then, when the campaign is in power, they can use their influence to help regulate (or deregulate) specific rules surrounding their organization.

With this in mind, companies like Facebook and Google will then send their own employees to work with various large- and small-scale campaigns, teaching them how to use the software, how to target ads, and even how to analyze the results. This, in turn, gives the campaign nearly limitless power to target a larger group than the ever could through traditional grassroots movements.

Donald Trump’s own digital director even said as much when he declared their designated Facebook employee to be the campaign’s MVP.

Not just Facebook

Now, with news surfacing about Twitter’s political involvement, it comes as no surprise that sites like YouTube are also coming forward with their own trove of suspected fake ads.

In a statement made to Recode, Google (which owns YouTube) reiterated their commitment to getting to the bottom of the matter:

“We have a set of strict ads policies including limits on political ad targeting and prohibitions on targeting based on race and religion. We are taking a deeper look to investigate attempts to abuse our systems, working with researchers and other companies, and will provide assistance to ongoing inquiries.”

Why you should always use a VPN

While some of the ads may be impossible to avoid, a large number could have been avoided had more people used a VPN. Because the vast majority of these ads were targeted to a small number of U.S. cities, masking your location and taking on another IP in the area of your choice would essentially remove users from being exposed to these particular ads.

Targeted ads are everywhere these days, but keeping your VPN connected at all times is a simple way to add an extra layer of privacy to an otherwise all-too-open network.

Execs from Facebook and Twitter are scheduled to testify before Congress on Nov 1st. Google, which is still in the early process of digging through their vast database, is likely to testify shortly after.