In its 14 years of existence, Facebook has done as much to sow discontent and shred privacy as it has to connect people.

Those things are not unrelated, as the company’s recent series of crises has shown. At the heart of it all is Facebook’s power to persuade, whether it wants you to buy a pair of shoes, back a political candidate or spread misinformation.

The company has failed to reckon with that responsibility, choosing business as usual over tough reforms that may mean lower profits. Facebook needs a watchdog.

Counting 68 percent of Americans as users, Facebook is an almost indispensable platform for community interaction.

But along with their friends’ photos and local events, those users are also exposed to propaganda and hate.

And with each click, the company gobbles up more private information to add to its unprecedented – and up until this point in history, utterly unthinkable – trove of valuable data.

From Russian trolls to data leaks, we’ve seen how disruptive and destructive the bad side of Facebook can be.

The company, however, has shown little interest in dealing with it. That’s a bad combination.

Facebook owns a tremendous amount of information, and if it were using that information solely to sell jeans and airline tickets, that might not be so bad. Other forces, however, have realized Facebook’s power, and been wielding it freely.

Authoritarian governments have used the site to spread propaganda and target minority groups. Hate organizations have used it for recruitment. And most famously, accounts connected to the Russian government used it to spread misinformation, divide Americans and influence the 2016 presidential election.

Facebook’s response has been, to say the least, underwhelming – and dishonest.

We found out in March that Facebook allowed a voter-profiling firm to acquire information on millions of Facebook accounts, then took no steps to alert users. The company at first denied the report.

And earlier this month, The New York Times reported that Facebook engineers had discovered suspicious activity linked to Russians as early as spring 2016.

Soon, the company recognized how far-reaching and organized the Russian operation was – it was a “five-alarm fire,” one employee said.

But all along, rather than face up to the problem, Facebook has buttoned up. CEO and founder Mark Zuckerberg publicly downplayed the influence of Russian accounts, in direct contrast to what the company knew.

Facebook hired a lobbying firm to push negative stories about its critics, slinging exactly the kind of “fake news” it said it was trying to eradicate from the site.

When confronted with the story, Facebook again initially denied it.

But on the night before Thanksgiving, with few people paying attention, the company released a statement confirming the Times report.

Facebook has tried to paint the errors as growing pains, but they seem more the result of how Facebook does business. Sure, Facebook wants to “bring the world closer together,” but it also wants to sell us stuff – and if there’s a conflict between the two goals, the latter always wins out.

Unfortunately, short of a mass exodus of users, there is no way to hold the company accountable.

That needs to change.

There must be oversight of Facebook and other tech companies that have positioned themselves to play such an outsized role in our lives, although lawmakers must be careful not to stifle innovation or speech.

That’s a challenge, and one that newly empowered House Democrats should take up in Congress.