Mark Zuckerberg confronted a series of tough, pointed questions from members of the European Parliament today, as the Facebook chief executive was asked to provide answers about the social media platform’s data collection practices.

The Silicon Valley executive once again apologized for not doing enough to protect user data — an apology he has offered repeatedly to government leaders and users since the Cambridge Analytical scandal broke.

“We didn’t take a broad enough view of our responsibility and that was a mistake and I am sorry for it,” Zuckerberg told EU lawmakers, as he pledged a commitment to keeping Facebook’s users safe and making changes to the platform to prevent the spread of fake news and misinformation, in the future.

Zuckerberg had sought to avoid the spectacle of publicly testifying before European lawmakers, after nearly 10 hours of grilling last month by members of Congress. But in the end, he agreed to have the testimony live streamed.

Similar themes arose in the legislative hearings, including questions today about perceived political bias and whether or not Facebook holds a monopoly over social media.

“I really think we have a big problem here,” Guy Verhofstadt, chair of the Alliance of Liberals and Democrats for Europe, speaking of Facebook’s dominance in the social media sphere.

Conservative British MP Nigel Farage credited Facebook for the Brexit vote in the UK, Donald Trump’s election and the Italian elections, saying such platforms “allowed people to get around the back of mainstream media.” But he said changes in Facebook’s algorithms this January have harmed right-of-center voices, resulting in “a very substantial drop” in views and engagement.

“I’m not generally somebody who calls for legislation on the international stage, but I’m beginning to wonder if we need a social media bill of rights to protect free speech,” Farage said.

Zuckerberg addressed what he called the “high level” questions raised by regulators, offering assurances that Facebook is working to combat harmful content, and avoid a repeat of the 2016 U.S. presidential elections, when Russian propagandists used the platform to spread misinformation.

“Let me be clear. The bottom line is hate speech, bullying, terror or violence — all of this has no place on our services,” Zuckerberg said. “In order to execute that, we need to upgrade and do a better job of executing our policies.”

Zuckerberg said Facebook’s artificial intelligence systems flag 99% of content linked to al Qaeda and the Islamic State of Iraq and Syria (ISIS), which is removed before anyone in the community reports it. Facebook also has hired 3,000 people to quickly respond to people who pose a harm to themselves.

In an effort to combat “fake news,” Zuckerberg said Facebook has attacked the economic incentives for spammers to traffic in sensational clickbait, and it’s combatting phony accounts used to spread bad information — removed some 580 million fake accounts in the first quarter of the year.

Facebook also is working with third-party fact checkers around the world to identify news Facebook users share that is provably false and append more information so readers can get what Zuckerberg described as “more well-rounded” understanding of an event or issue.

Zuckerberg sought to assure European lawmakers that Facebook would comply with the General Data Protection Regulation, giving residents of the European Union greater control over their privacy, when it takes effect on Friday.

The Facebook co-founder also rejected claims of liberal bias, restating his view of Facebook as a platform where people across the political spectrum can exchange ideas.

“I will commit to you today, we have never and will not make decisions about what content is allowed, and how we do ranking, on the basis of political orientation,” Zuckerberg said. “That’s an important philosophic point for me.”

Zuckerberg did not address a question about whether Facebook shares information with its messaging application, WhatsApp — pledging to provide that answer later, as a written response.