UK home secretary Amber Rudd is expected to call on Silicon Valley executives attending a meeting in San Francisco to play their part in countering extremism.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
She is to attend the inaugural meeting of the Global Forum to Counter Terrorism that was set up in June 2017 by Facebook, Microsoft, Twitter and YouTube.
Representatives from the tech industry, government and non-governmental organisations (NGOs) are expected to share information and best practices about how to counter the threat of terrorist content online.
The meeting is expected to be attended by representatives of the founding companies, around 20 other tech firms and NGOs, and government representatives from the US, Australia, Canada, the European Union (EU) and the UK.
In the wake of the Westminster terror attack on 22 March 2017, Rudd began a crusade against end-to-end encryption, but she appeared to back down after a meeting with representatives of Facebook, Google, Twitter and Microsoft to discuss ways to ensure that security officers get the data they need in the future.
However, Rudd is expected to use the forum meeting in San Francisco to push for increased efforts by service providers to remove extremist content from their platforms.
“Terrorists and extremists have sought to misuse your platforms to spread their hateful messages,” Rudd is expected to tell the tech executives, according to Reuters.
She is expected to say that the forum is “a crucial way to start turning the tide” and that the responsibility for tackling the extremist threat at every level lies with both governments and with industry.
“We have a shared interest: we want to protect our citizens and keep the free and open internet we all love. Today’s meeting of the forum is the next step towards achieving these goals,” she said.
After the London Bridge attack on 3 June 2017, UK prime minister Theresa May called for closer regulation of the internet to “deprive the extremists of their safe spaces online”.
The UK government has come under fire for seeking even greater powers of intrusion after passing the Investigatory Powers Act (IP Act) in December 2016, which many consider too intrusive.
Balancing censorship and safety
Announcing the Global Forum to Counter Terrorism on 26 June 2017, the founding organisations said the initiative was aimed at helping them make their services hostile to terrorists and violent extremists.
Recognising that the spread of terrorism and violent extremism is a pressing global problem and a critical challenge, the founders said: “We believe that by working together, sharing the best technological and operational elements of our individual efforts, we can have a greater impact on the threat of terrorist content online.”
The forum, they said, builds on initiatives including the EU Internet Forum and the Shared Industry Hash Database; discussions with the UK and other governments; and the conclusions of the recent G7 and European Council meetings.
Despite these commitments, most US tech firms have rejected calls to allow governments access to encrypted services, saying they need to balance the demands of state security with the freedoms of democratic society.
The tech giants have been criticised for failing to take real action and accused of promising co-operation in an effort to stall plans by the UK, the US and the EU to introduce legislation to force them to make it easier to identify and locate users, reports the Guardian.
According to the forum, the inaugural meeting on 1 August 2017 will be used to formalise goals for collaboration and identifying with smaller companies specific areas of support needed as part of the forum’s workplan.
“Our mission is to substantially disrupt terrorists’ ability to use the internet in furthering their causes, while also respecting human rights. We believe that the best approach to tackling online terrorism is to collaborate with each other and with others outside the private sector, including civil society and government,” the forum said.
Claire Stead, online safety expert at security firm Smoothwall said social media companies must collaborate with partners that have the capabilities to monitor for illegal or inappropriate behaviour and removing any concerning comments, accounts and pages.
“All stakeholders in the industry must work closely together and be honest and open with each other. It’s the only way to solve this problem,” she said.
However, Stead said freedom of speech must remain key. “It is important to ensure that it is used in a way that benefits everyone; enabling the police and law enforcement agencies to identify potential threats and keep the public safe, while not imposing themselves on innocent parties,” she said.
Monitoring should therefore be intent-based rather than event-based, she added, assessing the behaviours of searches together and deciphering whether it has negative connotations and needs to be monitored.
“If censorship on mainstream social sites emerges, it is highly likely that extremists will move to different platforms or the dark web. I doubt it will ever be eradicated completely, but platforms that facilitate this behaviour could and should be doing more to discourage and identify it to protect the public from harm,” she said.