By Ed Macnair, Founder and CEO, CensorNet
From this month, local authorities – along with schools, prisons and NHS trusts – are required to take specific action to stop people being drawn into terrorism under new rules that have been laid out in the new Anti-Radicalisation Law as part of the Counter Terrorism and Security Act, which states ‘a specified authority must, in the exercise of its functions, have due regard to the need to prevent people from being drawn into terrorism.’
Specified authorities will be expected to ensure those accessing the internet are safe from terrorist and extremist material that could lead to radicalisation. Councils will need to demonstrate they are protecting citizens – in particular children – from being drawn into terrorism by establishing appropriate levels of internet filtering and putting safeguarding policies in place to identify those who may be at risk, and intervening as appropriate.
A change for the better
Local councils will be required to make checks on the use of its public buildings, its internet filters and any unregulated out of school settings, including after-school clubs and groups, supplementary schools and tuition centres to support home education.
On the launch of the Act, Security minister John Hayes said: ‘We have seen all too starkly and tragically the dangers of radicalisation and the devastating impact it can have on individuals, families and communities. The new duty is about protecting people from the poisonous and pernicious influence of extremist ideas that are used to legitimise terrorism.”
Whilst the new Anti-Radicalisation Law is a welcome one, the sad truth is that content from extremist groups attempting to radicalise online have always been just a click or two away. There is an increasing trend for them to push these messages over social media cloud applications such as Twitter, Facebook or YouTube. In fact, the problem caused by popular social media sites has got far worse since YouTube changed its recommended column to an ‘Auto Play’ feature where the next video in the list is played automatically, meaning a ‘safe’ video could be followed by something far more unsavoury.
Local authorities are already faced with a constant battle to manage and police large, distributed networks. The problem is exacerbated by the numerous entry points now available through the likes of Bring Your Own Device (BYOD) which make it increasingly difficult for staff to oversee and monitor the increasingly disparate online activity across the growing number of different devices.
A cleaner internet
In complying with the new Anti-Radicalisation Law, the Government is calling for local authorities to ensure that publicly-owned venues and resources do not provide a platform for extremists and are not used to disseminate extremist views.
The Act clearly states that local authorities should consider ensuring that publicly available IT equipment uses web filtering solutions to limit access to terrorist and extremist material and take steps to ensure children attending out of school settings are ‘properly safeguarded’.
Many local authorities already use specialist internet filtering tools as a means of restricting access to harmful content, however many use outdated solutions that don’t address today’s needs to be able to properly monitor cloud application use. They therefore need to ensure that their core systems are able to track and block all the modern day vectors used by extremists to get their message across.
Because of this, local authorities should go a step further and ensure they have a solution that not only knows what websites are being visited, but also has the ability to track further down to a more granular level as to what content is being accessed or posted. Ideally, a solution that can automatically monitor for inappropriate phrases related to issues such as terrorism or radicalisation within the comments posted underneath a seemingly innocuous YouTube video upload or shared link on Facebook.
Our duty is clear
Radicalisation is but another category of the growing list of the darker corners of the web like pornography, sites promoting criminal skills or hate crime – that it is our duty to protect the next impressionable generation from. Local authorities need to have safeguarding policies in place and protect staff and citizens accessing their public networks from extremist material when accessing the internet.
The problem is very real, the government’s Counter Terrorism Internet Referral Unit (CTIRU), set up in 2010, has removed more than 49,000 pieces of content that ‘encourages or glorifies acts of terrorism’, 30,000 of which were removed since December 2013.
To adhere to Government’s new Anti-Radicalisation Law, every local authority should implement robust web security and content filtering technology to protect from inappropriate messages and content. Only through a combination of traditional web filtering technology and tracking social media activity and content across all devices can local councils benefit from an early-warning system and highlight someone who could be susceptible to radical messages.