Author: Alan Mackenzie, e-safety adviser

As part of your school’s overall safeguarding strategy, it’s likely you will be using one or a number of technical safeguarding tools to protect students and adults from accessing inappropriate and illegal content. Web filters would be the first obvious example, but there are also tools such as firewalls, anti-virus and others.

In the beginning…
In Sept 2013, we saw Ofsted release the first ever e-safety inspection framework. Whilst there were understandable concerns that this put more [[READ_MORE]]pressure on schools, looking at it from a positive angle it was good to see a common, standardised approach which showed very clear expectations in regards to e-safety. Unfortunately, after a number of updates and clarifications this framework was removed, but all the main points made their way into the Common Inspection Framework which was released in June 2015 ready for September that year.

One of the key points in the original framework was in regard to the use of web filters in schools. This specifically related to a small-scale study that was carried out in 2009 by Ofsted which showed that filters were being used to ‘lock down’ rather than manage web access. There was one standout paragraph in the study in relation to this: “Pupils in the schools that had ‘managed’ systems had better knowledge and understanding of how to stay safe than those in schools with ‘locked down’ systems. Pupils were more vulnerable overall when schools used locked down systems because they were not given enough opportunities to learn how to assess and manage risk for themselves.” (Ofsted, The safe use of new technologies. Ref: 090231. The National Archives, p. 5).

Whilst this was quite a small study, pragmatically it was well known that filtering could be an enormous frustration – particularly for teaching staff who were trying to access classroom resources and for students trying to access curriculum content, only to find they were blocked. This goes back to the days when the majority of schools received their filtering as part of a local authority or broadband consortium-managed service. Trying to differentiate filtering levels for potentially hundreds of thousands of users across multiple age groups was an almost impossible task for a whole host of reasons, but primarily it was a fine balance between what was needed for teaching and learning, the limitations of the filter and the network setup within individual schools.

A wider range of options
But things have improved significantly since then: technical solutions such as NetSupport DNA have revolutionised filtering; the standard local authority/RBC filtered broadband setup is no longer as common as it was; and schools are able to locally manage their own services far easier than ever before, without a need for technical knowledge. Added to this, the tools themselves have become much more intelligent and easier to use – and we can see that there is a range of options for schools to effectively ‘manage’ their web access.

Moving forward in time we saw ‘online safety’ being specifically mentioned in the updated and clarified DfE statutory guidance, Keeping Children Safe in Education. There were lots of changes within this document, but in the context of this article there was one really big addition – filtering and ‘monitoring’.

Initially, this one additional word caused a lot of confusion: does monitoring mean technical monitoring or over-the-shoulder type monitoring? And what are you monitoring?

The simple answer to the first question is both; the requirement is that each school should risk-assess its own individual circumstances. To give a very simplistic example, if you’re in a school that has a single room of computer screens that the teacher can physically see and monitor, then it’s likely that physical monitoring would suffice. However, it’s more likely you’re in a school that has a wide spread of screens and perhaps uses mobile technology such as iPads etc. Under these circumstances, physical monitoring would probably not suffice and you would need to consider technical monitoring.

But what are we monitoring for?

Much of the current agenda comes under the Prevent duty but we know that it is far more than this, which is normally categorised into the three specific areas of:

•    content
•    contact
•    conduct

So how does filtering and monitoring fit into these categories?

If you think about the way in which an internet filter works, it is predominantly about the content; specifically managing and preventing deliberate or inadvertent access to inappropriate or illegal content.

But this is where a filter has always fallen short; it’s just web access. If you think about all the things you do on a computer or other device, web access only forms a very small part. But the concerns around filtering are more than that. For example, on more than one occasion when carrying out an audit on filtering logs, I have found students that have attempted to access sites on self-harm, suicide, incest and much more. The filter has done its job by blocking access, but blocking access has done nothing to safeguard the student because, in the majority of these particular schools, nobody knew that access was being attempted. Nobody was looking at the logs; the ‘technical’ aspect wasn’t included in the school’s overall safeguarding strategy.

Taking it further…
Over the last few years, tools such as NetSupport DNA have transformed monitoring and are now a necessary and important part of any safeguarding strategy. DNA’s combination of proactive and reactive tools (such as its unique keyword and phrase monitoring tool, keyword cloud, review features and much more), gives schools a whole range of options and flexibility to manage everything that is happening on devices – not just the web.

Although ‘monitoring’ can be seen as an extra burden, it isn’t. Monitoring is a far more effective tool to help in the management of content, contact and conduct and the school’s overall safeguarding and child protection strategy.

Watch our short video to find out more…