Social media networks have made our lives more connected than ever before.
Facebook alone has 1.4 billion active users and Twitter attracts over 236 million people who regularly create, share and comment on each other’s ideas.
Yet, as filtered and finessed as our online world may seem, social media has also given way to the emergence of darker aspects of our society.
Whilst criminals are often largely seen as utilising unseen forums, dark web listings, private chat groups or peer-to-peer downloads, social media has become another tool in their arsenal.
In fact, Flickr, Facebook, Twitter and numerous other social media sites are regularly used by those who create, distribute and collect child sexual abuse (CSA) images and videos.
And while most CSA content is created illegally by networks of paedophiles and abusers, lately, we are witnessing an increasing amount of images and videos being generated by children themselves.
Learning to share is a hard lesson for children, but it seems social media has accelerated that learning curve.
Openness has become the norm and privacy a concern for another time.
Every day, thousands of selfies, food snaps, and family photos are shared on Instagram, Snapchat, Tumblr and other forums and websites.
However, whilst these spaces give the illusion of being your own, your plot of land in the teaming mass of the internet, once that material is released into the wild, it’s not only archived by you, and the social network you share it on, it’s also accessible for the rest of the world.
As a consequence, it’s become much easier for predators to trick young people into sharing explicit images of themselves and falling prey to an overly sexualised culture.
Tackling the taboo
Compared to just a few years ago social networks are doing a much better job at taking down explicit photos and videos.
Facebook, for example, has revamped its community standards in its takedown policy. It now includes a separate section on “dangerous organisations” and gives more details about what types of nudity it allows to be posted.
It actively encourages its members to report posts that they believe violate its rules.
In addition, the trend of citizen police has also emerged, as members of the public actively collaborate with social networks and law enforcement to monitor and report on suspicious criminal activities and child abuse cases.
Nonetheless, the sheer volume of CSA material being circulated online means that manual monitoring is not an option.
On average over 1.8 billion photos are shared online each day. While many of these are non-pertinent to CSA investigations, law enforcers must still sift through masses of content to uncover hidden material that can help to identify a victim and their link to criminal communities.
In addition, authorities must to trawl through masses of content available on the dark net.
While the subject still very much a taboo, awareness is growing. Legislation is being implemented to protect children, and citizens the world over are being urged to take heed of to what’s happening around them – in the workplace, in their online communities and in their neighbourhoods.
Conquering the root cause
With the proliferation of digital media, investigators and social networks face a huge challenge in trying to prevent illegal content from being shared on these services. As soon as one case is closed, another opens up.
More CSA cases emerge every day and each case could contain new or unidentified victims. The challenges facing investigators today is much more widespread than we have ever anticipated. Facebook alone has seen over 350 million photos being uploaded by users each day.
With millions of images and videos circulating online, manually reviewing and analysing these digital files is an almost impossible task.
However, technology can help our visual capabilities along the way, by using image hashing technology to separate pertinent (illegal) from non-pertinent material.
In fact, advanced image recognition technology has proven to be the most useful tool to aid investigators in connecting the dots between digital files and criminal actors when solving cases.
By tackling the root of the problem, the image itself, we can ensure that those responsible for creating and sharing this material are found.
More importantly by finding and assessing the material we can find and help victims of abuse.
Sourced by Christian Berg, CEO NetClean