MORE than 150 grooming crimes have been recorded by police in Sussex in the last two years, data obtained by the NSPCC has revealed.

There were 153 offences of sexual communication with a child recorded from April 2017 to April 2019 in Sussex.

In England and Wales there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year. The offence came into force on April 3, 2017, following an NSPCC campaign.

The data obtained from 43 police forces under Freedom of Information laws in England and Wales also revealed that where age was provided, one in five victims were just 11 or younger.

In 2018/19 in England and Wales the number of recorded instances of the use of Instagram, which is owned by Facebook, was more than double the previous year.

Overall in the last two years, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in more than 70 per cent of the instances where police recorded the communication method. Instagram was used in more than a quarter of them.

The Government has indicated it will publish the Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign. The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.

The NSPCC believes it is now crucial that Boris Johnson’s Government makes a public commitment to draw up these Online Harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.

Peter Wanless, NSPCC Chief Executive, said: “It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.

“Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day. These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.”

Freya* was 12 when, while she was staying at a friend’s house, a stranger bombarded her Instagram account with sexual messages and videos.

Her mum Pippa* told the NSPCC: “She was quiet and seemed on edge when she came home the next day. I noticed her shaking and knew there was something wrong so encouraged her to tell me what the problem was.

“When she showed me the messages, I just felt sick. It was such a violation and he was so persistent. He knew she was 12, but he kept bombarding her with texts and explicit videos and images. Freya* didn’t even understand what she was looking at. There were pages and pages of messages, he just didn’t give up.

“Our children should be safe in their bedrooms, but they’re not. They should be safe from messages from strangers if their accounts are on private, but they’re not.”

The NSPCC’s Wild West Web campaign is calling for social media regulation to require platforms to:

• Take proactive action to identify and prevent grooming on their sites by:

- Using Artificial Intelligence to detect suspicious behaviour

- Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts

- Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children

• Design young people’s accounts with the highest privacy settings, such as geo-locators off by default, contact details being private and unsearchable and livestreaming limited to contacts only.

The charity wants to see tough sanctions for tech firms that fail to protect their young users – including steep fines for companies, boardroom bans for directors, and a new criminal offence for platforms that commit gross breaches of the duty of care.