SecurityTechnology

The algorithms that suggest your kids’ content aren’t neutral, protecting their online safety shouldn’t be either

Search engines have given us easy access to more and seemingly ever-fresh sources of content that have the potential to be just as detrimental, if not more so, than cyberbullying. Of course, search engines, just like most technologies, are not inherently bad, but they aren’t neutral either which means there’s good reason to be cautious.

Paradoxically, while one of the greatest areas of progress in computing relates to search engine algorithms, some of the most concerning issues are rooted in them too. With greater use of search engines, their design evolved largely around deep learning, location and more data processing power. This combination has made them more powerful, making it easier for users to find the content they request. However, it has also increased the opportunity for unwanted or harmful content to appear or be requested and potentially disturb the user.

Early search engines
The first web search engines were built in the 1990s, and many of their key developments took place in the same era. However, modern search engines, like Google, are now self-optimised, with algorithms tuned in real-time and daily improvements to the user experience to suit the ‘modern’ user.

Over time, search engines have become more sophisticated and now fit into our pockets via our smartphones. Users, many of them children, have the entire web at their disposal at all times. This means accessing or receiving inappropriate or harmful content is highly likely. The internet is an expansive place that gives various groups and communities the opportunity to meet and scale their influence – for better or worse.

Not your neighborhood library
It’s alarmingly easy for a child, older minor or even an adult to stumble upon, attract or deliberately view harmful content. This is an ever-present reality that becomes all the more concerning given how efficient search engines have become, and how inappropriate or harmful content proliferates on social media, online forums, websites, and digital ads.

To better serve users, the internet rapidly evolved with the introduction of predictive search and monetisation; as such, algorithms leveraged by search engines began to not only locate content but crucially, suggest content as well. Large social platforms and search companies not only employ these developments to drive profit via ads (for example) but also to ‘feed’ users content that has the potential to (artificially) broaden their interests. In this manner, search behavior informs the users’ ‘for you’ or ‘suggested’ pages. This can be particularly problematic for children and young adults, whose interests and personalities may not have fully formed.

When search gets personal
Parents and educators need to be aware of the dangers awaiting minors online and be educated enough to help them. To highlight how direct a correlation there is between behavior-based search and the provided results, let’s consider how easily a “What I eat in a day” video may land you on a pro-ana (pro-anorexia) online forum, a thinspiration (thin inspiration) message board, or even a thread full of self-harm tips or other explicit content.

Algorithms work tirelessly to bring users the content they might enjoy and interact with. Even though large social media platforms endeavor to protect their users following high-profile investigations into what technology platforms know about its effects on children’s mental health, efforts often lag behind technological developments.

A toolbox of prevention
It is natural that kids want to spend time on the internet, but they should never be wholly unsupervised. A great tool to help you keep tabs on your child´s behavior online is a trusted parental control solution, like ESET’s Internet Security for Advanced Protection which ensures secure web browsing for kids. In addition to providing limits on how long your child can access certain apps and websites, it can also block specific content types and URLs for PCs and mobile devices alike.

A good quality parental control and internet security solution categorises websites to block categories that are unsafe and inappropriate for children. Safe search features can also filter search engine results so that parents can rest assured that search engines will not suggest inappropriate content to kids. When selecting an internet safety solution for their family, parents should ensure it also has a feature to allow them to manually blacklist websites and apps so they can make a call about what content is appropriate. The same applies to whitelisting appropriate resources.

Whether you as a parent start using a parental control package, and even more important task remains: educating yourself about the content that is on the web and having regular conversations with your children about the online and offline world. Talking to your children is one of the best tools you can give them to protect themselves. Education on any subject should start in the family, and that is especially true for personal and private topics and our online presence.

Children and minors deserve to be treated with respect and educated about the choices we make about or for them. Talking to them about their online behavior may make them feel like we are invading their privacy, so be sensitive and make sure they feel heard and understood.

Pin It on Pinterest