Good Digital Parenting
Blog | Sept. 23, 2019

Protecting Vulnerable Users on Instagram

Program Coordinator, Family Online Safety Institute

With a few weeks of the back to school season under our belts, it is easier to step back and examine our daily routines. Are our children making it to school on time? Are they eating a balanced diet? Are they spending too much time online? Are those interactions healthy? It can be difficult to answer those questions yourself, so it is helpful to include your teen in this conversation. Ask them their honest opinion about tech use and to reflect on their experiences thus far. If either of you notice there has been too much screen time, know that platforms continue to roll out features to help families carefully curate their online experience. Instagram in particular allows for the management of time, comment control, and content modifications for vulnerable users who may need help creating a healthier experience.

Instagram encourages interactions that are positive and inspiring. The Activity Dashboard helps ensure your teen’s time is being spent intentionally. The Activity Dashboard tells your teen (and all users) how much time they have spent on Instagram per day and per week. The Daily Reminder allows users to decide how long they want to spend on Instagram each day and sends an alert once that self imposed limit is reached. The Mute Push Notifications feature silences notifications for a short amount of time. All three in-app features of the Activity Dashboard can help your teen learn to manage screen time by opening their eyes to the specific amount of time spent on Instagram. It very well could be the first time they quantify their use. 

Controlling comments will contribute to a more positive experience on Instagram if your teen struggles to manage what is written on their posts. Users can enable both automatic and manual comment filters that remove comments automatically if they threaten the appearance, character, health, or wellbeing of the receiver. Enabling a manual filter allows your teen to separately enter a list of words or phrases that would trigger the removal of a comment should they contain that phrase. Additionally, artificial intelligence notifies a commenter when their message may be perceived as offensive. After a user posts an offensive comment, a pop up message asks “Are you sure you want to post this?” The process allows users to rethink any unkind comments and ‘undo’ the post in order to prevent the recipient from getting a notification. AI compares new comments to ones that have been previously reported with the goal of making Instagram a supportive place. 

When put into practice, these collective filtering features could go a long way to reduce the amount of upsetting interactions your child may receive on Instagram. Taking preventative measures like blocking certain comments, or blocking comments altogether, frees your teen’s attention from unwanted interactions to focusing on sharing what makes them happy.

For ongoing issues with bullying, Instagram’s Restrict feature aims to limit interaction entirely. If your teen restricts an account, the comments submitted by that person will only be visible to them, and not your teen or their other followers. The restricted user cannot see if your child is active on Instagram nor can they see if they have read their direct message. This feature protects users from undesired interactions without the other person knowing their content is invisible. It can be difficult as parents to recognize online bullying, which is why we’ve created an Understanding Cyberbullying resource for tips and examples. 

Lastly, Instagram protects vulnerable users by heavily restricting content containing self harm or suicide imagery. The platform does not allow graphic images of self harm (like cutting) and it removes images if they are reported. It also does not allow fictional depictions of graphic self harm or suicide from drawings, memes, or film and comic content nor imagery that shows associated materials or methods. Additionally, Instagram does not show non graphic images (like healed scars) through its search, hashtag, or explore pages. Users can still upload content depicting non graphic images, but it will not be recommended to others through discovery. This content is allowed in an effort not to stigmatize or isolate its users who post non graphic images and look for support from their online community. 

For those that do search self harm related phrases, a pop up message will show resources and organizations that help those in need and will encourage that user to reach out to a friend, helpline, or other trusted person. The platform also protects vulnerable users by placing a blurred sensitivity screen over potentially triggering non graphic images so they are not immediately visible. Combined with the removal of graphic content and the restriction on the promotion of non graphic images, all users can navigate Instagram without the stress of seeing surprise harmful content in their feeds or Explore page. 

Being aware of social media’s numerous options to curate user experience not only will help your teen avoid some potentially unwanted interactions, it should inspire them to use the platform to its fullest promise of creating intentional, positive interactions. The options available make it so no user has to endure bullying or see upsetting content. If your teen sees someone else who is struggling, encourage them to share the options from the platform’s help center, and to report egregious behavior themselves. FOSI’s Navigating Social Media resource is an extra tool for parents looking to understand their child’s relationship with social media. It only takes one report for Instagram to review content matter.