DCU seeks public’s help to tackle ‘subtle’ cyberbullying [Irish Examiner, By Sarah Slater, 08/09/2014]

Researchers, however, need the help of the public to develop the new system known as Uonevu — Swahili for bullying.

The majority of bullying behaviour online involves implicit and metaphorical use of language, often including negative stereotypes. This is much more difficult to detect than explicitly offensive language and content using profanities or other obviously derogatory words.

The collected information will be used to build an anti-cyberbullying system capable of automatically identifying subtle, non-explicit forms of bullying language that is widespread online but which is typically difficult to detect.

Dr James Norman O’Higgins, director Anti-Bullying Centre at DCU said 53% of young people surveyed by them have been upset by cyberbullying.

The study was carried out on a group of 2,700 students aged between 12 and 16 late last year.

Language used by some young adolescent girls the study found includes; slapper, slut, go kill yourself, and go cut yourself with glass.

This new system, developed by the Centre for Global Intelligent Content, can then be used across multiple languages and social media channels.

The research is backed by the National Anti-Bullying Research and Resource Centre at DCU.

It enables the public to contribute social stereotypes or language such as “blondes are stupid” or “French people are arrogant” via www.cngl.ie/cyberbullying.

Dr Johannes Leveling, project leader at DCU, explained: “Cyberbullying is a complex social problem that requires equally complex technology solutions to combat it.

“As a first step, we’re seeking to leverage crowdsourcing or ‘people power’ to enable us to quickly build a body of stereotypes (subtle language used).

“We can then use those stereotypes to train systems to automatically recognise more negative stereotypes and detect subtle, non-literal forms of bullying.”

Mr Leveling pointed out that traditional approaches to cyberbullying prevention have used a blacklist to filter and block words or phrases that are unequivocally offensive.

“Subtle forms of bullying and harassment, while widespread and equally damaging, are much more difficult for computer systems to identify.

“For example, a Facebook post that asks, ‘are you wearing your blouse today?’ does not contain anything that would traditionally be tagged as offensive. However, to a 15-year-old boy receiving the message, this post could imply that he is effeminate or gay. It is this subtle form of bullying that Centre for Global Intelligent Content researchers are aiming to detect,” he added.