Facebook has come under fire after leaked documents revealed the social media site has been targeting potentially vulnerable children.
The allegations suggest the company is gathering information on young people who “need a confidence boost” to facilitate predatory advertising practices.
Confidential documents obtained by The Australian reportedly show how Facebook can exploit the moods and insecurities of teenagers using the platform for the benefit of advertisers.
By monitoring posts, the newspaper said Facebook could determine when users as young as 14 feel “defeated”, “overwhelmed”, “stressed”, “anxious”, “nervous”, “stupid”, “silly”, “useless” or a “failure”.
This information, which Facebook calls “sentiment analysis” could be used by advertisers to target young teenager when they are potentially more vulnerable.
The document was reported to have been put together by two Australian Facebook executives, and includes information on when young users and most likely to feel certain emotions.
The company said claims made about targeting site users were "misleading" but has since issued an apology and said an investigation would be launched into the matter.
In a statement, Facebook said: "The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.
"Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight."
A spokesman told The Independent no known studies of a similar nature had taken place in the UK and that British parents had "no reason to feel concerned".
Facebook - one of the largest companies to dominate the advertising industry alongside Google – has been at the centre of internet privacy concerns in previous years, with many suspecting the company to be capable of using personal data in this way.
In 2012, the company received fierce backlash after it conducted an experiment on nearly 700,000 unsuspecting users.
By using an algorithm to determine whether a post was negative or positive, Facebook was able to alter which status updates appeared on an individual user’s news feed.
The aim was to determine whether the selected group’s mood could be influenced, becoming sadder the more negative posts they saw.
The results were published in a scientific journal but the company was criticised for playing with people’s emotions for commercial gain.