Archive for: May, 2017

Report: Facebook ‘Flooded’ with Revenge Porn, ‘Sextortion’

Facebook assessed nearly 54,000 cases of revenge porn and “sextortion” on the platform in a single month, according to a report by the Guardian.

“Figures shared with staff reveal that in January Facebook had to disable more than 14,000 accounts related to these types of sexual abuse – and 33 of the cases reviewed involved children,” the Guardian reported on Monday. “The company relies on users to report most abusive content, meaning the real scale of the problem could be much greater.”

“Sexual policy is the one where moderators make most mistakes,” said one source to the newspaper. “It is very complex.”

This sentiment was mirrored by Facebook’s head of global policy management, Monika Bickert, who also claimed that the situation was “complex.”

“We constantly review and improve our policies,” said Bickert. “These are complex areas but we are determined to get it right.”

In their report, the Guardian added that “One slide showed that in January moderators alerted senior managers to 51,300 potential cases of revenge pornography, which it defines as attempts to use intimate imagery to shame, humiliate or gain revenge against an individual.”

“In addition, Facebook escalated 2,450 cases of potential sextortion – which it defines as attempts to extort money, or other imagery, from an individual,” they continued. “This led to a total of 14,130 accounts being disabled. Sixteen cases were taken on by Facebook’s internal investigations teams.”

Besides non-consensual sex, sextortion, and revenge porn, leaked documents from Facebook also revealed the phrases and sentences that are both banned and allowed.

“Moderate displays of sexuality, open-mouthed kissing, [and] clothed simulated sex and pixelated sexual activity,” are all allowed on the platform, while phrases like “I’m gonna f*ck you,” “I’m gonna eat that p*ssy,” and “Hello ladies, wanna suck my c*ck?” are also accepted.

If users go into more detail or start to become aggressive, however, then Facebook advises its moderators to implement sanctions.

In April, Facebook introduced an anti-revenge porn program that will be able to detect previously-flagged images and stop users from posting them.

“If someone tries to share a photo that Facebook has previously taken down, that person will see a pop-up saying the photo violates Facebook’s policies and that Facebook will not allow the person to share that particular photo on Facebook, Messenger or Instagram,” explained Tech Crunch last month. “Facebook has also partnered with a handful of organizations, like the Cyber Civil Rights Initiative and the Revenge Porn Helpline, to offer support to people who are victims of revenge porn.”

In March, a revenge porn victim’s lawyer claimed micro-blogging site Tumblr had “chosen to ignore” revenge porn images posted on the platform after it allegedly took three weeks to get a video of the then-17-year-old victim removed from the site.

“In my opinion, Tumblr has chosen to ignore valid legal demands because they earn more money using victims’ photographs as clickbait than they do protecting minors,” said lawyer Daniel Szalkiewicz, whose firm specializes in Internet defamation and revenge porn cases.

Upon investigation, Breitbart Tech also discovered numerous accounts on Tumblr that specialized in trying to find and shame the victims seen in revenge porn, most of which had been up for numerous weeks without deletion.

Charlie Nash is a reporter for Breitbart Tech. You can follow him on Twitter @MrNashington or like his page at Facebook.

Read the rest

It’s Going To Be Especially Illegal To Share Revenge Porn In the Military

Partisanship may be rife on Capitol Hill, but there’s at least one issue that Republicans and Democrats can agree on. After a scandal in the Marines involving explicit images being distributed without consent, the House passed a bill banning nonconsensual nude photo sharing in the military.

Earlier this year, the military was rocked by news that a Marine Facebook group was sharing compromising photos and videos of female service members without their knowledge or consent. At least two dozen female members were identified in the photos, according to reports. The group, “Marines United,” included more that 30,000 active-duty or retired Marines, Corpsman, and British Royal Marines. Some members shared links to photos on the group page.

Although the original group was deleted after it was busted, others have reportedly formed and allegedly continued sharing nude photos. If the members thought that their activities would go undetected in the new groups, perhaps they shouldn’t have used the names “Marines United 2.0” and “Marines United 3.0” for their exploits.

The Marine photo-sharing scandal led to a Senate hearing, as well as a Pentagon investigation. Marine Corps. Commandant Gen. Robert Neller testified before the Senate Armed Services Committee that the Marines did not condone the behavior exhibited on the page and promised to take action.

The Protecting the Rights of IndiViduals Against Technological Exploitation Act (PRIVATE Act) would prohibit service members from sharing intimate images of others without consent, even if the individual gave consent to create the images, according to NBC News. Those who break the law could be tried by court-martial.

U.S. Navy regulations were also updated in April in order to make revenge porn in the Marines and Navy illegal. Any sharing of intimate images without the subject’s consent intended to humiliate, harass, or threaten is now banned.

The bill was passed 418-0 in the House, in a rare unanimous vote. It now heads to the Senate before reaching President Donald Trump’s desk to sign into law.

Although service members make a great sacrifice in dedicating their lives to the safety of the United States, they are not above the law. Hopefully this issue, along with the problem of sexual assault in the military, will be reduced as the public becomes more aware of them. No one should be subjected to photos of themselves being passed around without their consent, and certainly not the men and women who fight for our freedom.

Read the rest

Why Social Media’s Fight Against Revenge Porn Should Matter to Everyone

Although the Internet has connected people to one another with relative ease, there are certainly plenty of cons to balance out the pros of this technology. People of all ages are quick to share and post photos and videos of themselves through the mediums of social media websites and apps, texting, email and more, which can lead to problems down the road. What happens when you are no longer on good terms with someone you once shared intimate photographs, or your photos fall into the hands of someone you never intended to see them in the first place? Even worse, what happens if someone takes intimate images of you without your knowledge and posts or shares them? We explore how big a problem revenge porn is, what some social media sites are doing about it and how you can protect yourself and your loved ones.

A growing problem in the U.S. and beyond

The phenomenon of non-consensual intimate image sharing or non-consensual pornography, commonly referred to as revenge porn, is one that has grown over the years as it has become easier and easier to share and mass distribute images online. According to a memo published by the Data & Society Research Institute in Dec. 2016, one in 25 Americans has been a victim of non-consensual image sharing. Whether perpetrated by someone the victim knows or strangers who have acquired their data through hacking or other means, this type of image sharing is a huge problem and has potential to destroy victims’ lives. Just this week, the House passed legislation to ban non-consensual sharing of nude photos in the military, following a massive scandal within the Marine Corps that involved hundreds of Marines sharing explicit photos of female Marines in private groups online. Although many instances of non-consensual pornography involve media taken without the subject’s consent, plenty involve photos taken and sent by the victim themselves.

In a 2013 survey conducted by the Cyber Civil Rights Initiative (CCRI), 61% of the 1,606 people questioned admitted to taking a nude photograph or video of themselves and sending it to someone else, and 23% of the total respondents said they’d been victims of non-consensual pornography. Although this is just a small sample of people, and it only included adults ages 18 and older, these kinds of numbers combined with the numerous scandals and lawsuits in recent years indicate that plenty of people are not only taking and sharing these kinds of photos and videos, but many are being harassed or targeted with them. Not all instances of non-consensual pornography are malicious — as the CCRI notes on its website — making the term revenge porn somewhat misleading. But regardless of the motivations, the ease with which images can be uploaded and shared across social media platforms is problematic when it comes to getting them taken down and punishing the perpetrator. Law enforcement has been slow to catch up with technology as a whole, and victims of all kinds of online harassment have discovered when it comes to reporting the crimes against them.

What are social media sites doing to fight revenge porn?

While the law plays catch-up, some social media sites are taking matters into their own hands, since it’s on these platforms that a substantial amount of revenge porn occurs. This past spring, Facebook announced that it was taking steps to combat the sharing and posting of non-consensual intimate images. According to Antigone Davis, Head of Global Safety at Facebook, the site will utilize a combination of photo matching technology and trained members of its community standards team to not only help remove images flagged as revenge porn, but also prevent them from being re-shared or re-uploaded. In the event someone tries to share or upload a photo that Facebook has taken down, a pop-up will appear telling them that it’s a violation of the site’s policies. It will not be able to be shared on Facebook, Facebook Messenger or Instagram. In many cases, the account which posted the photo in the first place will also be deactivated.

Of course, it’s still largely up to victims to report instances of non-consensual pornography, but the fact that Facebook is taking significant steps to find a solution to the problem of photos being re-uploaded or posted is notable. CCRI worked with Facebook to create a guide on its site that helps people report images on a variety of websites, including other social media sites, image hosts like Flickr and even Google search results. There is also a Facebook-specific guide detailing how users can report images and strengthen their privacy settings. Other social media sites, like Twitter and Reddit, have implemented strict rules regarding revenge porn to try and combat it, but it may be that extra tools like those Facebook is deploying are necessary.

What can you do to protect yourself and your family?

Although this won’t help in the event someone takes photos or video of you without your knowledge or consent, thinking twice before you snap and share is wise. Remember that even people you think are trustworthy could betray you, whether intentionally as a form of revenge or simply through carelessness, such as passing along a photo to their friends. In a world where even non-photographic data can be used to blackmail and harass people, and scammers and identity thieves lurk around every corner, being extra cautious around what you post and share is not a bad thing. Reviewing your security and privacy settings, as well as using strong passwords, can go a long way. If you do become a victim, there are resources like and CCRI which offer legal advice, psychological support and help with getting content removed.

Parents and guardians should also remember that this is not an issue faced only by adults. Though up-to-date figures are difficult to come by, sharing sexual images is relatively common among teenagers these days, made easy by apps like Snapchat which “delete” photos and videos within a specific time limit.

Read the rest
Skype Facebook extortion blackmail scam

Philippine’s role in Facebook revenge porn, ‘sextortion’

Facebook’s leaked documents revealing cases of revenge porn and “sextortion” could have also infested the Philippines as it has 40 million active users, an Information Technology (IT) specialist said on Thursday.

Revenge porn diplays “sexually explicit photos/videos” of the victims on the Internet without her/his permission, and sextortion uses sexual images to blackmail the victim in exchange for sex or money.

“The possibility that some accounts from the Philippines were involved with these acts is high, since Filipinos are very active in social media, and that our country is sadly known in the porn industry,” IT specialist Darrel Jed Costales said in an interview with The Manila Times.

In 2016 alone, the Philippine National Police-Anti-Cybercrime Group (PNP-ACG) conducted 40 police operations, arresting 150 people involved in extortion, cybersex operations and violations of anti-photo and video voyeurism law.

Facebook documents leaked to the London-based The Guardian exposed that the site had to assess 54, 000 potential cases of revenge porn and sextortion–and 33, 000 of those involved child abuses. This led the social networking site to dismiss 14, 000 accounts worldwide in January, The Guardian posted online.

The newspaper revealed moderators were told to allow videos of abortions “to remain on Facebook as long as they do not contain nudity,” while video records of violent deaths do not have “to be deleted because they can help create awareness of issues such as mental illness.”

It said ‘handmade’ art that shows nudity and sexual activity can remain in the site,” but digital art showing sexual activity, however, cannot.

The Guardian also exposed non-sexual physical abuse and bullying of children need not be deleted “unless there is a sadistic or celebratory element.”

Facebook earlier disclosed that it had only 4, 500 content moderators for its 1.94 billion users, the reason why moderators only have seconds to decide what to delete because the website has become ‘too big, too quickly,” its report said.

Because of this, Facebook promised to hire more than 3, 000 people to review content. But some organizations still see this as a disturbing fact.

As reported by the British Broadcasting Corporation (BBC) News, British charity the National Society for the Prevention of Cruelty to Children (NSPCC) expressed its frustration about how Facebook works, describing it “alarming to say the least.”

“It needs to do more than hire an extra 3, 000 moderators. Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe,” the organization said.

“Any suspicious sites that promote child pornography should be banned,” Adventist Development and Relief Agency (ADRA) International representative Geraldine Maleon Gutierrez also told The Manila Times.

She said Facebook should implement stricter plans and responsible actions for the controversy, adding that Facebook administrators “should be responsible enough in any posts that will endanger children from all forms and abuse, and have a mechanism that will help the authorities to track down and punish abusers.”

Gutierrez added that users have to be responsible and vigilant as well “in reporting any forms of abuse on child pornography, revenge forms and other acts that will perpetuate abuses on children.”

“Users of social media sites should not share revenge porn, instead they should alert the authorities and concern agencies to help the victims. Users should be aware also of the laws regarding child’s right such as Republic Act 7610 [An Act providing for stronger deterrence and special protection against child abuse, exploitation and discrimination, and for other purposes] for them to appropriately respond in a manner that will protect the victim’s identity,” Gutierrez explained.

The Guardian said Facebook refused to comment on the figures in the documents but insisted that it “constantly reviews and improves its policies.”

“We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any,” wrote Monica Bickert, head of global policy management at Facebook, in her column in response to the issue.

She said, “I hope that readers will understand that we take our role extremely seriously. For many of us on the team within Facebook, safety is a passion that predates our work at the company.”

Read the rest