{"id":41855,"date":"2023-08-10T11:30:16","date_gmt":"2023-08-10T11:30:16","guid":{"rendered":"https:\/\/pieroselvaggio.com\/?p=41855"},"modified":"2023-08-10T11:30:16","modified_gmt":"2023-08-10T11:30:16","slug":"amid-sextortions-rise-computer-scientists-tap-a-i-to-identify-risky-apps","status":"publish","type":"post","link":"https:\/\/pieroselvaggio.com\/2023\/08\/10\/amid-sextortions-rise-computer-scientists-tap-a-i-to-identify-risky-apps\/","title":{"rendered":"Amid Sextortion\u2019s Rise, Computer Scientists Tap A.I. to Identify Risky Apps"},"content":{"rendered":"

Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked the same question by his 14-year-old daughter: Can I download this app?<\/p>\n

Mr. Levine responds by scanning hundreds of customer reviews in the App Store for allegations of harassment or child sexual abuse. The manual and arbitrary process has made him wonder why more resources aren\u2019t available to help parents make quick decisions about apps.<\/p>\n

Over the past two years, Mr. Levine has sought to help parents by designing a computational model that assesses customers\u2019 reviews of social apps. Using artificial intelligence to evaluate the context of reviews with words such as \u201cchild porn\u201d or \u201cpedo,\u201d he and a team of researchers have built a searchable website called the App Danger Project, which provides clear guidance on the safety of social networking apps.<\/p>\n

The website tallies user reviews about sexual predators and provides safety assessments of apps with negative reviews. It lists reviews that mention sexual abuse. Though the team didn\u2019t follow up with reviewers to verify their claims, it read each one and excluded those that didn\u2019t highlight child-safety concerns.<\/p>\n

\u201cThere are reviews out there that talk about the type of dangerous behavior that occurs, but those reviews are drowned out,\u201d Mr. Levine said. \u201cYou can\u2019t find them.\u201d<\/p>\n

Predators are increasingly weaponizing apps and online services to collect explicit images. Last year, law enforcement received 7,000 reports of children and teenagers who were coerced into sending nude images and then blackmailed for photographs or money. The F.B.I. declined to say how many of those reports were credible. The incidents, which are called sextortion, more than doubled during the pandemic.<\/p>\n

Because Apple\u2019s and Google\u2019s app stores don\u2019t offer keyword searches, Mr. Levine said, it can be difficult for parents to find warnings of inappropriate sexual conduct. He envisions the App Danger Project, which is free, complementing other services that vet products\u2019 suitability for children, like Common Sense Media, by identifying apps that aren\u2019t doing enough to police users. He doesn\u2019t plan to profit off the site but is encouraging donations to the University of Massachusetts to offset its costs.<\/p>\n

Mr. Levine and a dozen computer scientists investigated the number of reviews that warned of child sexual abuse across more than 550 social networking apps distributed by Apple and Google. They found that a fifth of those apps had two or more complaints of child sexual abuse material and that 81 offerings across the App and Play stores had seven or more of those types of reviews.<\/p>\n

<\/p>\n

Their investigation builds on previous reports of apps with complaints of unwanted sexual interactions. In 2019, The New York Times detailed how predators treat video games and social media platforms as hunting grounds. A separate report that year by The Washington Post found thousands of complaints across six apps, leading to Apple\u2019s removal of the apps Monkey, ChatLive and Chat for Strangers.<\/p>\n

Apple and Google have a financial interest in distributing apps. The tech giants, which take up to 30 percent of app store sales, helped three apps with multiple user reports of sexual abuse generate $30 million in sales last year: Hoop, MeetMe and Whisper, according to Sensor Tower, a market research firm.<\/p>\n

In more than a dozen criminal cases, the Justice Department has described those apps as tools that were used to ask children for sexual images or meetings \u2014 Hoop in Minnesota; MeetMe in California, Kentucky and Iowa; and Whisper in Illinois, Texas and Ohio.<\/p>\n

Mr. Levine said Apple and Google should provide parents with more information about the risks posed by some apps and better police those with a track record of abuse.<\/p>\n

\u201cWe\u2019re not saying that every app with reviews that say child predators are on it should get kicked off, but if they have the technology to check this, why are some of these problematic apps still in the stores?\u201d asked Hany Farid, a computer scientist at the University of California, Berkeley, who worked with Mr. Levine on the App Danger Project.<\/p>\n

Apple and Google said they regularly scan user reviews of apps with their own computational models and investigate allegations of child sexual abuse. When apps violate their policies, they are removed. Apps have age ratings to help parents and children, and software allows parents to veto downloads. The companies also offer app developers tools to police child sexual material.<\/p>\n

Dan Jackson, a spokesman for Google, said the company had investigated the apps listed by the App Danger Project and hadn\u2019t found evidence of child sexual abuse material.<\/p>\n

\u201cWhile user reviews do play an important role as a signal to trigger further investigation, allegations from reviews are not reliable enough on their own,\u201d he said.<\/p>\n

Apple also investigated the apps listed by the App Danger Project and removed 10 that violated its rules for distribution. It declined to provide a list of those apps or the reasons it took action.<\/p>\n

\u201cOur App Review team works 24\/7 to carefully review every new app and app update to ensure it meets Apple\u2019s standards,\u201d a spokesman said in a statement.<\/p>\n

The App Danger project said it had found a significant number of reviews suggesting that Hoop, a social networking app, was unsafe for children; for example, it found that 176 of 32,000 reviews since 2019 included reports of sexual abuse.<\/p>\n

\u201cThere is an abundance of sexual predators on here who spam people with links to join dating sites, as well as people named \u2018Read my picture,\u2019\u201d says a review pulled from the App Store. \u201cIt has a picture of a little child and says to go to their site for child porn.\u201d<\/p>\n

Hoop, which is under new management, has a new content moderation system to strengthen user safety, said Liath Ariche, Hoop\u2019s chief executive, adding that the researchers spotlighted how the original founders struggled to deal with bots and malicious users. \u201cThe situation has drastically improved,\u201d the chief executive said.<\/p>\n

The Meet Group, which owns MeetMe, said it didn\u2019t tolerate abuse or exploitation of minors and used artificial intelligence tools to detect predators and report them to law enforcement. It reports inappropriate or suspicious activity to the authorities, including a 2019 episode in which a man from Raleigh, N.C., solicited child pornography.<\/p>\n

Whisper didn\u2019t respond to requests for comment.<\/p>\n

Sgt. Sean Pierce, who leads the San Jose Police Department\u2019s task force on internet crimes against children, said some app developers avoided investigating complaints about sextortion to reduce their legal liability. The law says they don\u2019t have to report criminal activity unless they find it, he said.<\/p>\n

\u201cIt\u2019s more the fault of the apps than the app store because the apps are the ones doing this,\u201d said Sergeant Pierce, who offers presentations at San Jose schools through a program called the Vigilant Parent Initiative. Part of the challenge, he said, is that many apps connect strangers for anonymous conversations, making it hard for law enforcement to verify.<\/p>\n

Apple and Google make hundreds of reports annually to the U.S. clearinghouse for child sexual abuse but don\u2019t specify whether any of those reports are related to apps.<\/p>\n

Whisper is among the social media apps that Mr. Levine\u2019s team found had multiple reviews mentioning sexual exploitation. After downloading the app, a high school student received a message in 2018 a from a stranger who offered to contribute to a school robotics fund-raiser in exchange for a topless photograph. After she sent a picture, the stranger threatened to send it to her family unless she provided more images.<\/p>\n

The teenager\u2019s family reported the incident to local law enforcement, according to a report by Mascoutah Police Department in Illinois, which later arrested a local man, Joshua Breckel. He was sentenced to 35 years in jail for extortion and child pornography. Though Whisper wasn\u2019t found responsible, it was named alongside a half dozen apps as the primary tools he used to collect images from victims ranging in age from 10 to 15.<\/p>\n

Chris Hoell, a former federal prosecutor in the Southern District of Illinois who worked on the Breckel case, said the App Danger Project\u2019s comprehensive evaluation of reviews could help parents protect their children from issues on apps such as Whisper.<\/p>\n

\u201cThis is like an aggressively spreading, treatment-resistant tumor,\u201d said Mr. Hoell, who now has a private practice in St. Louis. \u201cWe need more tools.\u201d<\/p>\n

Tripp Mickle<\/span> covers technology from San Francisco, including Apple and other companies. Previously, he spent eight years at The Wall Street Journal reporting on Apple, Google, bourbon and beer. More about Tripp Mickle<\/span><\/p>\n

Source: Read Full Article<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Almost weekly, Brian Levine, a computer scientist at the University of Massachusetts Amherst, is asked…<\/p>\n","protected":false},"author":4,"featured_media":41854,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13],"tags":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/posts\/41855"}],"collection":[{"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/comments?post=41855"}],"version-history":[{"count":0,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/posts\/41855\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/media\/41854"}],"wp:attachment":[{"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/media?parent=41855"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/categories?post=41855"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/pieroselvaggio.com\/wp-json\/wp\/v2\/tags?post=41855"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}