NEW YORK, United States — Lily Garcia, an office manager living in Orange County, Calif., joined the clothing resale platform Depop three years ago to make some extra money selling items cluttering up her closet.
Garcia, now 25, heard about Depop from friends. The London-based startup was already wildly popular among young fashion lovers, who flocked to the platform’s mix of peer-to-peer commerce and features cribbed from Instagram. A fan of vintage and fast-fashion, Garcia quickly found a community of like-minded shoppers.
Pretty soon, though, Garcia began receiving sexually suggestive messages.
“Inquiries that I thought were supposed to be about my items were instead about my feet, my body, or if I could wear the clothes a few more times before I sold it,” Garcia told BoF. “Pretty soon, a lot of messages were just awkward, uncomfortable conversations.”
When Garcia would investigate her harassers’ profiles, she noticed they didn’t have items listed for sale and hadn’t left reviews, indicating they likely hadn’t bought anything off the platform either. They had joined Depop, she surmised, simply to prowl.
“Kids are using Depop to buy clothes and make money, but there are tons of these creeps on it trying to get their kick,” Garcia said. “I’ve sold to minors and I bet their parents for sure don’t know what type of messaging they’re getting. And that’s really scary.”
Since its founding in 2011, Depop has been hailed as one of the most successful start-ups in the fashion resale space. Users have sold over $500 million worth of merchandise, and the company has raised over $100 million in funding. The company boasts 16 million registered users and estimates 90 percent of them are under the age of 26.
Depop’s popularity with Gen Z can be explained by its interface; its home feed, explore tab and user profiles closely resemble Instagram’s. Stories about young users becoming successful entrepreneurs from their bedrooms helped build the platform’s popularity.
Kids are using Depop to buy clothes and make money, but there are tons of these creeps on it trying to get their kick.
Like Instagram, Depop also has a popular private messaging function, which is meant for users to communicate about products. Depop users send about 20 million messages a month.
But as the app’s user base has swelled, so have complaints about inappropriate and even predatory behaviour. As Depop, which has 219 employees, struggles to police millions of users across multiple continents, the start-up is reckoning with dangers facing its young users that have plagued the internet since the early 1990s.
In interviews with nearly a dozen Depop users, BoF heard numerous reports of women being inundated with sexual messages. The targets include minors, who told BoF they had been asked for nude photos, personal information, and to perform sexually suggestive activities like wearing clothes before users purchased them. In some cases, the harassment had gone on for years, and although the company has kicked accounts off the platform, some users feel there’s been an inadequate response from Depop.
Dominic Rose, Depop’s chief operating officer, said the company is aware of the issue, and that the start-up has a “zero-tolerance approach” for abusive behaviour and other forms of harassment. He said Depop is working to address the problem and is developing a tool that would enable the company “to detect abuse and harassment faster,” but declined to share specifics. Depop also employs a community experience team to monitor complaints and kick users off the platform.
“It's something that we take extremely seriously,” Rose said. “Like any online platform, we've got to be constantly vigilant about how we keep our community safe on Depop. We have clear terms of service, but of course we also know that there will always be people who try and break these rules and the responsibility is on us to make sure that those rules are clear, that they're up to date, and that we enforce them consistently.”
Members of Depop’s community experience team include Depop buyers and sellers, who were recruited by the start-up. A press representative for Depop said members of the team “take part in a rigorous and consistent training program led by Depop to handle all inquiries including inappropriate or predatory behaviour, harassment, and bullying.”
Depop allows users 13 and up to join its social shopping platform. It processes payments through PayPal, where the age requirement is 18, but many users sign up through their parents’ PayPal accounts. Depop’s Community Guidelines state that it does not tolerate harassment.
Depop’s strategy of hiring community members to monitor platform issues is similar to other social networks that have relied on users or third-party contractors to respond to complaints. Child safety experts say these measures often fail to address the core problem of internet predators.
“You need people with technical expertise and a professional understanding in sexual predation ... 10 hours of training is not enough to handle something like this,” said Josh Golin, executive director of the Campaign for a Commercial Free Childhood, a Washington D.C.-based nonprofit.
Meanwhile, the flow of inappropriate messages continues at Depop. A Reddit forum devoted to the platform contains numerous posts from users discussing traumatic encounters. There’s even an Instagram account, Depop Drama, dedicated to troubling messages users have received (“Hi there. I know this may sound weird but would you be able to put the bottoms on and urinate in them? I’ll pay anywhere up to 50 pounds for the set if you do!” one recent post read). The account has 186,000 followers.
Some Depop users say they navigate the platform knowing it’s “crawling with creeps,” as one user put it.
***
Depop is hardly the first digital marketplace to come face to face with the dark side of the internet. TikTok, the viral short-video platform, was described in February as a “hunting ground” for child predators by the National Society for the Prevention of Cruelty to Children, a UK-based organization. That month, TikTok was also fined $5.7 million by the US Federal Trade Commission for illegally collecting children’s data; it’s since debuted a safer, child-friendly app that does not allow young users to livestream with each other. TikTok declined to comment.
YouTube has also faced criticism for not addressing predatory behavior. Users who wrote sexually suggestive comments on videos depicting children were allowed to operate on the service for years. In March, a vlogger published an investigation about how the site’s algorithm had created a “wormhole into a soft-core pedophile ring.” Brands like Disney and Epic Games pulled ads, and the Google-owned video giant agreed to close the comments section on videos featuring children and said it would ban some users. YouTube did not respond for comment.
The ability to send messages to anyone — as opposed to friends — is a common thread across these services. So is a hands-off approach, at least initially, by tech companies that are primarily concerned with growth.
“Today everyone’s business model seems to be to build a site and get as many users as possible, and unfortunately, I don’t think most sites take child safety seriously,” said Golin. “Kids should never be exposed to these things in the first place — not have to report them once it happens.”
Emily Brougham, a 20-year-old Depop user from London who says she receives messages with inappropriate content two to three times a week, believes she’s a target on Depop because, like many users, she models in the clothing and lingerie that she sells.
“I could post my clothes flat, on a bed, but that doesn’t look as good,” she said.
Brougham said her photos are taken with the intention of selling clothes, and not to attract unwarranted sexual behavior. Nikol Kantardzhieva, a university student in London, said she doesn’t post provocative photos and is still inundated with prompts.
“My body is what some may deem as 'curvy' so it's difficult to wear any form of clothing without looking a certain way,” she said. “There have been instances where I have had to delete my items for sale as it would constantly receive the same messages from men.”
There have been instances where I have had to delete my items for sale as it would constantly receive the same messages from men.
Martha Kirby, a child safety online policy manager with the National Society for the Prevention of Cruelty to Children, said Depop is precisely the type of platform that attracts pedophiles.
“Perpetrators migrate to wherever there are sites with young people,” she said. “It’s quite common for this to happen on newer platforms.”
Some minors BoF spoke with said they ignore solicitations on Depop, but said the experience still affects them.
“Receiving unsolicited sexual attention can be really harmful for teenage girls,” said Alice, a 17-year-old Depop user from Northamptonshire (BoF is withholding the last names of minors). “I know that feeling violated like that sparks feelings of insecurity or self-hatred.”
Others admitted to accepting the prompts, like agreeing to wear products before selling them to users, or continuing to interact with users with predatory behaviour in order to nab a sale.
Golin, of the CCFC, noted that although some teenagers might feel these are casual interactions that won’t leave the web, they are problematic to teens who aren't fully aware of their actions, and what feeding into sexual fetishes actually mean. Plus, even teens fully internet savvy can fall into traps.
“We should not pretend 16-year-olds are fully aware of the predation of adults,” he said. “This is why we have laws about age, content and what consent means.”
Delaney Love, a user who's sold over 700 items on Depop, admitted there are aspects of the app that make her feel uneasy.
“It's easy to block people but at the same time, you can encounter those really weird people that will easily just find you,” Love said. “People write that they live in the US, for example. Mine says I live in my town of Alabama. It’s not a big town.”
Receiving unsolicited sexual attention can be really harmful for teenage girls.
Garcia, the Depop user from California, said she successfully blocked users who sent her inappropriate messaging. The messages would stop, but days later she’d still see them liking her photos.
Rose, Depop’s COO, said users should both block and report inappropriate exchanges because accounts reported lead Depop to take “immediate action, which does include banning users who fail to live up to our terms.”
Some users, though, say reports of problematic accounts have gone unnoticed.
“I'm pretty sure I've reported a few, and I didn't hear back from Depop,” said Alice. “I also tweeted them, but still no reply. I'm unsure as to why they seem to be silent on this, because it's a common and pressing issue.”
Rose said Depop is “aware that we don't necessarily catch every single instance of misuse on the platform” but that the company will “continue to invest in our methods to better identify and remove those bad actors from our community as well as improving the education and awareness of misuse amongst the community.”
***
In order to create a safe marketplace for young users, child safety advocates say a first step is for Depop to disable its open messaging function (TikTok took this approach when it debuted its kid's app earlier this year).
Users believe Depop also needs to install tougher restrictions to join the platform, like screening users to prove they are there to shop or sell.
We should not pretend 16-year-olds are fully aware of the predation of adults.
Installing such barriers could change the very nature of Depop, Rose said, and damper the democratic and free-for-all vice users have come to love.
“In terms of ability to sign up, we're an open community and I suppose that the idea of transparency is very key and core to us,” Rose said. “We see ourselves as a social marketplace, and we believe that transparency, the ability to see other participants in the community, is actually core to enabling that social marketplace to thrive and to succeed.”
But experts believe such severe measures are necessary when tackling a major issue like internet safety.
“There's always going to be a bit of balance in safety and privacy and the openness platforms allow,” said Kirby of the NSPCC. “We always recommend that messaging apps have privacy settings so that young people should only interact with their friends. It doesn’t mean the child can’t peel back the layers if they want to, but that is their choice.”
Additional reporting by Sophie Soar.
Related Articles:
Why Hasn’t #MeToo Come for Ian Connor?
Startups
via https://aiupnow.com
Chavie Lieber, Khareem Sudlow