Previously, Elon Musk said he could halt child trafficking on Twitter, the social media company he paid $44 billion to acquire.
Less than a month ago, he spoke about the problem and declared it his top priority.
However, there has not been any proof that Twitter has been acting aggressively under his administration since he took over.
Musk hasn’t invested much in getting rid of child exploitation-related information on Twitter, according to interviews with four former workers, one current employee, internal business records, and people working toward the same objective.
Elon Musk elevated the conversation about online safety into a significant attempt to discredit Twitter’s former executives.
Additionally, he is exercising his ownership as a member of a social movement against far-left to center-left values, often known as “the woke mind virus.”
The change came after he agreed with the far-right online rhetoric, which frequently makes exaggerated allegations of child sex assault.
“It is a crime that they refused to take action on child exploitation for years,” Musk tweeted on Friday.
His remark was in response to a member of Twitter’s Trust and Safety Council who focused on child abuse concerns resigning via letter.
Jack Dorsey, the previous CEO, countered, “This is false.”
Under Musk’s new leadership, Twitter announced that the number of account suspensions for child sex abuse content in November was greater than in other months.
New collaborations with anonymous organizations and new “detection and enforcement measures” were cited as the reasons for the suspensions.
However, the corporation has limited resources to combat online child sex abuse because of the following:
Although Twitter’s employment count is still fluctuating, internal documents obtained by NBC News and CNBC show that 25 of the 1,600 people still working for the firm had titles connected to “Trust and Safety.”
More than 100 people who Musk approved to work for Twitter, Tesla, SpaceX, The Boring Company, and various investors and advisers are included in the total.
A former worker who focused on child safety claimed to know a small Twitter team still working on the problem.
However, most of the team’s product managers and engineers have left.
The worker requested anonymity out of concern for reprisals.
Twitter’s workforce grew to more than 7,500 by the end of 2021.
Former workers claim that layoffs would have been possible even if Musk hadn’t bought the business.
Child safety groups
Under the current management, Twitter reduced its commitments to external organizations that support kid safety.
The social media company’s Trust and Safety Council, which consisted of 12 organizations and advised Twitter on its initiatives to raise awareness of child sexual exploitation, was disbanded on Monday.
Little has changed under Musk’s leadership, according to the National Center for Missing & Exploited Children (NCMEC), an agency the US government has charged with monitoring allegations of child sexual abuse content online.
Gavin Portnoy, a spokesperson of NCMEC, comments on the organization’s unified CSAM reporting system:
“Despite the rhetoric and some of what we’ve seen people posting online, their CyberTipline numbers are almost identical to what they were prior to Musk coming on board.”
The absence of Twitter from the organization’s annual social media roundtable was another shift noted by Portnoy.
“The previous person was one of the folks who resigned,” he said.
When asked if they wanted to send a proxy, Twitter allegedly declined, according to Portnoy.
Twitter reported 86,666 instances of CSAM discovered on the service in 2017; however, Portnoy thinks the actual number may be higher.
“We’ve always felt that there should have been more reports coming out of Twitter, no matter how you cut it, and just given the sheer number of users that are there,” he said.
Though it still affects most social media sites, child sexual exploitation content continues to plague Twitter.
Early this year, Twitter’s advertisers departed after discovering that their adverts frequently appeared next to offensive content.
A child sex abuse victim and their mother filed a lawsuit against the firm last year, alleging that they failed to act quickly after being informed of a video showing the youngster roaming the platform.
Regarding content moderation, child abuse content must be found and removed using automated detection technologies, internal specialist teams, and outside contracts.
According to Twitter’s rules, the content is:
“Imagerey and videos, referred to as child pornography, but also written solicitation and other material that promotes child sexual exploitation.”
According to people with knowledge of the situation and internal records, Twitter’s engineering staff—which included numerous employees and leaders who worked on trust and safety features as well as enhancements to the current platforms—was reduced by more than half as a result of layoffs, firings, and resignations.
Ella Irwin, the current head of Trust and Safety at Twitter, claims that Musk also laid off contractors as the firm turned to high-tech automation for its moderation requirements.
“You tend to think more bodies equals more safety,” said Portnoy.
“So, I mean, that is disheartening.”
It’s still being determined how many Twitter employees are still working on child safety problems.