Gaige Grosskreutz wasn’t even out of the hospital when his phone started blowing up. Shot point blank in the arm with an AR-15, he was the only person to survive a triple shooting at a protest condemning the shooting of Jacob Blake by Kenosha police.
Weeks later, the messages haven’t stopped. And while some are encouraging, most are ugly, even threatening. In some corners of the internet, Grosskreutz, 26, has become the target of angry white supremacists who think he and others who support Black Lives Matter should be stopped by any means necessary — including homicide.
His family and friends — people who had never protested in Kenosha — got frightening messages, too. The online harassment made its way into their neighborhoods, with strangers showing up at their homes to find out “what really happened” the night Grosskreutz was shot.
“And that’s the thing that affects me, seeing the people that I care about be upset for me, scared for me,” Grosskreutz said. “I just don’t understand the need to target people who weren’t even there.”
Kyle Rittenhouse, 17, who considered himself militia, has been charged with five felonies for wounding Grosskeutz and killing two other men, Joseph Rosenbaum and Anthony Huber.
Huber’s girlfriend, Hannah Gittings, also has received online threats, according to her friend Danielle Rasmussen, who sponsored an online fundraiser for her. On the fundraising site, people have donated $5 to gain access and leave a nasty message, then gotten the money refunded, she said. They’ve posted laughing emojis in reaction to posts about Huber’s death and sent mocking texts to Rasmussen’s husband.
“They’re doing that instead of being part of the solution,” she said. “Holding people accountable and doing the right thing, sometimes you have to have tough skin.”
Along with shining the spotlight on Wisconsin, a crucial state in the upcoming presidential election, the shootings have laid bare the extent of online harassment and its effects. It’s a problem that makes victims of violence unwitting pawns in ideological arguments, forcing them to delete their social media accounts, change their phone numbers and even move. It’s almost impossible to stop, experts say, due to the combination of ineffective criminal laws, ignorant police agencies and an unregulated internet.
And it’s nothing new. Over the past decade, every time an incident of police brutality, a mass shooting or a high-profile crime occurs, online attacks follow — not just for surviving victims, but for their families, their attorneys and the journalists who cover their stories.
“It’s such a challenging time that we’re living though,” said Jessie Daniels, a sociology professor at Hunter College in New York. “On the one hand, people are using social media to galvanize people against white supremacists, in support of Black Lives Matter and to point out the brutality of police killings. At the same time, those very tools people are using for social justice can be turned on them in very pernicious ways.”
Sandy Hook was not a hoax
On Dec. 14, 2012, Leonard “Lenny” Pozner’s 6-year-old son, Noah, died at Sandy Hook Elementary School in Newtown, Conn. Twenty children and six adults were fatally shot that day.
“My life prior to that tragedy was a completely different life compared to everything that happened after that,” Pozner told the Journal Sentinel recently. “This is my new life: I am a parent of a murdered child who is part of an internet conspiracy.”
Before, Pozner was a father of three who worked in information technologies and sometimes tuned in as Alex Jones spouted outlandish theories about 9/11 and the Kennedy assassination. After, Pozner was left to parent only his two girls while being stalked and terrorized by people accusing him of being an anti-gun “crisis actor” who never had a son — and worse.
Pozner quickly learned there was little he or anyone in law enforcement could do to stop them.
He contacted Jones, who publicly claimed the mass shooting at Sandy Hook was a hoax, to no avail. The grieving father then joined a Facebook group of conspiracy theorists, making himself available to answer their questions. He changed at least one woman’s mind, he said. She had young children and just couldn’t fathom they could be shot at school. After she left the group, he said, she became the target of online harassment, too.
Law enforcement often made the problem worse, Pozner said. Although he pleaded with authorities not to include his address in complaints, which are public record, they often did. As a result, he’s had to move over and over again. In one case, Pozner outed a stranger who filed baseless child abuse complaints against him, only to have a detective threaten him for harassing his tormentor, he said. Noah’s mother had to move as well.
Federal and state authorities were no better. In response to a records request, a state attorney general provided Pozner’s entire complaint — without blacking out his personal information as the law allows — to a conspiracy theorist who then posted it online.
Nearly a decade after his son’s death, Pozner continues to live in hiding.
“There is nothing there to protect you when it comes to the internet unless you’re willing to fight like hell,” he said. “Most people will just give up. I didn’t think I had the option to. I was on a one-way track: It was just keep fighting this or die.”
History of online harassment
Experts mark 2013, the same year Pozner’s harassers came out in force, as the beginning of coordinated online harassment and disinformation campaigns. These efforts were — and still are — bolstered by algorithms that elevate “content that’s hot, that’s hate-filled, that makes people angry and gets lots of reactions,” according to Daniels.
Perhaps the best known early example of such a campaign was Gamergate, in which female video game developers were not only vilified online but driven from their jobs and forced to flee their homes.
“If Facebook and Twitter had really taken a hard stance against bigotry and harassment in the wake of Gamergate, if they had learned their lesson, then then we would be in a very different position now,” said Whitney Phillips, an assistant professor of communications at Syracuse University. “But instead, they did nothing. They continued incentivizing or at least tolerating this kind of behavior.”
Coordinated antagonism based in identity continued, she said, and ramped up during Donald Trump’s 2016 presidential campaign.
“And when Trump was elected, of course, he normalized them,” she said.
Daniels, author of the books “Cyber Racism” and “White Lies,” takes the argument a step further.
“White supremacists have felt so empowered lately because they’re getting their actions and their statements validated from the highest office in the land,” she said. “And that’s pretty intoxicating.”
Foreign governments also are actively engaged in disinformation campaigns designed to fuel divisions and elevate white supremacy, Daniels said. The problem is exacerbated when people who don’t necessarily espouse neo-Nazi beliefs repost content about being sick of partisan bickering or not trusting the media.
“To have a politics of social justice, it relies on … some sort of shared belief that there’s truth and there’s stuff that’s not true,” Daniels said. “The president has been an expert at fueling the idea that there is no shared belief, which erodes the ground beneath human rights and social justice.”
Counteracting online threats
Because of her work, Phillips also has been threatened online. Police are of little help for several reasons, she said. One problem is that it takes time to track down the source of anonymous threats, especially if there are hundreds or thousands of them coming in. If a police department has limited resources, such threats often aren’t a priority.
Another issue is the legal definition of what constitutes a threat. The difference is subtle. Saying, for example, “I hope someone comes to your house and kills you” is protected speech under the First Amendment and can’t be prosecuted. But saying, “I’m going to come to your house and kill you” may be a crime.
“The kinds of utterances that actually would be actionable by law enforcement is really small compared to the kind of harassment that people receive,” Phillips said. “A lot of it is more diffuse and nebulous, and equally scary and equally threatening. But the law doesn’t see it as such.”
Society has been slow to realize online interactions can have real-world consequences, she said. It’s not enough to turn off the computer or log off social media.
“For too many years, and still some people believe — which is shocking, but they do —that there is somehow something fundamentally different between the offline world and the online world, and if something happens online, it’s not really real,” she said.
Online hatred morphed into murder in 2017 in Charlottesville, Va., when a member of a Facebook group of white supremacists, James Fields, intentionally drove his car into a group of protesters, killing 32-year-old Heather Heyer.
Heyer was demonstrating in opposition to a “Unite the Right” rally organized by hate groups. In the days after Heyer’s death, Facebook was roundly criticized for being slow to remove a post promoting the event. The following year, Facebook CEO Mark Zuckerburg told Congress hate groups are not allowed on Facebook.
But after the Aug. 25 triple shooting in Kenosha, Facebook has received similar backlash for failure to quickly remove an event called “Armed Citizens to Protect our Lives and Property.” The invitation, which was linked to a self-styled militia group known as the Kenosha Guard, was reposted by Jones’ alt-right website, Infowars.
“Any patriots willing to take up arms and defend the City tonight from the evil thugs?” one of the group’s posts read. “No doubt they are currently planning on the next part of the City to burn tonight!”
Other posts encouraged people to “lock and load,” and to slash protesters’ tires, put sugar in their gas tanks and mark their cars with paintballs so they could be easily followed.
When people reported those posts, Facebook replied they did not violate community standards. Grosskreutz said he received the same response when he reported threats to Facebook.
“The platform companies really have to take responsibility for creating these tools that make us all so vulnerable,” Daniels said. “They have really thrown gasoline on the fire of white supremacy in the United States and globally.”
Facebook did not respond to an email request for comment.
During a company meeting, first reported by BuzzFeed, Zuckerberg described the shootings as “really deeply troubling.” In a video of his remarks, which was publicly posted later, he called the failure to remove the Kenosha Guard posting until after Huber and Rosenbaum were killed “largely an operational mistake.”
In the video, Zuckerberg also said the posting violated a policy against dangerous organizations.
“This page and the militia, the Kenosha Guard page and event, violated this new policy we put in place a couple weeks ago against — that included QAnon and other militia groups that we worried could be trying to organize violence now, in this volatile period and especially as we get closer to the election — and after the election — when I think there’s a significant risk of civil unrest as well,” he said.
In the years since his son was killed at Sandy Hook, Pozner has filed numerous lawsuits. Perhaps his most important victory, he said, came when Jones admitted, under oath, that Noah was a real person who died in his classroom as a result of a mass shooting.
“I don’t need him to go away,” Pozner said of Jones and Infowars. “He has every right to scream with his last breath, as long as he’s not talking about me and pouring salt on my wound.”
Pozner also has won financial settlements against Jones and others who have publicly accused him of lying about his son’s death or encouraged others to harass him. One woman who threatened him, Lucy Richards, was convicted of a federal crime and sentenced to five months in prison.
Pozner has made it his life’s work to help others whose lives are disrupted by online threats. In 2014, he founded the HONR Network, a nonprofit that works to remove harmful posts and to improve platforms’ anti-harassment policies. Its members also advocate for more government regulation of the internet.
“I never considered that it was a choice to not do what I’m doing,” Pozner said. “My response doesn’t have a decision process with it. This is just the only way I could have responded.”
Working with about 300 volunteers worldwide, the network has gotten hundreds of thousands of pieces of content taken down, he said. In recent months, the group has assisted Maatje Benassi, a U.S. Army reservist falsely accused of starting the coronavirus pandemic. A single YouTube channel featured 4,000 videos publicizing the lie, which has led to death threats against Benassi and her family. HONR worked with YouTube to get the videos removed.
Due to the nonprofit’s efforts, Facebook and YouTube have updated their policies to better protect victims of violence, Pozner said.
Twitter, he said, has been less cooperative.
“They’re not like the other companies,” he said. “The volume of content they have to deal with is greater, but Twitter is responsible for the misinformation, hate and probably a lot of crimes that go on because that’s how the ideas spread.”
Twitter did not respond to an email request for comment.
The platform’s online help center says this: “You can report Tweets, profiles, or Direct Messages directly to us. Twitter may take action on the threatening Tweet, Direct Message, and/or the responsible account.
“However, if someone has Tweeted or messaged a violent threat that you feel is credible or you fear for your own or someone else’s physical safety, you may want to contact your local law enforcement agency.”
The push for government regulation
The only way to stop the barrage of online hate, experts agree, is more government regulation.
Pozner likes to use this analogy, inspired by a 2015 story in the Detroit News:
At the turn of the century, America’s newest technology was the automobile. People started buying cars without knowing how to use them safely. They drove wherever they wanted, in every direction, at every speed. They parked on lawns. People were dying in crashes. Children playing in the street were routinely hit by cars and killed.
Eventually, the government stepped in. Cities set speed limits and started doing traffic control, but that wasn’t enough. They then put up stop signs and traffic lights, painted crosswalks and designated no-parking zones. Finally, authorities set up rules of the road and required people to pass safety tests and get licenses in order to drive. Now, entire government agencies are dedicated to automobile safety.
In Pozner’s view, a similar evolution needs to take place when it comes to the internet.
“I don’t think this is going away, this hate,” he said. “I think it’s only going to get worse until the government steps in.”
Rory Linnane of the Journal Sentinel staff contributed to this report.
If you are being harassed or threatened online, contact the HONR network at https://www.honrnetwork.org/report-online-abuse.
Contact Gina Barton at (414) 224-2125 or firstname.lastname@example.org. Follow her on Twitter at @writerbarton.