Press "Enter" to skip to content

Addressing the Alt-Right Pipeline

It’s no secret your social media algorithm knows you better than you know yourself. What you click, what you ignore, and how long you stare at a post gives algorithms slews of information they collect, store, and use to keep you on the app for as long as possible. While this feature has been great for business, the notorious algorithm has faced much scrutiny in recent years, given how this information is being used, usually under the scope of targeted ads. But more than advertising, the core function of the algorithm has now been hijacked by political extremists to find their perfect targets.

While in the past, the support of close friends and family had defined how kids combat the worst stages of their development, in the digital age, social media and online forums provide support for those at their worst. For the alt-right movement, it is now easier than ever to lure in a vulnerable populace looking for validation. Insecurity, self-doubt, and rejection are normal feelings faced as one comes of age, but those are now the key traits those on the alt-right hope to guide towards their online content. What may begin as an interest in self-help quickly devolves into homophobia, racism, sexism, and antisemitism under the guise of empowerment.

While extremists have been using the internet to expand their following for decades, we now have social media algorithms that do the heavy lifting and spoon-feed this content directly to individuals, almost guaranteed to interact and fall down the alt-right pipeline.

The Culture

The alt-right pipeline is the path people follow as they engage with online content, forums, and peers that take them further to the right-wing political extreme. Once in this pipeline, values are heavily distorted from reality to overcompensate for active progress in society. The distortion is most clear through the lens of gender norms, where what those outside the pipeline would refer to as “toxic masculinity” is the encouraged form of traditional behavior. Rigid gender norms that glamorize and idealize those norms pushed in the 1950s along with strong opposition to higher education and academic institutions (rather, they opt for conspiracies and unregulated forums to develop their ideas of facts), and all of this occurs in one of the strongest examples of what a mass echo-chamber is.

Those who fall down the pipeline are often referred to as having been red-pilled, which refers to the film The Matrix, in which a red pill was used as a means to be awoken to the deeper meaning of life. In the pipeline, this depicts those who fall victim to the distortion of reality as finally finding the “truth” and poses them as more aware than those who do not build up ideas of entitlement and superiority. Red-pill is now viewed as a common dog-whistle as it attributes often bigoted, extremist, unsupported worldviews as the “truth.”

The pipeline, at its core, is made by men and for men. In a modern society that is evolving past the need for rigid gender roles and pushing toward equality, some irrational fear that their privilege is under threat. “Alpha Males” build a strong online persona or real-life personality surrounding the ideas of traditional roles. Men are the strong provider, women must be subservient to men, and in today’s “unnatural progressive society,” men must remind women of their place in life. The term originates from the idea that wolf packs have an alpha that leads the rest of the group and that because this is found in nature, it is the natural dynamic for relationships. Unsurprisingly, the study that coined this concept was disproved soon after, but it is still a common misconception. 

The overall movement is a reactionary response to losing power and severe denial that it is a loss of privilege, pushing the idea that men are now being oppressed and feminism is the enemy. The reality of the situation is now more women are college educated than men, it is more common for women to be able to economically support themselves without men, and 50% of marriages ending in divorce have pushed women to have higher standards in a partner. The rhetoric of the looming threat of unmarried women being single spinsters is no longer scary when women aren’t dependent on men and is statistically now less likely than men to end up single, as a Pew Research study found that “men are now more likely than women to be unpartnered, which wasn’t the case 30 years ago.” Women’s evolving lack of dependence on men has added fuel to the alt-right fire, and the Incel was born. Incels are best described as men who are celibate, not by their own choice, and those who self-identify blame feminism, women, and the left for their not being able to find love. Despite the obvious absurdity of the incel movement, it has real dangerous impacts like hate crimes against women, attempted shootings, and shootings. It now has prompted federal investigations under the name of “incel terrorism.”

While it is a male-dominated echo chamber, this isn’t to say women don’t fall for the same tricks. Internalized misogyny describes the phenomenon where women hold the same misogynistic beliefs pushed by society. This is seen through “pick me” culture, where women will claim they aren’t like other women; they are better than that. Common sentiments include a thought process along the lines of “I’m not like other girls, I like____” and insert any predominantly masculine interest like sports, working out, grilling, eating meat, and even the color blue. The thought process boils down to the following: misogyny tells women they are weak, dumb, and helpless, and if a woman does not think she is that way, she feels the need to separate herself from those negative traits assigned to femininity by the patriarchy. The goal ultimately is male approval by lifting yourself up by putting other women down. The average insecure woman who grew up in our patriarchal society can find her place at home within the pipeline as she can get the male validation of being “a good one,” and men can have a woman to point out when people accuse them of misogyny, i.e., “if I hate woman why do I have female friends.”

The Content 

Where are this content and this discourse taking place? Well-known platforms like Twitter, YouTube, and TikTok are common places where users can begin to enter the pipeline. There are specific genres of content that, while not inherently problematic, can quickly become a breeding ground for alt-right rhetoric. Genres of content like fitness, gaming, meme, and finance can go from an innocent interest to a source of malicious, politically charged, and unopposed propaganda. In fact, most “Alpha Male” podcasts gain viewers and their presence by promoting content relating to those four genres and have found high rates of success in drawing in the perfect captive audience.

From these more common platforms, Twitch, Reddit, and other forums or streaming platforms can provide those baited with a deeper dive and more people with similar views to validate one another until they have officially fallen victim to the pipeline. 

The ideology at play here is built on the premise of building yourself up by putting others down. This is because some of the most viral and popular content involves “owning,” whether it be feminists, leftists, or activists; watching them being “taken down” will always capture the young, insecure male demographic. With videos like “Ben Shapiro RIPS Pro-Choice Student At Berkeley” and “Ben Shapiro’s Best Moments – OWNING SJWs and Liberals” gaining hundreds of thousands of views on YouTube and comments like “I love watching uppity college kids get owned in front of a crowd” and “I’m proud of her, she made through the whole debate without a snack break !” Both found in the comment section of Turning Point USA videos, there is an obvious pattern that begins to form.

What it essentially comes down to with this material is that while not everyone who engages with this style of conservative content is an alt-right extremist, everyone who is an alt-right extremist engages with this style of conservative content. This is particularly problematic when the content creators know this, refuse to condemn that support, and continue to profit off of and feed into the pipeline. 

The Challenge

The problem now is ultimately in the hands of social media platforms. I acknowledge that no single platform can fully control what content users post, only do their best to crack down on the content that breaks community guidelines, but that is exactly what they are failing to do. 

Twitter’s community guidelines clearly state, “You may not promote violence against, threaten, or harass other people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease.” TikTok prohibits the posting of “Content that praises, promotes, glorifies, or supports violent acts or extremist organizations or individuals Content that encourages participation in, or intends to recruit individuals to, violent extremist organizations Content with names, symbols, logos, flags, slogans, uniforms, gestures, salutes, illustrations, portraits, songs, music, lyrics, or other objects meant to represent violent extremist organizations or individuals.” But anyone who has spent time on either of these platforms can attest to how loosely these guidelines are being enforced.

More than just keeping this content up, these platforms are actively benefiting from those engaging with this content, incentivizing the algorithms to not only leave the content, but present it to those most susceptible to falling victim to the pipeline. Over 70 percent of videos watched on YouTube come from videos YouTube recommends and not the search bar. Data collected and used to recommend videos include click-through rates, watch duration, watch history, and user engagement. It is easy for social media algorithms to find who is likely to become an alt-right extremist. Instead of suppressing content that could provoke violence or hate speech, the algorithm caters to this audience. It gives them whatever it takes for them to stay on the app, even if it violates community guidelines. 

Often these sites will make a show of removing top public figures in the space and ban their accounts to show they are dedicated to preventing this type of content on their platforms. Recently this happened with “Alpha Male” podcaster Andrew Tate who was banned from almost every mainstream social media site after he continually pished alt-right conspiracies and pushed misogynistic messages to his audience. While banning these types of creators is good, it seems performative at most when it happens only after they gain media attention, and no further action is taken to prevent these types of creators from gaining popularity again. Neither Andrew Tate nor his message is unique or special, and there are thousands of accounts run by men who push the same ideas or sell courses to profit while selling these ideas to young men.

Whether it lies in the algorithms social media platforms use or the platforms’ direct intentions, it’s time for these sites to stop being complicit in contributing to the alt-right pipeline. It stands to benefit all people, especially Republicans, if right extremism is denounced and tamed as opposed to being at the forefront of what conservatism is. 

Feature Photo Source: Fast Company

Comments are closed.