In May, RAND released the results of a study investigating the prevalence of extremism among veterans in the United States. By surveying a nationally-representative sample of veterans, RAND found that overall, veterans do not support domestic extremist groups or key domestic extremist ideas more than the U.S. population writ large. Although the report notes that veterans were indeed overrepresented among those charged with crimes related to the Jan 6 insurrection, other commentary continues to suggest that perhaps that event was an outlier in terms of the high level of participation from those with military experience.
As scholars of contemporary antigovernment extremism, we think this report is important. In general, empirical data about extremism among military and veteran communities (as well as first responders) has been woefully lacking for years. But we’re worried that some people might mistakenly interpret the report as saying that there is no need to be concerned about extremism among our service members. Further research is imperative to validate and contextualize the report’s findings.
Data from START and the Program on Extremism suggests that 13-17% of all those arrested for participating in the insurrection had military experience. This is alarming because these numbers far outpace the 6.4% of the overall US population who are veterans. Questions clearly still linger when trying to understand how common support for extremist groups and ideas really is among members of the military and veterans. Surveys are effective ways to measure the prevalence of a characteristic like group membership or beliefs, but there are several shortcomings of surveys that are important in this context.
First is what researchers call social desirability bias. With many sensitive topics, survey respondents might believe that they are “supposed” to answer a question in a certain way. For example, if a teacher asks students whether they have ever cheated on an exam, students might be more likely to say that they have never cheated even if they have, because they know that teachers want to hear that students don’t cheat. When asked whether they support white supremacists or the Proud Boys, those participating in this survey may similarly have been more likely to under-report their level of support for these groups based on a belief that such support would be looked on unfavorably. It’s difficult to know when social desirability bias is present in responses to survey questions, though, and it’s difficult to know how different responses would be if this form of bias weren’t present. This might lead us to think of the rates of support identified in RAND’s survey as a low-end estimate.
Second is the use of questions where respondents are given a choice between different answers (in this case, the use of likert-like scales that allow respondents to choose their answers from a set of categories along a single continuum). Uniform responses are a critical part of survey efficiency, a necessary feature for trying to determine the prevalence of a thing. But these scales rule out nuance. Consider the first question: “What is your opinion of Antifa?” A respondent might wonder what “Antifa” means in this question. Does it refer to all antifascists? Does it refer to a movement of militant antifascists? Does it refer to an alleged highly-organized group that engages in violent extremism? Different respondents may be imagining different antifas when answering this question. Additionally, a respondent might think something like “I support efforts to oppose fascism and white supremacy, including the use of protests and demonstrations; but I do not support the use of violence.” It’s difficult to anticipate which category among the 5 offered levels of (un)favorability this respondent would choose.
To address limitations of surveys and attempt to more fully unpack these dynamics, we are conducting in-depth interviews with service members to better understand how extremists exploit their peers and, most importantly, what they believe we should be doing about this problem. This study is ongoing, but our preliminary data indicate that veterans believe the risk of radicalization is generally heightened for people who joined the service after 9/11. Only 26% of the RAND survey respondents joined after this date, so it may be the case that the survey underrepresents service members for whom extremist recruitment is most likely to be a problem. We might, as a result, also be underestimating the continued risk of service members’ support for extremist groups and violence in future years.
To be clear, even if survey data underestimates these numbers, the vast majority of service members are not extremists. The problem is that extremists seem to be increasingly taking advantage of the knowledge and experiences our veterans have to offer, and we should be doing more to make sure that does not happen. Our preliminary interviewees stress the need for enhanced, functional services to ensure that people leaving the military are able to find jobs and to have support as they acclimate to civilian life.
Interviewees also commonly detail how military reporting structures are not conducive to reports from active service members who may witness signs of extremism in their units. They say that there are justified fears of repercussions when such reports must be made to unit leaders, rather than to a dedicated third party. An outside contact for these reports would also alleviate some interviewees’ concerns that unit leaders sometimes have disincentives for addressing problems that could impact their unit size or perceived leadership effectiveness. The reality is that the presence of extremist elements negatively impacts operational security and effectiveness, something with which both interviewees and past research agree.
The RAND researchers told NPR that this study is an early step towards understanding the prevalence of extremism among veterans and that more research is needed. We agree. This survey offers us important initial insights, but we need other forms of research — particularly insights from qualitative methodologies that allow more back-and-forth between researchers and those they study — to help us truly understand this issue. One example: having an open-ended conversation with respondents would let us ask those who said they believe the Great Replacement theory what action they believe is required to mitigate the perceived threat of other racial groups. This added information would provide much needed context about the likelihood of future violence that is justified by believing white people are intentionally being supplanted by racial others.
The RAND survey also provides some findings that call for follow up research. For example, respondents from the Marine Corps expressed higher levels of support on almost all questions than those who served in other branches. Would that finding hold for those currently serving? Is there something about the Corps that makes Marines more susceptible to extremism; alternatively, are those who are more susceptible to extremism more likely to choose the Corps over other branches?
Ultimately, RAND’s findings regarding the prevalence of support for extremism among veterans are important. But as the insurrection showed us, democracy-threatening outlier events can occur. Identifying baseline rates for support for extremism in veterans is important, but so too is identifying why those rates of support may not match rates of participation in other outlier events. Why did so many veterans participate in that attack on our democracy? This survey is an important step forward, but stakeholders should not treat it as evidence that the problem of extremism among veterans (or active duty for that matter) does not exist.
To learn more about our ongoing study or, if you are a veteran who wants to participate in an anonymous interview about extremist exploitation of our service members, please contact Dr. Amy Cooter at email@example.com.