Thursday, January 26, 2023

Politics as a vocation?

 In November 1943, a NORC survey asked "If you had a son just getting out of school, would you like to see him go into politics as a life work?"  This question has been repeated in a number of surveys since then, most recently in 2015 (sometimes with minor variations in wording).*  Confidence in government and political leaders has declined over that time, so you might expect a decline in the number who want their hypothetical son to go into politics.  However:



it seems to have increased--it definitely hasn't declined.  (I show the ratio of yes to no answers because the proportion of "don't knows" has tended to decline).  The highest single value was in March 1965 (36% yes and 54% no), when Lyndon Johnson had won a landslide victory and had a 70% approval rating, but March 2010 was almost as high (36% yes and 55% no), although Barack Obama's approval rating had fallen to about 50% and the Tea Party movement was getting underway.

Why did interest in having a hypothetical child go into politics increase when approval of politics and politicians was declining?  One possibility is increased educational levels--more educated people may be more likely to think that someone like them could accomplish something in politics. Another is that growing ideological distance between the parties and increased focus on national rather than local politics has given people a sense that more is at stake--that politics is potentially about accomplishing important things or stopping harmful things, not just dealing with routine everyday affairs.  

I hope to come back to this--I have a feeling that there is something important here.    






*More recent surveys usually ask about "son or daughter."  Some have asked randomly selected halves of the sample about a son or a daughter--there is little or no difference in responses.    

Friday, January 20, 2023

Amalgam

 A few years ago, I noted that public opinion towards Martin Luther King was not all that favorable when he was alive.  Some news stories on MLK day made the same point.  Some of them mentioned a survey I didn't discuss in my post, which was taken a couple of weeks after his assassination.  That asked for their reaction on hearing the news:  sadness, anger, shame, fear, or "felt he brought it on himself."  Overall, 36% said sadness, 5% anger, 15% shame, 9% fear, and 30% that he brought it on himself (the rest didn't know).  But of course responses differed by race:

                                      White      Black
Sadness                           30%         74%
Anger                                4%         11%
Shame                             16%           8%
Fear                                 10%           3%
Brought it on                   35%           3%

I would put the first three in a group, since they all involve some form of distress.  Among whites, 50% were distressed and 35% unsympathetic; among blacks, it was 93% to 3% (fear is hard to classify).  The survey also had a number of questions about whether you felt particular things after King's murder.  One was "did you feel sorry for his wife and children?"  Among blacks, 83% said they felt that strongly, 6% fairly strongly, 5% that it crossed their mind, and 6% that it didn't occur to them.  Among whites, it was 41%, 30%, 16%, and 12%.  So by both measures, about 30% of whites were not particularly upset. 

 There was an even larger difference on "did you think about the many tragic things that have happened to negroes and how this was just another one of them?"  Only 15% of whites said they felt this strongly, and 44% that it didn't occur to them; among blacks, 56% said they felt it strongly and 13% that it didn't occur to them.  

The survey that included this question was what NORC called an "amalgam":  that is, it had questions on a variety of topics.  While looking at it, I saw something else that seemed interesting.  One question asked people who they would like to see elected president in November:  the choices were Johnson, Robert Kennedy, Eugene McCarthy, Nixon, Reagan, and Rockefeller.  There was also a question about what the United States should do in Vietnam:  "begin to withdraw," "carry on its present level of fighting," or "increase the strength of its attacks against North Vietnam."  The results, going from most to least hawkish supporters:

                  withdraw    same   increase

Wallace         29%          11%     54%
Reagan         35%            8%      58%
Johnson        25%           26%     40%
Nixon           34%           16%     41%
Rockefeller  41%           15%      33%
McCarthy     48%           17%      30%
Kennedy       52%           12%      28%

Total             39%            15%     37%

The general ranking is unsurprising, but they weren't as differentiated as I expected:  e. g. 35% of Reagan supporters thought that we should start withdrawing, and 30% of McCarthy supporters thought we should increase attacks on North Vietnam.  Overall, only 15% said we should keep on as before--even for Johnson supporters, only 26% did.  So it seems like there was a lot of generalized discontent--escalate or get out, but just do something different.  

[Data from the Roper Center for Public Opinion Research]

Sunday, January 15, 2023

Double or nothing?

 A few days ago, the Washington Post had a story titled "Survey finds ‘classical fascist’ antisemitic views widespread in U.S."  Moreover, it suggested that anti-semitic views were becoming more common, although it cautioned:  "It is difficult to assess whether antisemitic views have increased over time, given changes in the survey’s response options as well as how respondents were sampled."  I looked at the report on the survey, which was sponsored by the Anti-Defamation League and carried out by NORC, and found that it was more confident in claiming an increase in anti-semitic views--specifically an increase between 2019 and 2022.  The key figure:



The survey was the fifth in a series going back to 1964, and its measure of antisemitism was based on questions that had been included in all five surveys.  The figure suggests that anti-semitic views generally declined from 1964 to 2019, and then rose sharply between 2019 and 2022.  There have been other stories on the survey in the last few days, and many of them emphasize this  apparent change:  for example, Reuters has a story called "Americans' belief in antisemitic conspiracies, tropes doubles since 2019, ADL survey shows."  The "doubled" is based on a comparison of the number who believe six or more of the anti-semitic opinions:  11% in 2019 and 20% in 2022.   Of course, that's an arbitrary cutoff, but no matter how you look at it, there seems to be a large increase:  average agreement with the statements shown in the figure was 17% in 2019 and 29% in 2022.

A large change in the last three years seemed unlikely to me--opinions on this sort of thing don't usually change rapidly.   However, NORC is a well-regarded survey organization, so you can't just dismiss the survey.  After looking at the report, I have a hypothesis.  The 1964, 1981, and 1992 surveys were conducted in person or over the phone; the 2019 and 2022 surveys were online.  But there was a change between 2019 and 2022:  "for the current survey, researchers opted to remove the 'Unsure/Don't Know' option for anti-Jewish tropes..."  The figure shows the percent agreeing out off all respondents, not just those who had an opinion.  For example, in 1964, 48% agreed that "Jews in business to out of their way to hire other Jews," 33% disagreed, and 19% had no opinion.  I couldn't find the "no opinion" rate for the 2019 survey--it doesn't seem to be archived--but my guess is that it was higher than in previous surveys.  With an interviewer, you are asked whether you agree or disagree--you have to volunteer a "no opinion."  Most people probably also want to cooperate with the interviewer and don't want to seem ignorant, so overall there is some push towards giving an answer.  In contrast, with an online survey that includes a no opinion box, it's on the same footing as any other answer.  So my hypothesis is that they had a large number of "don't knows" in 2019 and that a lot (maybe all) of the apparent increase in anti-semitic sentiments between 2019 and 2022 was a result of the elimination of that option.  

I looked for other survey questions that could shed light on changes in anti-semitism.  There weren't many, but since 1964, the ANES has sometimes included a "feeling thermometer" for Jews, and the GSS has also included it a number of times.  The mean scores (on a 0-100 scale, with higher meaning more favorable):  


There was no clear trend between 1964 and 2008, but the 2016 score was the highest ever, and 2020 set a new record.  So this question suggests a decline in antisemitism in recent years.  

Despite the difficulty of interpretation resulting from the change in response options, the 2019 and 2022 data provide valuable information--the ADL ought to deposit them in one of the data archives.  

[Some data from the Roper Center for Public Opinion Research]







Tuesday, January 10, 2023

Men are from Williams, women are from UNC-Greensboro

 I've had a couple of posts on support for free speech among college students.  I mentioned that it differs by ideology (more support among conservatives) and selectivity (more support at colleges with lower admission rates).  But there's another important factor I haven't mentioned yet:  gender.  The average rating for eight cases, with higher numbers indicating more support for their right to speak, is 21.0 for men and 18 for women.  The average for men is about the same as the average for students at Williams College, which ranks 5th out of 208 colleges and universities surveyed, while the average for women is about the same as for students at UNC-Greensboro, which ranks 193d. (The survey also gave options for non-binary, genderfluid, agender, unsure, and prefer not to say, which I combined into one category--they averaged 19.4, just about halfway in between the values for men and women).  In terms of ideology, the average for men is about the same as the average for people who call themselves "very conservative" and the average for women is just a little higher than the average for people who say they "haven't thought about" their political views--the least tolerant group.  That is, the gender difference is big.

The General Social Survey questions on whether certain kinds of people should be allowed to speak--support is somewhat lower among women, but the gap is not very large.  The GSS questions are yes/no, and the questions in the college student survey (sponsored by the Foundation for Individual Rights and Expression) have four categories:   definitely should be allowed to speak, probably should, probably should not, and definitely should not.  To make them comparable, I collapse the FIRE questions into definitely or probably should vs. definitely or probably should not.  The percent saying that each type should be allowed to speak

                                Men      Women                       Difference
GSS
homosexual             78%       79%                  -1
Communist              69%      61%                    8
racist                        66%      57%                    9
militarist                  65%       63%                   2
anti-religion             76%      70%                    6

FIRE
anti-transgender       42%        15%                27
ban abortion             57%        29%                28
anti-BLM                 44%        15%                29
stolen election          43%        24%                19
repeal 2A                  64%       57%                   7
immigrants               76%       76%                   0
white racism             65%       67%                 -2
religion                     67%       59%                   8

The table shows that the gender gap is a lot larger for the FIRE questions than the GSS questions.  The gaps are bigger for the right-wing speakers in the FIRE survey--however, even though are women more liberal than men, they are a bit less likely to support the left-wing speakers.   Another point is that levels of support for all speakers are relatively low in the FIRE survey compared to the GSS.  Although the questions are different, it's hard to argue that the FIRE examples are more extreme.  For example, you would expect support for the right to speak to be higher for someone who wants to work within the Constitution than for someone who wants to overthrow it, but the percent in favor of allowing a speaker who says "the second amendment should be repealed so that all guns can be confiscated" (FIRE) is slightly lower than that for someone who "advocates doing away with elections and letting the military run the country" (GSS).  

I think that these two points are connected.  The GSS questions ask about someone who "wanted to make a speech in your community," and presumably people interpret "your community" as their town.  The FIRE questions ask about someone speaking on their college campus.  A town is basically just a collection of people who choose to live in the same place, but colleges decide who gets to be part of the community and also proclaim some official values or mission.  As a result, there's a tendency to think of speaking on campus as representing a kind of endorsement--not necessarily coming out in favor a point of view, but at least saying it deserves to be considered.  Women tend to be more concerned about giving offense than men, so they are more likely to reject a speaker who might offend a significant part of the community.  


Monday, January 2, 2023

Lost in translation, part 2

 In October, I had a post on miscommunication of research results.  After Andrew Gelman discussed it in his blog, it went on to become my most-viewed post (by a 2:1 margin).  This post gives a few more thoughts on the issue.  

First, a point raised by some comments on Andrew's post.  I quoted a paper that said "the prevalence of psychotic symptoms among mass murderers is much higher than that in the general population (11% v. approximately 0.3-1%)" and added "That is, people with psychotic symptoms were between 10 and 30 times more likely to commit mass murder than people without psychotic symptoms."   How did I get  "between 10 and 30 times more likely"?  The calculation was:  11 is about 10, 10 divided by 1 is 10, and 10 divided by .3 is 33.3, which is about 30.   So I should have said something like "roughly between 10 and 30".  To do the calculation properly, the formula is (p*(1-q))/(q*(1-p)), where p is the prevalence of psychotic symptoms among mass murderers and q is the prevalence in the general public, which gives 12.2 if q is 1% and 41.1 if it's 0.3%.  I didn't bother doing the exact calculation because by the standards of things I usually study, even 10 times more likely is an extremely large difference:  my question was how that came to be described as "dispel[ling] the myth that having a severe psychiatric illness is predictive of who will perpetrate mass murder."

On to the main point:  I suggested that "sometimes a focus on making sure that people don't draw the wrong conclusions comes at the expense of explaining what the research actually found," and that this seemed to be particularly common in medicine and public health.  I mentioned two other cases in which the conclusions reported in news stories were very different from the actual research findings.  That leads to another question:  why were these studies being discussed in the media?  The one on mass shootings was about an issue that had been in the news, but the other two were were just about papers that had been published recently.  Reporters generally write about events--unusual or unexpected things.   With science, this means discoveries or breakthroughs, like the recent fusion experiment.  But in the social sciences, discoveries or breakthroughs are rare (or maybe nonexistent)--if there is progress, it's a slow increase in understanding.  Journalists seem to have figured this out, so although  economists, political scientists, or sociologists are sometimes quoted, there are very few stories that try to summarize a specific piece of research in economics, political science, or sociology.  But in medicine and health, there are sometimes discoveries or breakthroughs, so journalists have the sense that a particular study might be news.  Reporters don't have the time or expertise to read journals and go to conferences and decide for themselves, so universities or journals publicize research in the hope that it will attract media attention.  But that often involves exaggerating the contribution, or making the conclusions sound more startling, or emphasizing something that wasn't the primary focus but which people are interested in.

My conclusion is that journalists should make distinctions between the parts of health and medicine in which discoveries might occur--e. g., a new vaccine--and those which are more like social science.  For example, there have been enough cases of mass murder so that we know something about what kinds of people are more likely to commit it, and no new study is likely to show that everything we think we know is wrong or to reveal that beneath the complex surface there is really one underlying cause.  On a topic like this, journalists should ask researchers to give commentary, but shouldn't be looking for dramatic new discoveries.