I didn't intend to post again this soon, but I read a story in the New York Times and saw this passage: "In the United States, surveys point to declining civics understanding among adults [which leads] to weaker social discourse and faith in public institutions." I don't think that there has been a general decline in civics understanding, or that lack of civics understanding in the public is a major source of the problems with our political culture, so I wanted to check their evidence.
On clicking the link, I found it led to a legitimate survey sponsored by the Annenberg Center at the University of Pennsylvania, and the report was called "Americans’ Civics Knowledge Drops on First Amendment and Branches of Government." So far, that seems to support the statement in the Times. On reading further, I saw that the drop was relative to the previous year (2021), and it was dramatic--e. g., when asked what rights were guaranteed by the First Amendment, 20% named freedom of the press, down from 50% in 2021. Going back further, 42% mentioned freedom of the press in 2020, and 14% in 2017. So either we've had a big decline or a small increase in knowledge, depending on your starting year. Something is wrong--you might get a large increase in knowledge on issues that suddenly come into the news (e. g., knowing where Ukraine is located), but this is something that people learn in school, if they learn it. So you're not going to get large changes from year to year--you could get large changes over a long period of time, but they would involve the accumulation of small changes in the same direction.
What explains the differences between the years? With open-ended questions, the number who give responses is affected by the amount of encouragement they get from the interviewer--e. g., if someone says "I don't know," whether the interviewer says something like "just your best guess is OK." This is particularly relevant to the First Amendment question, since multiple answers are possible. Suppose someone answers "freedom of speech" and then pauses: the interviewer could move to the next question, or could ask "anything else?" So my guess is that the exact instructions given to the interviewers changed over the years (or possibly the way they were paid in a way that changed their incentives--like hourly rate versus completed interviews). The site has a report on sampling and weighting, but nothing on the exact instructions, so I can't check.
You would think that someone involved in the project would realize that the numbers looked strange, and checked to see if the apparent changes in knowledge actually reflected some change in the survey procedures. But they just presented them as straightforward changes in knowledge: for example, the 2020 survey report was titled "Amid Pandemic and Protests, Civics Survey Finds Americans Know More of Their Rights." I'm not saying that it's impossible that there were large year-to-year increases and declines in knowledge--just that it would be unusual enough to deserve close examination before saying that it happened.