Sunstein believes that interpreting these findings will "tell us a lot about current political controversies." Here's his own interpretation:
The first [explanation] is that if you know a lot about politics, you are more likely to be emotionally invested in what you believe. Efforts to undermine or dislodge those beliefs might well upset you and therefore backfire. The second explanation is that if you have a lot of political knowledge, you are more likely to think you know what is really true, and it will be pretty hard for people to convince you otherwise.
The implication that the less you know, the more open you are to persuasion may not be comforting to everyone. It could simply mean that the ignorant may be more easily persuaded about anything. What we want, of course, is a golden mean, a public not so easily persuaded yet not as defensive against persuasion as partisans are today. That said, the report leaves me wondering about the worth of the "political knowledge" identified in the survey. The one example Sunstein gives is the number of terms a president may serve. The questions and answers from the survey may show that some subjects know how government works, but they don't necessarily tell us about how they think the world works. That is, they may not touch on beliefs about human nature or ideological opinions on what is right and wrong in politics. I suspect that people will choose to trust Sarah Palin or President Obama or unidentified "non-partisan experts" based on how they think the world works, or should work, rather than the nuts and bolts of the Constitution. The same people will trust or distrust the ACA based on presumptions, favorable or unfavorable, about the motives of its authors and supporters -- whether they mean to help people or simply want to control them. Civics 101 knowledge is too objective to serve as any relevant determinant of political bias. The authors of the survey may have expected or hoped to find that stubborn bias was based on some kind of stupidity, but they may have mistaken the kind of stupidity in play and definitely used the wrong measurements to capture it. Let them put together a survey that solicits people's beliefs about human nature, their assumptions about power, etc., and they might get less confusing correlations and a better measure of both what people actually know and what they think they know.