Political scientists are summoned to the spectacles

 "In politics and in life, ignorance is not a virtue. It's not cool to not know what you're talking about." -- President Barack Obama, Commencement speech at Rutgers University.
Maxwell Tani, Business Insider, May 15, 2016.
 

The political season has provoked such anxiety that people are turning to political science for its perspectives. We've found two recent reviews of political science about California groundwater and the irrationality of voters.
The first comes from UC Davis Center for Watershed Sciences. It is a review of recent thinking about how the mandated Sustainable Groundwater Agencies ought to be governed. For those of us who have participated in the past on meetings on this topic, this is not perhaps such a serious question because we have observed how the largest landowners, municipal pumpers and irrigation districts control the public processes for their own interests. These interests include entirely unsustainable groundwater pumping for as long as a superior governmental agency -- state or federal -- or the courts will tolerate it.
Who are these political science studies directed to? Are they trying to establish criteria to judge the functionality of these new local agencies to give the state accredited support to stop the race to empty and collapse the aquifer? Is it possible for the state to establish some simulacrum of a public interest, or in water in general, given the history of the domination of the Public Trust by private special interests subsidized by the government that is supposed to be enforcing regulation?
Admittedly, it is hard to see from the middle of the smoggiest valley in America, so we have to ask if there is actually a political will in California strong enough to stop agribusiness, including powerful national finance, insurance  and real estate special interests  from emptying and collapsing the aquifers beneath their California nuts, fruits and vines. Nearby examples of the FIRE sectors include: Trinitas hedge fund near Oakdale; John Hancock Agricultural Investment Group near Los Banos; and landowner interests like the heirs and assigns to the Miller & Lux Cattle Co. 1.25-million acre holdings. Large land holdings play the role in the state economy that deep, internal waves play in the ocean.
Whenever water is mentioned, the word "stakeholder" is not far behind. Should its modest, folksy connotation of common participation be taken seriously? But should these FIRE entities be confused with social science's mythological yeoman farmer just because of the normal attrition of meaning of words that have entered the American political economic coliseum?  
 
***
 
The second piece reviews recent political science research on the problem that some of the electorate have with fact and reason, at least in the opinion of the researchers. Their alarming results --  people are not always amenable to reason, particularly in political affairs, especially right now -- appear to bode ill for our democracy.
Our first question is: Aren't those people not amendable to reason as defined by these political scientists, who gather in large numbers under banners like "Tea Party" and attend Donald Trump "Make America Great Again" presidential campaign rallies, exercising the same democratic rights that supporters of Bernie Sanders for president and groups like Black Lives Matter do?
This is one of those years the Founders warned us against. It is a year for the factions. And the volume of these factions is being cranked up to a fever pitch by the media, which has discovered a huge audience of angry people. The outer wings of the two centrist parties are forming powerful presences on the streets, in fairgrounds, auditoriums and other large venues throughout the country.
Meanwhile, Republican party regulars stare at the Trump campaign headlight on the freight train bearing down on them. Democratic Party regulars are fundraising, collecting IOU's for slaughters past with promises for more to come.
We think Bernie Sanders is speaking reason. Our acquaintances downtown, their pistols carelessly concealed, think Trump speaks reason. In fact, on some issues the two candidates seem to agree at times. The faction of the Right, however, seems to have their candidate in better position for the nomination, his opponents having fled.
Ultimately, I think at the core of right wing opinion is the greatest sickness in American culture -- racism -- that started with wars against Indians and the institution of slavery. It is a constantly developing and adaptive political theory that relies on race alone for its principle of consistency and its essential rationale.
Thehy cultivate their spirits in the optimism of billionaires.
The Left prefers a class analysis, despite the profound warning of Thorsten Veblen that the optimism of billionaires obstructs the development of class consciousness among the people.
None of the political scientists seem to have picked up on Sheldon Wolin's extremely fruitful (at least for a Californian) concepts of democracy as essentially a "fugitive" impulse in the people, who are most of the time successfully coerced and manipulated by the grand public/private partnership for imperialism, economic growth of billionaires and the exploitation of workers and the environment.  When too threatened, segments of the more suppressed muster the energy to resist, as we are seeing in movements like Black Lives Matter and the youth in the Bernie Sanders campaign.
 Rather than the optimism of the billionaires, liberals continue to cultivate honesty, love, courage and humor. They prefer reading Don Quixote to listening to Rush Limbaugh. Sensitive to themselves and to others and with the ability to take a good joke, as E. M. Forster put it in Two Cheers for Democracy (1951), they carry on, in fashion and out of fashion.
--blj
5-15-16
California Water Blog
Inevitable Changes to Water in California
 

SGMA and the Challenge of Groundwater Management Sustainability

 

Posted on 

 
 
 
May 15, 2016 by UC Davis Center for Watershed Sciences
Bill Blomquist
https://californiawaterblog.com/2016/05/15/sgma-and-the-challenge-of-gro...
It isn’t just the groundwater that has to be sustainable; it’s the management too.
That’s why the title of this post shifts from the more familiar “sustainable groundwater management” to “groundwater management sustainability.” This perspective doesn’t come from the world of hydrologic or climate or environmental science, but from political science and other disciplines focused on human institutions and behavior.
Along with the development and use of information, the greatest challenge in groundwater sustainability is governance and decision making. Over several decades we have learned that governance is at least as important in environmental management and protection as are science and data. That’s saying a lot, because science and data are critical. Understanding institutions for governing human interactions with the environment and with other humans is at that critical level or greater.
The challenge of governance and decision-making involves dealing with scales and levels, the legal and regulatory framework, and multiple publics and values; also, supporting and institutionalizing innovation, adaptation, and learning. Plainly this is complicated, and like the challenge of information, it is unlikely to be solved once and for all.
The treatment of institutions in policy analysis versus political science
Institutions come up a lot in policy discussions. Policy discussions generally follow this form: “What should we do about X?” Fill in the “X” with the relevant policy topic—groundwater sustainability, mass transit, international terrorism, etc.—and the policy discussion ensues, with varying voices and perspectives advocating one solution or another.
Institutions generally appear in these kinds of policy discussions as prescriptions. They are advocated for their predicted beneficial effects in remedying the illness – the “X” — that’s under discussion.
Got a groundwater problem? Take some private property rights plus a market and call me in the morning, or take a public trust doctrine plus a regulatory agency, or take a comprehensive watershed management authority, or take a public participation process plus some citizen science… etc.
These institutional prescriptions are usually available only in ideal form: well-defined property rights and a well-functioning market, agencies selflessly pursuing the public interest, and so on.
Political science discussions are different. In political science the most important question isn’t, “What should we do about X?” It’s “Who gets to decide what we’re going to do about X, and how?” That leads us into the world of governance and decision making, with its messiness of competing interests, conflicting incentives, rhetorical framing of issues, ideological predispositions, and seemingly endless blocks, countermoves, and end-runs.
To some extent this is a distinction between thinking about institutions as ideal types versus thinking about institutions as problematic human creations. The latter approach involves lifting the hood and applying some diagnostics. That means knowing what to look for and what questions to ask once the hood is up.
Institutional diagnostics—questions to consider if you’re trying to make the management sustainable
To implement the Sustainable Groundwater Management Act (SGMA), people in 127 groundwater basins across California are developing groundwater sustainability agencies (GSAs) that will be required to develop, adopt, and implement groundwater sustainability plans (GSPs). That’s a governance challenge of the first order, and it involves creating new institutions or adapting existing ones for new purposes.
I strongly recommend the recently published report by Kiparsky et al (2016) on criteria for evaluating GSAs: scale, human capacity, funding, authority, independence, participation, representation, accountability, and transparency. Because those are good criteria for evaluating GSAs, they are also good criteria to take into account when designing them.
The authors of that report have done an excellent job of defining and explaining their criteria and their significance. I will add here a few thoughts about one of the criteria they cover—representation—and then add another criterion that is especially important for the notion of “management sustainability.”
Representation is critically important to governance and decision-making. If a GSA will be a new entity, for instance, what will its governance structure look like? Will the members of its governing board represent districts, represent specific constituencies, or serve at large? And if an existing entity is going to assume the responsibilities and gain the authority of a GSA, are there any changes to its internal representation and decision-making structures and processes that should be made?
Either way, will communities or stakeholders within the basin be represented equally or proportionally? If proportionally, relative to what?   Are all stakeholders equal—or, rather, should they all count equally when it comes to making groundwater management decisions? Do some have more weight because of a judgment about their stake in groundwater management? Pumpers, for example, could reasonably be said to have site-specific investments and dependence on the resource to a greater extent than others. On the other hand, if pumpers have primary control over decision-making about an overdrafted groundwater basin, will it always be a situation of “the diet starts tomorrow?” Last but not least, how can the composition of the governing body be adjusted if and when the constellation of interests and uses change?
That brings me to the criterion I’d like to add in designing GSAs and GSPs: adaptability.
An observed fact from the messy world of governance and decision-making is this: no matter how carefully and well we design institutions, we won’t get everything right the first time. Also, conditions will change in ways that alter the fit between what we put in place at one time and what comes along to confront us later.
That will surely be true for the GSAs being designed in the basins starting SGMA implementation now. Despite everyone’s best efforts in constructing these governance structures, there will be errors and surprises. It will be essential to build in processes for modification. People engaged in the hard work of groundwater management in overdrafted basins (Porse 2015) will need to make rules not only for changing groundwater use and managing the basin; they’ll need to make rules for how and when the rules themselves can be changed.
Creating a decision-making body for groundwater management isn’t just solving a problem; it’s writing a constitution. Writing a constitution is an intricate task, where decisions about one element are often linked to and affect other elements. Long-standing and relatively successful constitutions—the sustainable ones, we might say—are the ones that can be adjusted when needed.
SGMA implementation will be hard, in ways that have already been predicted and discussed (Moran and Cravens 2015, Moran and Wendell 2015). By equating it to constitution making, I don’t mean to make it sound even harder. But I think it helps to conceive of it that way. It helps us think beyond the immediate circumstances of the moment and consider the ways in which we are designing a decision making process that will have to address circumstances well beyond this moment. And that, in turn, is likely to make us think now about how we want to be able to adjust then.
As Californians design GSAs and GSPs, they are designing institutions for governance and decision-making. That’s a somewhat different, and I hope useful, way of thinking about the task that lies ahead. Thinking about it that way may help encourage everyone to think about the sustainability of the management as well as of the groundwater.
Bill Blomquist is a Professor of Political Science at Indiana University-Purdue University Indianapolis (IUPUI) in Indianapolis, Indiana.  His areas of research are water resource management, institutions, and the policy making process.
Further Reading
Kiparsky, Michael, Dave Owen, Nell Green Nylen, Juliet Christian-Smith, Barbara Cosens, Holly Doremus, Andrew Fisher, and Anita Milman (2016) Designing Effective Groundwater Sustainability Agencies: Criteria for Evaluation of Local Governance Options. Berkeley, CA: Wheeler Water Institute, UC Berkeley School of Law
Moran, Tara and Amanda Cravens (2015) California’s Sustainable Groundwater Management Act of 2014: Recommendations for Preventing and Resolving Groundwater Conflicts. Stanford, CA: Water in the West Program, Stanford University Woods Institute for the Environment
Moran, Tara and Dan Wendell (2015) The Sustainable Groundwater Management Act of 2014: Challenges and Opportunities for Implementation. Stanford, CA: Water in the West Program, Stanford University Woods Institute for the Environment
Porse, Erik (2015) “The Hard Work of Sustainable Groundwater Management.” California WaterBlog. Posted August 13, 2015.
 
7-11-10
Boston Globe
How facts backfire
Researchers discover a surprising threat to democracy: our brains
ByJoe Keohane
http://archive.boston.com/bostonglobe/ideas/articles/2010/07/11/how_fact...
New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.
For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.
It’s unclear what is driving the behavior — it could range from simple defensiveness, to people working harder to defend their initial beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”
It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.
But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.
There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.
This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.
“Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”
On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)
Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”
What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
\New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.
For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.
It’s unclear what is driving the behavior — it could range from simple defensiveness, to people working harder to defend their initial beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”
It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.
But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.
There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.
Kuklinski’s study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.
And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”
In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts — inference, intuition, and so forth — to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, we’re easily suckered by political falsehoods.
Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on ‘Meet the Press’ and you get hammered for saying something misleading,” he says, “you’d think twice before you go and do it again.”
Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery. Getting a politician or pundit to argue straight-faced that George W. Bush ordered 9/11, or that Barack Obama is the culmination of a five-decade plot by the government of Kenya to destroy the United States — that’s easy. Getting him to register shame? That isn’t.
Joe Keohane is a writer in New York.