Jon Krosnick

From Wikipedia, the free encyclopedia
Jon Alexander Krosnick
Born (1959-01-05)January 5, 1959
Philadelphia, PA
Fields
Institutions
Alma mater
Known for
  • Studies in the psychology of political behavior
  • Studies in attitude formation and behavior
  • Survey methodology research

Jon Alexander Krosnick (born January 5, 1959) is a professor of Political Science, Communication, and (by courtesy) Psychology, and director of the Political Psychology Research Group (PPRG) at Stanford University. Additionally, he is a Frederic O. Glover Professor in Humanities and Social Sciences and a Social Sciences Senior Fellow at Woods Institute. One major focus of his research has been questionnaire design and survey research methods. Krosnick has studied the psychology of attitudes, voter choice behavior, and attitudes toward global warming. He was also a co-principal investigator of the American National Election Study, the nation's long-running and most comprehensive academic research project exploring voter decision-making.[ 1] Krosnick consults for several organizations, and has testified in court proceedings.

Krosnick's work focuses on the design and methodology of questionnaires and surveys, and he has served as a consultant to the government, academia, and industry on these issues. Krosnick was a principal investigator leading the American National Election Studies from 2005 to 2009, along with Arthur Lupia of the University of Michigan.[ 1] He was a member of the National Election Study Ad Hoc Committee on Survey Mode which compiled a report for the National Election Study Board of Overseers on the pros and cons of moving from face-to-face to telephone interviews.[1] He has also studied the psychology of attitudes and researched how voters make up their minds and how campaigns influence them.[ 1] He has conducted research on American attitudes toward global warming, how negativity in campaigns affects turnout, and ballot order effects. He has also been an on-television commentator on election night.[ 1]

Personal life

Krosnick was born on January 5, 1959 in Philadelphia, Pennsylvania, to an opera-singer, Evelyn Rieber Krosnick, and Arthur Krosnick, who loved opera.[ 1][ 1] Arthur was a physician[ 1] who was a diabetes specialist and researcher.[ 1] Evelyn was a music educator. They were patrons of George Nakashima and collected furniture he designed for their home in Bucks County, Pennsylvania.[ 1][ 1] Jon Krosnick has a sister, Jody Arlyn,[ 1] who is a surgeon.[ 1] He became interested in music at an early age, starting to learn how to play the piano at age 6 and going to a music camp at Interlochen at age 9, where he first encountered jazz drummer Peter Erskine. Erskine would later be a musical influence on him. Krosnick continued playing percussion instruments from elementary school on, becoming a member of the electric jazz band, Charged Particles.[ 1]

Krosnick went to Lawrenceville School in Lawrenceville, New Jersey and graduated in 1976.[ 1] He later graduated magna cum laude from Harvard University in 1980 with a B.A. in Psychology.[ 1] He then received both an M.A. in 1983 and a PhD in Social Psychology in 1986 from the University of Michigan, Ann Arbor.[ 1][ 1] On June 1, 1986, Krosnick married Catherine Ann Heaney.[ 1] He joined the departments of psychology and political science at Ohio State University, Columbus, as a lecturer in 1985, became an assistant professor in 1986, and was promoted to associate professor in 1991.[ 1][ 1] He became a professor and was a member of the Ohio State University (OSU) political psychology program and co-directed the OSU summer institute in political psychology. In 2004, Krosnick became a professor at Stanford, where his wife also accepted a faculty position. The couple have a daughter who attends Stanford as an undergrad. Jon and Catherine now live in Portola Valley, next to Stanford.[ 1]

Work in survey methodology

Krosnick and colleagues compared Internet surveys, telephone surveys, and face-to-face (FTF) surveys of probability samples and found that people provide socially desirable answers more often in telephone surveys than in the other two cases. They found that face-to-face-survey respondents respond more accurately than telephone respondents. Since face-to-face interviews are costly, Krosnick conducted a study providing computers and an Internet connection to a set of randomly sampled people, and inviting them to answer survey questions online over a year. This method is known to produce samples, after subtracting those who refused to participate, reflecting population counts of various groups proportionately.[ 1][2]

Opt-in surveys

Krosnick has published studies questioning the use of Internet opt-in surveys. Such surveys do not result in a random sample because participants are a self-selected group. Along with David Yeager, Krosnick concluded such surveys produced results varying from traditional surveys even after they were statistically adjusted to cancel effects from their non-random nature.[ 1] Another study found such studies could not be used to compare how a group's behavior or attitude changed over time, or how their responses to different issues related to one another.[ 1] Krosnick and Yeager used the same procedure to weight the raw data demographically in order for their surveys to be equally representative in terms of gender, age, race, etc. They then calculated the average error for the surveys on 13 additional measures of "secondary demographics" and other non-demographic factors. [ 1] The responses of opt-in Internet surveys differed from those in traditional surveys. Krosnick reached similar conclusions using two surveys collected for the U.S. Census Bureau, with one being a traditional poll and the other an Internet opt-in one.[ 1]

National Aviation Operations Monitoring Service

The National Aviation Operations Monitoring Service (NAOMS) was an $11.5 million research and development project by NASA using survey methods to measure aviation safety.[ 1] The program was created in response to the goal set by the White House Commission on Aviation Safety and Security in 1996 to reduce the risk of air travel accidents by 80 percent over the next 10 years.[ 1] Krosnick was the lead consultant in developing and implementing the NAOMS suvey methodology.[ 1][ 1]

While plane crashes remain rare, NAOMS sought to identify and reduce accident precursors and potential safety issues by regularly surveying commercial pilots, general aviation pilots, ground and flight crew members, and air traffic controllers.[ 1] The project was designed to provide broad, long-term measures on trends and to measure the effects of new technologies and aviation safety policies.[ 1] The project implemented a survey with an 80 percent response rate, interviewing a random sample of pilots about safety incidents.[ 1][ 1]

In 2004, NAOMS researchers finished collecting data on the first cohort of pilots, having conducted about 24,000 interviews. To some observers, preliminary findings suggested that some safety-related problems were occurring at astonishingly high rates, in some cases as much as four times the amount previously reported by the FAA. The FAA was “extremely unhappy” with the results and called for the program to be shut down. NASA soon cancelled the program.[ 1] The House Committee on Science and Technology Subcommittee on Investigations and Oversight later investigated FAA's role in the ending of NAOMS. Chairman Brad Miller (D-NC) stated the subcommittee found that the FAA did not support NAOMS.[ 1] In 2006, Associated Press reporter Rita Beamish filed a Freedom of Information Act request for the NAOMS’s data. For 14 months, NASA rejected the request.[ 1][ 1][ 1]

In a final denial letter to the AP, Thomas Luedtke, senior NASA official, indicated the data would not be released because the findings could damage the public's confidence in airlines and affect airline profits. Luedtke acknowledged that the NAOMS’s results "present a comprehensive picture of certain aspects of the U.S. commercial aviation industry."[ 1][ 1] Significant criticism from the public over NASA’s refusal to release the data and its handling of NAOMS prompted Congress to launch an investigation into the matter.[ 1] Members of Congress from both sides were very critical of NASA’s handling of the matter and demanded NASA release NAOMS’s results. During an oversight hearing, NASA's administrator, Michael D. Griffin, testified that Luedtke’s reasoning was a mistake and NASA would release the data. However, Griffin cast doubts on the reliability of the NAOMS’s data, cautioning that the data was never validated. Griffin warned, "there may be reason to question the validity of the methodology."[ 1] On January 1, 2007, Griffin released some of the NAOMS data.

Many refuted Griffin's criticism and defended NAOMS. The NAOMS survey methods were extensively peer reviewed, and the methods were adapted from proven survey methods. Krosnick and others had used such methods in similar contexts in published scientific studies that had been extensively peer reviewed.[ 1] In addition, NAOMS had also been thoroughly reviewed by internal and external experts.[ 1] The International Federation of Professional and Technical Engineers (IFPTE) sent a letter to Congressman Bart Gordon, then chairman of the House Committee on Science, Space and Technology stating, “there was no valid scientific basis for the Administrator's technical criticism of the NAOMS project.”[ 1] In a National Academy of Sciences 2004 report, NAS officially recommended “NASA should combine NAOMS methodology and resources with the ASRS program data to identify aviation safety trends."[ 1] After thorough review, the Office of Management and Budget, which reviews all federal survey projects to ensure they are optimally designed, approved NAOMS.[ 1][ 1] The union representing the majority of commercial pilots in the United States deemed NAOMS “tremendously valuable.”[ 1] In 2009, the Government Accountability Office investigated the NAOMS survey methodology and found “the project was planned and developed in accordance with generally accepted principles of survey planning and design...[and] as a research and development project, NAOMS was a successful proof of concept.”[ 1]

Work in political psychology

Studies in voter turnout

Among his work in political psychology, Krosnick has studied the psychology behind voter turnouts. In 2008, Krosnick published "Why do people vote? A psychological analysis of the causes of voter turnout," in which he designated several factors that increase and depress voter turnout during elections. Among these factors were age, race, residential mobility, and marital status.[3] It also showed that contrary to popular belief, an increased sense of diversity within communities actually discouraged people from voting.[4] The report also designated the most effective methods that candidates could use to increase voter turnout. Of common campaign practices, Krosnick's study found that canvassing was the most effective way to increase voter turnout, whereas common practices such as phone calls to people's houses seemed to have no effect at all. The study also found that involving people in civic service made them more likely to vote in the coming elections.[5]

Krosnick later traveled to Washington to present studies on voting psychology at the annual meeting of the American Political Science Association.[ 1] This particular study was conducted by the National Election Study (NES), has been funded by the National Science Foundation for the past 30 years, and involved researchers from Princeton, Northwestern, and the University of Chicago.[ 1][ 1] It spans over a 16-year period and involved more than 5,000 Americans in face-to-face interviews over the course of four elections. The resulting analysis of voter turnout was a part of a larger study that involved NES data from seven presidential elections and more than 25,000 respondents. In the end, these studies revealed a new way of thinking about voter decision-making that, according to Krosnick, was more consistent with psychological theory than reigning theories in political science at the time.[ 1]

One of the results of the study indicated that higher voter turnouts occurred when one candidate is disliked to the point of being a threat to voters, while the other is perceived as a hero. However, subjects who liked both candidates were not as likely to vote, even if they liked one significantly more than the other. This also holds true for subjects who disliked both candidates because in these cases voters would be happy or unhappy with either outcome.[ 1] The studies also indicated that mudslinging in political campaigns effectively increased voter turnout, provided that candidates vilified their opponents tastefully without tarnishing their own image. The study also revealed that if people liked or disliked the candidate at the first encounter, their opinion was difficult to change later on.[ 1] In fact, Krosnick's studies show that people become more resistant to changing their views as they learn more and more about a candidate. At the start of a campaign, most candidates are viewed in a mildly positive light. After presenting their positions, impressions of candidates solidify and information gained earlier in the campaign tends to have a greater impact. Krosnick calls this model the "asymmetrical" model of voting behavior. [ 1] This suggests that the current marketing strategy for campaigning - saving money for advertising more at the end of a campaign - is completely wrong.[ 1]

Ballot order studies

Krosnick and a colleague, analyzing data from an Ohio election, concluded the candidate whose name is listed first on a ballot received roughly 2% votes more in half of the races they studied.[ 1][6] The effect was stronger in races where the voters had no clear a priori choice.[6] While this effect has been known for more than a century, the study produced evidence.[7] His testimony to this effect led a court to invalidate an election in Compton, California. The effect carried over to other areas.[ 1]

Krosnick and others conducted a study of the 2000 U.S. Presidential elections in Ohio, California and North Dakota and found that candidates gained votes when listed first on the ballot as opposed to when listed later.[ 1] For elections, Krosnick hypothesized the effect may be from voters, feeling compelled to cast a vote, choosing the first choice on the list.[7] He believes Bush benefited from this effect in the 2000 presidential election in Florida,[ 1] and that the exit poll of the 2004 U.S. Presidential election was skewed toward the Democratic candidate, John Kerry, because he was listed on the questionnaire first.[8]

Patient Protection and Affordable Care Act

In 2010 and 2012, Krosnick conducted national surveys to explore American's understanding of the Patient Protection and Affordable Care Act, better known as Obamacare. [ 1] Over 2,600 participants in the survey were asked to answer 18 questions about whether a certain provision was in the bill, and how certain they were of their answer. 0% of the participants answered all the questions correctly,[ 1] and only 14% answered a majority of questions correctly with high certainty.[ 1] Besides not knowing about provisions that were in the bill, the participants also had trouble identifying provisions that weren't in the bill at all. For instance, only 17% of survey-takers were confident that the bill did not contain death panels, 11% recognized that there was no provision for illegal immigrants to get free healthcare, and all but 14% thought that the bill required smokers to pay $1000 per year.[ 1]

Krosnick also went further in his study to discover that the more accurately one understood the bill, the more likely he or she was to be in support of it. In fact, the majority of respondents favored nine of 12 provisions in the legislation.[ 1] The only three components not supported by the American majority were: "U.S. citizens without health insurance have to pay fines if they don't have specific reasons," "New fees for companies that make drugs," and "New fees for health insurance companies."[ 1] The research team concluded that if everyone in America knew enough to answer all the questions correctly, the approval rating of Obamacare would rise from 32% to 70%.[ 1]

Work in climate change

Krosnick has both conducted surveys and analyzed previous ones on global warming, some as part of his work at Stanford's Woods Institute for the Environment. His survey found, in 2007, that most Americans accepted global warming, but by a two-thirds majority were not convinced significant efforts were needed to stop it. Krosnick's view was that scientists were finding this lack of public concern a problem. Krosnick considered the media providing equal coverage to both sides of the debate, not in proportion to how strongly the views were represented among experts, a prime reason for the public's disbelieving scientists were united on the issue. He has also analyzed a 2006 poll by ABC News, TIME and Stanford, which showed the public has grown more concerned about global warming over the previous decade, with more than two thirds believing in unsettled weather patterns caused by human activity. Krosnick believes not acting now will cost the world more in the future.[ 1]

Studies in public belief and trust

Starting in 2008, polls began to show decline in the percentage of Americans that believed there was solid evidence for global warming and believed it to be a serious problem,[ 1] specifically from 80% in 2008 to 75% in late 2009. [ 1] In response, Krosnick conducted surveys and drew his own conclusions about this supposed dip in public belief.

Krosnick, who has run polls on public attitudes towards global warming since 2006, conducted a 2010 survey among 1000 Americans with the same questions as previous years in addition to new inquiries about recent and relevant controversies.[ 1] One of which was a controversy in which the email archive of the Climate Research Unit of the University of East Anglia was hacked in 2009. The emails retrieved from the hacking supposedly revealed extensive data manipulation in studies on climate research.[ 1] Krosnick's surveys revealed that 9% of the 32% of subjects who were aware of this controversy believed that it indicated that climate scientists should not be trusted. There was a subsequent controversy with the fourth report on Climate Change from the IPCC. 54% of the 13% of subjects who knew about this controversy believed it indicated that climate scientists were untrustworthy.[ 1]

As for the apparent public skepticism among Americans towards global warming, Krosnick believed that the apparent dip wasn't actually a result of decline in public belief in global warming, but the result of the questions on the surveys themselves. For instance, one of the integral questions of the survey conducted by the Pew Research Group was, “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” Krosnick in particular argued that the question's wording taints its intent and results.[ 1] The question doesn't ask for one's personal beliefs concerning global warming, but instead asks the respondent about what they've heard or read about global warming instead. Krosnick also critiqued another question used in repeated Gallup surveys: "Thinking about what is said in the news, in your view, is the seriousness of global warming generally exaggerated, generally correct, or generally underestimated?" In response to this question, the amount of respondents who answered with "generally exaggerated" rose from 30% to 48% between 2006 to 2010. However, Krosnick noted that, based on the wording of the question, this increase could be a result of either a change in views on global warming or a change in the media.[ 1]

Since 2009, Krosnick's findings have diverged from those of other organizations. Currently in 2012, Gallup and Pew polls report the amount of Americans that believe in global warming hovers around 50%, whereas Krosnick's latest poll suggests a percentage of 83%. His poll also indicated that, among believers, the majority reported thinking that fossil fuels and human activities are factors in the phenomenon. Krosnick also asked two questions: "What is the most important problem facing this country today?" and "What will be the most important problem facing the world in the future if nothing is done to stop it?" In the response to the first, respondents ranked the economy first with global warming dead last. For the latter, the results were reversed.[citation needed] His surveys also have indicated that 85% of Americans accept the idea of global warming and endorse steps to address it, even if higher costs are necessary to do so. Krosnick has acknowledged that such high levels of agreement are rare on major questions of foreign policy, but the key division among the public lies in public trust of scientists who study climate change.[ 1]

Advocacy and climate change

In 2012, Krosnick conducted another study based on a recent small dip in public belief in climate change. A national survey revealed that low-income and low education students were more willing to trust a scientist who presented evidence for global warming, until that same scientist began to urge their listeners to pressure their government into greener policies. At that point, viewers immediately became suspicious of that scientist's motives and the science they'd presented by extension.[ 1]

To come to this conclusion, Krosnick recruited a national sample of 793 Americans and split them into three groups to view three videos: a video of a scientist talking about the science of climate change, that same video with an added appeal to demand action from political representatives, and a video about making meatloaf as a control.[ 1][ 1] After each group viewed their respective video, they filled out a survey on their attitudes toward global warming.[ 1]

Krosnick discovered that subjects who'd watched the scientist discuss climate change gave them the same results as the group that had watched the video on meatloaf.[ 1] But the group that had seen the scientist make a political appeal after his discussion trusted the scientist 16% less (from 48% to 32%). Their belief in the scientist's accuracy fell from 47 to 36 percent. Overall trust in all scientists went from 60 to 52 percent. Their belief that government should "do a lot" to stop climate change fell from 62 to 49 percent. Finally, their belief that humans caused climate change fell from 81 to 67 percent.[ 1]

However, it should be noted that these changes only occurred in a cohort of 548 respondents who either had an income below $50,000 or no more than a high school diploma. Educated or wealthy respondents had no significant reaction.[ 1]

Climate change and voting

Krosnick has also combined his studies in global warming and voter choice through two studies. The first was based data collected from randomly selected households before and after the 2008 Election. These surveys asked on voter's opinions on McCain and Obama's policies on climate change before the election, and then asked who they voted for after the election process. He then conducted a study based on climate change and the 2010 Congressional Election. The results of both of these studies implied that Democrats who vehemently pursued green goals garnered more votes than Democrats who remained silent, and that Republicans who took "not-green" positions won less than Republicans that stayed silent.[ 1] The study reflects the growing concern over climate change in America and the ways those concerns affect political elections.

Krosnick also authored a study that revealed a subset of voters that focus in on a single issue that could be compelled to turn out if candidates appeal to them on climate change. Essentially Krosnick argued that, by speaking on climate change, candidates could actually enhance turnout and attract voters, especially in the current political climate, where neither candidate is a clear winner on significant issues.[ 1]

Work in attitude research

Krosnick has investigated in detail how attitudes, in general, are formed and how they relate to responses to surveys. He has modeled the emotional aspect, affect, that influences attitudes in a framework for long-term memory drawn on the computer model of short-term random access memory and longer-term disk storage.[9] Long-term memory is posited to be made of interconnected nodes, and Krosnick models affect as tags attaching to the node for say a political candidate, weighting it and influencing other nodes through connections.[10] The well-informed and politically savvy are expected to have more well-developed network structures of such nodes.[11]

Krosnick has also researched attitude strength, which per him is a subjective element,[12] with one possible measure being the attachment to a topic a respondent expresses in a self-report survey. He showed this form of attitude strength has four disparate dimensions, revealed by the statistical technique of factor analysis. The four dimensions found were polarized and positive or negative intensity (valence) of attitudes, ease of retrieval of the associated memories (accessibility), personal beliefs driving attitudes, and degree of thinking done on the subject.[13]

On the practical issue of how attitudes affect survey results, in line with other studies, Krosnick has looked separately at well-informed subjects aware of political issues and ill-informed or unmotivated respondents. In his research jointly with colleagues he found knowledgeable subjects used different cognitive organized patterns of thought (schemas) and knowledge-churning strategies from the naïve or undermotivated subjects.[14] Non-intuitively, in certain circumstances the experts were easier to prime with specific appeals or political advertisements.[15] The other group tended to generate more evasive answers avoiding the question,[16] especially when the issue was not considered relevant.[17] Some biases arising from this were a tendency to settle on the midpoint of a scale with an odd number of divisions, being more influenced by leading questions, and answering most questions with the same number on a scale, especially toward the end of the survey, a form of satisficing.[18] These combined increased the chance and amount of measurement error for such responders.[19]

Studies of racism

Between 2008 and 2012, Krosnick helped develop surveys with AP Poll to measure racial views in the U.S. Their surveys revealed that both implicit and explicit racism actually had increased within America since Obama's election in 2008. When tested on explicit anti-black attitudes, 51% of Americans were found to express them compared to 48% in 2008. On implicit attitudes, the number of Americans with anti-black ones jumped to 56% from 49%.[ 1] Many black Americans have also reported perceived antagonism since Obama has taken office.[ 1] The percentage of non-Hispanic whites who expressed anti-Hispanic attitudes rose from 52% to 57% from 2011 to 2012.[ 1] These results indicate a loss of 2% of the popular vote for Obama during the 2012 elections.[ 1]

In the survey, conducted online, respondents were shown a picture of a black, Hispanic, or white male before a neutral image. They were then asked to rate their feelings toward the neutral image. These feelings were taken as a measure of implicit racism toward the prior image. Responses were correlated to age, partisan belief, and views on Obama.[ 1]

Work as an expert witness

Krosnick frequently works as an expert witness. For instance, he was hired by the lawyers for Tyson Foods, in a case in which they were accused of polluting the Illinois Water Shed.[20] In another case, Krosnick worked on behalf of Phillip Morris in a case wherein they were sued for engaging "in deceptive practices designed to mislead the public regarding the harmful and addictive properties of cigarette smoking" [21]

Academic programs

Among the academic programs Krosnick directs at Stanford are the Political Psychology Research Group, which focuses on the study of public and political issues[ 1] such as global warming[ 1], and the Summer Institute in Political Psychology.[ 1] The Summer Institute in Political Psychology is a program that began as an annual tradition at Ohio State University in 1991 under the direction of Margaret Hermann. In 2005, it was moved to Stanford's campus. Today the program offers a three week training experience in for up to 60 participants. [ 1]

Positions

Awards and recognition

  • Bausch and Lomb Science Award, 1976.
  • National Institute of Mental Health Graduate Training Fellowship, 1982.
  • Phillip Brickman Memorial Prize for Research in Social Psychology, 1984.[ 1]
  • Pi Sigma Alpha Award for the Best Paper Presented at the 1983 Midwest Political Science Association Annual Meeting, 1984.[ 1]
  • Elected Departmental Associate, Department of Psychology, University of Michigan, recognizing outstanding academic achievement, 1984.
  • Invited Guest Editor, Social Cognition (Special issue on political psychology, Vol. 8, #1, May), 1990
  • Brittingham Visiting Scholar, University of Wisconsin, 1993.
  • Erik H. Erikson Early Career Award for Excellence and Creativity in the Field of Political Psychology, International Society of Political Psychology, 1995.[ 1][ 1]
  • Fellow, Center for Advanced Study in the Behavioral Sciences, Stanford, California, 1996-1997.[ 1]
  • Elected Fellow, American Psychological Association, 1998.
  • Elected Fellow, Society for Personality and Social Psychology, 1998.
  • Elected Fellow, American Psychological Society, 1998.
  • Appointed University Fellow, Resources for the Future, Washington, DC, 2001.
  • Prize for the Best Paper Presented at the 2002 Annual Meeting of the American Political Science Association, Section on Elections, Public Opinion, and Voting Behavior, 2003.[ 1]
  • Elected Fellow, American Academy of Arts and Sciences, 2009.[ 1]
  • Elected Fellow, American Association for the Advancement of Science, 2010.[ 1]

Books

  • Weisberg, H.; J. A. Krosnick, and B. Bowen (1989). Introduction to survey research and data analysis. Chicago: Scott, Foresman. 
  • Jon A. Krosnick, ed. (1990). "Thinking about politics: Comparisons of experts and novices by Jon A. Krosnick". Social Cognition 8 (1). New York, NY: Guilford Press. doi:10.1521/soco.1990.8.1.1. 
  • Petty, R. E.; J. A. Krosnick (1995). Attitude strength: Antecedents and consequences. Hillsdale, NJ: Erlbaum. 
  • Weisberg, H.; Krosnick, J. A.; Bowen, B. D. (1996). An Introduction to Survey Research, Polling, and Data Analysis (3 ed.). Thousand Oaks, CA: Sage. ISBN 0-8039-7401-9. 
  • Carson, R. T.; M. B. Conaway, W. Hanemann, J. A. Krosnick, R. C. Mitchell, and S. Presser (2004). Valueing oil spill prevention: A case study of California’s central coast. Dordrecht, The Netherlands: Kluwer Academic Publishers. 
  • Krosnick, J. A.; L. R. Fabrigar (2006). The handbook of questionnaire design. New York: Oxford University Press. 
  • Krosnick, Jon; Pasek, Josh (2010). "Optimizing survey questionnaire design in political science: Insights from psychology". In Leighley, Jan E. The Oxford Handbook of American Elections and Political Behavior. Oxford University Press. doi:10.1093/oxfordhb/9780199235476.003.0003. ISBN 978-0-19-923547-6. 
  • Callegaro, M.; R. Baker, J. Bethlehem, A. Göritz, J. A. Krosnick, and P. J. Lavrakas (2013). Online panel research: A data quality perspective. New York: John Wiley and Sons. 

Notes

  1. Bradburn et al. 1999.
  2. Atkeson 2010, pp. 15–16.
  3. Harder 2008, pp. 531-535.
  4. Harder 2008, pp. 534-535.
  5. Harder 2008, pp. 540.
  6. 6.0 6.1 Stewart et al. 2008.
  7. 7.0 7.1 Price 2008.
  8. Sproul 2007, pp. 18–19.
  9. Steenbergen & Lodge 2006, p. 128.
  10. Steenbergen & Lodge 2006, p. 160.
  11. Steenbergen & Lodge 2006, p. 143.
  12. Weisberg & Greene 2006, p. 99.
  13. Weisberg & Greene 2006, p. 100.
  14. Althaus 2010, p. 100.
  15. Althaus 2010, p. 21.
  16. Althaus 2010, p. 66.
  17. Althaus 2010, p. 153.
  18. Althaus 2010, p. 161–164.
  19. Althaus 2010, p. 37.
  20. http://www.oag.state.ok.us/oagweb.nsf/0/7db11b73010bff99862572b4006f60fb/$FILE/Complaint.pdf
  21. http://legacy.library.ucsf.edu/tid/wir20d00/pdf;jsessionid=1A005135AA27AC5B7B46ECA9D5665A09.tobacco03

Sources

External links

This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.