Sie sind auf Seite 1von 60

FOSTERING TRUST IN

THE ADVICE OF
EXPERTS
Alexia Gaudeul
CollEcons PhD colloquium
Georg-August-Universität, Göttingen
Thursday, April 25, 2019
Outline
1. Focus of this presentation and definition of terms
2. The context: A technocratic society.
3. The diagnosis: Lack of trust in experts
4. The causes: Why do people not trust in experts?
5. The solutions: What can be done to promote trust in experts?
6. Own work: Nudges, robo-advisers and feedback systems.
7. Summary and Conclusion
Focus of this presentation
• There is no “view from nowhere” (Nagel, 1989, Sugden, 2018)
• I am a behavioral economist, with good knowledge of decision-making research
and the literature on judge-adviser systems.
• I will deal mainly with advice in economic decision making
• I will speak mainly about behavioral issues in the relation expert-advisee.
Definition of terms
• Expert: Someone with extensive knowledge, experience and skill in
solving problems in a specific domain.
• Trust: Belief in knowledge, honesty, fairness and benevolence of
someone or of an institution.
• In our context, trust will be shown by seeking information from, listening and
following the advice of someone.
CONTEXT: A
TECHNOCRATIC
SOCIETY
1. A technocratic society
2. Why do we need experts?
3. What do experts provide?
A technocratic society
• Prominent role of experts in a technocratic society (Saint-Simon, 1817)
• Decision-makers selected on the basis of their expertise in specific areas of
decision-making.
• Belief in the merit of…
• applying scientific method to administration and policy making.
• finding technological solutions to societal problems.

• Related to:
• Bureaucratic state (Max Weber, 1922)
• Managerial society (James Burnham, 1941)
• Post-industrial economy (Daniel Bell, 1974, Alain Touraine, 1971).
Why do we need experts?
• Experts needed in many areas because of
• complexity of the operation of modern States and large companies,
• complicated technology,
• lack of experience with infrequent, remote, specialized decisions.

• What people want from experts (Stehr and Grundmann, 2011)


• Reduction in uncertainty about the future.
• Reduction in complexity of problems
• Advice that is transparent and easy to understand.
• Simple and direct solutions.
• Reassurance, legitimization and support (Dalal & Bonaccio, 2010).
Demand ↔ Offer
What decision-makers want What advisers offer
Certainty Uncertainty
Control over decisions Delegation of decisions
Simple solutions Complex plans of actions
Understandable advice Specialized technical knowledge
Personally adapted solutions General recommendations
Confirmation of beliefs New information
Support for own decisions Alternative possibilities
Quick results Long term plans
Why do we need trust in experts?
• Many claims made by experts are unverifiable by those outside their
field of expertise.
• Mistakes and lies by experts may not be found out and punished.
• One may never convincingly prove the expert lied (credence goods, Dulleck &
Kerschbamer, 2006; Dulleck, Kerschbamer, & Sutter, 2011, Balafoutas et al.,
2013).
• For an abuse to be exposed, we need to trust that “someone” is keeping
experts accountable.
• “Quis custodiet ipsos custodes?” (Juvenal, c.110, O'Neill, 2002)
• Other experts, media, police, auditors, regulatory authorities…
DIAGNOSIS: LOW
TRUST IN EXPERTS
1. Unpopular rule by experts
2. Low trust in a range of experts
3. Rise of populist parties
4. Decision-makers often do not seek or follow expert advice
Rule by experts is not popular
• Rule by experts is not popular (Pew Research Center, October, 2017)
• However, still preferred to dictatorship or military rule…
Rule by experts is not popular
Low trust in institutions and their
leaders.
• 2018 Edelman Trust Barometer for Germany
• 40% of Germans trust Government, Media, Business, NGOs.
• Lowest ratings for journalists, government officials and regulators, business
leaders.
• Only technical and academic experts are more likely to be trusted than “a
person like yourself”.
Rise of populist parties and
disinformation.
• Rise of populist parties
• Trump in the US, AfD in Germany, National Front in France, Northern League
and Five Star Movement in Italy, UK Independence Party in the UK.
• Populists have low trust in institutions
• Oppose “the popular will” with the rule of “the cosmopolitan liberal elite“
(Canovan, 1999, Oliver & Rahn, 2016).
• “The people of this country have had enough of experts” (Conservative MP
Michael Gove during Brexit campaign)
• Diffusion of conspiracy theories (Sunstein & Vermeule, 2009).
Distrust of science
• Spread of distrust in scientific recommendations
• On climate change
• On vaccination
• On genetically modified organisms
• On nutrition
• On evolution

(Van der Linden & Lewandowsky, 2015, Nichols, 2017, Rutjens et al. 2018)
WHY DO PEOPLE NOT
TRUST IN EXPERTS?
1. Failures of regulatory institutions
2. Ineffective policy advice
3. Biased personal advice
4. Crisis in the sciences
Failures of regulatory institutions
• 2008-2009: Financial crisis (Crotty, 2009, Roth, 2009)
• Bad regulatory oversight, under-estimated systemic risk, biased credit rating
agencies and auditors (Sikka, 2009, White, 2010)
• 2010-2014: Greek Financial Crisis
• Loss of trust in European institutions (Roth, 2009)
• 2011-2015: Scandal about plagiarism in doctoral dissertations
(Gutenberg).
• 2013: Scandal about fraud in organ transplants.
• 2018: Scandal about diesel emissions tests (VW)
Other failures of expertise
• Economic policy seen to benefit large banks, large corporations and the
wealthy (Pew Research Center, 2015)
• Misleading financial advice (Mullainathan, Noeth & Schoar, 2012)
• Deceptive and confusing offers (Gabaix & Laibson, 2006, Ellison & Ellison,
2009, Gaudeul & Sugden, 2012, Heidhues et al., 2016)
• Replication crisis in the social sciences (Open Science Collaboration, 2015,
Camerer et al, 2018)
CLASSIFICATION OF
CAUSES
1. Wrong advice, ex-post or ex-ante.
2. Consensual or conflicting advice
3. Biased or neutral advice
4. Inapplicable and unwelcome advice
Wrong advice, ex-post or ex-ante.
• Even correct advice can be wrong ex-post.  disappointment (Gul, 1991).
• Failure to educate people about uncertainty (Tauritz, 2012)
• E.g. financial advice on investments, house buying, borrowing.
• Overconfident experts (Angner, 2006)
• Failure in anticipating extent of uncertainty in prediction: calibration
• Failure to anticipate issues
• Wrong models, wrong assumptions, misdirected research (Colander et al, Ch. 13
in Lanteri & Vromen, 2014)
• e.g. diesel engines, nuclear plants, insecticides, asbestos, opioid prescriptions,
lead in paint, etc...
Wrong advice, ex-post or ex-ante.
Over-confident
prediction

Biased
Probability

Correct distribution prediction


ex-ante

Unpredicted result

Values
24
Consensual or conflicting advice
• Problem when experts conflict
• Generates uncertainty and thus anxiety (Carleton, 2016).
• E.g. “wicked problems” without clear solutions, such as immigration, climate
protection, diets, fitness, addictions.
• Problem when experts agree
• Herding (Banerjee, 1992).
• This leads to lack of responsiveness to new information & slow response to
crises and citizens’ concerns.
Biased advice
• Biased, self-serving advice
• E.g. medical doctors sponsored by pharmaceutical firms (Thompson, 1993),
• E.g. financial products biased towards high-commission products (Oehler and
Kohlert, 2009, Mullainathan et al., 2012, Anagol & Sarkar, 2017)
• Perception of bias leads people to discount advice (Choo, 1964).
• This leads to even more biased advice (Chung & Harbaugh, 2018).
• e.g. warning of catastrophe in case of Brexit (“project Fear”)...
Biased advice
• Neutral advice may be impossible
• Because experts differ from most people (Sapienza & Zingales, 2013)
• Because experts may be paid by different people than those they advise 
conflicts of interests.
Inapplicable and unwelcome advice
• Unhelpful and unrealistic advice that is difficult to follow
• E.g. drive slower, stop using plastic, do not look at your investments…
• E.g. nutrition (Kearney & McElhone, 1999), medication adherence (Brown &
Bussel, 2011), education of kids, vaccination, politics, climate change.
• Unwelcome, unsolicited advice (Fitzsimons & Lehmann, 2004, Sugden,
2016)
 Information avoidance (Golman, Hagmann and Loewenstein, 2017)
SOLUTIONS TO A
“WICKED” PROBLEM
What can be done to promote trust in
experts?
• A “wicked problem” (Rittel & Webber, 1973)
• Whichever way you try to solve the issue, you make it worse in another way.
• Many conflicting objectives:
1. Project authority and confidence and yet make aware of uncertainty.
2. Be neutral and yet have experience.
3. Give personal advice and yet reach many.
4. Change behavior and yet respect preferences of advisee.
Project confidence or explain
uncertainty?
• People trust confident advisers more than those who admit uncertainty
• Sniezek & Buckley, 1995; Sniezek & Van Swol, 2001, Peterson & Pitz, 1988;
Zarnoth & Sniezek, 1997
• Therefore, experts who admit uncertainty are not followed.

• However, advisers who project confidence and are wrong are punished
more than those who are less confident.
• Tenney et al, 2007, Tenney, Spellman & McCoun, 2008
Project confidence or explain
uncertainty?
• Possible solutions:
• Track adviser accuracy over time (McCoun, 2015)
• Educate advisees about uncertainty (Tauritz, 2012)
• Why not following the advice does not necessarily mean doing badly.
• Why following advice does not necessarily lead to doing well.
• Better representation and explanation of uncertainty (example next slide).
Better representation of uncertainty.
Promote neutral advice?
• Advisers often face conflicts of interests.
• Making aware of biases does not help as:
• Advisers do not recognize own bias (Sah, 2012)
• Advisees do not correct for bias (De Meza et al, 2010, Ismayilov & Potters, 2013)
• Disclosing bias makes advisers feel allowed to be even more biased
(Loewenstein et al, 2011)
• Solutions:
• Seek second opinions? (Sah & Loewenstein, 2015)
• Aggregate advice from different sources?

 own project with Gangl and Kulke.


Give advice in person and yet remain
accessible.
• In person advice is more likely to be taken up.
• E.g. behavioral counseling to quit smoking (Orleans et al, 1991), promoting
adherence to long-term therapies (Sabate, World Health Organization, 2003).
• However giving advice in person is time-consuming.
• So personal advice is available only to the rich, or for problems that can be
solved quickly.
• Not for the poor, very costly for behavioral issues.

• Possible solution: Robo-advisers with automated advice

 own project with Crosetto and Giannetti


Change or respect preferences?
• Should advice address wrong beliefs and biases of the advisee?
• Should still be accepted, but people are often overconfident and overvalue their
own initial judgments (Gardner & Berry, 1995; Harvey & Fischer, 1997).
• Solutions:
• Explain reasons for the advice? (Tzioti et al, 2014)
• Explain why intuition is wrong?
• Example: Diversifying, not always best ex-post…
• Example: Winner's curse, why not bid what one believes the good to be worth?
• Give advice gradually, minimize difference between advice and what the person
already believes?
• Nudges  influence action without explaining reasons.

 own work with Kaczmarek.


OWN WORK AND
PROJECTS
1. Do nudges respect people’s preferences?
2. Robo-advisers for financial advice
3. Feedback systems and their impact on experts
Nudges and social preferences
• Nudges are a tool to convey advice in a way that is “respectful of people’s
own preferences”
• Orient people towards the “correct” decision without forbidding alternatives
(Camerer et al, 2003, Sunstein & Thaler, 2003).
• A cost effective alternative to giving information or training people (Webb &
Sheran, 2006).
• However, are nudges only a temporary barrier to one’s inclinations?
• Nudges, even applied over long periods, do not seem to change preferences
(Allcott & Rogers, 2014).
• In Gaudeul & Kaczmarek, 2018, forthcoming, we elicit attitudes to and
motivation to follow the nudge and exploits attrition in our subject pool as a
measure of motivation.
Nudges and social preferences
• In a treatment, we “nudge” decision makers to make a donation.
• Those who promise to donate in this treatment are less likely to translate
this into actual donations.
• Overall, the total amount of the donation is therefore not different
between the treatments.
• This means that “nudges” may be self-defeating.

41
Nudges and social preferences
Nudges and social preferences
Robo-advisers for behavioral issues
• Algorithm delivering „optimal“ advice given elicited preferences
• E.g. elicit opinion on range of social and economic issues, return closest
candidate in elections.
• E.g. elicit risk aversion, patience, goals, etc, return investment
recommendations.
• E.g. analyze past purchases and tastes, return possible purchases.

• Issue: „Algorithm aversion“ (Önkal et al, 2009)


• People do not use algorithms as much as they should.
• Avoid robo-advisers more often than human adviser (Dietvorst, Simmons &
Massey, 2015).
• But use it more if can get some control over its decisions (Dietvorst, Simmons &
Massey, 2016).
Robo-advisers for behavioral issues
• Own project
• €10,000 under Think Forward Initiative funded by ING Bank, with Giannetti and
Crosetto.
• Investigate possible form of a Dunning-Kruger effect (1999).
• The more you need advice, the less you think you need it (Mahnwesen, 2018,
Yan & Mahnwesen, 2018, Mahnwesen und Kegel, 2018, Kramer, 2016, Calcagno
& Monticone, 2015).
• Indeed, “individuals with higher incomes, educational attainment, and levels of
financial literacy are most likely to receive financial advice.” (Collins, 2012)
Robo-advisers for behavioral issues
• Extend this finding to the use of (robo-)advisers for behavioral issues in
financial investment decisions.
• Offer robo-advisers as a way to commit to an investment strategy.
• Identify those most in need of a way to commit.
• Look at whether those people indeed use the robo-adviser.
• Look at whether soft commitments can enhance use of commitment devices
(Bryan, Karlan & Nelson, 2010).
Who takes advice?
Actual accuracy
Seek advice Do not seek advice
Level of and need it and need it
expertise
(accuracy) Beliefs about
accuracy

Accuracy
of adviser

Individuals
Dunning-Kruger effect cannot explain why people with low accuracy do not seek advice.
47
Who takes advice?
Actual accuracy
Seek advice Do not seek advice
Level of and need it and need it
expertise
(accuracy) Beliefs about
accuracy

Accuracy
of adviser

Belief about
accuracy
of adviser

Individuals
People with low accuracy who do not seek advice must also believe the adviser has low accuracy.
48
Psychological impact of feedback
systems
• Feedback systems are omnipresent on the Internet, for products as well
as for services.
• Aggregate experience of users, on what matters to them.
• How does feedback impacts advisers.
• Project with Katharina Gangl and Louisa Kulke
• Emotional impact of feedback (anger, happiness, sadness, …).
• Effect on subsequent performance of adviser.
• Effect of shame & embarrassment.
Psychological impact of feedback systems

50
SUMMARY AND
CONCLUSION
Summary
• Experts need to learn to recognize and consider their limitations and the
limitations of the recipients of their advice.
• Experts must also be aware of the need to maintain a balance between
conflicting goals:
• be neutral and be involved.
• be precise and convey uncertainty.
• suggest improvements, but consider the goals and limitations of the advisee.
• personalize advice, but also automate it.

53
Summary
• Own research questions…
• how far "nudges" respect people's preferences.
• whether advice can follow algorithms and still feel personal.
• whether performance feedback can discipline experts.

54
Conclusion
• Confidence in experts is important in a liberal democracy.
• But a degree of distrust may be needed to discipline the experts (MacCoun,
2015).
• Towards a more populist form of democracy.
• Participations outside the scientific community in scientific debates.
• It is necessary to further define the rules and methods in this more public
debate.
• Need for better demarcation between...
• scientific evidences.
• social implications.
• political solutions.

55
References
1. Allcott, H., & Rogers, T. (2014). The Short-Run and Long-Run Effects of M., ... & Altmejd, A. (2018). Evaluating the replicability of social science
Behavioral Interventions: Experimental Evidence from Energy Conservation. experiments in Nature and Science between 2010 and 2015. Nature Human
American Economic Review, 104(10), 3003–3037. Behaviour, 2(9), 637.
2. Anagol, S., Cole, S., & Sarkar, S. (2017). Understanding the advice of 13. Camerer, C., Issacharoff, S., Loewenstein, G., O’Donoghue, T., & Rabin, M.
commissions-motivated agents: Evidence from the Indian life insurance market. (2003). Regulation for Conservatives: Behavioral Economics and the Case for
Review of Economics and Statistics, 99(1), 1-15. “Asymmetric Paternalism” Retrieved from
https://papers.ssrn.com/abstract=399501
3. Angner, E. (2006). Economists as experts: Overconfidence in theory and
practice. Journal of Economic Methodology, 13(1), 1-24. 14. Canovan, M. (1999). Trust the people! Populism and the two faces of
democracy. Political studies, 47(1), 2-16.
4. Balafoutas, L., Beck, A., Kerschbamer, R., & Sutter, M. (2013). What Drives
Taxi Drivers? A Field Experiment on Fraud in a Market for Credence Goods. The 15. Carleton, R. N. (2016). Into the unknown: A review and synthesis of
Review of Economic Studies, 80(3), 876–891. contemporary models involving uncertainty. Journal of Anxiety Disorders, 39,
https://doi.org/10.1093/restud/rds049 30–43.
5. Banerjee, A. V. (1992). A simple model of herd behavior. The Quarterly Journal16. Choo, T. H. (1964). Communicator credibility and communication
of Economics, 107(3), 797-817. discrepancy as determinants of opinion change. The Journal of social
psychology, 64(1), 65-76.
6. Bell, D. (1974). The Coming of Post-Industrial Society, New York: Harper
Colophon Books 17. Chung, W., & Harbaugh, R. (2018). Biased recommendations from biased and
unbiased experts. Journal of Economics & Management Strategy, 0(0).
7. Braun, P. A., & Yaniv, I. (1992). A case study of expert judgment: Economists’ https://doi.org/10.1111/jems.12293
probabilities versus base-rate model forecasts. Journal of Behavioral Decision
Making, 5, 217-231. 18. Colander, D., Follmer, H., Haas, A., Goldberg, M., Juselius, K., Kirman, A., . . .
Sloth, B. (2014). The financial crisis and the systemic failure of academic
8. Brown, M. T., & Bussell, J. K. (2011). Medication Adherence: WHO Cares? economics. In A. Lanteri & J. Vromen (Eds.), The Economics of Economists:
Mayo Clinic Proceedings, 86(4), 304–314. Institutional Setting, Individual Incentives, and Future Prospects (pp. 344-360).
Cambridge: Cambridge University Press.
9. Bryan, G., Karlan, D., & Nelson, S. (2010). Commitment devices. Annu. Rev.
Econ., 2(1), 671–698. 19. Collins, J. M. (2012). Financial advice: A substitute for financial literacy?.
Financial Services Review, 21(4).
10. Burnham (1941). The Managerial Revolution: What is Happening in the
World. New York: John Day Co., 1941. 20. Crotty, J. (2009). Structural causes of the global financial crisis: a critical
assessment of the ‘new financial architecture’. Cambridge journal of economics,
11. Calcagno, R., & Monticone, C. (2015). Financial literacy and the demand for 33(4), 563-580.
financial advice. Journal of Banking & Finance, 50, 363-380.
12. Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T. H., Huber, J., Johannesson,
56
References
21. Dalal, R. S., & Bonaccio, S. (2010). What types of advice do decision-makers on the Internet. Econometrica, 77(2), 427–452.
prefer?. Organizational Behavior and Human Decision Processes, 112(1), 11-23.
32. Fitzsimons, G. J., & Lehmann, D. R. (2004). Reactance to recommendations:
22. De Meza, David, Bernd Irlenbusch, and Diane Reyniers, “Disclosure, Trust When unsolicited advice yields contrary responses. Marketing Science, 23(1), 82-
and Persuasion in Insurance Markets,” IZA working paper 5060 (2010). 94.
23. Dietvorst BJ, Simmons JP, Massey C (2015) Algorithm aversion: People 33. Furnham, A., & Boo, H. C. (2011). A literature review of the anchoring effect.
erroneously avoid algorithms after seeing them err. J. Experiment. Psych.: The journal of socio-economics, 40(1), 35-42.
General 144(1):114–126.
34. Gabaix, X. and D. Laibson (2006). Shrouded attributes, consumer myopia,
24. Dietvorst, B. J., Simmons, J. P., & Massey, C. (2016). Overcoming Algorithm and information suppression in competitive markets. Quarterly Journal of
Aversion: People Will Use Imperfect Algorithms If They Can (Even Slightly) Economics 121: 505-540.
Modify Them. Management Science, 64(3), 1155–1170.
35. Gardner, D. H., & Berry, D. C. (1995). The effect of different forms of advice
25. Dulleck, U., & Kerschbamer, R. (2006). On Doctors, Mechanics, and on the control of a simulated complex system. Applied Cognitive Psychology, 9,
Computer Specialists: The Economics of Credence Goods. Journal of Economic 555–579.
Literature, 44(1), 5–42.
36. Gaudeul A. and M.C. Kaczmarek (2018), Going along with the default does
26. Dulleck, U., Kerschbamer, R., & Sutter, M. (2011). The Economics of not mean going on with it: Attrition in a charity giving experiment, forthcoming,
Credence Goods: An Experiment on the Role of Liability, Verifiability, Behavioral Public Policy.
Reputation, and Competition. American Economic Review, 101(2), 526–555.
https://doi.org/10.1257/aer.101.2.526 37. Gaudeul, A., & R. Sugden (2012). Spurious Complexity and Common
Standards in Markets for Consumer Goods. Economica, 79(314), 209–225.
27. Dunning, D. (2018). Gullible to Ourselves, 20th Sydney Symposium of Social
Psychology, Visegrad, Hungary, 38. Gino, F. (2008). Do we listen to advice just because we paid for it? The impact
https://www.sydneysymposium.unsw.edu.au/2018/chapters/DunningSSSP2018. of advice cost on its use. Organizational Behavior and Human Decision
pdf Processes, 107(2), 234–245.
28. Dunning, D., & Cone, J. (2018). The Cassandra quandary: How flawed 39. Golman, R., Hagmann, D., and Loewenstein, G. (2017). Information
expertise prevents people from recognizing superior knowledge among their Avoidance. Journal of Economic Literature 55(1), 96{135.
peers. Manuscript under review, University of Michigan
40. Gul, F. (1991). A Theory of Disappointment Aversion. Econometrica, 59(3),
29. Edelman Trust Barometer for Germany, 2018, 667–686. https://doi.org/10.2307/2938223
https://www.edelman.com/trust-barometer
30. Edwards (2014) How effective is foreign aid? Abgerufen 21. Dezember 2018,
von https://www.weforum.org/agenda/2014/11/how-effective-is-foreign-aid/
31. Ellison, G., & Ellison, S. F. (2009). Search, Obfuscation, and Price Elasticities

57
References
41. Harvey, N., & Fischer, I. (1997). Taking advice: Accepting help, improving 52. MacCoun, R.J. (2015). The Epistemic Contract: Fostering Appropriate Public
judgment, and sharing responsibility. Organizational Behavior and Human Trust in Experts, in Motivating Cooperation and Compliance with Authority: The
Decision Processes, 7, 117–134. Roles of Institutional Trust and Confidence, Bornstein B.H. and A.J. Tomkins
eds., New York: Springer
42. Heidhues, P., Kőszegi, B., & Murooka, T. (2016). Inferior products and
profitable deception. The Review of Economic Studies, 84(1), 323-356. 53. Morris, S. (2001). Political correctness. Journal of political Economy, 109(2),
231-265.
43. Henrion, M., & Fischhoff, B. (1986). Assessing uncertainty in physical
constants. American Journal of Physics, 54, 791-‐798. 54. Mullainathan, S., Noeth, M., & Schoar, A. (2012). The market for financial
advice: An audit study (No. w17929). National Bureau of Economic Research.
44. Hirshleifer, D., & Hong Teoh, S. (2003). Herd behaviour and cascading in
capital markets: A review and synthesis. European Financial Management, 9(1), 55. Mullainathan, Sendhil, Markus Noeth, and Antoinette Schoar, “The Market
25-66. for Financial Advice: An Audit Study,” NBER working paper 17929 (2012).
45. Ismayilov, H., & Potters, J. (2013). Disclosing advisor's interests neither hurts 56. Nagel, T. (1989). The view from nowhere. OUP
nor helps. Journal of Economic Behavior & Organization, 93, 314-320.
57. Nichols, T. (2017). The Death of Expertise: The Campaign against Established
46. Kearney, J. M., & McElhone, S. (1999). Perceived barriers in trying to eat Knowledge and Why it Matters. Oxford University Press.
healthier–results of a pan-EU consumer attitudinal survey. British Journal of
Nutrition, 81(S1), S133-S137. 58. Nickerson, Raymond S. "Confirmation bias: A ubiquitous phenomenon in
many guises." Review of general psychology 2.2 (1998): 175.
47. Kramer, M. M. (2016). Financial literacy, confidence and financial advice
seeking. Journal of Economic Behavior & Organization, 131, 198–217. 59. O'Neill O. (2002) “A Question of Trust: The BBC Reith Lectures”, Cambridge
University Press
48. Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties
in recognizing one's own incompetence lead to inflated self-assessments. 60. Oehler, Andreas, and DanielKohlert, “Financial Advice Giving and Taking:
Journal of personality and social psychology, 77(6), 1121. Where Are the Markets’ Self-Healing Powers and a Functioning Legal
Framework When We Need Them?” Journal of Consumer Policy 32 (2009), 91–
49. Lichtenstein, S., Fischhoff, B., & Phillips, L. D. (1982). Calibration of 116.
probabilities: the state of the art to 1980. In D. Kahneman, P. Slovic, & A.
Tversky (Eds.), Judgment under uncertainty: heuristics and biases. Cambridge
University Press.
50. Lin, S. W., & Bier, V. M. (2008). A study of expert overconfidence. Reliability
Engineering & System Safety, 93, 711-‐721.
51. Loewenstein, G., Cain, D. M., & Sah, S. (2011). The limits of transparency:
Pitfalls and potential of disclosing conflicts of interest. American Economic
Review, 101(3), 423-28.

58
References
61. Oliver, J. E., & Rahn, W. M. (2016). Rise of the Trumpenvolk: Populism in the 71. Sah, S. (2012). Conflicts of Interest and Your Physician: Psychological
2016 Election. The ANNALS of the American Academy of Political and Social Processes That Cause Unexpected Changes in Behavior. The Journal of Law,
Science, 667(1), 189-206. Medicine & Ethics, 40(3), 482–487.
62. Open Science Collaboration. (2015). Estimating the reproducibility of 72. Sah, S., & Loewenstein, G. (2015). Conflicted advice and second opinions:
psychological science. Science, 349(6251), aac4716. Benefits, but unintended consequences. Organizational Behavior and Human
Decision Processes, 130, 89-107.
63. Orleans, C. T., Schoenbach, V. J., Wagner, E. H., Quade, D., Salmon, M. A.,
Pearson, D. C., ... & Kaplan, B. H. (1991). Self-help quit smoking interventions: 73. Saint-Simon, H. (1817), “Declaration of Principles” in L’Industrie, vol. II
effects of self-help materials, social support instructions, and telephone
counseling. Journal of consulting and clinical psychology, 59(3), 439. 74. Sapienza, P., & Zingales, L. (2013). Economic Experts versus Average
Americans. American Economic Review, 103(3), 636–642.
64. Peterson, D. K., & Pitz, G. F. (1988). Confidence, uncertainty, and the use of
information. Journal of Experimental Psychology: Learning, Memory, and 75. Sikka, P. (2009). Financial crisis and the silence of the auditors. Accounting,
Cognition, 14, 85–92. Organizations and Society, 34(6-7), 868-873.
65. Pew Research Center, 2015, „Most Say Government Policies Since Recession 76. Sniezek, J. A., & Buckley, B. (1995). Cueing and cognitive conflict in judge-
Have Done Little to Help Middle Class, Poor”, http://www.people- advisor decision making. Organizational Behavior and Human Decision
press.org/2015/03/04/most-say-government-policies-since-recession-have- Processes, 62, 159–174.
done-little-to-help-middle-class-poor/
77. Sniezek, J. A., & Van Swol, L. M. (2001). Trust and expertise in a judge advisor
66. Pew Research Center, 2017, “Globally, Broad Support for Representative and system. Organizational Behavior and Human Decision Processes, 84, 288–307.
Direct Democracy”, http://www.pewglobal.org/2017/10/16/globally-broad-
support-for-representative-and-direct-democracy/ 78. Sniezek, J. A., Schrah, G. E., & Dalal, R. S. (2004). Improving judgement with
prepaid expert advice. Journal of Behavioral Decision Making, 17(3), 173–190.
67. Rittel, H. W., & Webber, M. M. (1973). Dilemmas in a general theory of
planning. Policy sciences, 4(2), 155-169. 79. Stehr Nico, Grundmann Reiner. Experts: The Knowledge and Power of
Expertise. London: Routledge; 2011
68. Roth, F. (2009). The effect of the financial crisis on systemic trust.
Intereconomics, 44(4), 203-208. 80. Sugden R., 2018, The community of advantage, OUP (Chapter 2)
69. Rutjens, B. T., Sutton, R. M., & van der Lee, R. (2018). Not all skepticism is
equal: Exploring the ideological antecedents of science acceptance and
rejection. Personality and Social Psychology Bulletin, 44(3), 384-405.
70. Sabate E., Organisation mondiale de la santé, World Health Organization,
Who, UNAIDS (2003). Adherence to Long-term Therapies: Evidence for Action.
World Health Organization

59
References
81. Sugden, R. (2016). Do people really want to be nudged towards healthy Intuitive Advice Justification on Advice Taking. Journal of Behavioral Decision
lifestyles? International Review of Economics, 1–11. Making, 27(1), 66–77. https://doi.org/10.1002/bdm.1790
https://doi.org/10.1007/s12232-016-0264-1
91. Van der Linden, S., and S. Lewandowsky (2015). "How to combat distrust of
82. Sunstein, C. R., & Thaler, R. H. (2003). Libertarian paternalism is not an science." Scientific American. http://www. scientificamerican. com/article/how-
oxymoron. The University of Chicago Law Review, 1159-1202. to-combat-distrust-of-science.
83. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and 92. Webb, T. L., & Sheeran, P. (2006). Does changing behavioral intentions
cures. Journal of Political Philosophy, 17(2), 202-227. engender behavior change? A meta-analysis of the experimental evidence.
Psychological Bulletin, 132(2), 249–268.
84. Tauritz, R. (2012). How to handle knowledge uncertainty: learning and
teaching in times of accelerating change. Learning for Sustainability in Times of 93. Weber, Max. 1922 (1968). Economy and Society: An Outline of Interpretive
Accelerating Change, 299-316. Sociology. New York: Bedminster Press. pp. 956–958, 999–1001
85. Tenney, E. R., MacCoun, R. J., Spellman, B. A., & Hastie, R. (2007). 94. White, L. J. (2010). Markets: The credit rating agencies. Journal of Economic
Calibration trumps confidence as a basis for witness credibility. Psychological Perspectives, 24(2), 211-26.
Science, 18, 46-‐50.
95. Yan, H., & Dunning, D. (2017). On the failure to ask for advice: Behavioral
86. Tenney, E. R., Spellman, B. A., & MacCoun, R. J. (2008). The benefits of implications of the Dunning-Kruger effect. Unpublished manuscript, University
knowing what you know (and what you don’t): Fact-‐finders rely on others who of Michigan.
are well calibrated. Journal of Experimental Social Psychology, 44, 1368-‐1375.
96. Yaniv, I., & Kleinberger, E. (2000). Advice taking in decision making:
87. Thompson, D. F. (1993). Understanding financial conflicts of interest. New Egocentric discounting and reputation formation. Organizational Behavior and
England Journal of Medicine, 329, 573-573. Human Decision Processes, 83, 260–281.
88. Touraine, A. (1971). The Post-Industrial Society. Tomorrow's Social History: 97. Zarnoth, P., & Sniezek, J. A. (1997). The social influence of confidence in
Classes, Conflicts and Culture in the Programmed Society. New York: Random group decision making. Journal of Experimental Social Psychology, 33, 345–366.
House.
98. Önkal, D., Goodwin, P., Thomson, M., Gönül, S., & Pollock, A. (2009). The
89. Turner, S. (2001). What is the Problem with Experts?. Social studies of relative influence of advice from human experts and statistical methods on
science, 31(1), 123-149. forecast adjustments. Journal of Behavioral Decision Making, 22(4), 390–409.
90. Tzioti, S. C., Wierenga, B., & van Osselaer, S. M. J. (2014). The Effect of
60

Das könnte Ihnen auch gefallen