政治学与国际关系论坛

 找回密码
 注册

QQ登录

只需一步,快速开始

扫一扫,访问微社区

查看: 2571|回复: 20
打印 上一主题 下一主题

历届诺贝尔经济学奖获得者及其成就

 关闭 [复制链接]
跳转到指定楼层
1#
发表于 2005-8-3 17:27:26 | 只看该作者 回帖奖励 |倒序浏览 |阅读模式
[2002] Daniel Kahneman  因为他"将来自心理学的洞见整合到了经济科学之中,尤其是关于在不确定性下人们的判断和决策制定行为"  Vernon L. Smith 因为他"建立了实验室实验作为实证经济分析的工具,尤其是研究可选市场机制方面"
<br>[2001]乔治·阿克洛夫(G.Akerlof)、迈克尔·斯彭思(M.Spence)和约瑟夫·斯蒂格利兹(J.Stigliz)
<br>奖励他们对"非对称信息市场"分析所做的贡献。\r<br>[2000]詹姆斯· 赫克曼( JAMES J. HECKMAN)丹尼尔·麦克法登 ( DANIEL L. McFADDEN)
<br>在微观计量经济学领域的贡献。他们发展了广泛应用于个体和家庭行为实证分析的理论和方法。
<br>[1999]罗伯特·门德尔(ROBERT A. MUNDELL)\r<br>他对不同汇率体制下货币与财政政策以及最适宜的货币流通区域所做的分析使他获得这一殊荣。
<br>[1998] 阿马蒂亚·森(AMARTYA SEN )\r<br>对福利经济学几个重大问题做出了贡献,包括社会选择理论、对福利和贫穷标准的定义、对匮乏的研究等。
<br>[1997]罗伯特·默顿(ROBERT C. MERTON)和迈伦·斯科尔斯(MYRON S. SCHOLES)\r<br>前者对布莱克-斯科尔斯公式所依赖的假设条件做了进一步减弱,在许多方面对其做了推广。后者给出了著名的布莱克-斯科尔斯期权定价公式,该法则已成为金融机构涉及金融新产品的思想方法。
<br>[1996]詹姆斯·莫里斯(JAMES A. MIRRLEES)和 威廉·维克瑞(WILLIAM VICKREY)\r<br>前者在信息经济学理论领域做出了重大贡献,尤其是不对称信息条件下的经济激励理论。 后者在信息经济学、激励理论、博弈论等方面都做出了重大贡献。
<br>[1995]罗伯特·卢卡斯(ROBERT LUCAS)\r<br>倡导和发展了理性预期与宏观经济学研究的运用理论,深化了人们对经济政策的理解,并对经济周期理论提出了独到的见解。
<br>[1994]约翰·纳什(JOHN F.NASH) 约翰·海萨尼(JOHN C. HARSANYI) 莱因哈德·泽尔腾(REINHARD SELTEN)\r<br>这三位数学家在非合作博弈的均衡分析理论方面做出了开创性德贡献,对博弈论和经济学产生了重大影响。
<br>[1993]道格拉斯·诺斯(DOUGLASS C. NORTH)和罗伯特·福格尔(ROBERT W. FOGEL)\r<br>前者建立了包括产权理论、国家理论和意识形态理论在内的"制度变迁理论"。后者用经济史的新理论及数理工具重新诠释了过去的经济发展过程。
<br>[1992]加里·贝克(GARY S. BECKER)\r<br>将微观经济理论扩展到对人类相互行为的分析,包括市场行为。
<br>[1991]罗纳德·科斯(RONALD H.COASE)\r<br>揭示并澄清了经济制度结构和函数中交易费用和产权的重要性。
<br>[1990]默顿·米勒(MERTON M. MILLER) 哈里·马科维茨(HARRY M. MARKOWITZ) 威廉·夏普(WILLIAM F. SHARPE)\r<br>他们在金融经济学方面做出了开创性工作。
<br>[1989]特里夫·哈维默(TRYGVE HAAVELMO)\r<br>建立了现代经济计量学的基础性指导原则。
<br>[1988]莫里斯·阿莱斯(MAURICE ALLAIS)\r<br>他在市场理论及资源有效利用方面做出了开创性贡献。对一般均衡理论重新做了系统阐述。
<br>[1987]罗伯特·索洛(ROBERT M. SOLOW)\r<br>对增长理论做出贡献。提出长期的经济增长主要依靠技术进步,而不是依靠资本和劳动力的投入。
<br>[1986]詹姆斯·布坎南(JAMES M. BUCHANAN, JR)\r<br>将政治决策的分析同经济理论结合起来,使经济分析扩大和应用到社会—政治法规的选择。
<br>[1985]弗兰科·莫迪利安尼(FRANCO MODIGLIANI)\r<br>第一个提出储蓄的生命周期假设。这一假设在研究家庭和企业储蓄中得到了广泛应用。
<br>[1984]理查德·约翰·斯通(RICHARD STONE)\r<br>国民经济统计之父,在国民帐户体系的发展中做出了奠基性贡献,极大地改进了经济实践分析的基础。
<br>[1983]罗拉尔·德布鲁(GERARD DEBREU)\r<br>概括了帕累拖最优理论,创立了相关商品的经济与社会均衡的存在定理。
<br>[1982]乔治·斯蒂格勒(GEORGE J. STIGLER)\r<br>在工业结构、市场的作用和公共经济法规的作用与影响方面,做出了创造性重大贡献。
<br>[1981]詹姆士·托宾(JAMES TOBIN)\r<br>阐述和发展了凯恩斯的系列理论及财政与货币政策的宏观模型。在金融市场及相关的支出决定、就业、产品和价格等方面的分析做出了重要贡献。
<br>[1980]劳伦斯·罗·克莱因(LAWRENCE R. KLEIN)\r<br>以经济学说为基础,根据现实经济中实有数据所作的经验性估计,建立起经济体制的数学模型。
<br>[1979]威廉·阿瑟·刘易斯(ARTHUR LEWIS)和西奥多·舒尔茨(THEODORE W. SCHULTZ )\r<br>在经济发展方面做出了开创性研究,深入研究了发展中国家在发展经济中应特别考虑的问题。
<br>[1978]赫泊特·亚·西蒙(HERBERT A. SIMON)\r<br>对于经济组织内的决策程序进行了研究,这一有关决策程序的基本理论被公认为关于公司企业实际决策的创见解。
<br>[1977]戈特哈德·贝蒂·俄林(BERTIL OHLIN)和詹姆斯·爱德华·米德(JAMES E MEADE)\r<br>对国际贸易理论和国际资本流动作了开创性研究。
<br>[1976]米尔顿·弗里德曼(MILTON FRIEDMAN)\r<br>创立了货币主义理论,提出了永久性收入假说。
<br>[1975]列奥尼德·康托罗为奇(LEONID VITALIYEVICH KANTOROVICH)和佳林·库普曼斯(TJALLING C. KOOPMANS)\r<br>前者在1939年创立了享誉全球的线形规划要点,后者将数理统计学成功运用于经济计量学。他们对资源最优分配理论做出了贡献。
<br>[1974]弗·冯·哈耶克(FRIEDRICH AUGUST VON HAYEK)和纲纳·缪达尔(GUNNAR MYRDAL)\r<br>他们深入研究了货币理论和经济波动,并深入分析了经济、社会和制度现象的互相依赖。
<br>[1973]华西里·列昂惕夫(WASSILY LEONTIEF)\r<br>发展了投入产出方法,该方法在许多重要的经济问题中得到运用。
<br>[1972]约翰·希克斯(JOHN R. HICKS)和肯尼斯·约瑟夫·阿罗(KENNETH J. ARROW)
<br>他们深入研究了经济均衡理论和福利理论。
<br>[1971]西蒙·库兹列茨(SIMON KUZNETS )\r<br>在研究人口发展趋势及人口结构对经济增长和收入分配关系方面做出了巨大贡献。
<br>[1970]保罗·安·萨默尔森(PAUL A SAMUELSON )\r<br>他发展了数理和动态经济理论,将经济科学提高到新的水平。他的研究涉及经济学的全部领域。
<br>[1969]拉格纳·弗里希(RAGNAR FRISCH)和简·丁伯根(JAN TINBERGEN)\r<br>他们发展了动态模型来分析经济进程。前者是经济计量学的奠基人,后者经济计量学模式建造者之父。
分享到:  QQ好友和群QQ好友和群 QQ空间QQ空间 腾讯微博腾讯微博 腾讯朋友腾讯朋友 微信微信
收藏收藏 转播转播 分享分享 分享淘帖
2#
发表于 2005-8-3 23:18:14 | 只看该作者
提示: 作者被禁止或删除 内容自动屏蔽
3#
发表于 2005-12-9 01:21:29 | 只看该作者
斑竹以后争取进入名单   加油!
4#
发表于 2005-12-9 03:54:26 | 只看该作者
再具体一下\r<br>
<br>Press Release: The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 2002
<br>
<br>9 October 2002
<br>
<br>The Royal Swedish Academy of Sciences has decided that the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 2002, will be shared between Daniel Kahneman
<br>
<br>Princeton University, USA
<br>
<br>
<br>“for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty”
<br>
<br>
<br>and
<br>Vernon L. Smith
<br>
<br>George Mason University, USA
<br>
<br>“for having established laboratory experiments as a tool in empirical economic analysis, especially in the study of alternative market mechanisms”.
<br>
<br>Psychological and experimental economics
<br>
<br>Traditionally, much of economic research has relied on the assumption of a “homo oeconomicus” motivated by self-interest and capable of rational decision-making. Economics has also been widely considered a non-experimental science, relying on observation of real-world economies rather than controlled laboratory experiments. Nowadays, however, a growing body of research is devoted to modifying and testing basic economic assumptions; moreover, economic research relies increasingly on data collected in the lab rather than in the field. This research has its roots in two distinct, but currently converging, areas: the analysis of human judgment and decision-making by cognitive psychologists, and the empirical testing of predictions from economic theory by experimental economists. This year’s laureates are the pioneers in these two research areas.
<br>
<br>Daniel Kahneman has integrated insights from psychology into economics, thereby laying the foundation for a new field of research. Kahneman’s main findings concern decision-making under uncertainty, where he has demonstrated how human decisions may systematically depart from those predicted by standard economic theory. Together with Amos Tversky (deceased in 1996), he has formulated prospect theory as an alternative, that better accounts for observed behavior. Kahneman has also discovered how human judgment may take heuristic shortcuts that systematically depart from basic principles of probability. His work has inspired a new generation of researchers in economics and finance to enrich economic theory using insights from cognitive psychology into intrinsic human motivation.
<br>
<br>Vernon Smith has laid the foundation for the field of experimental economics. He has developed an array of experimental methods, setting standards for what constitutes a reliable laboratory experiment in economics. In his own experimental work, he has demonstrated the importance of alternative market institutions, e.g., how the revenue expected by a seller depends on the choice of auction method. Smith has also spearheaded “wind-tunnel tests”, where trials of new, alternative market designs – e.g., when deregulating electricity markets – are carried out in the lab before being implemented in practice. His work has been instrumental in establishing experiments as an essential tool in empirical economic analysis.
<br>
<br>Read more about this year's prize:
<br>
<br>Information for the Public
<br>
<br>Advanced Information (pdf)
<br>Links and further reading
<br>
<br>
<br>
<br>--------------------------------------------------------------------------------
<br>
<br>
<br>Daniel Kahneman, born 1934 (68 years) in Tel Aviv, Israel (US and Israeli citizen). PhD from University of California at Berkeley in 1961. Since 1993, Eugene Higgins Professor of Psychology and Professor of Public Affairs at Princeton University, NJ, USA.<a href="http://www.princeton.edu/~psych/PsychSite/fac_kahneman.html" target="_blank"><a href="http://www.princeton.edu/~psych/PsychSite/fac_kahneman.html" target="_blank">www.princeton.edu/~psych/PsychSite/fac_kahneman.html</a></a>
<br>
<br>Vernon L. Smith, born 1927 (75 years) in Wichita, KS, USA (US citizen). PhD from Harvard University in 1955. Since 2001, Professor of Economics and Law at George Mason University, VA, USA.
<br>
<br><a href="http://www.gmu.edu/departments/economics/facultybios/smith.html" target="_blank"><a href="http://www.gmu.edu/departments/economics/facultybios/smith.html" target="_blank">www.gmu.edu/departments/economics/facultybios/smith.html</a></a>
<br>
<br>The Prize amount: SEK 10 million, will be shared equally among the Laureates
<br>Contact persons: Katarina Werner, Information assistant,
<br>phone +46 8 673 95 29, katarina@kva.se and Eva Krutmeijer, Head of information, phone +46 8 673 95 95,
<br>+46 709 84 66 38, evak@kva.se
<br><a href="http://finance.sina.com.cn" target="_blank">http://finance.sina.com.cn</a> 2002年10月09日 22:24 新浪财经
<br>
<br>  瑞典斯德哥尔摩当地时间10月8日15:30(北京时间8日21:30),瑞典皇家科学院宣布,将本年度诺贝尔经济学奖授予美国普林斯顿大学的丹尼尔-卡恩曼(Daniel Kahneman拥有美国和以色列双重国籍)和美国乔治-梅森大学的弗农-史密斯(Vernon L. Smith)。
<br>
<br>  传统意义上的经济学被广泛认为是一种非实验科学,大多数的经济学研究依赖于各种合理的假设,这些假设在决策中具有重要意义。然而,现今越来越多的研究人员开始尝试用试验的方法来研究经济学,修改和验证各种基本的经济学假设,这使得经济学的研究越来越多的依赖于实验和各种数据的搜集,从而变得更加可信。这些研究大多数扎根于两个有着明显区分但目前却融汇在一起的领域,即认知心理学家有关人为判断和决策的分析和实验经济学家对经济学理论的实验性测试。今年的诺贝尔经济学奖获得者就是这两个研究领域的先锋。
<br>
<br>  丹尼尔-卡赫内曼将源于心理学的综合洞察力应用于经济学的研究,从而为一个新的研究领域奠定了基础。卡赫内曼的主要贡献是在不确定条件下的人为判断和决策方面的发现。他展示了人为决策是如何异于标准经济理论预测的结果。他的发现激励了新一代经济学研究人员运用认知心理学的洞察力来研究经济学,使经济学的理论更加丰富。
<br>
<br>  维农-史密斯为实验经济学奠定了基础,他发展了一整套实验研究方法,并设定了经济学研究实验的可靠标准。维农利用实验展示了选择性市场机制的重要性,他还率先采用了“风洞测试”的新方法研究选择性市场设计。维农的成就对于将实验作为一种工具应用于实验经济学分析很有帮助。
<br>
<br>  诺贝尔经济学奖并非诺贝尔遗嘱中提到的五大奖励领域之一,是由瑞典银行在1968年为纪念诺贝尔而增设,全称应为“纪念阿尔弗雷德-诺贝尔瑞典银行经济学奖”,其评选标准与其它奖项是相同的,获奖者由瑞典皇家科学院评选,1969年第一次颁奖,由挪威人弗里希和荷兰人丁伯根共同获得,美国经济学家萨缪尔森、弗里德曼等人均获得过此奖。去年的诺贝尔经济学奖是三位美国人:乔治-阿克洛夫、麦克尔-斯宾塞和约瑟夫-斯蒂格利茨。他们因为在现代信息经济学研究领域所作的突出贡献而获奖。\r<br>
5#
发表于 2005-12-9 03:55:29 | 只看该作者
Press Release - The 2001 Sveriges RikS*ank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>10 October 2001
<br>
<br>The Royal Swedish Academy of Sciences has decided to award the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 2001, jointly to
<br>
<br>George A. Akerlof
<br>University of California at Berkeley, USA,
<br>
<br>A. Michael Spence
<br>Stanford University, USA, and
<br>
<br>Joseph E. Stiglitz
<br>Columbia University, USA
<br>
<br>"for their analyses of markets with asymmetric information".
<br>
<br>  Information for the Public
<br>
<br>  Advanced Information
<br>
<br>Markets with asymmetric information
<br>
<br>Many markets are characterized by asymmetric information: actors on one side of the market have much better information than those on the other. Borrowers know more than lenders about their repayment prospects, managers and boards know more than shareholders about the firm's profitability, and prospective clients know more than insurance companies about their accident risk. During the 1970s, this year's Laureates laid the foundation for a general theory of markets with asymmetric information. Applications have been abundant, ranging from traditional agricultural markets to modern financial markets. The Laureates' contributions form the core of modern information economics.
<br>
<br>George Akerlof demonstrated how a market where sellers have more information than buyers about product quality can contract into an adverse selection of low-quality products. He also pointed out that informational problems are commonplace and important. Akerlof's pioneering contribution thus showed how asymmetric information of borrowers and lenders may explain skyrocketing borrowing rates on local Third World markets; but it also dealt with the difficulties for the elderly to find individual medical insurance and with labour-market discrimination of minorities.
<br>
<br>Michael Spence identified an important form of adjustment by individual market participants, where the better informed take costly actions in an attempt to improve on their market outcome by credibly transmitting information to the poorly informed. Spence showed when such signaling will actually work. While his own research emphasized education as a productivity signal in job markets, subsequent research has suggested many other applications, e.g., how firms may use dividends to signal their profitability to agents in the stock market.
<br>
<br>Joseph Stiglitz clarified the opposite type of market adjustment, where poorly informed agents extract information from the better informed, such as the screening performed by insurance companies dividing customers into risk classes by offering a menu of contracts where higher deductibles can be exchanged for significantly lower premiums. In a number of contributions about different markets, Stiglitz has shown that asymmetric information can provide the key to understanding many observed market phenomena, including unemployment and credit rationing.
<br>
<br>
<br>
<br>Read more about this year's prize:
<br>
<br>Information for the Public
<br>Advanced Information
<br>
<br>
<br>--------------------------------------------------------------------------------
<br>
<br>George A. Akerlof, 61 years, born 1940 in New Haven, Connecticut (US citizen). PhD from MIT 1966. Has held professorships at Indian Statistical Institute and London School of Economics. Since 1980 Goldman Professor of Economics at the University of California at Berkeley.
<br><a href="http://emlab.berkeley.edu/users/akerlof/index.html" target="_blank">http://emlab.berkeley.edu/users/akerlof/index.html</a>
<br>
<br>A. Michael Spence, 58 years, born 1943 in Montclair, New Jersey (US citizen). PhD from Harvard 1972. Has held professorships at Harvard and the Graduate School of Business, Stanford and has also been Dean at both these universities.
<br><a href="http://gobi.stanford.edu/facultybios/bio.asp?ID=156" target="_blank">http://gobi.stanford.edu/facultybios/bio.asp?ID=156</a>
<br>
<br>Joseph E. Stiglitz, 58 years, born 1943 in Gary, Indiana (US citizen). PhD from MIT 1967. Has held professorships at Yale, Princeton, Oxford and Stanford, and has been the Chief Economist of the World Bank. Since this year, Professor of Economics, Business and International Affairs at Columbia University.
<br>
<br>The Prize amount:
<br>SEK 10 million, will be shared equally among the Laureates
<br>
<br>Press Officer:
<br>Eva Krutmeijer, phone +46 8 673 95 95, +46 709 84 66 38, mailto:evak@kva.se
<br>Information for the Public The 2001 Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>  
<br>
<br>For more than two decades, the theory of markets with asymmetric information has been a vital and lively field of economic research. Today, models with imperfect information are indispensable instruments in the researcher's toolbox. Countless applications extend from traditional agricultural markets in developing countries to modern financial markets in developed economies. The foundations for this theory were established in the 1970s by three researchers: George Akerlof, Michael Spence and Joseph Stiglitz. They receive the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 2001, "for their analyses of markets with asymmetric information".
<br>
<br>  
<br>
<br>Markets with Asymmetric Information
<br>Why are interest rates often excessively high on local lending markets in Third World countries? Why do people who want to buy a good used car turn to a dealer rather than a private seller? Why does a firm pay dividends even if they are taxed more heavily than capital gains? Why is it advantageous for insurance companies to offer clients a menu of contracts where higher deductibles can be exchanged for lower premiums? Why do rich landowners not bear the entire harvest risk in contracts with poor tenants? These questions exemplify familiar – but seemingly different – phenomena, each of which has posed a challenge to economic theory. This year's Laureates proposed a common explanation and extended the theory when they argumented the theory with the realistic assumption of asymmetric information: agents on one side of the market have much better information than those on the other side. Borrowers know more than the lender about their repayment prospects; the seller knows more than buyers about the quality of his car; the CEO and the board know more than the shareholders about the profitability of the firm; policyholders know more than the insurance company about their accident risk; and tenants know more than the landowner about their work effort and harvesting conditions.
<br>
<br>More specifically, Akerlof showed that informational asymmetries can give rise to adverse selection on markets. Due to imperfect information on the part of lenders or prospective car buyers, borrowers with weak repayment prospects or sellers of low-quality cars crowd out everyone else from the market. Spence demonstrated that under certain conditions, well-informed agents can improve their market outcome by signaling their private information to poorly informed agents. The management of a firm can thus incur the additional tax cost of dividends to signal high profitability. Stiglitz showed that an uninformed agent can sometimes capture the information of a better-informed agent through screening, for example by providing choices from a menu of contracts for a particular transaction. Insurance companies are thus able to divide their clients into risk classes by offering different policies, where lower premiums can be exchanged for a higher deductible.
<br>
<br>  
<br>
<br>George Akerlof
<br>Akerlof's 1970 essay, "The Market for Lemons" is the single most important study in the literature on economics of information. It has the typical features of a truly seminal contribution – it addresses a simple but profound and universal idea, with numerous implications and widespread applications.
<br>
<br>Here Akerlof introduces the first formal analysis of markets with the informational problem known as adverse selection. He analyses a market for a good where the seller has more information than the buyer regarding the quality of the product. This is exemplified by the market for used cars; "a lemon" – a colloquialism for a defective old car – is now a well-known metaphor in economists' theoretical vocabulary. Akerlof shows that hypothetically, the information problem can either cause an entire market to collapse or contract it into an adverse selection of low-quality products.
<br>
<br>Akerlof also pointed to the prevalence and importance of similar information asymmetries, especially in developing economies. One of his illustrative examples of adverse selection is drawn from credit markets in India in the 1960s, where local lenders charged interest rates that were twice as high as the rates in large cities. However, a middleman who borrows money in town and then lends it in the countryside, but does not know the borrowers' creditworthiness, risks attracting borrowers with poor repayment prospects, thereby becoming liable to heavy losses. Other examples in Akerlof's article include difficulties for the elderly to acquire individual health insurance and discrimination of minorities on the labor market.
<br>
<br>A key insight in his "lemons paper" is that economic agents may have strong incentives to offset the adverse effects of information problems on market efficiency. Akerlof argues that many market institutions may be regarded as emerging from attempts to resolve problems due to asymmetric information. One such example is guarantees from car dealers; others include brands, chain stores, franchising and different types of contracts.
<br>
<br>A timely example might further illustrate the idea that asymmetric information can generate adverse selection. At first, firms in a new sector – such as today's IT sector – might seem identical to an uninformed bystander, while some "insiders" may have better information about the future profitability of such firms. Firms with lower than average profitability will therefore be overvalued and more inclined to finance new projects by issuing their own shares than high-profitability firms which are undervalued by the market. As a result, low-profitability firms tend to grow more rapidly and the stock market will initially be dominated by "lemons". When uninformed investors eventually discover their mistake, share prices fall – the IT bubble bursts.
<br>
<br>Apart from his research on asymmetric information, Akerlof has developed economic theory with insights from sociology and social anthropology. His most noteworthy contributions in this genre concern efficiency on labor markets. Akerlof points out that emotions such as reciprocity towards an employer or fairness towards colleagues can prompt wages to be set so high as to induce unemployment. He has also examined how social conventions such as the caste system may have unfavorable effects on economic efficiency. As a result of these studies, Akerlof's research is also well known and influential in other social sciences.
<br>
<br>  
<br>
<br>Michael Spence
<br>Spence asked how better informed individuals on a market can credibly transmit, "signal", their information to less informed individuals, so as to avoid some of the problems associated with adverse selection. Signaling requires economic agents to take observable and costly measures to convince other agents of their ability or, more generally, of the value or quality of their products. Spence's contribution was to develop and formalize this idea as well as to demonstrate and analyze its implications.
<br>
<br>Spence's pioneering essay from 1973 (based on his PhD thesis) deals with education as a signal of productivity on the labor market. A fundamental insight is that signaling cannot succeed unless the signaling cost differs sufficiently among the "senders", i.e., job applicants. An employer cannot distinguish the more productive applicants from those who are less productive unless the former find it sufficiently less costly to acquire an education that the latter choose a lower level of education. Spence also pointed to the possibility of different "expectations-based" equilibria for education and wages, where e. g. men and white receive a higher wage than women and black with the same productivity.
<br>
<br>Subsequent research contains numerous applications which extend this theory and confirm the importance of signaling on different markets. This covers phenomena such as costly advertising or far-reaching guarantees as signals of productivity, aggressive price cuts as signals of market strength, delaying tactics in wage offers as a signal of bargaining power, financing by debt rather than by issuing new shares as a signal of profitability, and recession-generating monetary policy as a signal of uncompromising commitment to reduce stubbornly high inflation.
<br>
<br>An early example in the literature concerns dividends. Why do firms pay dividends to their shareholders, knowing full well that they are subject to higher taxes (through double taxation) than capital gains? Retaining the profits within the firm would appear as a cheaper way to favor the shareholders through the capital gains of a higher share price. One possible answer is that dividends can act as a signal for favorable prospects. Firms with "insider information" about high profitability pay dividends because the market interprets this as good news and therefore pays a higher price for the share. The higher share price compensates shareholders for the extra tax they pay on the dividends.
<br>
<br>In addition to his research on signaling, Spence was a forerunner in applying the results and insights of the 1996 economics laureates, Vickrey and Mirrlees, to the analysis of insurance markets. During the period 1975-1985, he was one of the pioneers in the wave of game-theory inspired work that clarified many aspects of strategic market behavior within the so-called new theory of industrial organization.
<br>
<br>  
<br>
<br>Joseph Stiglitz
<br>One of Stiglitz's classical papers, coauthored with Michael Rothschild, formally demonstrated how information problems can be dealt with on insurance markets where the companies do not have information on the risk situation of individual clients. This work is an obvious complement to Akerlof's and Spence's analyses by examining what actions uninformed agents can take on a market with asymmetric information. Rothschild and Stiglitz show that the insurance company (the uninformed party) can give its clients (the informed party) effective incentives to "reveal" information on their risk situation through so-called screening. In an equilibrium with screening, insurance companies distinguish between different risk classes among their policyholders by offering them to choose from a menu of alternative contracts where lower premiums can be exchanged for higher deductibles.
<br>
<br>Stiglitz and his numerous coauthors have time and again substantiated that economic models may be quite misleading if they disregard informational asymmetries. Their common message has been that in the perspective of asymmetric information, many markets take on a completely different guise, as do the conclusions regarding appropriate forms of public-sector regulation. Stiglitz has analyzed the implications of asymmetric information in many different contexts, varying from unemployment to the design of an optimal tax system. Several of his essays have become important stepping stones for further research.
<br>
<br>One example is Stiglitz's work with Andrew Weiss on credit markets with asymmetric information. Stiglitz and Weiss show that in order to reduce losses from bad loans, it may be optimal for bankers to ration the volume of loans instead of raising the lending rate. Since credit rationing is so common, these insights were important steps towards a more realistic theory of credit markets. They have also had a substantial impact in the domains of corporate finance, monetary theory and macroeconomics.
<br>
<br>In collaboration with Sanford Grossman, Stiglitz analyzed efficiency on financial markets. Their key result is known as the Grossman-Stiglitz paradox: if a market were informationally efficient, i.e., all relevant information is reflected in market prices, then no single agent would have sufficient incentive to acquire the information on which prices are based.
<br>
<br>Stiglitz is also one of the founders of modern development economics. He has shown that asymmetric information and economic incentives are not merely academic abstractions, but highly concrete phenomena with far-reaching explanatory value in the analysis of institutions and market conditions in developing economies. One of his first studies of information problems dealt with sharecropping, an ancient, though still common, form of contracting.
<br>
<br>A sharecropping contract stipulates that the harvest should be divided between a landowner and his tenant in fixed shares (usually half each). Since the landowner is usually richer than the tenant, it would seem advantageous to both parties to let the landowner bear the entire risk. But such a contract would not give the tenant strong enough incentives to cultivate the land efficiently. Considering the landowner's inferior information about harvest conditions and the tenant's work effort, sharecropping is in fact the optimal solution for both parties.
<br>
<br>Joseph Stiglitz's many contributions have transformed the way economists think about the working of markets. Together with the fundamental contributions by George Akerlof and Michael Spence, they make up the core of the modern economics of information.
<br>
<br>The Laureates?
<br>George Akerlof
<br>Economics Department
<br>University of California
<br>549 Evans Hall #3880
<br>Berkeley, CA 94720-3880
<br>USA
<br><a href="http://emlab.berkeley.edu/users/akerlof/index.html" target="_blank">http://emlab.berkeley.edu/users/akerlof/index.html</a> PhD from MIT 1966. Has held professorships at Indian Statistical Institute and London School of Economics. Since 1980 Goldman Professor of Economics at the University of California at Berkeley.
<br>Michael Spence
<br>Stanford Business School
<br>518 Memorial Way
<br>Stanford University
<br>Stanford, CA 94305-5015
<br>USA
<br><a href="http://gobi.stanford.edu/facultybios/bio.asp?ID=156" target="_blank">http://gobi.stanford.edu/facultybios/bio.asp?ID=156</a> PhD from Harvard 1972. Has held professorships at Harvard and the Graduate School of Business, Stanford and has also been Dean at both these universities.
<br>Joseph Stiglitz
<br>Economics Department
<br>Columbia University
<br>1022 International Affairs Building
<br>420 West 118th Street
<br>New York, NY 10027
<br>USA PhD from MIT 1967. Has held professorships at Yale, Princeton, Oxford and Stanford, and has been the Chief Economist of the World Bank. Since this year, Professor of Economics, Business and International Affairs at Columbia University.
<br>
6#
发表于 2005-12-9 03:57:31 | 只看该作者
Presentation Speech - The Sveriges RikS*ank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>KUNGL. VETENSKAPSAKADEMIEN? THE ROYAL SWEDISH ACADEMY OF SCIENCES
<br>
<br>October 11, 2000
<br>
<br>
<br>The Royal Swedish Academy of Sciences has decided that the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 2000, will be shared between
<br>
<br>James J. Heckman
<br>University of Chicago, USA, and
<br>
<br>Daniel L. McFadden
<br>University of California, Berkeley, USA.
<br>
<br>
<br>In the field of microeconometrics, each of the laureates has developed theory and methods that are widely used in the statistical analysis of individual and household behavior, within economics as well as other social sciences.
<br>
<br>Citation of the Academy:
<br>"to James Heckman for his development of theory and methods for analyzing selective samples and to Daniel McFadden for his development of theory and methods for analyzing discrete choice. "
<br>
<br>
<br>Microeconometrics - on the boundary between economics and statistics - is a methodology for studying micro data, i.e., economic information about large groups of individuals, households, or firms. Greater availability of micro data and increasingly powerful computers have enabled empirical studies of many new issues. For example, what determines whether an individual decides to work and, if so, how many hours? How do economic incentives affect choices of education, occupation, and place of residence? What are the effects of different educational programs on income and employment? James Heckman and Daniel McFadden have resolved fundamental problems that arise in the statistical analysis of micro data. The methods they have developed have solid foundations in economic theory, but have evolved in close interplay with applied research on important social problems. They are now standard tools, not only among economists but also among other social scientists.
<br>
<br>Available micro data often entail selective samples. Data on wages, for instance, cannot be sampled randomly if only individuals with certain characteristics - unobservable to the researcher - choose to work or engage in education. If such selection is not taken into account, statistical estimation of economic relationships yields biased results. Heckman has developed statistical methods of handling selective samples in an appropriate way. He has also proposed tools for solving closely related problems with individual differences unobserved by the researcher; such problems are common, e.g. when evaluating social programs or estimating how the duration of unemployment affects chances of getting a job. Heckman is also a leader of applied research in these areas.
<br>
<br>Micro data often reflect discrete choice. For instance, data regarding individuals' occupation or place of residence reflect choices they have made among a limited number of alternatives. Prior to McFadden's contributions, empirical studies of such choices lacked a foundation in economic theory. Evolving from a new theory of discrete choice, the statistical methods developed by McFadden have transformed empirical research. His methods are readily applicable. For example, they prevail in models of transports and are used to evaluate changes in communication systems. Examples of McFadden's extensive applications of his own methods include the design of the San Francisco BART system, as well as investments in phone service and housing for the elderly.
<br>
<br>
<br>
<br>***********************************************************************************
<br>James J. Heckman (US citizen), 56, was born in Chicago, IL in 1944. Since 1995 he is the Henry Schultz Distinguished Service Professor of Economics at the University of Chicago.
<br>
<br>Daniel L. McFadden (US citizen), 63, was born in Raleigh, NC in 1937. Since 1990 he holds the E. Morris Cox Chair in Economics at the University of California, Berkeley.
<br>
<br>The Prize amount, SEK 9 million, will be shared equally between the Laureates.
<br>
<br>
<br>
<br>
<br>--------------------------------------------------------------------------------
<br>
<br>Read also:
<br>
<br>
<br>Information for the Public
<br>
<br>Advanced Information?
<br>
<br>NEXT
<br>
<br>新华社斯德哥尔摩10月11日电(记者吴平)瑞典皇家科学院 11日在这里宣布,美国经济学家詹姆斯·赫克曼和丹尼尔·麦克法登因在微观计量经济学领域所作出的杰出贡献而荣获2000年诺贝尔经济学奖。
<br>  
<br>瑞典皇家科学院说,两位获奖者在微观计量经济学领域的主要贡献是,他们在70年代发展了已被广泛用来对个人和家庭行为进行统计分析的理论和方法。其中赫克曼发展了对选择性抽样数据进行分析的理论和方法,麦克法登发展了对自行选择行为进行分析的理论和方法。他们将分享900万瑞典克朗(约合100万美元)的诺贝尔经济学奖金。
<br>  
<br>瑞典皇家科学院介绍说,介于经济学和统计学之间的微观计量经济学是一门用来研究微观数据的方法学。随着人们所能获得的微观数据的增加和计算机功率的增大,经济学家们已能对许多新的问题进行经验研究,如决定人们去工作和工作时间的因素是什么,经济动力是如何影响人们对教育、职业和居住地所进行的选择,以及不同教育计划对收入和就业产生什么影响等。这些问题都可以用赫克曼和麦克法登发展的理论和方法来进行分析和研究。
<br>  
<br>瑞典皇家科学院认为,赫克曼和麦克法登已解决了对微观数据进行统计分析中出现的基本问题。他们所发展的分析方法不仅在经济理论方面具有牢固的基础,而且还在重大社会问题的实用研究领域发生了很大的影响。这些方法现已成为经济学家和社会学家分析问题的“标准工具”。
<br>  
<br>瑞典皇家科学院进一步介绍说,人们所能获得的微观数据往往来源于选择性的抽样调查数据,如有关工资的数据就无法通过随意抽样的方式获得。如果这样的选择性因素不被考虑进去,那么对经济关系进行的统计评估结果将会发生偏差。赫克曼发展了用一种合适的方式来处理选择性抽样数据的方法,并提出了用来解决与此密切相关问题的方法。他在这些领域进行的应用研究也具有领先地位。
<br>  
<br>瑞典皇家科学院说,微观数据还能反映人们的自行选择行为,如有关人们对职业或居住地的数据能反映他们如何在有限的范围内进行选择。以前,经济学界对这样的选择进行的经验研究缺乏经济理论基础。麦克法登根据一种有关自行选择的新理论,发展了对自行选择进行统计分析的方法。这些方法已使经验研究发生变革,并已在交通运输和通信等领域得到了广泛的实际应用。
<br>  
<br>赫克曼1944年出生于美国的芝加哥,现为芝加哥大学经济学教授。麦克法登1937年出生于美国的罗利,现供职于美国加利福尼亚大学。
<br>The Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel 2000
<br>  
<br>
<br>
7#
发表于 2005-12-9 03:58:17 | 只看该作者
James Heckman and Daniel McFadden have each developed theory and methods that are widely used in the statistical analysis of individual and household behavior, within economics as well as other social sciences.
<br>
<br>Microeconometrics and Microdata
<br>
<br>Microeconometrics is an interface between economics and statistics. It encompasses economic theory and statistical methods used to analyze microdata, i.e., economic information about individuals, households and firms. Microdata appear as cross-section data which refer to conditions at the same point in time, or as longitudinal data (panel data) which refer to the same observational units over a succession of years. During the last three decades, the field of microeconometrics has expanded rapidly due to the creation of large databases containing microdata.
<br>
<br>Greater availability of microdata and increasingly powerful computers have opened up entirely new possibilities of empirically testing microeconomic theory. Researchers have been able to examine many new issues at the individual level. For example: what factors determine whether an individual decides to work and, if so, how many hours? How do economic incentives affect individual choices regarding education, occupation or place of residence? What are the effects of different labor-market and educational programs on an individual's income and employment?
<br>
<br>The use of microdata has also given rise to new statistical problems, owing primarily to the limitations inherent in such (non-experimental) data. Since the researcher can only observe certain variables for particular individuals or households, a sample might not be random and thereby not representative. Even when samples are representative, some characteristics that affect individuals' behavior remain unobservable, which makes it difficult, or impossible, to explain some of the variation among individuals.
<br>
<br>This year's laureates have each shown how one can resolve some fundamental statistical problems associated with the analysis of microdata. James Heckman's and Daniel McFadden's methodological contributions share a solid foundation in economic theory. They emerged in close interaction with applied empirical studies, where new databases served as a definitive prerequisite. The microeconometric methods developed by Heckman and McFadden are now part of the standard tool kit, not only of economists, but also of other social scientists.
<br>
<br>James J. Heckman
<br>
<br>James Heckman has made many significant contributions to microeconometric theory and methodology, with different kinds of selection problems as a common denominator. He developed his methodological contributions in conjunction with applied empirical research, particularly in labor economics. Heckman's analysis of selection problems in microeconometric research has had profound implications for applied research in economics as well as in other social sciences.
<br>
<br>Selection Bias and Self-selection
<br>
<br>Selection problems are legion in microeconometric studies. They can arise when a sample available to researchers does not randomly represent the underlying population. Selective samples may be the result of rules governing collection of data or the outcome of economic agents' own behavior. The latter situation is known as self-selection. For example, wages and working hours can only be observed in the case of individuals who have chosen to work; the earnings of university graduates can only be observed for those who have completed their university education, etc. The absence of information regarding the wage an individual would earn, had he or she chosen otherwise, creates problems in many empirical studies.
<br>
<br>The problem of selection bias may be illustrated by the following figure, where w denotes an individual's wage and x is a factor that affects this wage, such as the individual's level of education. Each point in the figure represents individuals with the same education and wage levels in a large and representative sample of the population. The solid line shows the statistical (and true) relationship that we would estimate if we could indeed observe wages and education for all these individuals. Now assume - in accordance with economic theory - that only those individuals whose market wages exceed some threshold value (the reservation wage) choose to work. If this is the case, individuals with relatively high wages and relatively long education will be overrepresented in the sample we actually observe: the dark points in the figure. This selective sample creates a problem of selection bias in the sense that we will estimate the relation between wage and education given by the dashed line in the figure. We thus find a relationship weaker than the true one, thereby underestimating the effect of education on wages.
<br>
<br>
<br>
<br>
<br>High resolution (JPG 80,8 kb)
<br> 
<br>
<br>Heckman's Contributions
<br>
<br>Heckman's methodological breakthroughs regarding self-selection took place in the mid-1970s. They are closely related to his studies of individuals' decisions about their labor-force participation and hours worked. As we observe variations in hours of work solely among those who have chosen to work, we could - again - encounter samples tainted by self-selection. In an article on the labor supply of married women, published in 1974, Heckman devised an econometric method to handle such self-selection problems. This study is an excellent illustration of how microeconomic theory can be combined with microeconometric methods to clarify an important research topic.
<br>
<br>In subsequent work, Heckman proposed yet another method for handling self-selection: the well-known Heckman correction (the two-stage method, Heckman's lambda or the Heckit method). This method has had a vast impact because it is so easy to apply. Suppose that a researcher - as in the example above - wants to estimate a wage relation using individual data, but only has access to wage observations for those who work. The Heckman correction takes place in two stages. First, the researcher formulates a model, based on economic theory, for the probability of working. Statistical estimation of the model yields results that can be used to predict this probability for each individual. In the second stage, the researcher corrects for self-selection by incorporating these predicted individual probabilities as an additional explanatory variable, along with education, age, etc. The wage relation can then be estimated in a statistically appropriate way.
<br>
<br>Heckman's achievements have generated a large number of empirical applications in economics as well as in other social sciences. The original method has subsequently been generalized, by Heckman and by others.
<br>
<br>Duration Models
<br>
<br>Duration models have a long tradition in the engineering and medical sciences. They are frequently used by social scientists, such as demographers, to study mortality, fertility and migration. Economists apply them, for instance, to examine the effects of the duration of unemployment on the probability of getting a job. A common problem in such studies is that individuals with poor labor-market prospects might be overrepresented among those who remain unemployed. Such selection bias gives rise to problems similar to those encountered in self-selected samples: when the sample of unemployed at a given point in time is affected by unobserved individual characteristics, we may obtain misleading estimates of "duration dependence" in unemployment. In collaboration with Burton Singer, Heckman has developed econometric methods for resolving such problems. Today, this methodology is widely used throughout the social sciences.
<br>
<br>Evaluation of Active Labor Market Programs
<br>
<br>Along with the spread of active labor-market policy - such as labor-market training or employment subsidies - in many countries, there is a growing need to evaluate these programs. The classical approach is to determine how participation in a specific program affects individual earnings or employment, compared to a situation where the individual did not participate. Since the same individual cannot be observed in two situations simultaneously, information about non-participation has to be used, thereby - once again - giving rise to selection problems. Heckman is the world's leading researcher on microeconometric evaluation of labor-market programs. In collaboration with various colleagues, he has extensively analyzed the properties of alternative non-experimental evaluation methods and has explored their relation to experimental methods. Heckman has also offered numerous empirical results of his own. Even though results vary a great deal across programs and participants, the results are often quite pessimistic: many programs have only had small positive - and sometimes negative - effects for the participants and do not meet the criterion of social efficiency.
<br>
<br>Daniel L. McFadden
<br>
<br>Daniel McFadden's most significant contribution is his development of the economic theory and econometric methodology for analysis of discrete choice, i.e., choice among a finite set of decision alternatives. A recurring theme in McFadden's research is his ability to combine economic theory, statistical methods and empirical applications, where his ultimate goal has often been a desire to resolve social problems.
<br>
<br>Discrete Choice Analysis
<br>
<br>Microdata often reflect discrete choices. In a database, information about individuals' occupation, place of residence, or travel mode reflects the choices they have made among a limited number of alternatives. In economic theory, traditional demand analysis presupposes that individual choice be represented by a continuous variable, thereby rendering it inappropriate for studying discrete choice behavior. Prior to McFadden's prizewinning achievements, empirical studies of such choices lacked a foundation in economic theory.
<br>
<br>McFadden's Contributions
<br>
<br>McFadden's theory of discrete choice emanates from microeconomic theory, according to which each individual chooses a specific alternative that maximizes his utility. However, as the researcher cannot observe all the factors affecting individual choices, he perceives a random variation across individuals with the same observed characteristics. On the basis of his new theory, McFadden developed microeconometric models that can be used, for example, to predict the share of a population that will choose different alternatives.
<br>
<br>McFadden's seminal contribution is his development of so-called conditional logit analysis in 1974. In order to describe this model, suppose that each individual in a population faces a number (say, J) of alternatives. Let X denote the characteristics associated with each alternative and Z the characteristics of the individuals that the researcher can observe in his data. In a study of the choice of travel mode, for instance, where the alternatives may be car, bus or subway, X would then include information about time and costs, while Z might cover data on age, income and education. But differences among individuals and alternatives other than X and Z, although unobservable to the researcher, also determine an individual's utility-maximizing choice. Such characteristics are represented by random "error terms". McFadden assumed that these random errors have a specific statistical distribution (termed an extreme value distribution) in the population. Under these conditions (plus some technical assumptions), he demonstrated that the probability that individual i will choose alternative j can be written as:
<br>
<br>
<br>
<br>
<br>
<br>
<br>In this so-called multinomial logit model, e is the base of the natural logarithm, while?and?are (vectors of) parameters. In his database, the researcher can observe the variables X and Z, as well as the alternative the individual in fact chooses. As a result, he is able to estimate the parameters?and?using well-known statistical methods. Even though logit models had been around for some time, McFadden's derivation of the model was entirely new and was immediately recognized as a fundamental breakthrough.
<br>
<br>Such models are highly useful and are routinely applied in studies of urban travel demand. They can thus be used in traffic planning to examine the effects of policy measures as well as other social and/or environmental changes. For example, these models can explain how changes in price, improved accessibility or shifts in the demographic composition of the population affect the shares of travel using alternative means of transportation. The models are also relevant in numerous other areas, such as in studies of the choice of dwelling, place of residence, and education. McFadden has applied his own methods to analyze a number of social issues, such as the demand for residential energy, telephone services and housing for the elderly.
<br>
<br>Methodological Elaboration
<br>Conditional logit models have the peculiar property that the relative probabilities of choosing between two alternatives, say, travel by bus or car, are independent of the price and quality of other transportation options. This property - called independence of irrelevant alternatives (IIA) - is unrealistic in certain applications. McFadden not only devised statistical tests to ascertain whether IIA is satisfied, but also introduced more general models, such as the so-called nested logit model. Here, it is assumed that individuals' choices can be ordered in a specific sequence. For instance, when studying decisions regarding place of residence and type of housing, an individual is assumed to begin by choosing the location and then the type of dwelling.
<br>
<br>Even with these generalizations, the models are sensitive to the specific assumptions about the distribution of unobserved characteristics in the population. Over the last decade, McFadden has elaborated on simulation models (the method of simulated moments) for statistical estimation of discrete choice models allowing much more general assumptions. Increasingly powerful computers have enhanced the practical applicability of these numerical methods. As a result, individuals' discrete choices can now be portrayed with greater realism and their decisions predicted more accurately.
<br>
<br>Other Contributions
<br>
<br>In addition to discrete choice analysis, McFadden has made influential contributions in several other fields. In the 1960s, he devised econometric methods to assess production technologies and examine the factors behind firms' demand for capital and labor. During the 1990s, McFadden contributed to environmental economics, in particular to the literature on contingent-valuation methods for estimating the value of natural resources. A key example is his study of welfare losses due to the environmental damage along the Alaskan coast caused by the oil spill from the tanker Exxon Valdez in 1989. This study provides yet another example of McFadden's masterly skill in integrating economic theory and econometric methodology in empirical studies of important social problems.
<br>
<br>
<br>
<br>********************************************************************************
<br>James J. Heckman was born in Chicago, IL in 1944. After completing his undergraduate education at Colorado College, having majored in Mathematics, he attended Princeton University for graduate studies in Economics and received his Ph.D. there in 1971. Since then, Heckman has held professorships at Columbia University and Yale University. Since 1995, he is Henry Schultz Distinguished Professor of Economics at the University of Chicago.
<br>
<br>James Heckman
<br>Department of Economics
<br>University of Chicago
<br>1126 East 59th Street
<br>Chicago, IL 60637
<br>USA
<br><a href="http://lily.src.uchicago.edu/" target="_blank">http://lily.src.uchicago.edu/</a>
<br>
<br>
<br>Daniel L. McFadden was born in Raleigh, NC in 1937. He attended the University of Minnesota, where he received both his undergraduate degree, with a major in Physics and, after postgraduate studies in Economics, his Ph.D. in 1962. McFadden has held professorships at the University of PittS*urgh, Yale University and MIT. Since 1990, he is E. Morris Cox Professor of Economics at the University of California, Berkeley.
<br>
<br>Daniel McFadden
<br>Department of Economics
<br>University of California
<br>Berkeley, CA 94720
<br>USA
<br><a href="http://emlab.berkeley.edu/users/mcfadden/index.html" target="_blank">http://emlab.berkeley.edu/users/mcfadden/index.html</a>
<br>
8#
发表于 2005-12-9 03:59:15 | 只看该作者
<br>Presentation Speech - The Sveriges RikS*ank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>KUNGL. VETENSKAPSAKADEMIEN? THE ROYAL SWEDISH ACADEMY OF SCIENCES
<br>
<br>13 October 1999
<br>
<br>
<br>The Royal Swedish Academy of Sciences awarded the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 1999, to
<br>
<br>Professor Robert A. Mundell, Columbia University, New York, USA
<br>
<br>for his analysis of monetary and fiscal policy under different exchange rate regimes and his analysis of optimum currency areas.
<br>
<br>Economic policy exchange rates and capital mobility
<br>
<br>Robert Mundell has established the foundation for the theory which dominates practical policy considerations of monetary and fiscal policy in open economies. His work on monetary dynamics and optimum currency areas has inspired generations of researchers. Although dating back several decades, Mundell's contributions remain outstanding and constitute the core of teaching in international macroeconomics.
<br>
<br>Mundell's research has had such a far-reaching and lasting impact because it combines formal - but still accessible - analysis, intuitive interpretation and results with immediate policy applications. Above all, Mundell chose his problems with uncommon - almost prophetic - accuracy in terms of predicting the future development of international monetary arrangements and capital markets. Mundell's contributions serve as a superb reminder of the significance of basic research. At a given point in time academic achievements might appear rather esoteric; not long afterwards, however, they may take on great practical importance.
<br>
<br>
<br>
<br>
<br>
<br>*************************************************************************************
<br>
<br>How are the effects of monetary and fiscal policy related to the integration of international capital markets? How do these effects depend on whether a country fixes the value of its currency or allows it to float freely? Should a country even have a currency of its own? By posing and answering questions such as these, Robert Mundell has reshaped macroeconomic theory for open economies. His most important contributions were made in the 1960s. During the latter half of that decade, Mundell was among the intellectual leaders in the creative research environment at the University of Chicago. Many of his students from this period have become successful researchers in the same field, building on Mundell's foundational work.
<br>
<br>Mundell's scientific contributions are original. Yet they quickly transformed the research in international macroeconomics and attracted increasing attention in the practically oriented discussion of stabilization policy and exchange rate systems. A sojourn at the research department of the International Monetary Fund, 1961-1963, apparently stimulated Mundell's choice of research problems; it also gave his research additional leverage among economic policymakers.
<br>
<br>
<br>The Effects of Stabilization Policy
<br>In several papers published in the early 1960s - reprinted in his book International Economics (1968) - Robert Mundell developed his analysis of monetary and fiscal policy, so-called stabilization policy, in open economies.
<br>
<br>The Mundell-Fleming Model
<br>A pioneering article (1963) addresses the short-run effects of monetary and fiscal policy in an open economy. The analysis is simple, but the conclusions are numerous, robust and clear. Mundell introduced foreign trade and capital movements into the so-called IS-LM model of a closed economy, initially developed by the 1972 economics laureate Sir John Hicks. This allowed him to show that the effects of stabilization policy hinge on the degree of international capital mobility. In particular, he demonstrated the far-reaching importance of the exchange rate regime: under a floating exchange rate, monetary policy becomes powerful and fiscal policy powerless, whereas the opposite is true under a fixed exchange rate.
<br>
<br>In the interesting special case with high capital mobility, foreign and domestic interest rates coincide (given that the exchange rate is expected to be constant). Under a fixed exchange rate, the central bank must intervene on the currency market in order to satisfy the public's demand for foreign currency at this exchange rate. As a result, the central bank loses control of the money supply, which then passively adjusts to the demand for money (domestic liquidity). Attempts to implement independent national monetary policy by means of so-called open market operations are futile because neither the interest rate nor the exchange rate can be affected. However, increased government expenditures, or other fiscal policy measures, can raise national income and the level of domestic activity, thereby escaping the impediments of rising interest rates or a stronger exchange rate.
<br>
<br>A floating exchange rate is determined by the market since the central bank refrains from currency intervention. Fiscal policy now becomes powerless. Under unchanged monetary policy, increased government expenditures give rise to a greater demand for money and tendencies towards higher interest rates. Capital inflows strengthen the exchange rate to the point where lower net exports eliminate the entire expansive effect of higher government expenditures. Under floating exchange rates, however, monetary policy becomes a powerful tool for influencing economic activity. Expansion of the money supply tends to promote lower interest rates, resulting in capital outflows and a weaker exchange rate, which in turn expand the economy through increased net exports.
<br>
<br>Floating exchange rates and high capital mobility accurately describe the present monetary regime in many countries. But in the early 1960s, an analysis of their consequences must have seemed like an academic curiosity. Almost all countries were linked together by fixed exchange rates within the so-called Bretton Woods System. International capital movements were highly curtailed, in particular by extensive capital and exchange rate controls. During the 1950s, however, Mundell's own country - Canada - had allowed its currency to float against the US dollar and had begun to ease restrictions. His far-sighted analysis became increasingly relevant over the next ten years, as international capital markets opened up and the Bretton Woods System broke down.
<br>
<br>Marcus Fleming (who died in 1976) was Deputy Director of the research department of the International Monetary Fund for many years; he was already a member of this department during the period of Mundell's affiliation. At approximately the same time as Mundell, Fleming presented similar research on stabilization policy in open economies. As a result, today's textbooks refer to the Mundell-Fleming Model. In terms of depth, range and analytical power, however, Mundell's contribution predominates.
<br>
<br>The original Mundell-Fleming Model undoubtedly had its limitations. For instance, as in all macroeconomic analysis at the time, it makes highly simplified assumptions about expectations in financial markets and assumes price rigidity in the short run. These shortcomings have been remedied by later researchers, who have shown that gradual price adjustment and rational expectations can be incorporated into the analysis without significantly changing the results.
<br>
<br>Monetary Dynamics
<br>In contrast to his colleagues during this period, Mundell's research did not stop at short-run analysis. Monetary dynamics is a key theme in several significant articles. He emphasized differences in the speed of adjustment on goods and asset markets (called the principle of effective market classification). Later on, these differences were highlighted by his own students and others to show how the exchange rate can temporarily "overshoot" in the wake of certain disturbances.
<br>
<br>An important problem concerned deficits and surpluses in the balance of payments. In the postwar period, research on these imbalances had been based on static models and emphasized real economic factors and flows in foreign trade. Inspired by David Humes's classic mechanism for international price adjustment which focused on monetary factors and stock variables, Mundell formulated dynamic models to describe how prolonged imbalances could arise and be eliminated. He demonstrated that an economy will adjust gradually over time as the money holdings of the private sector (and thereby its wealth) change in response to surpluses or deficits. Under fixed exchange rates, for example, when capital movements are sluggish, an expansive monetary policy will reduce interest rates and raise domestic demand. The subsequent balance of payments deficit will generate monetary outflows, which in turn lower demand until the balance of payments returns to equilibrium. This approach, which was adopted by a number of researchers, became known as the monetary approach to the balance of payments. For a long time it was regarded as a kind of long-run benchmark for analyzing stabilization policy in open economies. Insights from this analysis have frequently been applied in practical economic policymaking - particularly by IMF economists.
<br>
<br>Prior to another of Mundell's contributions, the theory of stabilization policy had not only been static, it had also assumed that all economic policy in a country is coordinated and assembled in a single hand. By contrast, Mundell used a simple dynamic model to examine how each of the two instruments, monetary and fiscal policy, should be directed towards either of two objectives, external and internal balance, in order to bring the economy closer to these objectives over time. This implies that each of two different authorities - the government and the central bank - is given responsibility for its own stabilization policy instrument. Mundell's conclusion was straightforward: to prevent the economy from becoming unstable, the linkage has to accord with the relative efficiency of the instruments. In his model, monetary policy is linked to external balance and fiscal policy to internal balance. Mundell's primary concern was not decentralization itself. But by explaining the conditions for decentralization, he anticipated the idea which, long afterwards, has become generally accepted, i.e., that the central bank should be given independent responsibility for price stability.
<br>
<br>Mundell's contributions on dynamics proved to be a watershed for research in international macroeconomics. They introduced a meaningful dynamic approach, based on a clear-cut distinction between stock and flow variables, as well as an analysis of their interaction during the adjustment of an economy to a stable long-run situation. Mundell's work also initiated the necessary rapprochement between Keynesian short-run analysis and classical long-run analysis. Subsequent researchers have extended Mundell's findings. The models have been extended to incorporate forward-looking decisions of household and firms, additional types of financial assets and richer dynamic adjustments of prices and the current account. Despite these modifications, most of Mundell's results stand up.
<br>
<br>The short-run and long-run analyses carried out by Mundell arrive at the same fundamental conclusion regarding the conditions for monetary policy. With (i) free capital mobility, monetary policy can be oriented towards either (ii) an external objective - such as the exchange rate - or (iii) an internal (domestic) objective - such as the price level - but not both at the same time. This incompatible trinity has become self-evident for academic economists; today, this insight is also shared by the majority of participants in the practical debate.
<br>
<br>
<br>Optimum Currency Areas
<br>
<br>As already indicated, fixed exchange rates predominated in the early 1960s. A few researchers did in fact discuss the advantages and disadvantages of a floating exchange rate. But a national currency was considered a must. The question Mundell posed in his article on "optimum currency areas" (1961) therefore seemed radical: when is it advantageous for a number of regions to relinquish their monetary sovereignty in favor of a common currency?
<br>
<br>Mundell's article briefly mentions the advantages of a common currency, such as lower transaction costs in trade and less uncertainty about relative prices. The disadvantages are described in greater detail. The major drawback is the difficulty of maintaining employment when changes in demand or other "asymmetric shocks" require a reduction in real wages in a particular region. Mundell emphasized the importance of high labor mobility in order to offset such disturbances. He characterized an optimum currency area as a set of regions among which the propensity to migrate is high enough to ensure full employment when one of the regions faces an asymmetric shock. Other researchers extended the theory and identified additional criteria, such as capital mobility, regional specialization and a common tax and transfer system. The way Mundell originally formulated the problem has nevertheless continued to influence generations of economists.
<br>
<br>Mundell's considerations, several decades ago, seem highly relevant today. Due to increasingly higher capital mobility in the world economy, regimes with a temporarily fixed, but adjustable, exchange rate have become more fragile; such regimes are also being called into question. Many observers view a currency union or a floating exchange rate - the two cases Mundell's article dealt with - as the most relevant alternatives. Needless to say, Mundell's analysis has also attracted attention in connection with the common European currency. Researchers who have examined the economic advantages and disadvantages of EMU have adopted the idea of an optimum currency area as an obvious starting point. Indeed, one of the key issues in this context is labor mobility in response to asymmetric shocks.
<br>
<br>
<br>Other Contributions
<br>
<br>Mundell has made other contributions to macroeconomic theory. He has shown, for example, that higher inflation can induce investors to lower their cash balances in favor of increased real capital formation. As a result, even expected inflation might have a real economic effect - which has come to be known as the Mundell-Tobin effect. Mundell has also made lasting contributions to international trade theory. He has clarified how the international mobility of labor and capital tends to equalize commodity prices among countries, even if foreign trade is limited by trade barriers. This may be regarded as the mirror image of the well-known Heckscher-Ohlin-Samuelson result that free trade of goods tends to bring about equalization of the rewards to labor and capital among countries, even if international capital movements and migration are limited. These results provide a clear prediction: trade barriers stimulate international mobility of labor and capital, whereas barriers to migration and capital movements stimulate commodity trade.
<br>
<br>
<br>
<br>
<br>*************************************************************************************
<br>
<br>Further Reading
<br>
<br>
<br>Additional background information
<br>
<br>
<br>Mundell, R.A. (1961), "A Theory of Optimum Currency Areas", American Economic Review 51: 657-665.
<br>
<br>
<br>Mundell, R.A. (1963), "Capital Mobility and Stabilization Policy under Fixed and Flexible Exchange Rates", Canadian Journal of Economics 29: 475-485.
<br>
<br>
<br>Mundell, R.A. (1968), International Economics (New York: MacMillan).
<br>
<br>
<br>
<br>************************************************************************************
<br>
<br>
<br>Robert A. Mundell was born in Canada in 1932. After completing his undergraduate education at the University of British Columbia he began his postgraduate studies at University of Washington and continued it at M.I.T. and London School of Economics. Mundell received his Ph.D. from M.I.T. in 1956 with a thesis on international capital movements. After having held several professorships, he has been affiliated with Columbia University in New York since 1974.
<br>
<br>Professor Robert A. Mundell
<br>Economics Department
<br>Columbia University
<br>1022 International Affairs Building
<br>420 West 118th Street
<br>New York, NY 10027
<br>USA
<br>
<br>
<br>The amount of the Prize Award is SEK 7, 900, 000.
<br>
9#
发表于 2005-12-9 03:59:53 | 只看该作者
Presentation Speech - The Sveriges RikS*ank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>KUNGL. VETENSKAPSAKADEMIEN? THE ROYAL SWEDISH ACADEMY OF SCIENCES
<br>
<br>14 October 1998
<br>
<br>
<br>The Royal Swedish Academy of Sciences has decided to award the 1998 Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, to
<br>
<br>
<br>
<br>Professor Amartya Sen, Trinity College, Cambridge, U.K. (citizen of India)
<br>
<br>for his contributions to welfare economics.
<br>
<br>
<br>Social Choice, Welfare Distributions, and Poverty
<br>
<br>Amartya Sen has made several key contributions to the research on fundamental problems in welfare economics. His contributions range from axiomatic theory of social choice, over definitions of welfare and poverty indexes, to empirical studies of famine. They are tied closely together by a general interest in distributional issues and a particular interest in the most impoverished members of society. Sen has clarified the conditions which permit aggregation of individual values into collective decisions, and the conditions which permit rules for collective decision making that are consistent with a sphere of rights for the individual. By analyzing the available information about different individuals' welfare when collective decisions are made, he has improved the theoretical foundation for comparing different distributions of society's welfare and defined new, and more satisfactory, indexes of poverty. In empirical studies, Sen's applications of his theoretical approach have enhanced our understanding of the economic mechanisms underlying famines.
<br>
<br>************************************************************************************
<br>
<br>Can the values which individual members of society attach to different alternatives be aggregated into values for society as a whole, in a way that is both fair and theoretically sound? Is the majority principle a workable decision rule? How should income inequality be measured? When and how can we compare the distribution of welfare in different societies? How should we best determine whether poverty is on the decline? What are the factors that trigger famines? By answering questions such as these, Amartya Sen has made a number of noteworthy contributions to central fields of economic science and opened up new fields of study for subsequent generations of researchers. By combining tools from economics and philosophy, he has restored an ethical dimension to the discussion of vital economic problems.
<br>
<br>Individual Values and Collective Decisions
<br>
<br>When there is general agreement, the choices made by society are uncontroversial. When opinions differ, the problem is to find methods for bringing together different opinions in decisions which concern everyone. The theory of social choice is preoccupied precisely with this link between individual values and collective choice. Fundamental questions are whether - and, if so, in what way - preferences for society as a whole can be consistently derived from the preferences of its members. The answers are crucial for the feasibility of ranking, or otherwise evaluating, different social states and thereby constructing meaningful measures of social welfare.
<br>
<br>Majority rule
<br>Majority voting is perhaps the most common rule for making collective decisions. A long time ago, this rule was found to have serious deficiencies, in addition to the fact that it may allow a majority to suppress a minority. In some situations it may pay off to vote strategically (i.e. by not voting for the preferred alternative), or to manipulate the order in which different alternatives are voted upon. Voting between pairs of alternatives sometimes fails to produce a clear result in a group. A majority may thus prefer alternative a to alternative b whereas a (second) majority prefers b to c ; meanwhile, a (third) majority prefers c to a. In the wake of this kind of "intransitivity", the decision rule cannot select an alternative that is unambiguously best for any majority. In collaboration with Prasanta Pattanaik, Amartya Sen has specified the general conditions that eliminate intransitivities of majority rule.
<br>
<br>In the early 1950s, such problems associated with rules for collective choice motivated economics laureate Kenneth Arrow (1972) to examine possible rules for aggregating individual preferences (values, votes), where majority rule was only one of many alternatives. His surprising but fundamental result was that no aggregation (decision) rule exists that fulfills five conditions (axioms), each of which appears very reasonable on its own.
<br>
<br>This so-called impossibility theorem seemed to be an insurmountable obstacle to progress in the normative branch of economics for a long time. How could individual preferences be aggregated and different social states evaluated in a theoretically satisfactory way? Sen's contributions from the mid-1960s onwards were instrumental in alleviating this pessimism. His work not only enriched the principles of social choice theory; they also opened up new and important fields of study. Sen's monograph Collective Choice and Social Welfare from 1970 was particularly influential and inspired many researchers to renew their interest in basic welfare issues. Its style, interspersing formally and philosophically oriented chapters, gave the economic analysis of normative problems a new dimension. In the book as well as many separate articles, Sen treated problems such as: majority rule, individual rights, and the availability of information about individual welfare.
<br>
<br>Individual rights
<br>A self-evident prerequisite for a collective decision-making rule is that it should be "non-dictatorial"; that is, it should not reflect the values of any single individual. A minimal requirement for protecting individual rights is that the rule should respect the individual preferences of at least some people in at least some dimension, for instance regarding their personal sphere. Sen pointed to a fundamental dilemma by showing that no collective decision rule can fulfill such a minimal requirement on individual rights and the other axioms in Arrow's impossibility theorem. This finding initiated an extensive scientific discussion about the extent to which a collective decision rule can be made consistent with a sphere of individual rights.
<br>
<br>Information about the welfare of individuals
<br>Traditionally, the theory of social choice had only assumed that every individual can rank different alternatives, without assuming anything about interpersonal comparability. This assumption certainly avoided the difficult question of whether the utility individuals attach to different alternatives can really be compared. Unfortunately, it also precluded saying anything worthwhile about inequality. Sen initiated an entirely new field in the theory of social choice, by showing how different assumptions regarding interpersonal comparability affect the possibility of finding a consistent, non-dictatorial rule for collective decisions. He also demonstrated the implicit assumptions made when applying principles proposed by moral philosophy to evaluate different alternatives for society. The utilitarian principle, for instance, appeals to the sum of all individuals' utility when evaluating a specific social state; this assumes that differences in the utility of alternative social states can be compared across individuals. The principle formulated by the American philosopher John Rawls - that the social state should be evaluated only with reference to the individual who is worst off - assumes that the utility level of each individual can be compared to the utility of every other individual. Later developments in social choice rely, to a large extent, on Sen's analysis of the information about, and interpersonal comparability of, individual utilities.
<br>
<br>Indexes of Welfare and Poverty
<br>
<br>In order to compare distributions of welfare in different countries, or to study changes in the distribution within a given country, some kind of index is required that measures differences in welfare or income. The construction of such indexes is an important application of the theory of social choice, in the sense that inequality indexes are closely linked to welfare functions representing the values of society. Serge Kolm, Anthony Atkinson and - somewhat later - Amartya Sen were the first to derive substantial results in this area. Around 1970, they clarified the relation between the so-called Lorentz curve (that describes the income distribution), the so-called Gini coefficient (that measures the degree of income inequality), and society's ordering of different income distributions. Sen has later made valuable contributions by defining poverty indexes and other welfare indicators.
<br>
<br>Poverty indexes
<br>A common measure of poverty in a society is the share of the population, H , with incomes below a certain, predetermined, poverty line. But the theoretical foundation for this kind of measure was unclear. It also ignored the degree of poverty among the poor; even a significant boost in the income of the poorest groups in society does not affect H as long as their incomes do not cross the poverty line. To remedy these deficiencies, Sen postulated five reasonable axioms from which he derived a poverty index: P = H · [I + (1 - I) · G]. Here, G is the Gini coefficient, and I is a measure (between 0 and 1) of the distribution of income, both computed only for the individuals below the poverty line. Relying on his earlier analysis of information about the welfare of single individuals, Sen clarified when the index can and should be applied; comparisons can, for example, be made even when data are problematic, which is often the case in poor countries where poverty indexes have their most intrinsic application. Sen's poverty index has subsequently been applied extensively by others. Three of the axioms he postulated have been used by those researchers, who have proposed alternative indexes.
<br>
<br>Welfare indicators
<br>A problem when comparing the welfare of different societies is that many commonly used indicators, such as income per capita, only take average conditions into account. Sen has developed alternatives, which also encompass the income distribution. A specific alternative - which, like the poverty index, he derived from a number of axioms - is to use the measure y · (1 - G), where y is income per capita and G is the Gini coefficient.
<br>
<br>Sen has emphasized that what creates welfare is not goods as such, but the activity for which they are acquired. According to this view, income is significant because of the opportunities it creates. But the actual opportunities - or capabilities, as Sen calls them - also depend on a number of other factors, such as health; these factors should also be considered when measuring welfare. Alternative welfare indicators, such as the UN's Human Development Index, are constructed precisely in this spirit.
<br>
<br>Amartya Sen has pointed out that all well-founded ethical principles presuppose equality among individuals in some respect. But as the ability to exploit equal opportunity varies across individuals, the distribution problem can never be fully solved; equality in some dimension necessarily implies inequality in others. In which dimension we advocate equality and in which dimensions we have to accept inequality obviously depends on how we evaluate the different dimensions of welfare. In analogy with his approach to welfare measurement, Sen maintains that capabilities of individuals constitute the principal dimension in which we should strive for equality. At the same time, he observes a problem with this ethical principle, namely that individuals make decisions which determine their capabilities at a later stage.
<br>
<br>Welfare of the Poorest
<br>
<br>In his very first articles Sen analyzed the choice of production technology in developing countries. Indeed, almost all of Sen's works deal with development economics, as they are often devoted to the welfare of the poorest people in society. He has also studied actual famines, in a way quite in line with his theoretical approach to welfare measurement.
<br>
<br>Analysis of famine
<br>Sen's best-known work in this area is his book from 1981: Poverty and Famines: An Essay on Entitlement and Deprivation. Here, he challenges the common view that a shortage of food is the most important (sometimes the only) explanation for famine. On the basis of a careful study of a number of such catastrophes in India, Bangladesh, and Saharan countries, from the 1940s onwards, he found other explanatory factors. He argues that several observed phenomena cannot in fact be explained by a shortage of food alone, e.g. that famines have occurred even when the supply of food was not significantly lower than during previous years (without famines), or that faminestricken areas have sometimes exported food.
<br>
<br>Sen shows that a profound understanding of famine requires a thorough analysis of how various social and economic factors influence different groups in society and determine their actual opportunities. For example, part of his explanation for the Bangladesh famine of 1974 is that flooding throughout the country that year significantly raised food prices, while work opportunities for agricultural workers declined drastically as one of the crops could not be harvested. Due to these factors, the real incomes of agricultural workers declined so much that this group was disproportionately stricken by starvation.
<br>
<br>Later works by Sen (summarized in a book from 1989 with Jean Drèze) discuss - in a similar spirit - how to prevent famine, or how to limit the effects of famine once it has occurred. Even though a few critics have questioned the validity of some empirical results in Poverty and Famines, the book is undoubtedly a key contribution to development economics. With its emphasis on distributional issues and poverty, the book rhymes well with the common theme in Amartya Sen's research.
<br>
<br>
<br>
<br>************************************************************************************
<br>
<br>  
<br>Further Reading
<br>
<br>Additional background material
<br>Sen, A. K., 1970, Collective Choice and Social Welfare, San Fransisco: Holden Day , also London: Oliver and Boyd (reprinted Amsterdam: North-Holland).
<br>Sen, A. K. , 1973, On Economic Inequality, Oxford: Clarendon Press.
<br>Sen, A. K., 1981, Poverty and Famines: An Essay on Entitlement and Deprivation, Oxford: Clarendon Press.
<br>
<br>******
<br>
<br>******************************************************************************
<br>Amartya Sen was born in Bengal in 1933 (citizen of India). He received his doctorate from the University of Cambridge, U.K. in 1959 and has been professor in India, the U.K. and the U.S. In 1998 he left his professorships in economics and philosophy at Harvard University to become Master of Trinity College, Cambridge U.K.
<br>
<br>Professor Amartya Sen
<br>Trinity College
<br>Cambridge, CB2 1TQ, U.K.
<br>
<br>
10#
发表于 2005-12-9 04:00:32 | 只看该作者
Presentation Speech - The Sveriges RikS*ank (Bank of Sweden) Prize in Economic Sciences in Memory of Alfred Nobel
<br>
<br>KUNGL. VETENSKAPSAKADEMIEN? THE ROYAL SWEDISH ACADEMY OF SCIENCES
<br>
<br>14 October 1997
<br>
<br>
<br>The Royal Swedish Academy of Sciences has decided to award the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, 1997, to
<br>
<br>Professor Robert C. Merton, Harvard University, Cambridge, USA and
<br>Professor Myron S. Scholes, Stanford University, Stanford, USA
<br>
<br>for a new method to determine the value of derivatives.
<br>
<br>Robert C. Merton and Myron S. Scholes have, in collaboration with the late Fischer Black, developed a pioneering formula for the valuation of stock options. Their methodology has paved the way for economic valuations in many areas. It has also generated new types of financial instruments and facilitated more efficient risk management in society.
<br>
<br>************************************************************************************
<br>
<br>In a modern market economy it is essential that firms and households are able to select an appropriate level of risk in their transactions. This takes place on financial markets which redistribute risks towards those agents who are willing and able to assume them. Markets for options and other so-called derivatives are important in the sense that agents who anticipate future revenues or payments can ensure a profit above a certain level or insure themselves against a loss above a certain level. (Due to their design, options allow for hedging against one-sided risk - options give the right, but not the obligation, to buy or sell a certain security in the future at a prespecified price.) A prerequisite for efficient management of risk, however, is that such instruments are correctly valued, or priced. A new method to determine the value of derivatives stands out among the foremost contributions to economic sciences over the last 25 years.
<br>
<br>This year磗 laureates, Robert Merton and Myron Scholes, developed this method in close collaboration with Fischer Black, who died in his mid-fifties in 1995. These three scholars worked on the same problem: option valuation. In 1973, Black and Scholes published what has come to be known as the Black-Scholes formula. Thousands of traders and investors now use this formula every day to value stock options in markets throughout the world. Robert Merton devised another method to derive the formula that turned out to have very wide applicability; he also generalized the formula in many directions.
<br>
<br>Black, Merton and Scholes thus laid the foundation for the rapid growth of markets for derivatives in the last ten years. Their method has more general applicability, however, and has created new areas of research - inside as well as outside of financial economics. A similar method may be used to value insurance contracts and guarantees, or the flexibility of physical investment projects.
<br>
<br>
<br>The problem
<br>
<br>Attempts to value derivatives have a long history. As far back as 1900, the French mathematician Louis Bachelier reported one of the earliest attempts in his doctoral dissertation, although the formula he derived was flawed in several ways. Subsequent researchers handled the movements of stock prices and interest rates more successfully. But all of these attempts suffered from the same fundamental shortcoming: risk premia were not dealt with in a correct way.
<br>
<br>The value of an option to buy or sell a share depends on the uncertain development of the share price to the date of maturity. It is therefore natural to suppose - as did earlier researchers - that valuation of an option requires taking a stance on which risk premium to use, in the same way as one has to determine which risk premium to use when calculating present values in the evaluation of a future physical investment project with uncertain returns. Assigning a risk premium is difficult, however, in that the correct risk premium depends on the investor磗 attitude towards risk. Whereas the attitude towards risk can be strictly defined in theory, it is hard or impossible to observe in reality.
<br>
<br>The method
<br>
<br>Black, Merton and Scholes made a vital contribution by showing that it is in fact not necessary to use any risk premium when valuing an option. This does not mean that the risk premium disappears; instead it is already included in the stock price.
<br>
<br>The idea behind their valuation method can be illustrated as follows:
<br>Consider a so-called European call option that gives the right to buy one share in a certain firm at a strike price of $ 50, three months from now. The value of this option obviously depends not only on the strike price, but also on today磗 stock price: the higher the stock price today, the greater the probability that it will exceed $ 50 in three months, in which case it pays to exercise the option. As a simple example, let us assume that if the stock price goes up by $ 2 today, the option goes up by $ 1. Assume also that an investor owns a number of shares in the firm in question and wants to lower the risk of changes in the stock price. He can actually eliminate that risk completely, by selling (writing) two options for every share that he owns. Since the portfolio thus created is risk-free, the capital he has invested must pay exactly the same return as the risk-free market interest rate on a three-month treasury bill. If this were not the case, arbitrage trading would begin to eliminate the possibility of making a risk-free profit. As the time to maturity approaches, however, and the stock price changes, the relation between the option price and the share price also changes. Therefore, to maintain a risk-free option-stock portfolio, the investor has to make gradual changes in its composition.
<br>
<br>One can use this argument, along with some technical assumptions, to write down a partial differential equation. The solution to this equation is precisely the Black-Scholes
您需要登录后才可以回帖 登录 | 注册

本版积分规则

Archiver|小黑屋|中国海外利益研究网|政治学与国际关系论坛 ( 京ICP备12023743号  

GMT+8, 2025-7-29 15:36 , Processed in 0.125000 second(s), 29 queries .

Powered by Discuz! X3.2

© 2001-2013 Comsenz Inc.

快速回复 返回顶部 返回列表