Sie sind auf Seite 1von 32

Plagiarism is defined in dictionaries as the "wrongful appropriation," "close imitation," or "purloining and publication" of another author's "language, thoughts,

ideas, or expressions," and the representation of them as one's own original work,[1][2] but the notion remains problematic with nebulous boundaries.[3][4][5]
[6]

The modern concept of plagiarism as immoral and originality as an ideal emerged in Europe only in the

18th century, particularly with the Romantic movement, while in the previous centuries authors and artists were encouraged to "copy the masters as closely as possible" and avoid "unnecessary invention."[7][8][9][10]
[11][12]

The 18th century new morals have been institutionalized and enforced prominently in the sectors of academia and journalism, where plagiarism is now considered academic dishonesty and a breach of journalistic ethics, subject to sanctions like expulsion and other severe career damage. Not so in the arts, which not only have resisted in their long-established tradition of copying as a fundamental practice of thecreative process,[12][13][14] but with the boom of the modernist and postmodern movements in the 20th century, this practice has been heightened as the central and representative artistic device.[12][15]
[16]

Plagiarism remains tolerated by 21st century artists.[13][14]

Plagiarism is not a crime per se but is disapproved more on the grounds of moral offence,[7][17] and cases of plagiarism can involve liability forcopyright infringement.
Contents
[hide]

1 Etymology and history 2 Legal aspects 3 In academia and journalism

3.1 Journalism 3.2 Sanctions for student plagiarism 3.3 Self-plagiarism

3.3.1 The concept of selfplagiarism

3.3.2 Self-plagiarism and codes of ethics

3.3.3 Factors that justify reuse

3.4 Organizational publications

4 In the arts

4.1 Plagiarism and the history of art 4.2 Praisings of artistic plagiarism

5 In other contexts

5.1 Plagiarism on the Internet 5.2 As a practical issue

6 See also 7 Notes 8 References 9 Further reading 10 External links

[edit]Etymology

and history

Twentieth-century dictionaries define plagiarism as "wrongful appropriation," "close imitation," or "purloining and publication," of anotherauthor's "language, thoughts, ideas, or expressions," and the representation of them as one's own original work,[1][2] but the notion remains problematic with nebulous boundaries.[3][4][5][6] There is no rigorous and precise distinction between imitation, stylistic plagiarism, copy,replica and forgery.[3][4][5][6] In the 1st century, the use of the Latin word plagiarius (literally kidnapper), to denote someone stealing someone else's work, was pioneered by Roman poet Martial, who complained that another poet had "kidnapped his verses." This use of the word was introduced into English in 1601 by dramatist Ben Jonson, to describe as a plagiary someone guilty of literary theft.[7][18] The derived form plagiarism was introduced into English around 1620.[19] The Latin plagirius, "kidnapper", and plagium, "kidnapping", has the root plaga ("snare", "net"), based on the Indo-European root *-plak, "to weave" (seen for instance in Greek plekein, Bulgarian ""pleta, Latin plectere, all meaning "to weave"). The modern concept of plagiarism as immoral and originality as an ideal, emerged in Europe only in the 18th century, particularly with theRomantic movement.[7][11][12] For centuries before, not only literature was considered "publica materies," a common property from which anybody could borrow at will, but the encouragement for authors and artists was actually to "copy the masters as closely as possible," for which the closer the copy the finer was considered the work.[7][8][13][20][21] This was the same in literature, music, painting and sculpture. In some cases, for a writer to invent their own plots was reproached as

presumptuous.[7] This stood at the time of Shakespeare too, when it was common to appreciate more the similarity with an admired classical work, and the ideal was to avoid "unnecessary invention."[7][9][10] The modern ideals for originality and against plagiarism appeared in the 18th century, in the context of the economic and political history of the book trade, which will be exemplary and influential for the subsequent broader introduction of capitalism.[22] Originality, that traditionally had been deemed as impossible, was turned into an obligation by the emerging ideology of individualism.[10][13] In 1755 the word made it into Johnson's influential A Dictionary of the English Language, where he was cited in the entry for copier ("One that imitates; a plagiary; an imitator. Without invention a painter is but a copier, and a poet but a plagiary of others."), and in its own entry denoting both A thief in literature ("one who steals the thoughts or writings of another") and The crime of literary theft.[7][23] Later in the 18th century, the Romantic movement completed the transformation of the previous ideas about literature, developing the Romantic myth of artistic inspiration, which believes in the "individualised, inimitable act of literary creation", in the ideology of the "creation from nothingness" of a text which is an "autonomous object produced by an individual genious."[5][12][21][24][25][26][27] Despite the 18th century new morals, and their current enforcement in the ethical codes of academia and journalism, the arts, by contrast, not only have resisted in their long-established tradition of copying as a fundamental practice of the creative process,[12][13][14] but with the boom of the modernist and postmodern movements, this practice has been accelerated, spread, increased, dramatically amplifyied to an unprecedented degree, to the point that has been heightened as the central and representative artistic device of these movements.[15][16][12]Plagiarism remains tolerated by 21st century artists.[13][14] [edit]Legal

aspects

Though plagiarism in some contexts is considered theft or stealing, it does not exist in a legal sense. "Plagiarism" is not mentioned in any current statute, either criminal or civil.[14][17] Some cases may be treated as unfair competition or a violation of the doctrine of moral rights.[17]The increased availability of intellectual property due to a rise in technology has furthered the debate as to whether copyright offences are criminal.[citation needed] In short, people are asked to use the guideline, "...if you did not write it yourself, you must give credit."[28][unreliable source?] Plagiarism is not the same as copyright infringement. While both terms may apply to a particular act, they are different concepts. Copyright infringement is a violation of the rights of a copyright holder, when material restricted by copyright is used without consent. On the other hand, the moral concept of plagiarism is concerned with the unearned increment to the plagiarizing author's reputation that is achieved through false claims of authorship.

[edit]In

academia and journalism

Within academia, plagiarism by students, professors, or researchers is considered academic dishonesty or academic fraud, and offenders are subject to academic censure, up to and including expulsion. In journalism, plagiarism is considered a breach of journalistic ethics, and reporters caught plagiarizing typically face disciplinary measures ranging from suspension to termination of employment. Some individuals caught plagiarizing in academic or journalistic contexts claim that they plagiarized unintentionally, by failing to include quotations or give the appropriate citation. While plagiarism in scholarship and journalism has a centuries-old history, the development of the Internet, where articles appear as electronic text, has made the physical act of copying the work of others much easier. For professors and researchers, plagiarism is punished by sanctions ranging from suspension to termination, along with the loss of credibility and integrity.[29][30] Charges of plagiarism against students and professors are typically heard by internal disciplinary committees, which students and professors have agreed to be bound by.[31] [edit]Journalism Since journalism's main currency is public trust, a reporter's failure to honestly acknowledge their sources undercuts a newspaper or television news show's integrity and undermines its credibility. Journalists accused of plagiarism are often suspended from their reporting tasks while the charges are being investigated by the news organization.[32] The ease with which electronic text can be reproduced from online sources has lured a number of reporters into acts of plagiarism: Journalists have been caught "copying-and-pasting" articles and text from a number of websites[citation needed]. [edit]Sanctions

for student plagiarism

An editor has expressed a concern that this article lends undue weight to certain ideas, incidents, controversies or matters relative to the article subject as a whole. Please help to create a more balanced presentation. Discuss and resolve this issue before removing this message. (August 2010) In the academic world, plagiarism by students is a very serious offense that can result in punishments such as a failing grade on the particular assignment (typically at the high school level) or for the course (typically at the college or university level).[citation needed] For cases of repeated plagiarism, or for cases in which a student commits severe plagiarism (e.g., submitting a copied piece of writing as original work), a student may be suspended or expelled.[33] In many universities, academic degrees or awards may be revoked as a penalty for plagiarism. A plagiarism tariff has been devised for UK higher education institutions in an attempt to encourage some standardization across the sector.[34]

Students may feel pressured to complete papers well and quickly, and with the accessibility of new technology (the Internet) students can plagiarize by copying and pasting information from other sources. This is often easily detected by teachers for several reasons. First, students' choices of sources are frequently unoriginal; instructors may receive the same passage copied from a popular source from several students. Second, it is often easy to tell whether a student used his or her own "voice". Third, students may choose sources which are inappropriate, inaccurate, or off-topic. Fourth, lecturers may insist that submitted work is first submitted to an online plagiarism detector.[35] There has been increasing recognition that some plagiarism occurs because students are unaware of acceptable writing practices or may even have developed writing practices considered unacceptable in higher education as part of their prior education.[36] This has led to a call for a greater emphasis on helping students learn about plagiarism as part of a holistic approach suggested by MacDonald and Carroll (2006).[36] As a consequence, consideration has now been given to the best ways to help students learn about plagiarism with suggestions by Carroll (2006) that students should be allowed to experiment and the Joint Information Systems Committee (a body advising higher education in the UK) that students should be able to develop their understanding of plagiarism through making mistakes, which means that they may need to produce some unacceptable writing and receive feedback on it before understanding that it is unacceptable.[36] When considering how best to help students learn about plagiarism, "recognition of individual learner differences" is important.[36] A large amount of the research which has taken place into plagiarism and learner differences has taken place in the context of students studying overseas.
[36]

However, while it might be useful to understand the range of reasons suggested in this research,

Carroll (2008), writing in a UK context, suggests that the variety of understandings of plagiarism are likely as varied amongst domestic students.[36] There is little academic research into the frequency of plagiarism in high schools. Much of the research investigated plagiarism at the post-secondary level.[37] Of the forms of cheating, (including plagiarism, inventing data, and cheating during an exam) students admit to plagiarism more than any other.[citation
needed]

A Duke University in 2005 found that 58% of high school students have plagiarized at least once as

found in a study size of 18,000 participants.[38][39] However, this figure decreases considerably when students are asked about the frequency of "serious" plagiarism (such as copying most of an assignment or purchasing a complete paper from a website). Recent use of plagiarism detection software (see below) gives a more accurate picture of this activity's prevalence. [edit]Self-plagiarism Self-plagiarism (also known as "recycling fraud"[40]) is the reuse of significant, identical, or nearly identical portions of one's own work without acknowledging that one is doing so or without citing the original work. Articles of this nature are often referred to as duplicate ormultiple publication. In addition to the ethical

issue, this can be illegal if copyright of the prior work has been transferred to another entity. Typically, self-plagiarism is only considered to be a serious ethical issue in settings where a publication is asserted to consist of new material, such as in academic publishing or educational assignments.[41] It does not apply (except in the legal sense) to public-interest texts, such as social, professional, and cultural opinions usually published in newspapers and magazines. In academic fields, self-plagiarism is when an author reuses portions of their own published and copyrighted work in subsequent publications, but without attributing the previous publication.[42] Identifying self-plagiarism is often difficult because limited reuse of material is both legally accepted (as fair use) and ethically accepted.[43] It is common for university researchers to rephrase and republish their own work, tailoring it for different academic journals and newspaper articles, to disseminate their work to the widest possible interested public. However, it must be borne in mind that these researchers also obey limits: If half an article is the same as a previous one, it will usually be rejected. One of the functions of the process of peer review in academic writing is to prevent this type of "recycling". [edit]The concept of self-plagiarism The concept of "self-plagiarism" has been challenged as self-contradictory or an oxymoron.[44] For example, Stephanie J. Bird[45] argues that self-plagiarism is a misnomer, since by definition plagiarism concerns the use of others' material. However, the phrase is used to refer to specific forms of potentially unethical publication. Bird identifies the ethical issues sometimes called "self-plagiarism" as those of "dual or redundant publication." She also notes that in an educational context, "self-plagiarism" may refer to the case of a student who resubmits "the same essay for credit in two different courses." As David B. Resnik clarifies, "Self-plagiarism involves dishonesty but not intellectual theft."[46] According to Patrick M. Scanlon[47] "Self-plagiarism" is a term with some specialized currency. Most prominently, it is used in discussions of research and publishing integrity in biomedicine, where heavy publish-or-perish demands have led to a rash of duplicate and "salami-slicing" publication, the reporting of a single study's results in "least publishable units" within multiple articles (Blancett, Flanagin, & Young, 1995; Jefferson, 1998; Kassirer & Angell, 1995; Lowe, 2003; McCarthy, 1993; Schein & Paladugu, 2001; Wheeler, 1989). Roig (2002) offers a useful classification system including four types of self-plagiarism: duplicate publication of an article in more than one journal; partitioning of one study into multiple publications, often called salami-slicing; text recycling; and copyright infringement. [edit]Self-plagiarism and codes of ethics

Some academic journals have codes of ethics which specifically refer to self-plagiarism. For example, the Journal of International Business Studies.[48] Some professional organizations like the Association for Computing Machinery (ACM) have created policies that deal specifically with self-plagiarism.[49] Other organisations do not make specific reference to self-plagiarism: The American Political Science Association (APSA) has published a code of ethics which describes plagiarism as "deliberate appropriation of the works of others represented as one's own." It does not make any reference to self-plagiarism. It does say that when a thesis or dissertation is published "in whole or in part", the author is "not ordinarily under an ethical obligation to acknowledge its origins."[50] The American Society for Public Administration (ASPA) has published a code of ethics which says its members are committed to: "Ensure that others receive credit for their work and contributions," but it does not make any reference to self-plagiarism.[51] [edit]Factors that justify reuse Pamela Samuelson in 1994 identified several factors which excuse reuse of one's previously published work without the culpability of self-plagiarism.[43] She relates each of these factors specifically to the ethical issue of self-plagiarism, as distinct from the legal issue of fair use of copyright, which she deals with separately. Among other factors which may excuse reuse of previously published material Samuelson lists the following: 1. The previous work needs to be restated in order to lay the groundwork for a new contribution in the second work. 2. Portions of the previous work must be repeated in order to deal with new evidence or arguments. 3. The audience for each work is so different that publishing the same work in different places was necessary to get the message out. 4. The author thinks they said it so well the first time that it makes no sense to say it differently a second time. Samuelson states she has relied on the "different audience" rationale when attempting to bridge interdisciplinary communities. She refers to writing for different legal and technical communities, saying: "there are often paragraphs or sequences of paragraphs that can be bodily lifted from one article to the other. And, in truth, I lift them." She refers to her own practice of converting "a technical article into a law review article with relatively few changesadding footnotes and one substantive section" for a different audience.[43]

Samuelson describes misrepresentation as the basis of self-plagiarism. She seems less concerned about reuse of descriptive materials than ideas and analytical content.[43] She also states "Although it seems not to have been raised in any of the self-plagiarism cases, copyrights law's fair use defense would likely provide a shield against many potential publisher claims of copyright infringement against authors who reused portions of their previous works."[43] [edit]Organizational

publications

Plagiarism is presumably not an issue when organizations issue collective unsigned works since they do not assign credit for originality to particular people. For example, the American Historical Association's "Statement on Standards of Professional Conduct" (2005) regarding textbooks and reference books states that, since textbooks and encyclopedias are summaries of other scholars' work, they are not bound by the same exacting standards of attribution as original research and may be allowed a greater "extent of dependence" on other works.[52]However, even such a book does not make use of words, phrases, or paragraphs from another text or follow too closely the other text's arrangement and organization, and the authors of such texts are also expected to "acknowledge the sources of recent or distinctive findings and interpretations, those not yet a part of the common understanding of the profession."[52] Within an organization, in its own working documents, standards are looser but not non-existent. If someone helped with a report, they may expect to be credited. If a paragraph comes from a law report, a citation is expected to be written down. Technical manuals routinely copy facts from other manuals without attribution, because they assume a common spirit of scientific endeavor (as evidenced, for example, in freeand open source software projects) in which scientists freely share their work. The Microsoft Manual of Style for Technical Publications Third Edition (2003) by Microsoft does not even mention plagiarism, nor doesScience and Technical Writing: A Manual of Style, Second Edition (2000) by Philip Rubens. The line between permissible literary and impermissible source code plagiarism, though, is apparently quite fine. As with any technical field, computer programming makes use of what others have contributed to the general knowledge. [edit]In

the arts

L.H.O.O.Q. (1919), one of Marcel Duchamp'sreadymades.

[edit]Plagiarism

and the history of art

Through all of the history of literature and of the arts in general, works of art are for a large part repetitions of the tradition; to the entire history of artistic creativity belong plagiarism,literary theft, appropriation, incorporation, retelling, rewriting, recapitulation, revision, reprise,thematic variation, ironic retake, parody, imitation, stylistic theft, pastiches, collages, and deliberate assemblages.[3][12][13][14][53] There is no rigorous and precise distinction between practices like imitation, stylistic plagiarism, copy, replica and forgery.[3][4][5]
[6]

These appropriation procedures are the main axis of a literate culture, in which the tradition of the

canonic past is being constantly rewritten.[53] These appropriation procedures, vital to the whole history of art, have gained more and more importance since the beginning of the 20th century, with the boom of the modernist andpostmodern movements; in modernist and postmodernist art, appropriation has been heightened as the central and representative device.[12][15][16] [edit]Praisings

of artistic plagiarism

This article may contain excessive, poor or irrelevant examples. You can improve the article by adding more descriptive text and removing less pertinent examples. See

Wikipedia's guide to writing better articles for further suggestions. (August 2010) A famous passage of Laurence Sterne's 1767 Tristram Shandy, condemns plagiarism by resorting to plagiarism.[54][55] Oliver Goldsmith commented:[56] Sterne's Writings, in which it is clearly shewn, that he, whose manner and style were so long thought original, was, in fact, the most unhesitating plagiarist who ever cribbed from his predecessors in order to garnish his own pages. It must be owned, at the same time, that Sterne selects the materials/ of his mosaic work with so much art, places them so well, and polishes them so highly, that in most cases we are disposed to pardon the want of originality, in consideration of the exquisite talent with which the borrowed materials are wrought up into the new form. On December 6, 2006, Thomas Pynchon joined a campaign by many other major authors to clear Ian McEwan of plagiarism charges by sending a typed letter to his British publisher, which was published in the Daily Telegraph[57] Playwright Wilson Mizner said "If you copy from one author, it's plagiarism. If you copy from two, it's research."[58] American author Jonathan Lethem delivered a passionate defense of the use of plagiarism in art in his 2007 essay "The ecstasy of influence: A plagiarism" in Harper's Magazine. He wrote: "The kernel, the soullet us go further and say the substance, the bulk, the actual and valuable material of all human utterancesis plagiarism" and "Don't pirate my editions; do plunder my visions. The name of the game is Give All. You, reader, are welcome to my stories. They were never mine in the first place, but I gave them to you."[59] [edit]In

other contexts
on the Internet

[edit]Plagiarism

Content scraping is copying and pasting from websites[60] and blogs.[61] Free online tools are becoming available to help identify plagiarism,[62][63] and there is a range of approaches that attempt to limit online copying, such as disabling right clicking and placing warning banners regarding copyrights on web pages. Instances of plagiarism that involve copyright violation may be addressed by the rightful content owners sending a DMCA removal notice to the offending site-owner, or to the ISPthat is hosting the offending site. Detecting plagiarism even by detection tools can still be difficult, as plagiarism is often held to not only be the mere copying of text, but also the presentation of another's ideas as one's own, regardless of the specific words or constructs used to express that idea. However, many so-called plagiarism detection services can only detect blatant word-for-word copies of text.

What is Plagiarism By S.E. Van Bramer, Widener University 1995.

Introduction
Because students often are confused about what is and is not plagiarism, I have prepared this handout to help you understand what is acceptable. There are some gray areas and if you have any questions, ask your instructor. Plagiarism is very serious and it can be grounds for failure in a course. So ask first. Another important point is that as you progress in your education the standards become higher. As a College student you are expected to have your own ideas. To read information and explain it in your words. If you complete an assignment by copying material, you are not showing that you understand something. Only that you can repeat what the textbook says. This does not show that you understand.

Definition
Lets start with a definition: Plagiarize \'pla-je-,riz also j - -\ vb -rized; rizing vt [plagiary] : to steal and pass off (the ideas or words of another) as one's own : use (a created production) without crediting the source vi: to commit literary theft: present as new and original an idea or product derived from an existing source - plagiarizer n FROM: Webster's New Collegiate Dictionary 9th ed, (Springfield, Ma: Merriam 1981, p. 870).

What to do
Now what does this mean for you?
1.

First, it is unacceptable to copy something out of a book, newspaper, journal or any other printed source. The most blatant example of this is to directly copy something word for word. It does not matter if it is only a phrase. If it is not yours, either do not use it or place it in quotes and reference it. There are different methods for doing this. The important thing is that the reader can tell what is yours, and what is someone else's.

a.

For short quotes, use quotation marks in the sentence. An example is "CFC's: These substances are also of concern in connection with the destruction of stratospheric ozone" [Bunce, N. Environmental Chemistry (Winnipeg: Wuerz, 1994, p. 19)]

b. b. For longer quotes it is appropriate to indent the entire passage: Chlorofluorocarbons, CFCs: These substances are also of concern in connection with the destruction of stratospheric ozone (Chapter 2). Like N2O, they have no tropospheric sinks, but are infrared absorbers. Up to 1984, the tropospheric concentrations of three of the major commercial CFCs... [Bunce, N. Environmental Chemistry (Winnipeg: Wuerz, 1994, p. 19)] 2. Another reason to use references is to show where you get information from. When you state a fact, unless it is "general knowledge," you should say where it comes from. Otherwise, a careful reader will have no way to verify your statement. It may be subjective to decide what is "general knowledge" but keep in mind who is your audience. As an example what is your reaction to the statement: Wetlands emit 150 million tons of methane each year [Bunce, N. Environmental Chemistry (Winnipeg: Wuerz, 1994, p. 18)]. Without the reference, why should you believe me?
3.

The above examples may seem obvious. If you use something word for word it MUST be acknowledged. Things start to get a bit gray when you paraphrase. There is one simple solution to this dilemma. DO NOT PARAPHRASE! Only use someone else's writing when it serves a purpose. Only use someone else's writing when you want to quote precisely what they wrote. If this is not your goal, USE YOUR OWN WORDS. a. This avoids any ambiguity about who wrote it. After all, you do not want someone to accuse you of plagiarism. b. You need to learn how to write in your own style. You may be influenced by authors that you find clear and easy to understand, but your writing needs to be YOUR writing.

Mimicking someone else is not a productive exercise. You just learn to cut and paste. c. An instructor who is reading or grading your work is interested in YOUR understanding of an idea. I am not interested in your ability to copy explanations from the textbook. I know that the author of the book understands it, which is why I picked the textbook. I need to know if YOU understand it. d. Understanding and learning is more than just replaying something you have heard. Writing is a valuable exercise that tests your ability to explain a topic. I often think I understand something, until I try to write it out. This is an important part of learning.

Malnutrition is the condition that results from taking an unbalanced diet in which certain nutrients are lacking, in excess (too high an intake), or in the wrong proportions.[1][2] A number of different nutrition disorders may arise, depending on which nutrients are under or overabundant in the diet. The World Health Organization cites malnutrition as the greatest single threat to the world's public health.
[3]

Improving nutrition is widely regarded as the most effective form of aid.[3][4]Emergency measures include

providing deficient micronutrients through fortified sachet powders, such as peanut butter, or directly through supplements.[5][6] The famine relief model increasingly used by aid groups calls for giving cash or cash vouchers to the hungry to pay local farmersinstead of buying food from donor countries, often required by law, as it wastes money on transport costs.[7][8] There are various methods used to gauge the degree of malnutrition, including the Gomez Classification. This classifies as 1st, 2nd or 3rd degree malnutrition according to the percentage of normal body weight a person is.

Long term measures include investing in modern agriculture in places that lack them, such asfertilizers and irrigation, which largely eradicated hunger in the developed world.[9] However,World Bank strictures restrict government subsidies for farmers and the spread of fertilizer use is hampered by some environmental groups.[10][11]
Contents
[hide]

1 Effects

1.1 Mortality 1.2 Causes 1.3 Psychological 1.4 Cancer 1.5 Metabolic syndrome

1.6 Hyponatremia 1.7 Overeating vs. hunger

2 Causes

2.1 Poverty and food prices

2.2 Dietary practices

2.3 Agricultural productivity

2.4 Future threats

3 Management

3.1 Emergency measures

3.2 Long-term measures

4 Epidemiology

4.1 Middle East 4.2 South Asia 4.3 United States

5 See also

5.1 Organizations

6 References 7 External links

[edit]Effects [edit]Mortality According to Jean Ziegler(the United Nations Special Rapporteur on the Right to Food for 2000 to March 2008), mortality due to malnutrition accounted for 58% of the total mortality in 2006: "In the world, approximately 62 million people, all causes of death combined, die each year. One in twelve people worldwide is malnourished.[12] In 2006, more than 36 million died of hunger or diseases due to deficiencies in micronutrients".[13] According to the World Health Organization, malnutrition is by far the biggest contributor to child mortality, present in half of all cases.[3] Six million children die of hunger every year.[14] Underweight births and interuterine growth restrictions cause 2.2 million child deaths a year. Poor or non-existent breastfeeding causes another 1.4 million. Other deficiencies, such as lack of vitamin A or zinc, for example, account for 1 million. Malnutrition in the first two years is irreversible. Malnourished children grow up with worse health and lower educational achievements. Their own children also tend to be smaller. Malnutrition was previously seen as something that exacerbates the problems of diseases as measles, pneumonia and diarrhea. But malnutrition actually causes diseases as well, and can be fatal in its own right.[3] [edit]Causes Malnutrition increases the risk of infection and infectious disease; for example, it is a major risk factor in the onset of active tuberculosis.[15]In communities or areas that lack access to safe drinking water, these additional health risks present a critical problem. Lower energy and impaired function of the brain also represent the downward spiral of malnutrition as victims are less able to perform the tasks they need to in order to acquire food, earn an income, or gain an education. Nutrients Deficiency Excess

Food

Starvation, Marasmus

Obesity, diabetes mellitus, Cardiovascular disease

energy

Simple carbohydr none ates

diabetes mellitus, Obesity

Complex carbohydr none ates

Obesity

Saturated fat

low sex hormone levels [16] Cardiovascular disease

Trans fat

none

Cardiovascular disease

Unsaturate none d fat

Obesity

Fat

Malabsorption of Fatsoluble vitamins, Rabbit Cardiovascular disease (claimed by some) Starvation (If protein intake is high)

Omega 3 Fats

Cardiovascular disease

Bleeding, Hemorrhages

Omega 6 Fats

none

Cardiovascular disease, Cancer

Cholestero none l

Cardiovascular disease

Protein

kwashiorkor

Rabbit starvation

Sodium

hyponatremia

Hypernatremia, hypertension

Iron

Iron deficiency: Anemia

Cirrhosis, heart disease

Iodine

Iodine deficiency: Goiter, hypothy Iodine Toxicity (goiter, hypothyroidism) roidism

Vitamin A deficiency: Xerophthalmia Vitamin A Hypervitaminosis A (cirrhosis, hair loss) and Night Blindness, low testosterone levels

Vitamin B1

Beri-Beri

Vitamin B2

Ariboflavinosis: Cracking of skin and Corneal Ulceration

Vitamin B3

Pellagra

dyspepsia, cardiac arrhythmias, birth defects

Vitamin B12

Pernicious anemia

Vitamin C Scurvy

diarrhea causing dehydration

Vitamin D Rickets

Hypervitaminosis D (dehydration, vomiting, constipation)

Vitamin E nervous disorders

Hypervitaminosis E (anticoagulant: excessive bleeding)

Vitamin K

Vitamin K deficiency: Haemorrhage

Calcium

Osteoporosis, tetany, carpo pedal Fatigue, depression, confusion, anorexia, nausea, vomiting,constipation, pa spasm,laryngospasm, cardi ncreatitis, increased urination ac arrhythmias

Magnesiu m

Magnesium deficiency: Hypertension

Weakness, nausea, vomiting, impaired breathing, and hypotension

Potassium

Hypokalemia, cardiac arrhythmias

Hyperkalemia, palpitations

Boron

Boron deficiency

Manganes Manganese deficiency e [edit]Psychological Malnutrition, in the form of iodine deficiency, is "the most common preventable cause of mental impairment worldwide."[17] Even moderateiodine deficiency, especially in pregnant women and infants, lowers intelligence by 10 to 15 I.Q. points, shaving incalculable potential off a nations development.
[17]

The most visible and severe effects disabling goiters, cretinism and dwarfism affect a tiny

minority, usually in mountain villages. But 16 percent of the worlds people have at least mild goiter, a swollen thyroid gland in the neck.[17] Research indicates that improving the awareness of nutritious meal choices and establishing long-term habits of healthy eating has a positive effect on a cognitive and spatial memory capacity, potentially increasing a student's potential to process and retain academic information. Some organizations have begun working with teachers, policymakers, and managed foodservice contractors to mandate improved nutritional content and increased nutritional resources in school cafeterias from primary to university level institutions. Health and nutrition have been proven to have close links with overall educational success.[18] Currently less than 10% of American college students report that they eat the recommended five servings of fruit and vegetables daily.[19] Better nutrition has been shown to have an impact on both cognitive and spatial memory performance; a study showed those with higher blood sugar levels performed better on certain memory tests.[20] In another study, those who consumed yogurt performed better on thinking tasks when compared to those who consumed caffeine free diet soda or confections.[21] Nutritional deficiencies have been shown to have a negative effect on learning behavior in mice as far back as 1951.[22] "Better learning performance is associated with diet induced effects on learning and memory ability".[23]

The "nutrition-learning nexus" demonstrates the correlation between diet and learning and has application in a higher education setting. "We find that better nourished children perform significantly better in school, partly because they enter school earlier and thus have more time to learn but mostly because of greater learning productivity per year of schooling."[24] 91% of college students feel that they are in good health while only 7% eat their recommended daily allowance of fruits and vegetables.[19] Nutritional education is an effective and workable model in a higher education setting.[25][26] More "engaged" learning models that encompass nutrition is an idea that is picking up steam at all levels of the learning cycle.[27] There is limited research available that directly links a student's Grade Point Average (G.P.A.) to their overall nutritional health. Additional substantive data is needed to prove that overall intellectual health is closely linked to a person's diet, rather than just another correlation fallacy. Nutritional supplement treatment may be appropriate for major depression, bipolar disorder, schizophrenia, and obsessive compulsive disorder, the four most common mental disorders in developed countries.[28] Supplements that have been studied most for mood elevation and stabilization include eicosapentaenoic acid and docosahexaenoic acid (each of which are an omega-3 fatty acid contained in fish oil, but not in flaxseed oil), vitamin B12, folic acid, and inositol. [edit]Cancer Cancer is now common in developing countries. According a study by the International Agency for Research on Cancer, "In the developing world, cancers of the liver, stomach and esophagus were more common, often linked to consumption of carcinogenic preserved foods, such as smoked or salted food, and parasitic infections that attack organs." Lung cancer rates are rising rapidly in poorer nations because of increased use of tobacco. Developed countries "tended to have cancers linked to affluence or a 'Western lifestyle' cancers of the colon, rectum, breast and prostate that can be caused by obesity, lack of exercise, diet and age."[29] [edit]Metabolic

syndrome

Several lines of evidence indicate lifestyle-induced hyperinsulinemia and reduced insulin function (i.e. insulin resistance) as a decisive factor in many disease states. For example, hyperinsulinemia and insulin resistance are strongly linked to chronic

inflammation, which in turn is strongly linked to a variety of adverse developments such as arterial microinjuries and clot formation (i.e. heart disease) and exaggerated cell division (i.e. cancer). Hyperinsulinemia and insulin resistance (the so-called metabolic syndrome) are characterized by a combination of abdominal obesity, elevated blood sugar, elevated blood pressure, elevated blood triglycerides, and reduced HDL cholesterol. The negative impact of hyperinsulinemia on prostaglandin PGE1/PGE2 balance may be significant. The state of obesity clearly contributes to insulin resistance, which in turn can cause type 2 diabetes. Virtually all obese and most type 2 diabetic individuals have marked insulin resistance. Although the association between overweight and insulin resistance is clear, the exact (likely multifarious) causes of insulin resistance remain less clear. Importantly, it has been demonstrated that appropriate exercise, more regular food intake and reducing glycemic load (see below) all can reverse insulin resistance in overweight individuals (and thereby lower blood sugar levels in those who have type 2 diabetes). Obesity can unfavourably alter hormonal and metabolic status via resistance to the hormone leptin, and a vicious cycle may occur in which insulin/leptin resistance and obesity aggravate one another. The vicious cycle is putatively fuelled by continuously high insulin/leptin stimulation and fat storage, as a result of high intake of strongly insulin/leptin stimulating foods and energy. Both insulin and leptin normally function as satiety signals to the hypothalamus in the brain; however, insulin/leptin resistance may reduce this signal and therefore allow continued overfeeding despite large body fat stores. In addition, reduced leptin signalling to the brain may reduce leptin's normal effect to maintain an appropriately high metabolic rate. There is a debate about how and to what extent different dietary factors such as intake of processed carbohydrates, total protein, fat, and carbohydrate intake, intake of saturated and trans fatty acids, and low intake of vitamins/minerals contribute to the development of insulin and leptin resistance. In any case, analogous to the way modern man-made pollution may potentially overwhelm the environment's ability to maintain homeostasis, the recent explosive introduction of high glycemic index and processed foods into the human diet may potentially overwhelm the body's ability to maintain homeostasis and health (as evidenced by the metabolic syndrome epidemic).[citation needed]

[edit]Hyponatremia Excess water intake, without replenishment of sodium and potassium salts, leads to hyponatremia, which can further lead to water intoxication at more dangerous levels. A well-publicized case occurred in 2007, when Jennifer Strange died while participating in a water-drinking contest.[30] More usually, the condition occurs in long-distance endurance events (such as marathon or triathlon competition and training) and causes gradual mental dulling, headache, drowsiness, weakness, and confusion; extreme cases may result in coma, convulsions, and death. The primary damage comes from swelling of the brain, caused by increased osmosis as blood salinity decreases. Effective fluid replacement techniques include Water aid stations during running/cycling races, trainers providing water during team games such as Soccer and devices such as Camel Baks which can provide water for a person without making it too hard to drink the water. [edit]Overeating

vs. hunger

Although a lot of the focus regarding malnutrition centers around undernourishment, overeating is also a form of malnutrition. Overeating is much more common in the United States,[31] where for the majority of people, access to food is not an issue. The issue in these developed countries is choosing the right kind of food. Fast food is consumed more per capita in the United States than in any other country. The reason for this mass consumption of food is the affordability and accessibility. Oftentimes the fast food, low in cost and nutrition, are high in calories and heavily promoted. When these eating habits are combined with increasingly urbanized, automated, and more sedentary lifestyles, it becomes clear why gaining weight is difficult to avoid.[32] However, overeating is also a problem in countries where hunger and poverty persist. In China, consumption of high-fat foods has increased while consumption of rice and other goods has decreased.[33] Overeating leads to many diseases, such as heart disease and diabetes, that may result in death. [edit]Causes Major causes of malnutrition include poverty and food prices, dietary practices and agricultural productivity, with many individual cases being a mixture of several factors. Malnutrition can also be a consequence of other health issues such as diarrheal disease or chronic illness,[34]especially the HIV/AIDS pandemic[35] Clinical malnutrition, such as in cachexia, is a major burden also in developed countries.

[edit]Poverty

and food prices

As much as food shortages may be a contributing factor to malnutrition in countries with lack of technology, the FAO (Food and Agriculture Organization) has estimated that eighty percent of malnourished children living in the developing world live in countries that produce food surpluses.[33] The economist Amartya Sen observed that, in recent decades, famine has always a problem of food distribution and/or poverty, as there has been sufficient food to feed the whole population of the world. He states that malnutrition and famine were more related to problems of food distribution and purchasing power.[36] It is argued that commodity speculators are increasing the cost of food. As the real estate bubble in the United States was collapsing, it is said that trillions of dollars moved to invest in food and primary commodities, causing the 20072008 food price crisis.[37] The use of biofuels as a replacement for traditional fuels may leave less supply of food for nutrition and raises the price of food.[38] The United Nations special rapporteur on the right to food, Jean Ziegler proposes that agricultural waste, such as corn cobs and banana leaves, rather than crops themselves be used as fuel.[39] [edit]Dietary

practices

A lack of breastfeeding can lead to malnutrition in infants and children. Possible reasons for the lack in the developing world may be that the average family thinks bottle feeding is better.[40] The WHO says mothers abandon it because they do not know how to get their baby to latch on properly or suffer pain and discomfort.[41] Deriving too much of one's diet from a single source, such as eating almost exclusively corn or rice, can cause malnutrition. This may either be from a lack of education about proper nutrition, or from only having access to a single food source. Many tend to think malnutrition only in terms of hunger, however, overeating is also a contributing factor as well. Many parts of the world have access to a surplus of non-nutritious food, in addition to increased sedentary lifestyles. In turn, this has created a universal epidemic ofobesity. Yale psychologist Kelly Brownell calls this a "toxic food environment where fat and sugar laden foods have taken precedent over healthy nutritious foods. Not only does obesity occur in developed countries, problems are also occurring in developing countries in areas where income is on the rise.[33] [edit]Agricultural

productivity

Food shortages can be caused by a lack of farming skills such as crop rotation, or by a lack of technology or resources needed for the higher yields found in modern agriculture, such as nitrogen fertilizers, pesticides and irrigation. As a result of widespread poverty, farmers cannot afford or governments cannot provide the technology. The World Bank and some wealthy donor countries also press nations that depend onaid to cut or eliminate subsidized agricultural inputs such as fertilizer, in the name of free market policies even as the United States andEurope extensively subsidized their own farmers.[10][42] Many, if not most, farmers cannot afford fertilizer at market prices, leading to low agricultural production and wages and high, unaffordable food prices.[10]

An 18-month old Afghan girl, weighing approximately 14 pounds, is treated by a US Army medical team in Paktya province.

Reasons for the unavailability of fertilizer include moves to stop supplying fertilizer on environmentalgrounds, cited as the obstacle to feeding Africa by the Green Revolution pioneer Norman Borlaug.[11] [edit]Future

threats

There are a number of potential disruptions to global food supply that could cause widespread malnutrition. Climate change is of great importance to food security. With 95% of all malnourished peoples living in the relatively stable climate region of the sub-tropics and tropics. According to the latest IPCC reports, temperature increases in these regions are "very likely."[43] Even small changes in temperatures can lead to increased frequency of extreme weather conditions.[43] Many of these have great impact on agricultural production and hence nutrition. For example, the 19982001 central Asian drought brought about an 80% livestock loss and 50% reduction in wheat and barley crops in Iran.[44]Similar figures were present in other nations. An increase in extreme weather such as drought in regions such as Sub-Saharan

would have even greater consequences in terms of malnutrition. Even without an increase of extreme weather events, a simple increase in temperature reduces the productiveness of many crop species, also decreasing food security in these regions.[43][45] Colony collapse disorder is a phenomenon where bees are dying in large numbers.
[46]

Since many agricultural crops worldwide are pollinated by bees, this represents a

serious threat to the supply of food.[47] An epidemic of stem rust on wheat caused by race Ug99 is currently spreading across Africa and into Asia and, it is feared, could wipe out more than 80% of the worlds wheat crops.[48][49] [edit]Management Main articles: Ready-to-Use Therapeutic food and famine relief Fighting malnutrition, mostly through fortifying foods with micronutrients (vitamins and minerals), improves lives at a lower cost and shorter time than other forms of aid, according to the World Bank.[50] The Copenhagen Consensus, which look at a variety of development proposals, ranked micronutrient supplements as number one.[4][51] However, roughly $300m of aid goes to basic nutrition each year, less than $2 for each child below two in the 20 worst affected countries.[3] In contrast, HIV/AIDS, which causes fewer deaths than child malnutrition, received $2.2 billion$67 per person with HIV in all countries.[3] [edit]Emergency

measures

Micronutrients can be obtained through fortifying foods.[4] Fortifying foods such as peanut butter sachets (see Plumpy'Nut) and Spirulina have revolutionized emergency feeding in humanitarian emergencies because they can be eaten directly from the packet, do not require refrigeration or mixing with scarce clean water, can be stored for years and, vitally, can be absorbed by extremely ill children.
[5]

The United Nations World Food Conference of 1974 declared Spirulina as 'the

best food for the future' and its ready harvest every 24 hours make it a potent tool to eradicate malnutrition. Additionally, supplements, such as Vitamin A capsules or Zinc tablets to cure diarrhea in children, are used.[6] There is a growing realization among aid groups that giving cash or cash vouchers instead of food is a cheaper, faster, and more efficient way to deliver help to the hungry, particularly in areas where food is available but unaffordable.

[7]

The UN's World Food Program, the biggest non-governmental distributor of food,

announced that it will begin distributing cash and vouchers instead of food in some areas, which Josette Sheeran, the WFP's executive director, described as a "revolution" in food aid.[7][8] The aid agency Concern Worldwide is piloting a method through a mobile phone operator, Safaricom, which runs a money transfer program that allows cash to be sent from one part of the country to another.[7] However, for people in a drought living a long way from and with limited access to markets, delivering food may be the most appropriate way to help.[7] Fred Cuny stated that "the chances of saving lives at the outset of a relief operation are greatly reduced when food is imported. By the time it arrives in the country and gets to people, many will have died."[52] US Law, which requires buying food at home rather than where the hungry live, is inefficient because approximately half of what is spent goes for transport.[51] Fred Cuny further pointed out "studies of every recent famine have shown that food was available in-country though not always in the immediate food deficit area" and "even though by local standards the prices are too high for the poor to purchase it, it would usually be cheaper for a donor to buy the hoarded food at the inflated price than to import it from abroad."[53] Ethiopia has been pioneering a program that has now become part of the World Bank's prescribed recipe for coping with a food crisis and had been seen by aid organizations as a model of how to best help hungry nations. Through the country's main food assistance program, the Productive Safety Net Program, Ethiopia has been giving rural residents who are chronically short of food, a chance to work for food or cash. Foreign aid organizations like the World Food Program were then able to buy food locally from surplus areas to distribute in areas with a shortage of food.
[54]

Not only has Ethiopia been pioneering a program but Brazil has also established

a recycling program for organic waste that benefits farmers, urban poor, and the city in general. City residents separate organic waste from their garbage, bag it, and then exchange it for fresh fruit and vegetables from local farmers. As a result, this reduces its countries waste and the urban poor get a steady supply of nutritious food.[32] [edit]Long-term

measures

Main article: food security The effort to bring modern agricultural techniques found in the West, such as nitrogen fertilizers and pesticides, to Asia, called the Green Revolution, resulted in decreases in malnutrition similar to those seen earlier in Western nations. This

was possible because of existinginfrastructure and institutions that are in short supply in Africa, such as a system of roads or public seed companies that made seeds available.[55] Investments in agriculture, such as subsidized fertilizers and seeds, increases food harvest and reduces food prices.[10]
[56]

For example, in the case of Malawi, almost five million of its 13 million people

used to need emergency food aid. However, after the government changed policy and subsidies for fertilizer and seed were introduced against World Bank strictures, farmers produced record-breaking corn harvests as production leaped to 3.4 million in 2007 from 1.2 million in 2005, making Malawi a major food exporter.[10] This lowered food prices and increased wages for farm workers.[10] Proponents for investing in agriculture include Jeffrey Sachs, who has championed the idea that wealthy countries should invest in fertilizer and seed for Africas farmers.[9][10] Breast-feeding education helps. Breastfeeding in the first two years and exclusive breastfeeding in the first six months could save 1.3 million childrens lives.[57] In the longer term, firms are trying to fortify everyday foods with micronutrients that can be sold to consumers such as wheat flour for Beladi bread in Egypt or fish sauce in Vietnam and the iodization of salt.[5] Restricting population size is a proposed solution. Thomas Malthus argued that population growth could be controlled by natural disasters and voluntary limits through moral restraint.[58] Robert Chapman suggests that an intervention through government policies is a necessary ingredient of curtailing global population growth.
[59]

However, there are many who believe that the world has more than enough

resources to sustain its population. Instead, these theorists point to unequal distribution of resources and under- or unutilized arable land as the cause for malnutrition problems.[60][61] For example, Amaryta Sen advocates that, no matter how a famine is caused, methods of breaking it call for a large supply of food in the public distribution system. This applies not only to organizing rationing and control, but also to undertaking work programmes and other methods of increasing purchasing power for those hit by shifts in exchange entitlements in a general inflationary situation.[36] One suggested policy framework to resolve access issues is termed food sovereignty, the right of peoples to define their own food, agriculture, livestock, and fisheries systems in contrast to having food largely subjected to international market forces. Food First is one of the primary think tanks working to build support for food sovereignty. Neoliberals advocate for an increasing role of the free market. Another possible long term solution would be to increase access to

health facilities to rural parts of the world. These facilities could monitor undernourished children, act as supplemental food distribution centers, and provide education on dietary needs. These types of facilities have already proven very successful in countries such as Peru and Ghana.[62][63] New technology in agricultural production also has great potential to combat under nutrition.[64] By improving agricultural yields, farmers could reduce poverty by increasing income as well as open up area for diversification of crops for household use. The World Bank itself claims to be part of the solution to malnutrition, asserting that the best way for countries to succeed in breaking the cycle of poverty and malnutrition is to build export-led economies that will give them the financial means to buy foodstuffs on the world market. When aiming to prevent rather than treat overeating, which is also a form of malnutrition, starting in the school environment would be the perfect place as this is where the education children receive today will help them choose healthier foods during childhood, as well as into adulthood. As seen in Singapore, if we increase nutrition in school lunch programs and physical activity for children and teachers, obesity can be reduced by almost 3050%.[33] [edit]Epidemiology

Disability-adjusted life year for nutritional deficiencies per 100,000 inhabitants in 2002. Nutritional deficiencies included: protein-energy malnutrition, iodine deficiency,vitamin A deficiency, and iron deficiency anaemia.[65] no data less than 150 150-300 300-450 450-600 600-750 750-900 900-1050 1050-1200

1200-1350 1350-1500 1500-1750 more than 1750

See also: Global Hunger Index There were 925 million undernourished people in the world in 2010, an increase of 80 million since 1990,[66][67] despite the fact that the world already produces enough food to feed everyone 6 billion people and could feed the double 12 billion people.[68] 199 199 200 200 0 5 5 8

Year

Undernourished people in the world (millions)[69] 843 788 848 923

Year

1970 1980 1990 2005 2007

Percentage of people in the developing world who are undernourished[70][71]

37 % 28 % 20 % 16 % 17 %

Percentage of population affected by undernutrition by country, according to United Nations statistics.

Number of undernourished people (million) in 20012003, according to the FAO, the following countries had 5 million or more undernourished people [1]: Country Number of Undernourished (million)

India

217.05

China

154.0

Bangladesh

43.45

Democratic Republic of Congo

37.0

Pakistan

35.2

Ethiopia

31.5

Tanzania

16.1

Philippines

15.2

Brazil

14.4

Indonesia

13.8

Vietnam

13.8

Thailand

13.4

Nigeria

11.5

Kenya

9.7

Sudan

8.8

Mozambique

8.3

North Korea

7.9

Yemen

7.1

Madagascar

7.1

Colombia

5.9

Zimbabwe

5.7

Mexico

5.1

Zambia

5.1

Angola

5.0

Note: This table measures "undernourishment", as defined by FAO, and represents the number of people consuming (on average for years 2001 to 2003) less than the minimum amount of food energy (measured in kilocalories per capita per day) necessary for the average person to stay in good health while performing light physical activity. It is a conservative indicator that does not take into account the extra needs of people performing extraneous physical activity, nor seasonal variations in food consumption or other sources of variability such as inter-individual differences in energy requirements. Malnutrition and undernourishment are cumulative or average situations, and not the work of a single day's food intake (or lack thereof). This table does not represent the number of people who "went to bed hungry today." Various scales of analysis also have to be considered in order to determine the sociopolitical causes of malnutrition. For example, the population of a community may be at risk if it lacks health-related services, but on a smaller scale certain

households or individuals may be at even higher risk due to differences in income levels, access to land, or levels of education.[72] Also within the household, there may be differences in levels of malnutrition between men and women, and these differences have been shown to vary significantly from one region to another with problem areas showing relative deprivation of women.[73] Children and the elderly tend to be especially susceptible. Approximately 27 percent of children under 5 in developing world are malnourished, and in these developing countries, malnutrition claims about half of the 10 million deaths each year of children under 5. [edit]Middle

East

Malnutrition rates in Iraq had risen from 19% before the US-led invasion to a national average of 28% four years later.[74] [edit]South

Asia

According to the Global Hunger Index, South Asia has the highest child malnutrition rate of world's regions.[75] India contributes to about 5.6 million child deaths every year, more than half the world's total.[76] The 2006 report mentioned that "the low status of women in South Asian countries and their lack of nutritional knowledge are important determinants of high prevalence of underweight children in the region" and was concerned that South Asia has "inadequate feeding and caring practices for young children".[76] Half of children in India are underweight,[77] one of the highest rates in the world and nearly double the rate of Sub-Saharan Africa.[78] Research on overcoming persistent under-nutrition published by the Institute of Development Studies, argues that the co-existence of India as an 'economic powerhouse' and home to one-third of the world's under-nourished children reflects a failure of the governance of nutrition: "A poor capacity to deliver the right services at the right time to the right populations, an inability to respond to citizens' needs and weak accountability are all features of weak nutrition governance."[79] The research suggests that to make under-nutrition history in India the governance of nutrition needs to be strengthened and new research needs to focus on the politics and governance of nutrition. At the current rate of progress the MDG1 target for nutrition will only be reached in 2042 with severe consequences for human wellbeing and economic growth.[79] [edit]United

States

Childhood malnutrition is generally thought of as being limited to developing countries, but although most malnutrition occurs there, it is also an ongoing presence in developed nations. For example, in the United States of America, one out of every six children is at risk of hunger.[citation needed] A study, based on 20052007 data from the U.S. Census Bureau and the Agriculture Department, shows that an estimated 3.5 million children under the age of five are at risk of hunger in the United States.[80] In developed countries, this persistent hunger problem is not due to lack of food or food programs, but is largely due to an underutilization of existing programs designed to address the issue, such as food stamps or school meals. Many citizens of rich countries such as the United States of America attach stigmas to food programs or otherwise discourage their use. In the USA, only 60% of those eligible for the food stamp program actually receive benefits.[81]The U.S. Department of Agriculture reported that in 2003, only 1 out of 200 U.S. households with children became so severely food insecure that any of the children went hungry even once during the year. A substantially larger proportion of these same households (3.8 percent) had adult members who were hungry at least one day during the year because of their households' inability to afford enough food.

Das könnte Ihnen auch gefallen