Loka Alert 9:1 (January 28, 2002)
PLEASE FORWARD WIDELY WHERE APPROPRIATE
TRUST US, WE'RE EXPERTS: HOW INDUSTRY MANIPULATES SCIENCE AND GAMBLES WITH YOUR FUTURE
Dear Friends and Colleagues,
This is one in an occasional series on the democratic politics of research, science, and technology issued free of charge by the
nonprofit Loka Institute. TO BE ADDED TO THE LOKA ALERT E-MAIL LIST, or to reply to this post, please send a message to
Loka@Loka.org. TO BE REMOVED from the Loka Alert E-mail list, send an E-mail with no
subject or message text to firstname.lastname@example.org. (If
that fails, just notify us at Loka@Loka.org). IF YOU ENJOY LOKA
ALERTS, PLEASE INVITE INTERESTED FRIENDS & COLLEAGUES TO SUBSCRIBE.
In their thought-provoking book, Trust Us, We're Experts: How Industry Manipulates Science and Gambles With Your Future, Sheldon Rampton and John Stauber discuss how the commercialization of the science system has compromised the previously long-held "contract" between science and society. We're using the January 15, 2002 paperback release of the book to introduce Loka Alert readers to their insightful and comprehensive analysis.
The featured text comes from Chapter 8, The Best Science Money Can Buy. This chapter gives a fascinating portrayal of the influence of military, industry and university agendas on the direction of America's science and technology enterprise. And considering recent threats to U.S. national security and respective technology-rich countermeasures, this chapter provides a much-needed – albeit disturbing – account of the social and political shaping of science and technology. Something we plan to discuss in our upcoming April event on Technology and Terrorism in New York city (see Loka updates below).
If you are interested in purchasing a copy of Trust Us, We're Experts, please visit the following link for information: http://www.prwatch.org/books/experts.html. If you have any questions or comments for the authors, please send in your responses to: email@example.com.
Cheers to all,
(I) TRUST US, WE'RE EXPERTS: HOW INDUSTRY MANIPULATES SCIENCE AND
(II) LOKA INSTITUTE UPDATES................................(3/4 page)
(III) INTERNSHIPS AT THE LOKA INSTITUTE.....................(1/3 page)
(IV) ABOUT THE LOKA INSTITUTE..............................(1/3 page)
Loka Alert 9:1
TRUST US, WE'RE EXPERTS: HOW INDUSTRY MANIPULATES SCIENCE AND GAMBLES WITH YOUR FUTURE
THE BEST SCIENCE MONEY CAN BUY
Science has a face, a house, and a price; it is important to ask who is doing science, in what institutional context, and at what cost.
Understanding such things can give us insight into why scientific tools are sharp for certain types of problems and dull for
According to historian Stephen Mason, science has its historical roots in two primary sources: "Firstly, the technical tradition, in which practical experiences and skills were handed on and developed from one generation to another; and secondly, the spiritual tradition, in which human aspirations and ideas were passed on and augmented." The technical tradition is the basis for the claim that science provides useful ways of manipulating the material world. The spiritual tradition is the basis for the claim that science can _explain_ the world in "objective," unbiased terms. Sometimes, however, these two traditions are at odds.
Modern science considers itself "scientific" because it adheres to a certain methodology. It uses quantitative methods and measurable phenomena; its data is empirically derived and verifiable by others through experiments that can be reproduced; and, finally, its practitioners are impartial. Whereas ideological thinkers promulgate dogmas and defend them in the face of evidence to the contrary, scientists work with "hypotheses" that they modify whenever the evidence dictates.
The standard description of the scientific method makes it sound like an almost machinelike process for sifting and separating truth from error. The method is typically described as involving the following steps:
1. Observe and describe some phenomenon.
2. Form a hypothesis to explain the phenomenon and its relationship to other known facts, usually through some kind of mathematical formula.
3. Use the hypothesis to make predictions.
4. Test those predictions by experiments or further observations to see if they are correct.
5. If not, reject or revise the hypothesis.
"Recognizing that personal and cultural beliefs influence both our perceptions and our interpretations of natural phenomena, we aim through the use of standard procedures and criteria to minimize those influences when developing a theory," explains University of Rochester physics professor Frank Wolfs. "The scientific method attempts to minimize the influence of bias or prejudice in the experimenter when testing a hypothesis or a theory." One way to minimize the influence of bias is to have several independent experimenters test the hypothesis. If it survives the hurdle of multiple experiments, it may rise to the level of an accepted theory, but the scientific method requires that the hypothesis be ruled out or modified if its predictions are incompatible with experimental tests. In science, Wolfs says, "experiment is supreme."
Experience shows, however, that this commonly accepted description of the scientific method is often a myth. Not only is it a myth, it is a fairly recent myth, first elaborated in the late 1800s by statistician Karl Pearson. Copernicus did not use the scientific method described above, nor did Sir Isaac Newton or Charles Darwin. The French philosopher and mathematician René Descartes is often credited with ushering in the age of scientific inquiry with his "Discourse on the Method of Rightly Conducting the Reason and Seeking the Truth in the Sciences," but the method of Descartes bears little relation to the steps described above. The molecular structure of benzene was first hypothesized not in a laboratory but in a dream. Many theories do not originate through some laborious process of formulating and modifying a hypothesis, but through sudden moments of inspiration. The actual thought processes of scientists are richer, more complex, and less machinelike in their inevitability than the standard model suggests. Science is a human endeavor, and real-world scientists approach their work with a combination of imagination, creativity, speculation, prior knowledge, library research, perseverance, and, in some cases, blind luck—the same combination of intellectual resources, in short, that scientists and nonscientists alike use in trying to solve problems.
The myth of a universal scientific method glosses over many far-from-pristine realities about the way scientists work in the real world.
There is no mention, for example, of the time that a modern researcher spends writing grant proposals; coddling department heads,
corporate donors, and government bureaucrats; or engaging in any of the other activities that are necessary to obtain research funding.
Although the scientific method acknowledges the possibility of bias on the part of an
individual scientist, it does not provide a way of countering the effects of
system-wide bias. "In a field where there is active experimentation and open communication among members
of the scientific community, the biases of individuals or groups may cancel out, because experimental tests are repeated by different
scientists who may have different biases," Wolfs states. But what if
The standard description of the scientific method also tends to idealize the degree to which scientists are even capable of
accurately observing and measuring the phenomena they study. "Anyone who has done much research knows only too well that he never seems to
be able himself to reproduce the beautiful curves and straight lines that appear in published texts and papers," admits British biologist
Gordon D. Hunter. "In fact, scientists who would be most insulted if I accused them of cheating usually select their best results only,
not the typical ones, for publication; and some slightly less rigorous in their approach will find reasons for rejecting an
inconvenient result. I well remember when my colleague David Vaird and I were working with a famous Nobel Prize winner (Sir Hans Krebs
himself) on bovine ketosis. The results from four cows were perfect,
The idea that all scientific experiments are replicated to keep the process honest is also something of a myth. In reality, the number of findings from one scientist that get checked by others is quite small. Most scientists are too busy, research funds are too limited, and the pressure to produce new work is too great for this type of review to occur very often. What occurs instead is a system of "peer review," in which panels of experts are convened to pass judgment on the work of other researchers. Peer review is used mainly in two situations: during the grant approval process to decide which research should get funding, and after the research has been completed to determine whether the results should be accepted for publication in a scientific journal.
Like the myth of the scientific method, peer review is also a fairly new phenomenon. It began as an occasional, ad hoc practice during the middle of the nineteenth century but did not really become established until World War I, when the federal government began supporting scientists through the National Research Council. As government support for science increased, it became necessary to develop a formal system for deciding which projects should receive funding.
In some ways, the system of peer review functions like the antithesis of the scientific method described above. Whereas the scientific method assumes that "experiment is supreme" and purports to eliminate bias, peer review deliberately _imposes_ the bias of peer reviewers on the scientific process, both before and after experiments are conducted. This does not necessarily mean that peer review is a bad thing. In some ways, it is a necessary response to the empiricist limitations of the scientific method as it is commonly defined. However, peer review can also institutionalize conflicts of interest and a certain amount of dogmatism. In 1994, the General Accounting Office of the U.S. Congress studied the use of peer review in government scientific grants and found that reviewers often know applicants and tend to give preferential treatment to the ones they know. Women and minorities have charged that the system constitutes an "old boys' network" in science. The system also stacks the deck in favor of older, established scientists and against younger, more independent researchers. The process itself creates multiple opportunities for conflict of interest. Peer reviewers are often anonymous, which means that they do not have to face the researchers whose work they judge. Moreover, the realities of science in today's specialized world means that peer reviewers are often either colleagues or competitors of the scientist whose work they review. In fact, observes science historian Horace Freeland Judson, "the persons most qualified to judge the worth of a scientist's grant proposal or the merit of a submitted research paper are precisely those who are the scientist's closest competitors."
"The problem with peer review is that we have good evidence on its deficiencies and poor evidence on its benefits," the _British Medical Journal_ observed in 1997. "We know that it is expensive, slow, prone to bias, open to abuse, possibly anti-innovatory, and unable to detect fraud. We also know that the published papers that emerge from the process are often grossly deficient."
In theory, the process of peer review offers protection against scientific errors and bias. In reality, it has proven incapable of filtering out the influence of government and corporate funders, whose biases often affect research outcomes.
DOES MONEY MATTER?
Many of the factors that bias scientific results are considerably more subtle than outright bribery or fraud. "There is distortion that causes publication bias in little ways, and scientists just don't understand that they have been influenced," Rennie says. "There's influence everywhere, on people who would steadfastly deny it." Scientists can be naive about politics and other external factors shaping their work and become indignant at the suggestion that their results are shaped by their funding. But science does not occur in a vacuum. In studying animal populations, biologists use the term "selection pressure" to describe the influence that environmental conditions exert upon the survival of certain genetic traits over others. Within the population of scientists, a similar type of selection pressure occurs as industry and government support, combined with the vicissitudes of political fashion, determine which careers flourish and which languish. As David Ozonoff of the Boston University School of Medicine has observed, "One can think of an idea almost as one thinks of a living organism. It has to be continually nourished with the resources that permit it to grow and reproduce. In a hostile environment that denies it the material necessities, scientific ideas tend to languish and die."
Like other human institutions, the development of the scientific enterprise has seen both advances and reversals and is exquisitely sensitive to the larger social environment in which it exists. Germany, for example, was a world leader in science in the nineteenth and early twentieth centuries but went into scientific decline with the rise of fascism. Under the Nazis, scientists were seen as too "cosmopolitan," and the idea of a culturally rooted "German science" transformed applied scientists into "folk practitioners," elevated astrology at the expense of astronomy, and impoverished the country's previously renowned institutions for the study of theoretical physics. Something similar happened in Soviet Russia when previously accepted theories in astronomy, chemistry, medicine, psychology, and anthropology were criticized on the grounds that they conflicted with the principles of Marxist materialism. The most notorious example in the Soviet case was the rise of Lysenkoism, which rejected the theories of Mendelian genetics with catastrophic results for Russian agriculture. In the United States, political and social movements have also given rise to a number of dubious scientific trends, including the "creation science" of Christian fundamentalists as well as such movements as parapsychology and scientology.
The most dramatic trend influencing the direction of science during the past century, however, has been its increasing dependence on funding from government and industry. Unlike the "gentleman scientists" of the nineteenth century who enjoyed financial independence that allowed them to explore their personal scientific interests with considerable freedom, today's mainstream scientists are engaged in expensive research that requires the support of wealthy funders. A number of factors have contributed to this reality, from the rise of big government to the militarization of scientific research to the emergence of transnational corporations as important patrons of research.
The Second World War marked a watershed in the development of these trends, with the demands of wartime production, military intelligence, and political mobilization serving as precursors to the "military-industrial complex" that emerged during the Cold War in the 1950s. World War II also inaugurated the era of what has become known as "big science." Previously, scientists for the most part had been people who worked alone or with a handful of assistants, pursuing the inquiries that fit their interests and curiosity. It was a less rigorous approach to science than we expect today, but it also allowed more creativity and independence. Physicist Percy Bridgman, whose major work was done before the advent of "big science," recalled that in those days he "felt free to pursue other lines of interest, whether experiment, or theory, or fundamental criticism. . . . Another great advantage of working on a small scale is that one gives no hostage to one's own past. If I wake up in the morning with a new idea, the utilization of which involves scrapping elaborate preparations already made, I am free to scrap what I have done and start off on the new and better line. This would not be possible without crippling loss of morale if one were working on a large scale with a complex organization." When World War II made large-scale, applied research a priority, Bridgman said, "the older men, who had previously worked on their own problems in their own laboratories, put up with this as a patriotic necessity, to be tolerated only while they must, and to be escaped from as soon as decent. But the younger men . . . had never experienced independent work and did not know what it was like."
The Manhattan Project took "big science" to unprecedented new levels. In the process it also radically transformed the assumptions and social practices of science itself, as military considerations forced scientists to work under conditions of strict censorship. "The Manhattan Project was secret," observe Stephen Hilgartner, Richard Bell, and Rory O'Conner in _Nukespeak_, their study of atomic-age thinking and rhetoric. "Its cities were built in secret, its research was done in secret, its scientists traveled under assumed names, its funds were concealed from Congress, and its existence was systematically kept out of the media. . . . Compartmentalization, or the restriction of knowledge about various aspects of the project to the `compartments' in which the knowledge was being developed, was central to this strategy. . . . Press censorship complemented compartmentalization." President Truman described the development of the atom bomb as "the greatest achievement of organized science in history." It was also the greatest regimentation of science in history, and spawned the need for further regimentation and more secrecy.
Prior to the development of the atomic bomb, the scientific community believed with few exceptions that its work was beneficial to humanity. "Earlier uses of science for the development of new and deadlier weapons had, upon occasion, brought forth critical comments by individual scientists; here and there, uncommonly reflective scientists had raised some doubts about the generalized philosophy of progress shared by most of the scientific community, but it was only in the aftermath of Hiroshima that large numbers of scientists were moved to reflect in sustained ways on the moral issues raised by their own activities," notes historian Lewis Coser.
Even before the bombing of Japan, a group of atomic scientists had tried unsuccessfully to persuade the U.S. government against its use. In its aftermath, they began to publish the _Bulletin of the Atomic Scientists_, which campaigned for civilian control of atomic energy. Some of its members called for scientists to abstain from military work altogether. In the 1950s, however, the Red Scare and McCarthyism were brought to bear against scientists who raised these sorts of questions. "Furthermore, as more and more scientific research began to be sponsored by the government, many scientists considered it `dangerous' to take stands on public issues," Coser notes. By 1961, some 80 percent of all U.S. funds for research and development were being provided directly or indirectly by the military or by _two U.S. agencies with strong military connections, the Atomic Energy Commission and the National Aeronautics and Space Administration.
The terrifying potential of the new weaponry became a pretext for permanently institutionalizing the policy of secrecy and "need-to-know" classification of scientific information that had begun with the Manhattan Project. In 1947, the Atomic Energy Commission expanded its policy of secrecy beyond matters of direct military significance by imposing secrecy in regard to public relations or "embarrassment" issues as well as issues of legal liability. When a deputy medical director at the Manhattan Project tried to declassify reports describing World War II experiments that involved injecting plutonium into human beings, AEC officials turned down the request, noting that "the coldly scientific manner in which the results are tabulated and discussed would have a very poor effect on the public."
Alvin Weinberg, director of the Oak Ridge National Laboratory from 1955 to 1973, bluntly laid out the assumptions of atomic-age science. In order to avert catastrophe, he argued, society needed "a military priesthood which guards against inadvertent use of nuclear weapons, which maintains what a priori seems to be a precarious balance between readiness to go to war and vigilance against human errors that would precipitate war." He did not mean the word "priesthood" lightly or loosely. "No government has lasted continuously for 1,000 years: only the Catholic Church has survived more or less continuously for 2,000 years or so," he said. "Our commitment to nuclear energy is assumed to last in perpetuity—can we think of a national entity that possesses the resiliency to remain alive for even a single half-life of plutonium-239? A permanent cadre of experts that will retain its continuity over immensely long times hardly seems feasible if the cadre is a national body. . . . The Catholic Church is the best example of what I have in mind: a central authority that proclaims and to a degree enforces doctrine, maintains its own long-term social stability, and has connections to every country's own Catholic Church."
The idea of a "central authority" that "proclaims and enforces doctrine" runs contrary, of course, to the spirit of intellectual freedom and scientific inquiry that led Galileo to defy the Catholic Church in his defense of Copernican astronomy. Weinberg's comments show how much the practice and philosophy of science had changed under the pressures of government bureaucracy and military secrecy. Instead of a process for asking questions, it had become a dogma, a set of answers imposed by what was becoming a de facto state religion.
FROM MILITARY SECRETS TO TRADE SECRETS
"The expansion of university research in the 1950s was largely the result of support from the military," wrote Dorothy Nelkin in her 1984 book Science as Intellectual Property. The last quarter of the twentieth century, however, saw the commercialization of big science, as the rise of the so-called "knowledge-based" industries—computers, telecommunications, and biotechnology—prompted a wide variety of corporate research initiatives. In 1970, federal government funding for research and development totaled $14.9 billion, compared to $10.4 billion from industry. By 1997, government expenditures were $62.7 billion, compared to $133.3 billion from industry. After adjusting for inflation, government spending had barely risen, while business spending more than tripled. Much of this increase, moreover, took place through corporate partnerships with universities and other academic institutions, blurring the traditional line between private and public research. In 1980, industrial funding made up only 3.8 percent of the total research budget for U.S. universities. "Seldom controversial, it provided contacts and financial benefits usually only to individual faculty members, and on the whole it did not divert them from university responsibilities," Nelkin noted. However, declining public funding in many areas of research "left many faculty and university administrators receptive to, indeed, eager for industrial support, and inevitably less critical of the implications for the ownership and control of research."
First reluctantly and then eagerly, universities began to collaborate with industry in fields such as biotechnology, agriculture,
chemistry, mining, energy, and computer science. "It is now accepted practice for scientists and institutions to profit directly from the
results of academic research through various types of commercial ventures," Nelkin observed in her 1984 book, and what was a
In 1999, the Department of Plant and Microbial Biology at the University of California–Berkeley signed an unprecedented five-year, $25 million agreement with the Novartis biotech firm of Switzerland. In exchange for the funding, the university promised that Novartis would have first bid on a third of the research discoveries developed by the department. "The Berkeley agreement has inspired other major American research universities to seek similar agreements with industry," noted the National Center for Public Policy and Higher Education. But although the deal was popular with the department that received the money, it drew a different reaction from many of the professors in other departments. A survey conducted by the chairman of the university's College of Natural Resources showed that two-thirds of the faculty in that college disagreed with the terms of the contract.
"We fear that in our public university, a professor's ability to attract private investment will be more important than academic
qualifications, taking away the incentives for scientists to be socially responsible," stated professors Miguel Altieri and Andrew
Paul Gutierrez in a letter to the university's alumni magazine. Altieri's academic career has been devoted to the study
of "biological control"—the discipline of controlling agricultural pests through means other than pesticides. He noted bitterly that
while money from Novartis was pouring in, university funding for biological control research had been eliminated. "For more than 40
years we trained leaders in the world about biological control . . . A whole theory was established here, because pesticides cause major
environmental problems," Altieri said. Another researcher, UC–Berkeley anthropologist Laura Nader, said the Novartis contract "sent
a chill especially over younger, untenured faculty. Word gets around early . . . over the proper relationship between researchers and
industry in a university setting. A siege mentality sets in, reminiscent of the McCarthy period and the so-called Red Scare,
except then it was government which could be called to account and
Just as military funding for research carried with it a set of obligations that had nothing to do with the pursuit of knowledge, corporate funding has transformed scientific and engineering knowledge into commodities in the new "information economy," giving rise to an elaborate web of interlocking directorates between corporate and academic boardrooms. By the end of the 1990s, the ivory tower of academia had become "Enterprise U," as schools sought to cash in with licensing and merchandising of school logos and an endless variety of university–industry partnerships and "technology transfers," from business-funded research parks to fee-for-service work such as drug trials carried out on university campuses. Professors, particularly in high-tech fields, were not only allowed but encouraged to moonlight as entrepreneurs in start-up businesses that attempted to convert their laboratory discoveries into commercial products. Just as science had earlier become a handmaiden to the military, now it was becoming a servant of Wall Street.
"We're adopting a business instead of an economic model," said chemist Brian M. Tissue of Virginia Polytechnic Institute and State University. "The rationale is collaborations are good because they bring in money. People say we can have better facilities and more students, and it's a win-win situation, but it's not. There can be benefits, but you're not training students anymore; you're bringing them in to work a contract. The emphasis shifts from what's good for the student to the bottom line."
"More and more we see the career trajectories of scholars, especially of scientists, rise and fall not in relation to their intellectually-judged peer standing, but rather in relation to their skill at selling themselves to those, especially in the biomedical field, who have large sums of money to spend on a well-marketed promise of commercial viability," observed Martin Michaelson, an attorney who has represented Harvard University and a variety of other leading institutions of higher education. "It is a kind of gold rush," Michaelson said at a 1999 symposium sponsored by the American Association for the Advancement of Science. "More and more we see incentives to hoard, not disseminate, new knowledge; to suppress, not publish, research results; to titillate prospective buyers, rather than to make full disclosure to academic colleagues. And we see today, more than ever before, new science first—generally, very carefully, and thinly—described in the fine print of initial public offerings and SEC filings, rather than in the traditional, fuller loci of academic communication."
Industry–academic entanglements can take many forms, some of which are not directly related to funding for specific research. Increasingly, scientists are being asked to sit on the boards of directors of for-profit companies, a service that requires relatively little time but can pay very well—often in excess of $50,000 per year. Other private-sector perks may include gifts to researchers of lab equipment or cash, or generous payment for speeches, travel, and consulting.
Corporate funding creates a culture of secrecy that can be as chilling to free academic inquiry as funding from the military. Instead of government censorship, we hear the language of commerce: nondisclosure agreements, patent rights, intellectual property rights, intellectual capital. Businesses frequently require scientists to keep "proprietary information" under wraps so that competitors can't horn in on their trade secrets. "If we could not maintain secrecy, research would be of little value," argued the late Arthur Bueche, vice president for research at General Electric. "Research properly leads to patents that protect ideas, but were it not for secrecy, it would be difficult to create a favorable patent position."
In 1994 and 1995, researchers led by David Blumenthal at the Massachusetts General Hospital surveyed more than 3,000 academic researchers involved in the life sciences and found that 64 percent of their respondents reported having some sort of financial relationship with industry. They also found that scientists with industry relationships were more likely to delay or withhold publication of their data. Their study, published by the Journal of the American Medical Association, found that during the three years prior to the survey, 20 percent of researchers reported delaying publication of their research results for more than six months. The reasons cited for delaying publication included the desire to patent applications from their discovery and a desire by some researchers to "slow the dissemination of undesired results." The practice of withholding publication or refusing to share data with other scientists was particularly common among biotechnology researchers.
"It used to be that if you published you could ask about results, reagents—now you have these confidentiality agreements," said Nobel Prize–winning biochemist Paul Berg, a professor of biochemistry at Stanford University. "Sometimes if you accept a grant from a company, you have to include a proviso that you won't distribute anything except with its okay. It has a negative impact on science."
In 1996, Steven Rosenberg, chief of surgery at the U.S. National Cancer Institute, observed that secrecy in research "is under-appreciated, and it's holding back medical cancer research—it's holding back my research."
The problem of secrecy in science is particularly troubling when it involves conflicts of interest between a company's marketing objectives and the public's right to know. When research results are not to a sponsor's liking, the company may use heavy-handed tactics to suppress them—even if doing so comes at the expense of public health and the common good.
ABOUT THE AUTHORS
John Stauber is the founder and director of the non-profit Center for Media and Democracy. He and Sheldon Rampton write and edit for the Center's quarterly, PR Watch: Public Interest Reporting on the PR/Public Affairs Industry. They are the authors of Toxic Sludge Is Good for You! and Mad Cown U.S.A. They both live in Madison, Wisconsin.
(II) LOKA INSTITUTE UPDATES
**ADVOCACY EFFORTS – This past summer, Loka coordinated an advocacy effort to ensure that community groups and community-based research
would be eligible under the Congressional bill to create a Math and Science Partnerships program at the National Science Foundation.
Thanks to the effort of many of our partners from all over the country, our efforts paid off. Congress has allocated $160 million
annually to the program, and the recently released guidelines http://www.nsf.gov/pubs/2002/nsf02061/nsf02061.html#ELIG
explicitly include community-based organizations in the list of potential partners that school and university P.I.'s are encouraged to
collaborate with. These guidelines go pretty far beyond the original scope of the partnerships that Congress envisioned, in which
businesses and professional scientific societies were the only non-
**LIVING KNOWLEDGE COMMUNITY-BASED RESEARCH DATABASE – The Living Knowledge Community-Based Research Database is now live! The Database is a free, publicly-accessible resource for science shops, community-based organizations, universities and funders world-wide. A project of Living Knowledge: An International Science Shop Network (funded by the European Commission), the Database is an interactive "information warehouse", providing users with resources and tools related to community-based research. Enter information about your organization and projects today. Go to www.Loka.org and click on the Living Knowledge logo.
**WEB SITE UPDATES - Check out the Loka Institute website www.Loka.org for future dates and details of our 5th Annual 2002 Community Research Network Conference in Chicago, IL and our "Technology in an Age of Terrorism" public forum to be held in New York City this Spring.
**ARTICLE SUBMISSION – We are now accepting article submissions for our electronic newsletter, _Loka Alert_. If you would like to submit a 2-10 page article concerning science, technology and society, please e-mail us for guidelines at Loka@loka.org.
(III) LOKA INSTITUTE INTERNSHIPS
The Loka Institute has openings for Summer semester volunteers, graduate and undergraduate student interns, and work-study students.Interns' responsibilities include updating our Web page; managing email lists and listservs; conducting background research on issues concerning science, technology, and society; and helping with administrative work. Interns committing to a semester or more will have the opportunity to integrate independent research into their internship experience.
Candidates should be self-motivated and able to work as part of a team as well as independently. A general knowledge and comfort with computers is needed. Experience in Web page maintenance is preferable. Undergraduate students, graduate students, and recent graduates are welcome to apply. Loka is able to provide interns with an expense stipend of $35 per day for volunteering (or $700 per month full-time-equivalent).
If you are interested in working with us to promote a democratic politics of science and technology, please send a resume and a
succinct cover letter explaining your interest and dates of
(IV) ABOUT THE LOKA INSTITUTE
The Loka Institute is a nonprofit organization dedicated to making research, science and technology more responsive to democratically-decided social and environmental concerns. In doing so, we focus our efforts on achieving the following areas:
** Critical assessments: evaluation of science and technology policies and decision-making processes; evaluation of citizen participation and the social and environmental impact of science and technology policies at all levels.
** Education: build citizen and community capacity to have an effective voice in decision-making processes at the local, regional, national and international levels.
** Dissemination and advocacy: create avenues for citizen participation in research, science and technology processes and policies.
TO FIND OUT MORE ABOUT our current activities and projects, to participate in our on-line discussion groups, to download or order publications, or to help, please visit our Web page: http://www.Loka.org. Or contact us via E-mail at Loka@Loka.org or by telephone at 413-559-5860.
Support Loka with an on-line secure-site donation: Donate Now!
The Loka Institute