THE SCIENTIST VOLUME 7, No:3 February 8, 1993 (Copyright, The Scientist, Inc.) Articles pu

---
Master Index Current Directory Index Go to SkepticTank Go to Human Rights activist Keith Henson Go to Scientology cult

Skeptic Tank!

THE SCIENTIST VOLUME 7, No:3 February 8, 1993 (Copyright, The Scientist, Inc.) =============================================================== Articles published in THE SCIENTIST reflect the views of their authors and not the official views of the publication, its editorial staff, or its ownership. ================================================================ *** THE NEXT ISSUE OF THE SCIENTIST WILL APPEAR ON *** *** FEBRUARY 22, 1993 *** THE SCIENTIST CONTENTS PAGE (Page numbers correspond to printed edition of THE SCIENTIST) NSF'S FUTURE DIRECTION: The National Science Foundation and the National Science Board are wrestling over the practical implementation of the recommendations contained in an NSB commission report on the agency's future, a task being complicated by disagreement over the document's interpretation, the board's role in shaping an NSF science and technology policy, and the uncertainty of the agency's influence in the new administration Page 1 of newspaper SCIENTISTS AND INDUSTRY: With few exceptions, the outlook for the hiring of scientists in industry is only marginally optimistic and generally tied to the sluggish recovery of the economy, analysts say. Other factors--such as the changing world political map, a new administration, and the shifting perception by companies of how much basic research they can support--cloud the picture further Page 1 of newspaper MOVING TOWARD THE MAINSTREAM: As his Biosphere 2 project encounters technical difficulties and controversy, Texas billionaire environmental philanthropist Edward Bass appears to be moving his attentions and largess closer to more conventional scientific research by financing the establishment of the Yale University Institute for Biospheric Studies Page 1 of newspaper An interview with Edward Bass Page 9 of newspaper AAAS ADDS STRENGTH: This year's meeting of the American Association for the Advancement of Science, being held in Boston this week, will be a "stronger" one, according to AAAS officials, because of enhanced quality-control measures used in selecting topics and speakers; it will also continue a trend of examining controversial issues, such as science and religion and AIDS Page 3 of newspaper AAAS meeting highlights Page 3 of newspaper SCIENTIFIC COMMUNICATION: The constant perusal of scientific literature is fundamental to the research process, but keeping up with the flood of books and papers is a daunting chore for the busy researcher. According to Nobelist Joshua Lederberg, electronic publishing--as long as it preserves the salient virtues of print publishing--provides a way for scientists to keep up on their reading of current research literature Page 10 of newspaper COMMENTARY: Eugene Garfield, publisher of The Scientist, discusses the publication's recent move in electronic publishing (it's now on NSFnet) and the ways in which readers can benefit Page 12 of newspaper INTERFACING: Joint investigations between chemical engineers and biologists are beginning to bear fruit, and the greater challenge today, say those collaborators, is to get these two scientific cultures to interact Page 15 of newspaper HOT PAPERS: An astrophysicist discusses solar neutrinos Page 16 of newspaper REAGENT KIT BUILDING: With the Human Genome Project in high gear and biotechnology laboratories busier than ever, the need for molecular biology reagent kits--prepackaged chemicals and their directions used to perform a myriad of biological tests--is intense, with manufacturers rushing to fill it. (Also see Reagent Kit Directory on page 31) Page 19 of newspaper SOCIAL GRACES: Whereas good scientific credentials used to be enough to determine hiring decisions, with today's emphasis on interdisciplinary investigations and teamwork, high-technology companies are increasingly looking to applicants' interpersonal skills, as well Page 21 of newspaper START-UP FUNDS: The American Association for Clinical Chemistry provides much-needed seed grants for clinical chemists, scientists who relate the chemical composition of tissue and body fluids to different illnesses Page 22 of newspaper MIT METEOROLOGIST EDWARD LORENZ has been named the first recipient of the Roger Revelle Medal, presented by the American Geophysical Union Page 23 of newspaper NOTEBOOK Page 4 of newspaper CARTOON Page 4 of newspaper LETTERS Page 12 of newspaper CROSSWORD Page 13 of newspaper OBITUARIES Page 23 of newspaper REAGENTS AND REAGENT KITS DIRECTORY Page 31 of newspaper =================================== U.S. Economic Woes Expected To Limit Job Opportunities For Scientists In Industry Slowdowns in corporate R&D are foreseen, which, experts reason, will cut employment potential for researchers in 1993 BY SUSAN L-J DICKINSON (Page 1 of newspaper) The 1993 employment outlook for scientists in industry is marginally optimistic, at best, according to various economic indicators and industry experts. With the exception of a few sectors that, as a result of sustained consumer demand, appear relatively recession-proof, such as the pharmaceutical and biotechnology industries, hiring will be tied to the sluggish recovery of the economy as a whole, they say. But it is more than fallible macroeconomic forecasts that cloud the picture of industrial R&D in the near future, analysts say. Other factors include: a new, untested Democratic administration; the world's changing political map; and a shifting perception of just how much basic research, independent of market demands, corporations can afford to finance. All of these ingredients contribute to a puzzle that many analysts and industry insiders are finding difficult to put together. And though the cyclical swings in the economy and resultant ebb and flow in the demand for new employees are to be expected, some industry insiders are predicting that, with this recession, the nature of corporate R&D may have changed forever. "I am not optimistic that corporate basic R&D is going to make a big comeback," says Trueman Parish, director of special projects development at Eastman Chemical Co. of Kingsport, Tenn. "Even if we have a very robust economic turnaround, the overall expenditures for a corporation might go back to the same level, but they will have a very different character. There has been a real shift toward emphasis on business organizations and market- driven re- sponses to market needs, vs. just developing new technology in a vacuum and hoping there is a market out there." To be sure, outside of the aerospace and defense industries, companies are doing some hiring. And certain technical specialties, such as hydrology and other environmental-related disciplines, and industry sectors, such as biotechnology, should see significant expansion throughout 1993, industry observers predict. But Tom Parent, manager of professional recruiting for General Electric Co.'s corporate R&D division in Schenectady, N.Y., is one of many recruiting managers who are being inundated by applications from an increasing number of frustrated scientists. "These are good people who would like to contribute," he says, "but the jobs are just not there." Indeed, a July 1992 survey of 141 of the 257 R&D company members of the Washington, D.C.-based Industrial Research Institute (IRI) resulted in the institute's prediction that "1993 will see the recession continuing for industrial R&D in the United States." The survey found that 28 percent of responding companies expected to decrease their total R&D expenditures from 1992 levels (vs. 19 percent in 1991 and 16 percent in 1990); only 10 percent expected to increase hiring of new graduates, while 40 percent expected a decrease; and 84 percent expected their R&D professional staffing level to remain unchanged or to drop from 1992 levels. And Columbus, Ohio-based Battelle Memorial Institute, an organization that develops, commercializes, and manages technologies, forecasts--based on information compiled from the National Science Foundation federal budget documents, and other sources--that 1993 industrial support of R&D will be $83 billion. This represents an increase of just 2.4 percent from 1992, a rise that shrinks significantly when juxtaposed with the expected 1993 inflation rate in excess of 2 per-cent. Eastman's Parish, one of the authors of the IRI report, says he wasn't surprised by the institute's results. "R&D is about the last place companies are willing to go back and expand after a recession," he says. But the appearance of a downswing is exacerbated, Parish says, when compared with the boom years of the 1980s, at which time most R&D companies saw an annual growth rate in the 5 percent to 6 percent range. "Compared to that, the current 1 percent growth in budgets is depressing, and doesn't create the demand for many people." Through an examination of surveys and an informal sampling of some R&D leaders in a variety of industry areas, The Scientist has attempted to gain a better picture of the employment outlook for scientists in industry over the coming year. Chemical and Engineering News (Oct. 19, 1992, page 24) has called 1992 "perhaps the worst year in a decade to be looking for a job in chemistry or chemical engineering." Indeed, an annual survey conducted by the American Chemical Society has found that the unemployment rate has risen dramatically since 1990, that the length of average unemployment increased during 1992, and that more Ph.D. chemistry graduates are taking postdoctoral appointments, rather than finding jobs. The ACS prediction for its members in 1993 is that the outlook for Ph.D. chemists is "bleak." Industry is hiring some chemists, albeit at a reduced level (Edward R. Silverman, The Scientist, Jan. 25, 1993, page 21). And, because the fortunes of chemical companies are tied so closely to the economy as a whole, ACS predicts that employment of chemists overall will improve, but slowly, throughout 1993. At Dow Chemical Co. in Midland, Mich., a hiring freeze has been in place for all of Dow North America since late last year. And Dennis Guthrie, manager of Ph.D. recruiting for Dow, reports that he expects to hire fewer than 30 scientists for corporate R&D in 1993--less than half of the number of positions filled during an average year. E.I. Du Pont de Nemours & Co. Inc. of Wilmington, Del., has been downsizing over the past couple of years, too. According to Parry M. Norling, planning director for corporate R&D at Du Pont, most of this has happened through the use of incentives for early retirement. And while Norling projects that hiring there will be up 20 percent to 30 percent over the coming year, he also says that Du Pont's research budget is tightening (normally $1.3 billion, it is set at little over $1.2 billion for 1993, he says). Norling reports that strategic approaches to R&D at Du Pont are changing as a result. "We're looking for more efficient and effective ways to get the job done," he says. Whereas in the past, for instance, Du Pont might have had three people working on the safety of and regulations regarding the use of lasers, now the firm is trying to spread the talents of one employee more effectively. "We're doing a better job of leveraging our support work across the company," he says. Aerospace, defense, and, to some extent, electronics firms have had to institute aggressive layoff and voluntary separation policies in response to the dramatic decline in the defense industry, and that trend will not change in the foreseeable future, industry experts predict. Even large corporations like General Electric, whose operations comprise 13 different businesses, are facing some rough times. Parent reports that, while to date GE has not had to actually lay anyone off, it has been downsizing through attrition: "We've been in a nonexpansion mode for the last four to five years." Parent adds that in 1993 he expects to hire only 50 scientists within corporate R&D, all of whom will be replacement hires. The situation for earth scientists employed in the petroleum and gas industries is expected to continue to decline, as well. Nick Claudy, manager of human resources for the American Geological Institute of Alexandria, Va., reports that these firms have been forced to lay off technical staff members as existing sources dry up and competition has increased from abroad over the past couple of years (Edward R. Silverman, The Scientist, Dec. 7, 1992, page 23). The American Association of Petroleum Geologists (AAPG) in Tulsa, Okla., reports that employment prospects in the mining and minerals industry are also pretty flat. But there is some good news on the horizon for earth scientists. With the current trend toward environmental awareness, both associations report a bright outlook for hydrologists, hydrogeologists, and scientists experienced in the areas of toxic waste and waste management. Oil companies need people to help them develop "cleaner" methods of drilling and are looking for more efficient ways to extract oil from existing wells, Claudy says. And Larry Nation, spokesman for AAPG, anticipates that many petroleum geologists will return to school for retraining in geology and hydrology, as the demand for environmental consultants skyrockets. Another industry sector that appears to be weathering the economic downturn intact is the pharmaceutical business. The Washington, D.C.-based Pharmaceutical Manufac- turers Association reports that R&D spending by its approximately 105 member companies increased in 1992 to $8.8 billion in the U.S. and $10.9 billion worldwide (up from $7.7 billion and $9.6 billion, respectively, in 1991). But while research budgets and hiring needs are expected to continue to increase moderately throughout the pharmaceutical industry, some external factors could negatively impact this market sector in the near future. One, says Ann Price, a scientific recruiter who works in the San Francisco Bay area, is potential drug price capping legislation that has been under discussion in Congress. "If drug companies are worried about how drug prices will be regulated and are not confident regarding their lobbyists' impact on a new administration, they will tighten their belts," she predicts. And Erwin Posner, president of the Professional Advancement and Placement Institute, a science recruiting firm in Southfield, Mich., says that the large number of prescription drugs slated to come off patent--and therefore become open to competition from generic equivalents--in the next few years also poses a problem for the large pharmaceutical firms. "They are tightening up on their research budgets now because they are concerned about how their income is going to support the continuing research," he says. Posner has been surprised over the past year to have some of his traditional corporate pharmaceutical clients tell him that they have a hiring freeze--even though, judging by their current sales, they appear to be doing well. The one area of optimism for corporate R&D appears to be biotechnology. Though Price stresses that, owing to the nature of science, fortunes and the demand for staff can change literally overnight, she has not witnessed any kind of slowdown in demand or opportunities for scientists in biotech; nor does she expect to over the coming year. The biotechnology industry has experienced some major setbacks in the past year and a half, but most of them have been attributable to regulatory agency complications, rather than to the troubled U.S. economy. The most recent example has been Malvern, Pa.-based Centocor Inc.'s January 18 announcement that it was stopping clinical trials of its antisepsis drug Centoxin because of problems. Last year, the U.S. Food and Drug Administration rejected Cen-tocor's request for approval of the drug, citing deficiencies in the company's data. A subsequent trial was halted after an unexpectedly high number of patient deaths. The news caused the company's stock to plunge 62 percent and raised questions about the health of the entire industry. But analysts disagree about the consequences of these setbacks, and many companies are moving ahead with research and clinical trials of their products. At Amgen Inc. of Thousand Oaks, Calif., for example, Chris Giffin, associate manager of research affairs, anticipates a 25 percent to 30 percent growth of employees for his company in 1993. (Amgen currently employs 600 to 700 scientists.) "We're in the human therapeutic business, developing treatments for diseases which currently have no adequate therapy; these kinds of things are not really influenced by swings in the economy," he says. Indeed, the success of Amgen's therapeutics Epogen and Neupogen is funding the company's current expansion. And the unprecedented number of biotech products expected to hit the market from several different companies over the next few years should help the outlook of this industry as a whole. Both Price and Giffin report that, as the biotech industry has matured, there has been a broadening in the types of scientists whom firms have been seeking. Whereas 10 or 15 years ago, companies were primarily hiring just molecular biologists, now they also are recruiting various types of cellular biologists, physiologists, neurobiologists, and organic and synthetic chemists, they say. But whatever the demand for scientists in any specific area, recruiting managers across virtually all industry sectors are reporting increasing competition for the jobs that are available. Norling at Du Pont reports that currently, for about 200 openings in corporate R&D, his company has roughly 2,000 active candidates. "Our pipeline is full right now," he says, "and the quality of candidates is excellent." Meanwhile, Dow's Guthrie reports an increase in students who want on-campus interviews with the company, as well as older applicants who already have industry experience. At GE, Parent is happy to reflect on the positive side of such a tight job market: It allows him to "cherry pick" the best people for his company. Suzanne Simala at Eli Lilly & Co. in Indianapolis also reports that the steady increase in the ratio of applicants to openings means that her company can be more choosy in whom they hire: In 1992, 40 percent of all scientists hired by Lilly had prior work experience in their fields. The recession also has caused industry executives to reassess the goals of their research. At Du Pont, Norling says, management is "spending more time looking at how long-range research ties into the long-term business outlook, and we are focusing research more on fundamental technologies that cut across a number of areas that will be in market demand." At GE, too, "the charge increasingly is to get R&D projects out into a business division," says Parent. He points out that this objective has real ramifications for the type of scientists GE is seeking to hire. "We are more interested in looking for a person who is interested in taking what has been done in the lab and putting it into business to become more competitive, vs. the person we would have hired a few years ago, who is interested in developing basic new technology." What other traits on a scientist's resume might strike the right cord with corporate recruiters? Du Pont's Norling says that, however tough the times, his company is specifically concerned with recruiting "new blood" into the organization, to ensure that Du Pont keeps abreast of newer technologies and techniques, such as innovative methods for making polymers. With this in mind, he says that Du Pont "still desperately needs to bring polymer chemists, chemical engineers, and biologists on board." Betty Devinney, employment manager for Eastman Chemical Co., reports that changing political and economic geography is having a direct impact on whom she hires. "Our market is no longer a domestic one, it's global: Fifty percent of our sales are international, and we are in the process of moving our manufacturing sites closer to our customers." As a result, while technical fit is still the first thing she looks for in a job applicant, she is increasingly concerned with finding scientists who can help Eastman's multinational effort succeed. Willingness to relocate overseas, international residency, having worked or traveled abroad, and bi- or multi- lingual skills--"anything that signifies appreciation or understanding of other cultures"--all raise her interest in a scientific candidate. John Oxton, manager of corporate Ph.D. recruitment for Armonk, N.Y.-based IBM Corp., cites the litany of characteristics he looks for in a candidate: "technical expertise, a track record of proven accomplishments, motivation, leadership and interpersonal skills, initiative, independence, the ability to generate ideas and communicate abilities, and raw intellectual power." Susan L-J Dickinson is a freelance writer based in Philadelphia. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ CORPORATE R&D: THE CLINTON INFLUENCE (Page 7 of newspaper) Although the specifics of President Clinton's first budget will not become known until late winter or early spring, Albert H. Teich, the American Association for the Advancement of Science's director of science and policy programs, says that there are a number of specific tools Clinton could use to help brighten the outlook for corporate R&D: * An R&D tax credit. This would provide financial incentive for a corporation to invest in research and development. * Technology extension services, modeled after the current United States Department of Agriculture program. These would send government-employed technical experts out into the field to provide assistance and advice to small businesses that don't have their own R&D capabilities. * Consortia, formed by industrial firms in a given sector and with government support (modeled on the Sematech consortium of semiconductor manufacturers). These would help corporations advance their common interests by sharing work at the precompetitive level of underlying technologies. * A revision of antitrust legislation. Joint research is already allowed under current antitrust law, but Teich suggests that Clinton might expand this practice to manufacturing. * Government programs to facilitate the redeployment of highly trained scientists displaced from the aerospace and defense industries. Finally, Teich points out that "all of the things that would improve the climate for industrial research would improve the employment possibilities for scientists." --S.L-JD. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ NSF Still Wrestling With Science Board Over Recommendations For Agency Future NSB report's ambiguity and a lack of consensus on implementation hamper foundation's strategic planning BY BARTON REPPERT (Page 1 of newspaper) In the face of congressional pressures and a climate of unease among university-based researchers, the National Science Foundation and its oversight body, the National Science Board, are wrestling to develop a plan for implementing policy recommendations set forth in a report by a special NSB commission on the agency's future. Their efforts are being complicated by lack of consensus about how to interpret the commission's findings in areas such as the relationship of basic and applied science, grant allocations, science education, and NSF's role in developing a national science and technology policy. There has also been disagreement between NSB and NSF director Walter E. Massey over the role the science board should play in shaping specific agency policy initiatives. At press time, the picture was clouded further by the announcement that Massey will leave the agency in April. The announcement throws into question the status of these policy issues, as well as the disposition of a plan being prepared by NSF that attempts to outline specific steps toward implementation of the NSB report. Meanwhile, several Washington-based science policy-watchers outside the government say they have mixed views about the NSB commission's report and its value in helping to shape longer- range strategy and goals for the NSF. Bruce L.R. Smith, a science policy analyst at the Brookings Institution, says about the report: "It's like the Dead Sea Scrolls, written in some kind of strange code. You have to know who said what. Obviously there was a lot of internal head- knocking. And they came out with this strange and interesting document." Many NSB members attending a meeting last month of the science board's executive committee told Massey of mounting worries among NSF grantees concerning the possible impact the current drive to rede-fine the foundation's mission and scope may have on their research fields and career futures. "Believe me, we're generating intense insecurity," warned Thomas B. Day, NSB's vice chairman and president of San Diego State University. Another science board member, Peter H. Raven, director of the Missouri Botanical Garden in St. Louis, said during the January 15 meeting in Philadelphia that researchers are worried over the possibility of "some kind of murky move that will make it difficult for them to get money." Raven added that any indication NSF may trim back support for investigator-initiated basic research "sends thousands of people out to the windowsills, ready to jump." At the meeting, Massey distributed to NSB members copies of a draft "strategic vision" document that sketches more specific long-range aims for NSF. He said a final version of the document is not expected to be publicly released until after a meeting of the full science board, scheduled for February 11-12. Massey also said that he and his staff are trying to develop an "implementation plan" to state specifically how recommendations of the special commission will be carried out. Mary E. Hanson, an NSF public affairs officer, says that the "strategic vision" document is the result of a long-range planning process that included the formation of the NSB commission, and that seeks to incorporate the commission's findings. According to Hanson, the document presents five "strategic themes" for the foundation: "intellectual integration," involving "the synthesis of knowledge and skills occurring throughout research and education"; "organizational integration," referring to "cooperative arrangements and partnerships with other agencies, industry, or local governments"; "investing in people" to help develop a scientifically literature citizenry; adaptability; and accountability. Massey and James J. Duderstadt, NSB chairman and president of the University of Michigan, have strongly praised the report by the board's Commission on the Future of the NSF, cochaired by William H. Danforth, chancellor of Washington University in St. Louis, and Robert W. Galvin, chairman of the executive committee of Motor-ola Inc. of Schaumburg, Ill. A number of science policy observers, however, say that on key points such as grant allocations, the role of the agency in national science policy and advancing U.S. competitiveness, and the scope of NSF support for science, the panel adopted language that many in the U.S. scientific community may view as troublingly ambiguous. In its report, released on November 20, the 15-member commission concludes that NSF should have two goals in allocating its resources: "One is to support first-rate research at many points on the frontiers of knowledge, identified and defined by the best researchers. The second goal is a balanced allocation of resources in strategic research areas in response to scientific opportunities to meet national goals." The report says "the commission urges that the role of the NSF be further clarified within an overall national policy, the goal of which should be to maintain the premier position of U.S. science while regaining America's lead in the commercialization of technology." At the same time, it cautions that NSF can play only an indirect role in bolstering U.S. competitiveness in world markets. "The universities and the NSF should complement rather than replace the roles of those engaged in technology development," the panel says. "Redirecting the NSF's activities from research and education would have little or no effect on the U.S. competitive position in the near term, but would severely restrict prospects for the long term." While omitting discussion of any specific budget figures, the report says "the NSF will find it difficult to respond to these new challenges without an increase in resources, for the budget of NSF already is inadequate to support its present responsibilities and programs." It recommends that the size of NSF grants be examined because "many believe that on average, NSF individual research grants are too small." The commission urges that "the [National Science] Board and those involved in the planning resist any pressures to strip the NSF of its full spectrum of research goals and linkage mechanisms, from engineering research centers, to computer networks, to pure science and mathematics. The great strength of American science and of American universities is the absence of rigid cultural barriers between science and engineering and between pure research and its applications." (See page 8 for more excerpts from the report.) Marye Anne Fox, an NSB member who also served on the special commission and attended the January 15 meeting, says she believes that "by and large" the scientific community is "very favorably impressed" with the commission report. "What they're worried about is ... how it's going to be implemented," she says, adding that "we're going to keep our eyes open on this implementation business." Fox, who holds the M. June and J. Virgil Waggoner Regents Chair in Chemistry at the University of Texas, Austin, indicates some concern over whether the science board will have a chance to vote on the Massey implementation plan before it is put into effect. "If you go back to the [1950] charter of NSF, it's NSB that's supposed to be doing the policy-making. So I would assume that if the implementation represents a shift in policy, it would have to be put up for a vote," she says. The special commission was formed last summer amid stepped-up efforts in Congress--particularly by the Senate Appropriations Com-mittee's subcommittee on independent agencies, chaired by Sen. Barbara A. Mikulski (D-Md.), which controls NSF funding--to channel the foundation more toward "directed" or "strategic" research and involvement with technology transfer. In its report accompanying a fiscal 1993 appropriation bill, the subcommittee said that, while recognizing NSF's role in establishing U.S. leadership in basic research, it believes that "the new world order requires the foundation to take a more activist role in transferring the results of basic research from the academic community into the marketplace." NSF, the subcommittee said, should "play the key role in making the nation's academic research infrastructure more accessible to those endeavoring to build America's technology base and improve U.S. economic competitiveness." The Mikulski subcommittee set minimum levels for NSF's support of manufacturing science and technology, high-performance computing, and interdisciplinary research on the environment. This get-tough message to NSF was dramatically underscored when the Senate and House appropriations committees rebuffed the Bush administration's request for an 18 percent increase in the NSF research budget--and instead appropriated $1.859 billion, down $13 million from fiscal 1992. In December, Massey sought to interpret the NSB special com- mission's recommendations in a politically favorable light when he sent to key members of Congress a letter saying he views the November 20 report "as an important step in validating NSF's efforts to strengthen the links between academic or curiosity- driven research and strategic research focused on issues of national concern. "More importantly, I feel the report provides a strong policy basis for an accelerated expansion of programs that link scientific and engineering research to key technologies in such areas as materials, biotechnology, electronics, and information and communication." Detailing NSF's "current plan" for the 1993 fiscal year, Massey's December 21 letter said that with regard to various "strategic research" areas: * Manufacturing research and education will increase by 18.7 percent to a level of $104.4 million. "We will devote particular attention to working closely with the Defense Department in defense conversion efforts," Massey wrote. * The advanced materials and processing program will increase by 14.2 percent to $303.6 million. * Biotechnology research funded by NSF will increase by 9.3 percent to $190.2 million. * High-performance computing and communications will increase by 12.5 percent to $225.1 million. * The global change research program is slated to increase by 15.2 percent over last year's level. "This represents about 30 percent of the original increase requested and will require us to forego a number of planned international activities," the letter says. A Senate Appropriations Committee staff member, speaking on condition of anonymity, says that from the committee's point of view, "there's some progress that's been made." He says some in the research community overreacted because they "misunderstood the language and intent of the committee," which wasn't meant to suggest that "all research must be applied research." Instead, the subcommittee believes that "you can have basic research that is strategic in nature," he says. The committee aide says NSF is "making progress towards providing a vision, that is articulable, that does say we can focus basic research, based upon strategic areas--recognizing that we don't have enough money to fund every good idea, every good intention." On the other side of the Capitol, the House Science, Space, and Technology Committee's science subcommittee, chaired by Rep. Rick Boucher (D-Va.), is expected to review the commission's report and recommendations as part of its NSF reauthorization hearings, which a subcommittee aide says could get under way as soon as late February. The Brookings Institution's Smith, who previously worked at the White House Office of Science and Technology Policy, says that, as he views the situation, Massey "made an effort, under pressure from Mikulski and the Senate [appropriations] committee, to try to move NSF toward a prominent role in technology transfer. And the scientific community kind of went bananas, in a semi- hysterical fashion. I think [moving toward a technology-transfer role is] dead in the water." Commenting on the report's brevity--11 pages--and its lack of detailed explanation for the recommendations presented, Smith notes that with only three months to complete the study, the commission was "under a very tight deadline." Nevertheless, he adds, "we deserve a little more profound wrestling with the issues and a clearer charting of the future. I think in that sense the report was a disappointment." Erich Bloch, who served as NSF director in 1984-90 and is currently a distinguished fellow at the privately funded Council on Competitiveness, says the commission's report is "a confirmation, in my opinion, of what NSF has been doing over the past few years"--programs balanced between basic and applied research, between individual grants and support for groups and centers, and also between research and education. According to Bloch, "the report points out that NSF has to become more proactive--more proactive with regard to its relationship to industry.... NSF has to reach out more and involve industry members more in its deliberations, be it on the [science] board, in advisory committees, or you name it." Kathy Ream, director of the American Chemical Society's department of government relations and science policy, says the commission's report is "not the be-all and end-all on what the foundation should be doing." She observes that, despite Massey's attempt to interpret the report in a favorable light, "I don't see the commission's report as being in the thrust that the Senate Appropriations Committee seems to want the foundation to go. I think we're going to still have certain battles up there." H. Guyford Stever, who served as NSF director in 1972-76 and is now a commissioner with the Carnegie Commission on Science, Technology, and Government, comments about the report: "I think the [NSB] special commission did a good job, although they didn't go overwhelmingly in one direction or the other. My personal feeling is that the National Science Foundation's best effort is heavily loaded on the basic research side, but applied research is fine if it's done in the kind of climate that they're best suited to, which is American universities." Stever indicates that he believes the NSB commission should have taken more time in order to do a more comprehensive study, saying: "I personally think it's much too important to get this right than try to rush it." Charles Chambers, executive director of the American Institute of Biological Sciences, calls the report "certainly intelligent and reasonable." He says that some of the report's ambiguous language or other shortcomings may have been inevitable in view of the circumstances under which it was drafted. "When you have such a broad range of people of different views, operating on a very, very short time frame, in the midst of a national election, you find people perhaps pointing with pride rather than marching ahead," Chambers says. "And I think it's a document which will, of course, take on more character and more flavor as the new administration settles in, [and as] NSF decides where it wants to be and how it wants to do what it's doing." Barton Reppert is a freelance science writer based in Gaithersburg, Md. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ AMPLIFICATION (Page 5 of newspaper The article "Advance Planning Is The Key To Avoiding And Surviving Layoffs, Career Experts Say" (Ricki Lewis, The Scientist, Jan. 11, 1993, page 20) mentioned the Young Scientists Network (YSN), an electronic mail-based organization. The electronic mail address for information on subscribing to YSN, which is available through the Internet, is: ysn-adm@zoyd.ee.washington.edu (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ REPORT ON NSF FUTURE (Page 8 of newspaper) Following are excerpts from the report, "A Foundation for the 21st Century: A Progressive Framework for the National Science Foundation," prepared by the National Science Board's Commission on the Future of the National Science Foundation. The report was released on Nov. 20, 1992. "Despite having only about three percent of the total federal R&D budget, the NSF has for over 40 years played an essential role in the scientific primacy of the United States.... "An important national priority is to improve the relative industrial strength of the United States. The National Science Foundation can make contributions to economic success, but developing a plan to do so must begin with an understanding of the system and the reasons for failure of some industries in world markets. "Failures in the market place have not been the result of slow transfer of academic science to industry. In fact, American firms have been the first to commercialize virtually all innovative products, but have lost market share to competitors with shorter product cycles, lower costs, and superior quality. "... The universities and the NSF should complement rather than replace the roles of those engaged in technology development. Redirecting the NSF's activities from research and education would have little or no effect on the U.S. competitive position in the near term, but would severely restrict prospects for the long term. . . . "The United States should have a stronger and more coherent policy wherein science and engineering can contribute more fully to America's strength. "The [National Science] Board is encouraged to work with the president, his science adviser, and the Federal Coordinating Council on Science, Engineering and Technology to assess the health of science and engineering broadly and to generate a stronger policy into which the NSF mission fits.... "Society's voice is welcome and needed.... In accepting society's support, the scientific community naturally assumes an obligation to be both responsive to national needs voiced by society as well as the intellectual priorities solely initiated by the scientist or engineer.... "The commission strongly supports the initiation of proposals by investigators and selection of those to be funded by merit review carried out by experts. This method has proved to be the best way of tapping into the creativity of research scientists and engineers.... "The board and foundation's key role in the support of research in science and engineering should be strongly reaffirmed. "The NSB and the NSF should encourage interdisciplinary work and cooperation among sectors. Nature knows nothing about disciplinary boundaries.... "Many believe that on average, NSF individual research grants are too small. Examination of separate fields and wide consultation within the community would help in understanding the issues.... "The foundation should more aggressively lead in communicating the `case' for science and engineering, which deserve a high priority in the mind of public officials and citizens alike.... "The NSF should both set an example and work with others in fostering international coopera-tion.... "A major priority for the NSB and the NSF should continue to be education in science and engineering. NSF's support of education has a cascading influence. The foundation should be at the leading edge of ever-emerging improvements in curricula and methodologies of teaching and training for research.... "NSF should continue to support shared, common-use facilities that cannot be built and maintained by individual institutions.... "The commission urges that the role of the NSF be further clarified within an overall national policy, the goal of which should be to maintain the premier position of U.S. science while regaining America's lead in the commercialization of technology. "... The NSB, in helping to develop a national science and technology policy, should move quickly to propose a role for the NSF based on its past mission and a vision of what is needed today.... The plan should include a response to the recommendations of this commis- sion.... "The NSF will find it difficult to respond to these new challenges without an increase in resources, for the budget of the NSF already is inadequate to support its present responsibilities and programs.... "... All roads need not lead just to the public treasury. We have one additional suggestion--expanded contributions by business to complement public funding for selected science, engineering and technology programs." (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ Billionaire Bass Moves Beyond Biosphere 2 BY SCOTT VEGGEBERG (Page 1 of Newspaper) Texas billionaire Edward Bass, best known for his bankrolling of Biosphere 2, now appears to be focusing his largess a bit closer to the scientific mainstream. The Biosphere 2 project, an experiment set up near Tucson, Ariz., attempting to establish a sealed, self-sustaining, three-acre module with eight humans inside, has taken a beating from the press and scientists over its lack of scientific credibility, not to mention allegations that the project is essentially the work of a survivalist cult. The scientific shortcomings of Biosphere 2 (B2) are being addressed in the wake of an analysis conducted last year at the behest of Bass by Smithsonian Institution ecologist Tom Lovejoy and other scientists. And, in another move toward greater scientific credibility for proj-ects he funds, Bass has provided $20 million to form the Yale University Institute for Biospheric Studies (YIBS). In November, Yale officials announced the formation of seven on- campus centers for ecological research and the intent to hire seven new faculty members. According to YIBS director Leo Buss, although Bass is chairman of his institute's external advisory board, YIBS and B2 have "no more to do with each other than with the number of other projects Mr. Bass supports" (see story on page 9). On the B2 front, last July Lovejoy released a report of his team's analysis of the problems with the project. The report said, "Although good potential exists for [Biosphere 2] to be a top-grade scientific effort, several related factors have held back scientific development." Those factors included the lack of a "well- developed, written scientific plan," which has resulted in an "ad hoc mix of scientific initiatives of varying quality." Lovejoy also cited an "overconcern with proprietary information," which has "impeded the flow of scientific information and interaction." Other problems mentioned in the report include possible "embellishments" of data. The report recommended that B2's managers create a position of scientific director and begin publishing and discussing their work more openly. Lovejoy and his team of B2 analysts were optimistic, however, that the $150 million project could be set straight: "We believe that, given an adequate scientific reorientation, Biosphere 2 can fulfill its vision and become an important and unique contributor to scientific knowledge." Specifically, the report said B2 could make important contributions in the fields of biogeochemical cycling, the ecology of closed ecological systems, and restoration ecology. Yale emeritus botany professor Arthur Galston, an expert in space biology and a longtime consultant to NASA on space-based farming, has in the past been critical of the Arizona project, calling it "unregulated and unscientific." But even he acknowledges that some credible findings may yet emerge from the B2 experiment. "Tom Lovejoy will certainly help them shape up," he says. "If they mend their ways and have a proper scientific organization, they can contribute not only to basic science but to NASA's Controlled Ecological Life Support System [CELSS] mission- oriented proj-ects." Galston and colleagues reported in the July/August 1992 issue of BioScience (42[7]:490-535) on NASA- sponsored research that aims to develop "bioregenerative life- support systems to produce, process, and recycle biomass." Biosphere To Shape Up "We want to assure people that there is serious science going on here at Biosphere 2," says B2 spokesman Chris Helms. As for the position of scientific director, which was announced via advertisements in Science, "we are very close to some finalists for that position," he says, but at press time no candidate had been selected. Publications of results from B2 are in production and one has already appeared in print, Helms says. But aside from the credibility issue, another problem the B2 managers have to deal with is the declining oxygen levels inside B2. For reasons as yet unexplained, what should be a 21 percent level has declined to less than 15 percent oxygen, roughly equivalent to that available at 12,500 feet elevation. Scientists are puzzling over this gradually dwindling oxygen supply, and some believe it may be getting bound up in the soil. But Roy Walford, a University of California, Los Angeles, pathologist known for his research on aging and the only trained scientist inside the B2 structure, says it is not a bane but a bit of a boon. He says he is using the declining oxygen levels as an opportunity to conduct research on a phenomenon in high-altitude acclimation he is observing and will report on in future papers. But he and fellow biospherians have been suffering breathing problems, insomnia, and fatigue. "When those things get too much for us, then I'll call it a halt," he says. And in fact, on January 13, project managers decided to ameliorate the atmospheric conditions inside B2 by pumping in oxygen to achieve a 19 percent level, says Helms. However, no further additions of oxygen are anticipated from now until the biospherians' scheduled emergence on September 26. Walford has already published a paper in the Proceedings of the National Academy of Sciences that takes advantage of one of B2's other major problems--the inability of the closed ecosystem to supply 2,500 calories per day worth of food to the biospherians, as was targeted at the time B2 was closed in September 1991. The title of the paper sums up his findings: "The calorically restricted, low-fat, nutrient dense diet in Biosphere 2 significantly lowers blood glucose, total leukocyte count, cholesterol, and blood pressure in humans" (R.L. Walford, et al., PNAS, 89:11533-37, 1992). Speaking by telephone from inside B2, Walford says, "We just kind of, by serendipity, find ourselves in a situation we've been studying for years in animals" (R. Weindruch and R.L. Walford, The Retardation of Aging and Disease by Dietary Restriction, Springfield, Ill., Charles C Thomas, 1988). He says having what amounts to rigorously sequestered human subjects for such a long period provides a unique setting by ensuring that everyone receives the same amounts of food and that no one is cheating on the diet. In the summary of the paper, he writes that the B2 caloric restriction demonstrates that "radical and possibly beneficial changes in physiological risk factors can be produced in normal affluent individuals in Western countries quickly and reproducibly by dietary manipulation." The biospherians have all lost quite a bit of weight--some of them more than 50 pounds--and according to press accounts they appear quite thin. As with many of the goings-on at B2, this paper has attracted attention from the popular press. Speaking of Walford's study, Neil Stone, a professor of medicine at Chicago's Northwestern University and chairman of the American Heart Association's nutrition committee, told the Associated Press in December that the research hasn't shown that the diet can20increase life span. "He didn't show any of that," Stone told AP. "It's one thing to say that I think this diet does this in animals. It's another thing to show it in humans." Meanwhile, at YIBS, director Buss says the Bass donations were fortuitous, coming at a time when Yale was endeavoring to grow substantially in the environmental science area. To choose the centers that would be part of YIBS, environmental science faculty were invited to submit innovative proposals on what areas Yale should focus its attentions. "The process of deciding [on] these centers has brought together faculty from across departments and schools that was unprecedented," says Buss. "We found as a rule that the most exciting areas were ones that span more than one academic unit." That is, interdisciplinary proposals fared the best, he says. "The center's leaders are among the most prestigious members of Yale's faculty. Without exception, they are noted for their ability to encourage interdisciplinary research and promote synergism." YIBS Projects The Center for the Study of Global Change will examine variations in energy released from the Earth's interior and from the sun, fluctuations in ice caps, sea temperatures and levels, and atmospheric gases. The Center for Computational Ecology will study "artificial life," which involves computer simulations of how living organisms evolve and interact. The Center for Earth Observation will gather and analyze satellite data to monitor storms, forest damage, ozone depletion, and so forth, and act as a library for such data. Another center, called ECOSAVE, will study the ecology and systematics of animals on the verge of extinction. The Center for Molecular Ecology and Systematics will utilize Yale's Peabody Museum and its vast plant and animal collections to genetically analyze population patterns and to classify species. The Center for Human Ecology, Environment and Infectious Diseases will chart the impact of social and environmental changes, such as deforestation, on infectious human diseases. The Center for Biological Transformation will concentrate on ways to utilize bacteria to detoxify chemical wastes. Bass's donation is intended only as seed money to get the centers off the ground, says Buss, who adds, "We are aggressively pursuing ongoing funding." Other grants are starting to materialize, such as $389,000 from the General Reinsurance Corp., headquartered in Stamford, Conn., to the Center for Biological Transformation for studies on bacteria that consume toxic waste. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ EDWARD BASS ON BIOSPHERE STUDIES (Page 9 of Newspaper) Edward Bass is 46 years old and--as heir to a fortune that originated with his great-uncle Sid Richardson, a Texas oil wildcatter--is one of the wealthiest men in the United States. But although many of his siblings' interests run more toward luxury and high-stakes business ventures, he prefers a down-to- earth lifestyle. He drives his own car, flies commercial airlines, and doesn't spend a lot of money on fancy houses, says his publicist, Terrell Lamb. He is also one of the largest benefactors of environmental research in the world, not all of which attract as much interest--and derision--as Biosphere 2. Besides funding Biosphere 2 and the Yale Institute for Biospheric Studies, Bass is on the board of the World Wildlife Fund (WWF); the New York Botanical Garden; the African Wildlife Foundation; and the Jane Goodall Institute for Wildlife Research, Education and Conservation. In a July 1992 article in Vanity Fair, Bruce Bunting, vice president of WWF, said of Bass, "The common denominator in all his projects is that Ed Bass wants to build a better world." To learn more about what motivates Bass's environmental largess, The Scientist conducted the following interview. Bass shares at least one trait in common with the rest of his family, however--a reluctance to speak and appear in public. These questions, therefore, were submitted in writing through his publicist, as were the responses. Q How do you account for your extraordinary interest in the environment? A I consider I was lucky when I was growing up. My father had a great love for the out-of-doors and for wildlife, and he exposed me to that. I also was educated at a time when ecological issues were beginning to be focused upon. The growing awareness of humankind's role in the biosphere, a role that can be detrimental, has developed in the last 20 or 30 years as an area of considerable scientific interest and significance. Q Why do you invest so much in environmental projects, and what satisfaction do you derive from this investment? A I've invested a great deal in ecologically oriented ventures, not only in monetary terms, but also in terms of my own time, efforts, and energies. I've done this because I think there is a great future in it. I see it as good business, and I see it as responsible stewardship of the resources of wealth. What I enjoy most about the projects I'm involved with around the world is the opportunity to go out and work in the field. For example, I give a significant chunk of my time every year to work in the field with the World Wildlife Fund, principally in Asia. This includes everything from participating in program development and negotiations on a diplomatic level to expeditions in the most remote reaches of Bhutan and Nepal. I have been quite involved in programs for the conservation of the Greater Asian one-horned rhino in the lowlands of Nepal, and in the creation of a unique, pioneering Conservation Trust Fund for Bhutan. Q Besides being a 1968 graduate of Yale, why did you choose this university to create an Institute for Biospheric Studies? A I am convinced that Yale has extraordinary potential among universities to take the lead in advancing understanding of the biosphere and in refining creative approaches to environmental issues in this country and around the world. Yale has a great strength in the basic sciences which inform these matters, and it is unique in having a professional school devoted to natural resources management and envi- ronmental studies; a magnificent natural history collection in the Peabody Museum; a law school that has been a pioneer in environmental law; an international studies program that will focus increasingly on worldwide environmental concerns; and a growing interest in the economics, politics, ethics, and public health issues relating to the environment. Q What are your plans for improving the scientific merits of Biosphere 2? A With any new scientific endeavor, gaining scientific credibility is a very definite, step-by-step process. Since we first conceived of Biosphere 2, I've been confident it would have scientific significance, and I think that as we progress, we will be increasingly acknowledged for our scientific interest. Also, we now have an independent, outside scientific advisory committee, chaired by Dr. Thomas Lovejoy of the Smithsonian. The committee reviews all pertinent aspects of the project, and their reports are made available to the scientific community. --S.V. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ AAAS Gathering In Boston BY RON KAUFMAN (Page 3 of the newspaper) The 1993 meeting of the American Association for the Advancement of Science--which AAAS officials foresee as being a "stronger" meeting than those of past years--convenes this week in Boston. The six-day gathering, February 11-16, will feature 22 symposia, including more than 200 lecture sessions, and 1,200 speakers. Especially noteworthy to some attendees may be sessions focusing on such volatile subjects as "Science and Religion," "Confronting AIDS," and "Examining and Reforming the Economic System." Treatment of controversial topics like these, in addition to the hard-core science matters traditionally within the AAAS meeting domain, furthers a trend established during recent years. Last year's symposia included one session on the crisis in health care in the United States; another dealt with the problem of world hunger. The 1988 meeting featured speeches from 14 Soviet scientists about research challenges in a communist system. However, the agenda of topics for the 1993 meeting, which is to take place at the Hynes Convention Center, were chosen in a manner different from that of previous years. In the past, symposia proposals from individual members were submitted to the AAAS program office, which--with the advice of whomever the office wished to consult--set the meeting docket. For the 1993 convention, symposia topics were chosen by a 14- member program committee, chaired by current AAAS president F. Sherwood Rowland. This group reviewed all suggestions for scientific merit and the status of the proposed speakers in their respective fields. "One of the feelings that the board of directors had, as well as the program office, was that the system [of choosing symposia] needed some revamping," says Rowland, a chemistry professor at the University of California, Irvine. He says that AAAS members were complaining that some symposia were weak, with low scientific interest and speakers considered on the fringe of that particular scientific area. This year, Rowland says, "there is much more quality control of symposia selections. With that in mind, we think this will be a stronger program." More than 4,000 scientists are expected to attend the event. In addition, some 600 journalists, many in Boston to attend simultaneous annual meetings of the National Association of Science Writers and the International Science Writers Association, are expected at the AAAS gathering. Linda Wilson, president of Radcliffe College in Cambridge, Mass., will give the keynote address, replacing original lecturer Jean Mayer, the former chancellor of Tufts University, who died January 1. Wilson's speech is entitled, "The Scientific Community at a Crossroads: Discovery in a Political and Cultural Context." The conference, subtitled "Science and Education for the Future," will also offer programs geared toward students and teachers interested in science, says meeting director Robin Woo. For example, on Saturday, February 13, newly designated Surgeon General Joycelyn Elders will give the keynote address for a "Student Caucus," a feature intended to--for the first time--give students a forum so their voices can be heard by the members of AAAS. Although many of the symposia will be of interest to nonscientists--students, teachers, journalists, and so forth-- four special seminars are aimed at a scientifically sophisticated audience. These include: "Mapping the Human Brain," "Protein Kinases and Phosphatases," "Human Obesity," and "Teaching Ethics in Science and Engineering." (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ AT A GLANCE (Page 3 of Newspaper) 1993 AAAS Meeting: Science And Education For The Future More than 4,000 scientists are expected at the annual meeting of the American Association for the Advancement of Science, to be held in Boston February 11-16. Meeting highlights: Three days of symposia entitled "Future Chemistry: From Carbon to Silicon," February 14-16. Nano-engineering will be the focus of lectures by 20 different speakers. * A special one-day symposium, "Science Education Reform in America," Saturday, February 13. Among the sessions is a detailed discussion of AAAS's Project 2061, a proposal for science education reform. * A one-day workshop, "Regulated Gene Expression and Chromosome Structure," presented by AAAS and the American Society of Cell Biology, Saturday, February 13. * On Saturday at 2:30 P.M., symposia entitled "Biological Science in the Public Domain," featuring a lecture by Harvard University evolutionary biologist Stephen Jay Gould. * Lecture by former Office of Science and Technology Policy director Allan Bromley, "The Theory and Practice of Science Advising in the United States and Abroad," Monday, February 15, at 1:15 P.M. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ NOTEBOOK (Page 4 of newspaper) First, The Bad News Academicians who believe that the way to start the day off right is sitting down with a newspaper and a cup of coffee might want to avoid a new publication by the American Association of University Professors until the afternoon. The College and University Fiscal Crisis Update, which debuted in January, is filled with the latest in bad budget news from campuses around the United States, and is punctuated by comprehensive national lists of funding and job cutback statistics. Highlights (or lowlights) of the January issue include reports that Louisiana has been forced to shave $45 million in state funding from its colleges and universities; the University of Wisconsin system's $10.2 million budget reduction will necessitate hiring freezes, elimination of some classes, and student support services cuts; and Florida State University is experiencing a "brain drain," caused by $18.9 million in cuts over the past two years. For more information about the AAUP newsletter, contact Iris Molotsky at (202) 737-5900, Ext. 3009. The Spring Of Her Discontent In 1962, zoologist Rachel Carson published Silent Spring, which spawned, many believe, the environmental movement as it's known today. One of the first two women ever hired by the United States Fish and Wildlife Service, Carson, fighting cancer herself, took on the chemical companies, the Department of Agriculture, and agribusiness by exposing in the book the dangers of DDT and other pesticides and herbicides to the environment. She died April 14, 1964, and as part of its "The American Experience" series, PBS examines the impact of Carson's book and her struggle. Included in the show will be readings from the book by actress Meryl Streep and interviews with foes and friends, including then- Secretary of the Interior Stewart L. Udall, one of the early champions of her controversial work. Check local listings for broadcast and possible rebroadcast times. An Ounce Of Prevention The National Safety Council's First Aid Institute has developed a Bloodborne Pathogens Training Program to ensure the safety of employees who may be exposed to HIV or hepatitis B infection. Included on the list of occupations considered at risk are workers at clinical and diagnostic labs. The program, which can be taught by noncertified instructors, satisfies OSHA guidelines. An instructor training package, which includes a manual, slides, and a video, enables employers to train staff on-site. For more information, call Bill Markowski, technical marketing specialist, National Safety Council, (708) 775-2105. Just One More Bowl The bowl season didn't entirely end last month. The third annual National Science Bowl will be held April 18-19 in Washington, D.C. Teams of high school students from all 50 states--more than 15,000--are expected to participate in the academic competition, aimed at heightening interest in mathematics, engineering, technology, and science. Regional competitions will produce the 44 teams that will qualify for the national finals in Washington. Cosponsored by the Department of Energy and Cray Research Foundation in Washington, prizes will include international and domestic science trips, computer hardware and software, teacher scholarships, and linkup to the National Education Supercomputer, located at DOE's Lawrence Livermore National Laboratory in Livermore, Calif. For information, contact the National Science Bowl Coordinator, Office of University and Science Education, DOE, Washington, D.C. 20585; (202) 586-8949. The Key To A Breezy Presentation The National Center for Atmospheric Research offers a library of videos, color slides, black-and-white photos, transparencies, and computer simulation prints on weather, climate change, and related topics. The visual aids can be used to enhance presentations, articles, and classroom lectures. For a free catalog or more information, contact NCAR Information Services, P.O. Box 3000, Boulder, Colo. 80307-3000; (303) 497-8600 or (303) 497-8606. Fulbright Season Opens Competition for the 1994-1995 Fulbright Scholar Awards for U.S. faculty and professionals begins March 1. Some 1,000 grants will be awarded for research, research and lecturing, or university lecturing in nearly 135 countries. Specific openings exist in a wide range of the sciences, social sciences, humanities, and arts. Eligibility requirements are U.S. citizenship and a Ph.D. or comparable professional qualifications. University or college teaching experience is expected for lecturing awards. The deadline for applications is August 1. For application materials and information, contact the Council for International Exchange of Scholars, 3007 Tilden St., N.W., Suite 5M, Box NEWS, Washington, D.C. 20008-3009; (202) 686-7877. Maybe It Will Start A Trend While the price of all things medical seems to be going up, the cost to subscribers of the National Library of Medicine's MEDLINE and other NLM online databases have gone down as much as 40 percent. Beginning in January, the price of access to most of the databases was reduced from the approximately $30 per hour of connect time to about $18 for most subscribers. High-volume users, such as hospitals and academic centers, will also have a new discount plan available to them. Commercial database vendors whose systems supply compact disk products containing MEDLINE data will pay for the cost of reproducing and handling database tapes, but will no longer have to pay user fees. For information and a price schedule, contact the National Library of Medicine, National Institutes of Health, Bethesda, Md. 20894; (301) 496- 6308. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ OPINION Communication As The Root Of Scientific Progress (Page 10 of newspaper) Editor's Note: The thorough and timely review of scientific literature pertaining to a researcher's chosen specialty is fundamental to the process of science, says Nobel Prize-winning geneticist Joshua Lederberg. However, says Lederberg--former president of Rockefeller University and now University Professor at that institution--keeping up with the steady, potentially overwhelming flow of significant published documents can be a daunting chore for the diligent, conscientious researcher. In October 1991, Lederberg offered his views on this matter--and his suggestions for ways in which a melding of print and electronic publishing processes promises great progress--in a lecture at the Sixth International Conference of the International Fed- eration of Science Editors in Woods Hole, Mass. Following is an edited version of his lecture. The event took place subsequent to Lederberg's leaving his post as Rockefeller's president and resuming his focus on laboratory experimentation. BY JOSHUA LEDERBERG I don't do very much editorial work these days; I'm back working in the laboratory after a lapse of 12 years, and that has kept me very busy trying to reacquaint myself with the literature of my own field. So I will offer you the perspective of a scientific reader. Now, some people tell me that's a vanishing species. For anyone to say that, even with some sense of irony, is an atrocity. One of my main functions with my own laboratory group is that I try to be its principal reader. If something goes on in the world outside and none of us has heard about it for two or three weeks, I'm the one who feels responsible. I want to be alert to events that might have a very important bearing on the way we think about our own research, our planning, of the data coming in, of the sources of error. The Act Of Publication Let me begin with a few truisms, just to be sure that we are operating on a common ground of reverence for the publication process. Publication is, to start with, just that: publication. It converts private to public knowledge, in the service of registering a private claim of original authorship--in science, of discovery. Above all, the act of publication is an inscription under oath, a testimony. It is accepted as valid until firm evidence to the contrary is presented. There is an extremely high standard of accountability for what is published under a given person's name. Just look at the daily headlines. It is the essential ingredient to make scientific work responsible in the sense that one cannot readily retreat from assertions that have been signed, delivered to the printer, and made available to thousands. These publicly asserted claims also play an extremely important role in the allocation of resources--the ability of different scientists to survive in the competition with other legitimate claims for expenditures, for support of laboratories, for positions at the institutions, for space in the journals, for the attraction of students and collaborators. Both author and audience benefit from the successful assertion of those claims: One doesn't have to spend an inordinate amount of time reexamining every detail of an individual's output if that person has established credibility through prior publication and exposure. Publication also results in a repository, constructing the tradition of science. Up to this point, it can hardly be anonymous in order to perform the functions that I have just indicated. But as time goes by, we have the reassimilation of the content of scientific work, and as it settles in and survives the criticism that it should have had at its early stages of the process, it becomes the common tradition, the unquestioned shared wisdom--often becoming anonymous by obliteration. The literature is also a forum. It's a gladiatorial arena for competing claims, resolving discrepancies in data or interpretation. There used to be oral duels, and we revel in stories like Louis Pasteur's confrontation with Felix Pouchet that finally put spontaneous generation to rest in 1864. Today, our battles are more often fought out in print, which is indeed appropriate because the testimony then becomes available to the universe, not simply to the immediate onlookers. Despite the opportunity for very broad dissemination, there is the paradox, nevertheless, that broadcast restricts individuals' access to feedback. The publication system, at least in principle, should allow a dialectic to appear in more symmetrical terms so that anyone with something purposeful to say has a way to get into the system. If the literature is a forum, it is also a rumen, a place for the digestion and assimilation of the variety of inputs whereby scientific claims go through a period of seasoning, modification, modulation. Even the truths look different five or 10 years later, regardless of explicit criticisms. We can expect a process of reinterpretation, a post-historical reexamination of the meaning of their terms. And now I need only to remind you of the term imprimatur (a wonderful metaphor): the imprinted witness that an article, having appeared in a refereed journal, has survived a critical process, a conspiracy if you like, of the editors and the publishers and the referees--that something has appeared that is worthy of the shared interest and precious attention of the community. The Work Of A Reading Scientist Reading the scientific literature has been my primary vocation for 50 years. May I tell you what I do as a reading scientist today? Books play a diminishing role. Today they are mostly for targeted reference. In the scientific domain, we rarely have the leisure now to read a book from cover to cover. A few biographies command attention. I just finished Carl Djerassi's life story, The Pill, Pygmy Chimps and Degas' Horse (New York, Basic Books, 1992); another of that genre was Fran~ois Jacob's revelation of the development of his scientific work, The Statue Within (New York, Basic Books, 1988). These are, obviously, not very contributory to the details on how to do my next experiment, but they tell me a lot about the scientific personality, providing object lessons and models for emulation. Rarely, I do see a work that compels total ingestion--for example, Physiology of the Bacterial Cell (Sunderland, Mass., Sinauer Associates, 1990) by Frederick Neidhardt, John Ingraham, and Moselio Schaechter. This is such a magnificent synthesis at a fairly elementary level of exposition that I really marveled at the deliberation and distillation that went into the telling. Wonderful books like that are rare. In printed form, they surely will be the survivors of any electronic revolution. At an intermediary level of indispensability as books in print format are the Annual Reviews. They are reference works for whatever you have to look up; but they also give a chance to browse through an enormous literature with some coherence. Compare an Annual Reviews of Genetics with current issues of the journal Genetics. Even if I had the time to read every article in that journal, I wouldn't have the background to be able to place each one of them in the appropriate context of what comes through--and I regard this as my home discipline! People will spend varying amounts of their time and energy, as well, in trying to understand what is going on in science beyond the window of their own specific work in their research and teaching. There are about a dozen journals that I subscribe to, and maybe seven or eight of them that I do scan from cover to cover: Nature, Science, Proceedings of the National Academy of Sciences, The Journal of Bacteriology, Microbiological Reviews, Genetics, Biochemistry--those are the very general ones. I pick up a "hot paper" now and then from The Scientist, and I look at The Sciences, New Scientist, American Scientist, and Scientific American for general scientific culture. I'm talking about a textual sampling, not immersion. You couldn't read every article in a critical and detailed fashion in just the publications I have listed within the number of hours that there are in the week. What you can do within a couple of hours a day is to scan that range of material and try to pick out those things that might be of interest. To follow the structure of argument just in one's own specialty, you must go to the detail of trying to check the numbers on the graphs and see if they match the authors' assertions--an arduous task. We are well served by those kinds of journals in terms of maintaining a general currency about what is going on in the field, and they match very well the energy and interest and intellectual acuity that our scientific readers are able to put into the proc-ess. I see no occasion for those to be altered. Most scientists are very grateful for them: They provide the material that thousands of scientists will share as common currency, to carry in their briefcases and read on the airplanes and the commuter rides, with all the convenience of the present print format. Information Retrieval My main problem is: How do you reacquire, retrace that intellectual traffic? What do you do with all of your marginal notes, and how do you synthesize a coherent system of what you've read? Well, to try to deal with this on a current basis, I get the weekly Current Contents on Diskette with all of its embellishments. I eagerly await the five or six diskettes that have to be loaded, every week, and I am sometimes impatient about how long it takes to load them and get going with that week's literature. My stored profiles work out reasonably well, but have to be embellished from time to time. You discover new keys, other notations that authors insist on in changing fads and idiosyncrasies of language. I can warrant that my profiles recover on a current basis about 90 percent of what I have read or would want to read. God help me if I lose my notes on the rest! Then, how to keep up with what's closest to my immediate specialty? Acquiring a couple or three papers a day is not hard. And, even with a fairly detailed critical examination, down to checking the points on the graphs and so on, reading them as they come in is entirely doable. My problem is the arithmetic of accumulation. After a decade, I've got about 10,000 papers that I have got to keep track of--the texts, my marginal notes, and so on. And here my system is absolutely broken down! A technological fix is on the way: document scanners that can store page images and digitize scripts on searchable media. One or a few CD-ROMs will take care of the storage. But what a lot of bother for information--yes, full text--that I should be able to acquire electronically in the first place. This is all the more urgent for specialty journals and references to be searched on demand. Within a given discipline, there are usually one or two journals that specialists must see. There may be only a couple of hundred people who have such a level of interest that they will look at every article. There are the journals of broad appeal, and then a very flat distribution of the other sources. For my part, of an additional 30 articles a month, perhaps half of them come from about 15 journals; you can probably extrapolate to the rest using Bradford's Law, which states that a relatively small number of journals accounts for the great majority of significant publications in any field. Thus, about 90 percent of the most significant articles will come from about 35 journals, and then there is a gradual asymptote out to the vanishing returns from the total coverage that the system is going to offer. In spite of our most systematic efforts, an important article does pop up serendipitiously from an obscure place. So each of us faces the task of selective retrieval from a cosmic domain of stuff that every other eager beaver in the world has been busily putting into the repository. Our present technology affords the researcher an approximation of recovery with reasonable confidence. Keeping track of what you have accumulated on pieces of paper is the frustration. That's not your bedside reading, well served by the print-on-paper version. The next step, to integrate that into your own private library of useful knowledge, is simply not achievable with last year's technology. Information Overload? The fact is that scientific literature inherently has grown beyond the scope of any hundred people to have understood it and gone into it in some depth; it is built into the growth of knowledge that past improvements in communication and storage aren't going to alter. What are the consequences? For one thing, the problematics of assessing the literature reinforces all the other drives to specialization. The ambitions of scientists have changed, to focus on ever-narrower targets. It's just too much hard work to master an interdisciplinary area on top of all the other institutional obstacles. Never mind the intellectual conceptual problems. Never mind the problem of getting funding or moral and fiscal support, just to get hold of the necessary expertise and information! But that impediment is in principle remediable. At the same time, are we drowning in information, inundated by the numbers of journals? You know, when you come to any specific issue, when there is some important, special fact that you would like to know all about, it's another matter altogether. My usual experience in asking a new question tells me the odds are that the exquisite detail needed to take the next step has just never been completed. So here, far from being drowned, I have a great deficit of specific and detailed knowledge of exactly what happens in such and such a system with such and such reagents, and so forth. Our systems for acquisition of that kind of material are not perfect, but they are getting a lot better with devices like keyword searching, related articles, and full abstract searching, which is just about what the technology does offer today. Accordingly, I can feel reasonably confident that explicit matters of factual detail--such as whether somebody has done a particular experiment before--can be retrieved, but often only with a lot of effort. Much more difficult, has anybody else had a good idea that would be pertinent to my search? Those keys are so much more difficult to catalog. Often it takes a great creative act to recognize that a concept developed in one context really is pertinent to another. So there will never be a guarantee that those can all be acquired. But there is at least the hope of finding it in that literature, and it is a very important hope to try to preserve. Adaptations There are different adaptations to the information flood, and more and more do we see what I can only describe as a scandal, that scientific literature is not always taken seriously anymore. In polls of scientists, many will say that the primary source of their information about scientific work in their field is not with the published literature. It's by word of mouth, it's by telephone networks, by attendance at meetings, and so on. Perhaps that can't be helped--people have got to do what they've got to do. But I find those kinds of sources so unreliable! I feel very uncomfortable when the only place that I have heard about something is by word of mouth. If I can't pin it down, if I can't hold its source accountable by saying that was in a published item, I can't look at it in detail, ruminate about it, think through what second-order reactions I would have. I don't know whether my colleagues share that passion for the literate mode. They may feel that they don't have any alternative except to pick up what's on the rumor mill, but I think great mistakes can be promulgated in that fashion. The telephone is a wonderful instrument, but when I try to use it to get information, people who have what I am looking for are all pretty busy. I hate to impose on their time, and, if I do, there is usually a round of telephone tag of three or four attempts to catch somebody before I actually do get hold of them for the information. If it's a reference, I am delighted. If it's an attribution, it cannot be pinned down more definitely than: "You know, this is what I think." I don't feel like I have made a great advance over what I had before. Not taking literature seriously reinforces the trend that sees libraries, in desperation, canceling subscriptions to journals that they don't see being very much read locally. And it doesn't make any sense to have a local copy of a serial in which perhaps one in a hundred titles will ever be examined by anybody in that institution. Some of these journals, de facto, are approaching the point at which they might as well print only one copy, send it to the National Library of Medicine or some other repository, and let it redistribute reprints by interlibrary loan. Here, however, the economics just doesn't add up. The fundamental problem lies in trying to foist an inappropriate number of vehicles on an outmoded mechanism for the purpose of dissemination--and that would fall of its own weight. You can see what I'm leading to: go from 1,000 to one to zero print copies. Meanwhile, the libraries are in a great dilemma, trying to figure out exactly what to do. They get a fight from the faculty--what a librarian hears when he or she wants to drop a journal, you would think that every professor was reading every issue of every journal in the library. For the operational procedures by which libraries can make sensible decisions about acquisition priorities, they could get any number of technical aids on that point, but it does put them in a very tough spot. Besides the budgetary crunch, the libraries are also running out of space. The older stuff is deteriorating, anyhow. Maybe ink on paper was not a totally bad idea, for that reason alone, provided one clean copy remains available; unfortunately, things don't always work out that way. One direction things could take if we don't reform the system of scientific communication is that invisible colleges will take over as the principal but unreliable routes of communication. Archival copies of material will eventually be sent in to some repository, but there will be a limbo of material that doesn't know if it is going to go to hell or heaven for four or five years--while it is still cooking and unaccountably available--on a basis far from equitable. So, in due course there has to be a wholehearted exploitation of the new technologies, and I don't have to plead for it. It's happening because electronic networks are becoming more and more available to people working in a variety of fields. A couple of dozen of them now operate with a routine exchange of preprints. The central problem facing the journal has been a radical change in the economics and technology of printing, without an adequate recognition of the essential value added in the print-publishing process. From Gutenberg's time until mechanized and computerized composition, publishing's added value was in providing the capital and the entrepreneurship and the organization to facilitate a process whereby an expensive and precious printed article was the product. It was characterized by rather high capital investment in the initial composition of any material. Once it was composed, there was a rather low variable cost for further dissemination. We had a market system for determining what was worthy of that degree of capital investment. Well, today the capital investment in the printing technology is almost zero. Editorial Value The important value added--which is an intellectual and aesthetic rather than economic value--is the editorial process, including issues of selection and, subsequently, of editorial work and improvement. And, finally, we have that very precious imprimatur. When something comes out in a journal of high repute (to make a circular argument), that's a journal worth my time and worth my attention. If it is just thrown up in the air without having undergone that kind of editorial review, it will not have been refined in terms of the presentation, and perhaps even substance, of the argument, and it won't have the imprimatur of other people, whose judgment I trust, that it's worth reading and can be relied upon for accountability. Whether the article then gets into print is almost an irrelevancy at this point. Any of a variety of communication media could follow on that editorial process. What should be promoted is the marriage of that editorial role, on one hand, with, on the other, a production role that uses the electronic technologies rather than print. And that's where the spontaneous bulletin boards don't quite make it. They quickly get filled up with obscenities, literal and otherwise, for lack of that sort of control. I don't mind the obscenities as long as I don't have to plow through them, but I'd like a truth-in-advertising framework that tells me what's worth reading. I'd like to know that x, y, or z editorial committee had been established as a guide for what is worth capturing the priority of my attention. I think it will be the societies that provide the most likely framework for the organization of those editorial functions. This won't make any money to start with. But the economics and the technology will converge with the social necessities for this kind of improvement. Technically, we don't need much more than what we now have. There are a few problems about transmitting graphics and formatting manuscripts. Some standards have to be established, along with some minor fixes, especially on the graphics. But we are basically right there. Machines with gigabyte storage and ever-smaller 25-megabyte processors are very routine today. You will find them by the hundreds in the laboratories and the libraries and so on, with a halving of capabilities-per-unit costs every couple of years. So in 10 years, today's so-called supercomputer will certainly be available in every institution, and to a large degree in every laboratory. Communication links won't grow quite as fast as that, but if you consider the bandwidth of a package of CD-ROMs, you have a variety of technologies for all the communication we need. So those are not limiting factors, either. They are not very expensive. The machinery, the social framework, the decisions involved, the wetware, the distribution channels, the marketing, and so on, really are all that stand in the way. There are not the same kinds of profit incentives that drive paper publishing; so I think the not-for-profit institutions will start taking over. Perhaps the for-profit publishing houses will provide the essential technical services--because they can have the economy of scale, the organization, the hardware, and so on--and then contract that out to the societies for providing the other elements of the equation. That partnership could be a very productive one for the entire scientific community. One feature of that kind of a system, to which we have only a crude approximation today, is feedback, dialectic. It shouldn't take a federal case for reactions to a paper to be elicited from the scientific community--and not just on the rumor network, but some place where everybody else can see it. This is the bulletin board system of commentary, and it would complement what the fixed board of editors would have to say. If there is a good dialectical system and the critical community has an opportunity to express its views, even ex post facto, that's how the scientific process works at its best. Here the economics and the technology for dialectic give a great edge to the electronic systems over the printed ones, if for no other reason than their ability to provide propinquity. I mean, if an article has been printed and then, a little later on, I write a critical reaction to it (even in the rare case that the journal accepts that sort of commentary and further dialogue), the article and the reaction do not adjoin one another on the shelves. It's a nuisance trying to find them. Let's say I wrote something six months ago and one of my friends writes a blistering critique sometime after that. How are the two of them going to be brought together? That kind of reshuffling of the units is very hard with printed paper. It's trivial, of course, to do it with electronic media via the networks of linkage of material and commentary. That potential for reaggregation stands just after mechanized search and tempo of availability as the greatest advantage that these new kinds of media can offer. Sharing Information Let me make one further comment about global access, something very dear to my heart. There was a remark in my letter of invitation to this conference: "You may feel like you are in a flood, but people in the Third World are in a real drought. They never get the journals that you complain of getting too many of." The economics of sharing will shift dramatically with these media. For trivial, marginal costs you can provide 100 CDs a year, which would far exceed the total volume of publication that people in developing countries could ever hope to get in any other way. There is no other way in the world that we can duplicate all the paper libraries that we now have as a privileged treasure. Another feature about globality that electronic systems will offer is built-in translation aids. I am not talking about the nirvana of automated, perfectly smooth translation. Most of us here have a smattering of two or three foreign languages; a few of you are great linguists. But when I am reading an article in German, which I am fairly fluent in, wouldn't I love to have a built-in dictionary to help out when I run into a phrase that I don't understand? I'll take the risks of that crude translation. It may come out with some of the ridiculous puns for which machine translations are notorious. Again this becomes trivially easy in terms of its marginal cost, and will greatly extend the global accessibility of literature to a wide variety of people whose command of the current international standard English may not be perfect. Fortunately, current systems of scientific communication, like these proceedings and the forum for opinion in The Scientist, will help sustain the debate that may help bring about the necessary reforms. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ THE SCIENTIST ON THE INTERNET (Page 14 of newspaper) The Scientist is now available on-line via NSFnet, as files of about 3,000 lines in length. To gain electronic access to this valuable biweekly information source, follow these instructions: Type: ftp nnsc.nsf.net Use the login "anonymous" Use your username@bitnet~ address as password Type: cd the-scientist Files will be added every two weeks, with file names corresponding to date of publication. To access the Nov. 9, 1992, issue, for example, type: get the-scientist-921109, or, to access the Nov. 23, 1992, issue: get the-scientist-921123 Jan. 11, 1993, is available as: the-scientist-930111; Jan. 25, 1993, is available as: the-scientist-930125, and so forth. For a full listing of issues of The Scientist in the NSFnet directory, type: ls * or type: get index-the-scientist (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ COMMENTARY by Eugene Garfield (Page 12 of newspaper) Electronic Publishing Extends Reach Of Scientists And Of The Scientist In his essay on page 10 of this issue, Nobel laureate Joshua Lederberg describes the many potential benefits to the science community of electronic publishing. I certainly concur in advocating its use, as witness my recent decision to mount The Scientist on NSFnet and the Internet. Originally planned primarily to facilitate access to large mainframe computer programs and data files, NSFnet has evolved into a major communications network and a splendid means of disseminating the valuable information our publication presents. Last year I attended a conference on networking at which I had the pleasure of meeting Vinton Cerf, vice president of the Corporation for National Research Institutes and one of the gurus of networking. Cerf, in turn, introduced me to Corrine Carroll of the NSF Network Service Center in Washington, D.C., who eventually obtained all the necessary clearances for making The Scientist available on NSFnet. It has been suggested that electronic availability might undermine The Scientist's popularity as a print publication. Although some readers on a tight budget may very well switch to the electronic version in order to avoid the annual subscription fee, I don't foresee that happening. At present, only text is included on NSFnet; photographs and other graphics, cartoons, crossword puzzles, display and classified advertising--that is, all of the other valuable components responsible, along with the text, for the publication's increasing acceptance--are omitted. So why have we gone electronic? It seems to me that The Scientist online serves a different function. First, it overcomes the inherent delay in using the postal system. This may be insignificant to many readers, but it might be of real value to those, including science journalists, who can never receive their information early enough. The significance is even higher for scientists overseas, where the delays and cost of postage are much greater. While our European colleagues may have to pay local communication charges to use the Internet, they can access the file once and then redistribute the information locally over their own internal networks. In order to access The Scientist electronically, a PC and modem are required. Transfer of files is quite rapid. There are two basic modes of access. Either you can transfer the entire file of each issue into your local PC (using file transfer protocol, or "ftp"), or you can request that the NSF info server send one or more issues to your Internet mailbox. In the former case, you are online only as long as it takes to download the contents of each issue. Once you receive an issue, you can browse the file as you would any other ASCII text--scanning the contents page and then skipping to the full text of each article. Even if you are not a hacker and do not spend a lot of your time at the console, you might consider using this facility as a simple method for forwarding electronic copies of articles to interested colleagues. Among other virtues, this eliminates a trip to the copying machine and the delay in using the mail. You could send a fax, but that isn't very convenient unless you can transmit directly from your PC. In any case, regular readers of The Scientist should feel free to post these files onto their bulletin boards. Any reader who wishes to comment directly concerning material published in The Scientist--or on our new experiment in electronic publishing--can contact me, via Bitnet, at garfield@aurora.cis.upenn.edu; or, via CompuServe, at 70550.130@compuserve.com. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ LETTERS New Spin On Ultracentrifuges (Page 12 of newspaper) The article on analytical ultracentrifugation (Franklin Hoke, The Scientist, Nov. 9, 1992, page 18) was well-written and informative but had one omission. It did not mention the National Analytical Ultracentrifugation Facility, which is part of the Biotechnology Center at the University of Connecticut. This facility was set up by an initial grant from the National Science Foundation in 1988. The principal investigators are Emory H. Braswell, Todd M. Schuster, and David A. Yphantis. The facility is staffed with two master's-level scientists and one master's- level engineer. In addition to the new Beckman XLA, the facility maintains four model E instruments equipped with advanced electronics and cells developed at the facility, increasing the efficiency of the technique more than 20-fold. Although it has been suggested that we are maintaining the most complete working museum of Model Es in the world, for some studies the Model E has not yet been superseded by the new machine. Utilizing software and research strategies developed at the facility, we do complete analyses of self-association and hetero- association of molecules, such as DNA oligomer proteins, peptides, and inorganic complexes. We are also studying nonaqueous systems, such as buckyballs in toluene. After an initial development period, the facility has been involved in 30 research projects with 18 academic and 12 industrial collaborators. We have worked closely with Beckman Instruments Inc. of Fullerton, Calif., helping them in the development of a data- analysis package, and have been influential in shaping the development of Beckman's current and future centrifuges. The facility provides workshops for researchers interested in applying the new research strategies and analysis techniques in their own laboratories. It's interesting that this old technique, kept alive during the dark ages by a few dedicated researchers, is once again on the cutting edge, this time making important contributions to biotechnology. EMORY H. BRASWELL Head Analytical Ultracentrifugation Facility Biotechnology Center University of Connecticut Storrs Liberal Arts Colleges (Page 12 of newspaper) I am sure my colleagues at liberal arts colleges would join me in thanking you for an insightful article on the pleasures and occasional pains of doing science in a smaller college (Linda Marsa, The Scientist, Nov. 23, 1992, page 21). But the headline you chose, "Doing Science Off The Beaten Path At Liberal Arts Schools," undercuts the message of the article. "Off the beaten path" suggests that our institutions are all in academic Nowheresville. How about "Science In Liberal Arts Colleges--The Path Less Traveled By"? That would make all the difference. BRUCE PARTRIDGE Provost Haverford College Haverford, Pa. A Caricature? (Page 12 of newspaper) Patrick H. Cleveland (The Scientist, Nov. 23, 1992, page 12) laments growing antipathy toward animal experimentation, but he gives the animal protection movement too much credit. The movement is highly fractionated, wastes much of its resources on mass mailings, and commits only a small fraction of its resources to animal research campaigns. In fact, much opposition to animal research is scientific. I doubt I shock many readers of The Scientist when I suggest that many, if not most, animal experimentation projects are poorly conceived. Because animal experiments cannot disprove hypotheses about human anatomy, physiology, or pathology, they do not constitute human medical research in the strictest Popperian sense. Most nonscientists are unfamiliar with Karl Popper, a 20th-century philosopher of science, but many can recognize that "animal models" are often poor analogs to human conditions. Cleveland denounces animal rights, but he does not offer a cohesive alternative ethic. Do we have the right to submit innocent, sentient creatures to pain and/or suffering for supposed human benefit? Many people are concluding that animal research rests on an ethic of "the ends justify the means," which they find unacceptable. I suggest that the notion of a wealthy and violent animal rights movement is a caricature created by animal research defenders as a straw man to avoid legitimate scientific and ethical concerns about animal experimentation. STEPHEN R. KAUFMAN Medical Research Modernization Committee Cleveland (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) WHERE TO WRITE: Letters to the Editor The Scientist 3501 Market Street Philadelphia, PA 19104 Fax:(215)387-7542 Bitnet: garfield@aurora.cis.upenn.edu The Scientist welcomes letters from its readers. Only signed letters will be considered for publication. Please include a daytime telephone number for verification purposes. ================================ At The Interface Of Biology And Chemical Engineering (Page 15 of newspaper) BY SCOTT VEGGEBERG Chemical engineers and biologists say that collaborations between their disparate disciplines are beginning to bear some valuable fruit, not just in the applied field of biotechnology, but also in understanding basic cellular mechanisms. Yet those who are currently working at this cross-disciplinary interface say the challenge today is to get these two scientific cultures to interact. One way to bridge such a gulf is to have a meeting and invite interested members from each discipline to attend--which is exactly what has recently happened. "Research Opportunities in Biomolecular Engineering: The Interface Between Chemical Engineering and Biology" was convened in Washington, D.C., on December 7-8, under the auspices of the National Institute of General Medical Sciences (NIGMS). Says George Georgiou, a chemical engineer at the University of Texas, Austin, who co- chaired the meeting: "The idea of the meeting, in my mind, was to make more people aware of the interface that already exists and attract them to work at that interface and to build momentum for federal support of this work." Georgiou, for instance, has crossed the disciplinary barrier to collaborate with UT microbiologist Charles Earhart as well as fellow chemical engineer Joseph Francisco. Together they have produced Escherichia coli bacteria that instead of excreting proteins--as is the usual case in the biotech industry--anchor important enzymes, such as cellulases, to their surface (J.A. Francis-co, et al., Bio/Technology, in press). At the December meeting, Georgiou attempted to define "biomolecular engineering," the new term that's been coined to describe the emerging body of work coming from the interface of biologists and chemical engineers. "It may be argued that biomolecular engineering is applied biology or, more precisely, applied molecular biology to distinguish it from areas of applied research related to organismic biology," he told the attendees. "Yet, this definition is inadequate and self-limiting. It is best to think of biomolecular engineering as an area of scientific endeavor which is characterized by the following elements: emphasis on the analysis of model systems of obvious significance to medicine and biotechnology; the synthesis of information and research approaches from disciplines such as cellular physiology, genetics, physical biochemistry, and chemical engineering; and the translation of biological information into a quantitative framework." The translation of the biological into the numerical seems to be the most valuable element of this interface of fields, according to many researchers who attended the meeting. "Chemical engineers all have a certain quantitative bent that exists in their training that doesn't exist in the training of biologists," says Jonathan King, a highly regarded molecular biologist at the Massachusetts Institute of Technology who works closely with chemical engineers. About seven years ago, he says, he was looking at problems associated with protein aggregation. The biotechnology industry was just beginning to scale up production of its products but was encountering problems of proteins that have the intended amino acid sequence but don't fold up properly and hence aren't bioactive. King says he found few biologists or biochemists who were interested in or capable of working on these "scrambled egg" messes of misfolded proteins. "Initially, we were only able to talk to chemical engineers," he says. "They're used to dealing with polymers, and the problems you get when you scale up [production] are their stock and trade." He found it was a mind-set problem that hindered his interactions with biologists on this problem. "Biologists want to win Nobel Prizes and chemical engineers want to build factories. They have very different standards of achievement," he says. Douglas A. Lauffenburger, a chemical engineer at the University of Illinois, Urbana-Champaign, chaired a session on cell and tissue engineering at the December meeting. He is excited by the possibilities the melding of these two disciplines can engender. "An engineer is taught over and over again to think quantitatively and in terms of dynamics and kinetics, while a biologist is taught how to think descriptively. It's incredibly complementary. The best work in biomolecular engineering is going to be done at the interface and by two people, instead of one." But bringing these two scientific cultures together is a continuing challenge, says King. MIT's Biotechnology Process Engineering Center, funded by the National Science Foundation, has helped biologists and engineers coalesce. Janet Westpheling, a microbial physiologist at the University of Georgia in Athens, says the best physiologists are among chemical engineers because "they're interested in directing carbon flow from one metabolic pathway to another." She explains: "It's better for me to make contact with a chemical engineer who knows what he's doing than trying to be a mediocre chemical engineer. But biologists are going to have to make an effort to learn the language of chemical engineering." Before trying to make contacts, she recommends reading the highly cited textbook Biochemical Engineering Fundamentals (New York, McGraw- Hill Publishing Co., 1986), written by California Institute of Technology chemical engineer James E. Bailey. One problem that Westpheling and others note is that making contacts is difficult because chemical engineers and biologists rarely attend each other's meetings and don't publish in or read the same journals. That's why the NIGMS biomolecular engineering meeting was so useful, they say. It was a gathering place for biologists and chemical engineers--basic researchers as well as industry-based scientists. Lauffenburger says these cross-cultural conclaves are vital as ways of getting in touch with other researchers interested at working at the interface of disciplines. In fact, it was a workshop four years ago that sparked what has turned out to be quite a valuable collaboration. He says the organizers of the "Workshop on Mechanisms of Protein Trafficking," held at the University of California, Davis, in March 1988, "went out of their way to bring a diverse group together." At Davis, he met cell biologist Steven Wiley from the University of Utah Medical Center in Salt Lake City. Each man was quite familiar with the other's work, but they had never met or talked. Wiley says he is best known for a paper in Cell that laid out a quantitative model for how the epidermal growth factor receptor works (H.S. Wiley, et al., Cell, 25:433-40, 1981). A more detailed paper appeared the next year in the Journal of Biological Chemistry, and has been his most popular work, garnering 180 citations as of mid-1992, and reaching its peak at 30 citations in 1990 (H.S. Wiley, et al., JBC, 257:4222, 1982). To develop this model, he took what was then known about the molecular "parts" of the EGF receptor and the rates at which they functioned, and he then built a mathematical model that could predict how the cell would respond if certain parameters were changed, such as concentration of hormone or number of surface receptors. While this paper garnered a lot of attention when it was published, as time rolled on and research progressed on EGF receptors, more "parts" had to be added to the model. That's where his mathematical abilities ran out. And that's also when a lot of biologists became increasingly skeptical of the value of this mathematical modeling approach to receptor study. Kindred Souls "Then I met Doug Lauffenburger, and suddenly I had a kindred soul," he says. Together they've been able to advance this modeling approach to understanding the basic science of receptor function (H.S. Wiley, et al., JBC, 266:11083-94, 1991). "What I'm trying to add," says Lauffenburger, "is a quantitative understanding of rates and magnitudes of signals. Most people in the field are trying to identify components. Identifying components is critical but insufficient for understanding the signal because they depend on magnitudes." One use for the model has been to guide their thinking on how an excessive number of receptors leads to a transformation into a cancerous cell. What apparently is happening is these receptors choke the system that recycles them, shutting down the regulatory pathway, thus leaving the cells out of control, not overcon- trolled as might be expected. Wiley says his newest biomolecular engineering project is to attempt to develop an "autonomous bioreactor system." Under a grant from NSF's biotechnology program, he is working on a system of cell cultures that would not only produce valuable proteins-- but also supply its own growth regulators, obviating the need to supply complex and costly serums containing these factors. A project intended to bring about increased interest in work at the interface of disciplines is a book coming out this spring by Lauffenburger and coauthor Jennifer J. Linderman, also a UI chemical engineer. In the forthcoming Receptors--Models for Binding, Trafficking, and Signalling (New York, Oxford University Press) they write: "We intend this book to build a bridge between cell biologists and engineers over the ground that can be called quantitative cell biology or cellular bioengineering. Our aim is to com- municate how insights can be gained into the relationship between receptor/ligand molecular properties and the cell functions they govern, by a judicious combination of cell biology experimentation and quantitative engineering models." The bulk of funding for studies in biomolecular engineering has traditionally come from NSF, with additional money available from the Whitaker Foundation, which now has an office in Washington, D.C. (Scott Veggeberg, The Scientist, March 30, 1992, page 21), as well as from the biotechnology industry, says Michael Shuler, a chemical engineer at Cornell University. The National Institutes of Health has provided some funding in support of biologist-engineer collaborations through its Interdisciplinary Biotechnology Training Grant Program. But otherwise, grant review committees have apparently been prejudiced against anyone with a chemical engineering title, Shuler says. "Funding committees don't real-ly seem to understand the purpose of what's being done," he says, noting, however, that the December conference has probably been helpful in breaking down this bias. "There were probably people at NIH who had certain questions about the quality of science in this community, and where the interface may be," he says. "And I think they came away from this conference feeling the quality of science was very good." He says while contacts were made and grant officials--he hopes--were convinced of the value of the biology-chemical engineering interface, it will be a couple of years before it's known how effective the conference really was in terms of increased research opportunities. More funding will be critical not only because of the explosion of intellectual interest in these studies but also to take advantage of the field's benefits to U.S. competitiveness in the biotechnology sector. While NIH may try to implement more programs to encourage research at the interface, there's more than life scientist snobbery that's blocking chemical engineers' access to funding, says Marvin Cassman, deputy director of NIGMS. Chemical engineers need to examine their own biases, he says. "If you're going to do well in the NIH peer review and funding decisions, you need to submit proposals that are hypothesis- driven. This is not the way an engineer approaches a research problem," Cassman says. "Also, it's not that there are specific barriers to getting funding at NIH, but it's a mind-set that chemical engineers may not think of NIH." They instead go to NSF first rather than trying to successfully negotiate the NIH landscape, he says. Training Biologist-Engineers In addition to increased funding opportunities, engineers must be trained differently, says Jim Swartz, group leader for bacterial fermentation development at Genentech Inc. of South San Francisco, Calif. He was part of a committee, including such widely respected chemical engineers as MIT's Danny Wang and Harvard University's George Whitesides, that produced Putting Biotechnology to Work: Bioprocess Engineering (Washington, D.C., National Academy of Sciences, 1992). In this report, they call for a new paradigm for training the next generation of biologically grounded engineers. "It is important that the bioproc-ess engineers' training in the next decade have a strong background in biochemistry, molecular biology, cell biology, and genetics. That will facilitate useful communication of bioprocess engineers with the bench scientists who are at the initial discovery stage of biological product research and development," according to the report. "To implement this type of research, cultural changes in the engineering and scientific communities will be required. For example, a doctoral candidate in chemical engineering is often viewed as performing research as a single investigator when, in fact, input from multiple disciplines is essential." In Swartz's mind the ball is now clearly in the NIH court for advancing work at the interface. "I think NIH is an excellent place to foster this research." NIGMS's Cassman says his first order of business is to get the summary papers from the December meeting out in published form, sometime this spring. From there it will be up to other officials at his and other institutes whether they catch the cross-disciplinary fever and move ahead with funding new research and training initiatives. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ RESEARCH ASTROPHYSICS (Page 16 of newspaper) A.I. Abazov, O.L. Anosov, E.L. Faizov, V.N. Gavrin, et al., "Search for neutrinos from the sun using the reaction 71Ga(ne,e- )71Ge," Physical Review Letters, 67:3332-35, 1991. R.T. Kouzes (Battelle Pacific Northwest Laboratories, Richland, Wash.): "The fusion mechanism that drives the stellar furnace was long thought to be well understood. Sophisticated computer models have been developed to describe this process, and experimental tests of the model for our sun have been carried out for more than 20 years. The `solar neutrino problem'--the deficiency in the experimentally measured flux of neutrinos produced in the fusion process at the center of the sun compared to model calculations-- has cast some doubt on our understanding of stellar energy generation, weak interaction physics, or both. "The Soviet American Gallium Experiment (SAGE), a solar neutrino detector collaboration between the Russian Academy of Sciences and United States institutions led by Los Alamos National Laboratory, is located in the Caucasus mountains of southern Russia and has been operational for more than two years. This gallium-based detector, unlike earlier experiments, is sensitive to the low-energy neutrinos produced by the proton-proton fusion process in the sun's core, a flux closely tied to the observed luminosity of the sun. The experimental measurements to date have shown an apparent deficiency in the low-energy solar neutrino flux, a result also found by a similar European detector (GALLEX). Both SAGE and GALLEX indicate that the number of neutrinos observed experimentally is less than expected from the standard computer models. "These initial experimental results, if verified by several more years of observation and direct neutrino calibrations of the experiments, require a substantial change either in the physics of neutrinos or in our understanding of the stellar energy generation process. One solution to the solar neutrino problem that is of great interest to physicists is the possibility that we are observing oscillations of neutrinos from one type to another. This elegant solution requires that at least one neutrino type have a non-zero mass and that this mass be close to the mass of another neutrino type. Under certain reasonable stellar conditions, the electron-type neutrinos emitted in the stellar fusion process could resonantly oscillate into another neutrino type and, as a result, pass through our terrestrial detectors unobserved. Detectors now under construction, such as the one being built at the Sudbury Neutrino Observatory, may provide the definitive data to determine whether the solar neutrino problem is due to faulty astrophysics or new particle physics." (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ OPTICS (Page 16 of newspaper) U. Keller, G.W. tHooft, W.H. Knox, J.E. Cunningham, "Femtosecond pulses from a continuously self-starting passively mode-locked Ti:sapphire laser," Optics Letters, 16:1022-24, 1991. Ursula Keller (AT&T Bell Laboratories, Holmdel, N.J.): "Until the late 1980s, advances in ultrashort pulse generation were dominated by dye lasers. These ultrafast laser systems with pulse duration shorter than 100 femtoseconds are relatively large, are maintenance-intensive, and require a laboratory environment with a skilled technician or scientist to operate. An all-solid-state laser technology could provide a compact, reliable, push-button type of a laser. The development of diode-pumped solid-state lasers addressed these issues; however, for ultrashort solid- state lasers, a broad gain bandwidth was required. "Renewed research efforts in tunable solid-state lasers produced the first demonstration of the Ti:sapphire laser and, more recently, diode-pumped laser materials such as Cr:LiSAF. The development of such laser materials was crucial in the current rapid developments in femto-second pulse generation. "One key issue for solid-state lasers is the mode-locking mechanism, which produces ultrashort pulses. Because diode-pumped solid-state lasers typically have about a 1,000 times smaller gain cross section than dye lasers, the traditional femtosecond mode-locking techniques developed for dye lasers did not work. However, within a very short time since 1988, the pulse duration of mode-locked Ti:sapphire lasers moved from picosecond pulses to 15 femtoseconds because of a strong worldwide research effort. During this time many new mode-locking techniques have been introduced. "For many of these new mode-locking techniques, self-starting is a serious problem. The mode-locking was typically started by increasing temporarily the noise of the laser, which produced a sufficiently strong noise spike to initiate passive mode-locking. Techniques such as banging or misaligning the laser, moving an end mirror, or an acousto-optic modulator have been used. However, for applications beyond the scientific market, reliability and low cost are the main issues. We concentrated on an all-solid-state passive continuous starting mechanism for which we initially used semiconductor saturable absorbers inside a coupled cavity. However, an intracavity mode- locking technique is more desirable because it does not require active stabilization and the overall cavity design becomes more compact leading to increased mechanical stability. The coupled-cavity result mo- tivated the invention of a new compact, (about 400 microns thick) semiconductor saturable absorber device, an antiresonant Fabry-Perot saturable absorber (A-FPSA), which is the first passive intracavity mode-locker that both starts and sustains stable mode-locking of diode-pumped solid-state lasers (U. Keller, et al., Optics Letters, 17:505-7, 1992). Bandgap engineering of semiconductors allows custom-designed nonlinearities and extension of this technique to other laser materials and wavelengths. We expect the robust nature of an all- solid-state ultrafast laser system will allow for many new applications outside the laboratory and by researchers who do not wish to become laser experts." (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ CELL BIOLOGY (Page 16 of newspaper) C.C. Thompson, T.A. Brown, S.L. McKnight, "Convergence of Ets- and Notch-related structural motifs in a heteromeric DNA binding complex," Science, 253:762, 1991. Catherine C. Thompson (Carnegie Institution of Washington, Baltimore): "Our analysis of GABP, the heteromeric DNA binding protein, revealed the functional properties of a structural motif found in a diverse array of proteins. One of the subunits of GABP, GABPa, is a member of the ETS family of DNA binding proteins. Although the other subunit, GABPb, does not bind to DNA by itself, it forms a complex with GABPa in which it stabilizes the binding of GABPa to DNA. We found that four imperfect tandem repeats of a 33-amino-acid motif at the amino terminus of GABPb mediate interaction with GABPa. This motif, the cdc10/SW16 or ankyrin repeat, had previously been identified in a variety of interesting proteins. "Surprisingly, the proteins in which repeats had been identified seemed to lack a common function or even subcellular location. For example, repeats are found in transmembrane (Notch, Lin-12, and Glp-1), cytoplasmic (ankyrin, NFk p105, IkB), nuclear (SW14, SW16, GABPb), and secreted (a-latrotoxin) proteins. Proteins with this motif had been identified in diverse species including fruit flies, nematodes, yeast, and man, indicating that the motif is evolutionarily conserved. Until now, the function of the repeats had remained obscure. "The importance of our results was to demonstrate a distinct biochemical role for the repeats--as modules that form specific dimerization interfaces. Direct evidence for participation of the repeats in protein:protein recognition has been obtained for GABP, ankyrin, and most recently, IkB and related proteins (G.F. Wulczyn, et al., Nature, 358:597-9, 1992; J.-I. Inoue, Proceedings of the National Academy of Sciences, 89:4333-7, 1992; S. Kidd, et al., Cell, 71:623-35, 1992). In all cases, it is likely that the function of the repeats will be to mediate specific, inter- or intra-molecular protein:protein interactions." (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ TOOLS & TECHNOLOGY Molecular Biology Reagent Kits Simplify Lab Procedures (Page 19 of newspaper) BY RICKI LEWIS Flip through the pages of the major science journals, and you'll find ads for kits that enable a researcher to do nearly anything imaginable--detection or purification, for example--to nucleic acids, chromosomes, and proteins. Such ads are starting to trickle into the medical journals, also, as physicians begin to replace or supplement traditional microbiology-based tests with DNA probes or monoclonal antibody kits to diagnose infections. With labs in the biotechnology industry and the universities busier than ever and researchers in the Human Genome Project and elsewhere generating a constant stream of new material for kit development, the future market for molecular biology kits looks substantial. A reagent kit, also often called a "system," consists of the chemicals needed to perform a given laboratory procedure, packaged with directions on their use. The catalog from Bio-Rad Laboratories in Richmond, Calif., provides a somewhat sweeping definition of such kits as "systems that enable scientists to recognize and decipher the multiple chemical patterns which are the clues to understanding life, human health, and materials science." What's In A Reagent Kit? "A reagent kit is an assembly of all components, optimized and pretested, so they give good results without the user needing to do a lot of work beforehand," says Gary A. Dahl, president of Epicentre Technologies of Madison, Wis. "The idea is to produce something that will help researchers do the simple things so they can get on with the experiments, which is what they are supposed to be doing," adds Jim Gautsch, vice president of kit development at BIO 101 Inc. of Vista, Calif. Most kit producers also offer flexibility. "All of our products are sold in kit forms, and you can buy the reagents separately, but that is more expensive," says Annette Short, customer service representative at Gaithersburg, Md.-based Oncor Inc. Oncor calls itself a "molecular pathology" company specializing in using DNA probes to highlight chromosome abnormalities behind inherited disorders and cancer. "We try to offer laboratories the ability to customize particular needs, not forcing them into a kit," Short says. In molecular biology, the number of procedures actually covered by the many reagent kits available is not as daunting as the bulging catalogs might suggest. Despite the large number of molecular reagent kits on the market, the procedures they facilitate--and the molecules they target-- are comparatively few. The job of choosing an appropriate kit is simplified by first identifying the task to be performed and then matching it with the molecule involved. The most common molecules, of course, are DNA, RNA, and proteins. And the most common procedures are detection, extraction, isolation, purification, synthesis, sequencing, labeling, modification, or mutagenizing. A typical reagent kit includes enzymes, controls, buffers, labeling reagents, possibly DNA probes or antibodies, and the all-important protocol. Very helpful are the newsletters published by reagent kit manufacturers, in which users describe applications of the products. For example, Cleveland-based U.S. Biochemical Corp.'s "Editorial Comments," Oncor's "The Molecular Cytogenetics Newsletter," and La Jolla, Calif.-based Strategene Corp.'s "Strategies in Molecular Biology" provide helpful hints and new applications. The challenge in the reagent kit business is to stay at the cutting edge of research and be among the first to recognize when a new approach both fills a niche and can save a scientist time if its components are prepackaged together. One way that a company can do this is to tap into the experiences of its staff, says BIO 101's Gautsch. "I came from the Scripps Clinic [of La Jolla, Calif.], where I spent a lot of time in the lab making basic materials so I could go on [with research]," Gautsch says. "Our kits allow you to do procedures that [otherwise would] slow you down, that often are stumbling blocks." He cites as an example Geneclean, a kit that removes impurities and inhibitors (such as organic solvents, unreacted nucleotides, and salts and proteins) from agarose gels, ensuring that results on subsequent experiments are reliable. Kit Genesis New kit ideas can come from a variety of sources. "We have a scientific advisory board of people in the field either working on projects or keeping their eyes open to what's coming up," says David Bruning, genetics marketing manager at Oncor. "We also have scientists in house who scour papers [in the literature]. We get feedback from going to meetings, information from customers, and from customer service reps." Many considerations enter into deciding which new DNA probes to commercialize as a kit. Would research labs actually use a new probe kit, or are they already using their own versions? How many people have a disorder caused by the gene detectable with a new DNA probe? Would physicians ultimately use the new DNA probe kit to diagnose the associated illness, or do diagnostic kits using other technologies already exist? Are other companies close to developing a kit using the particular probe? Knowledge about the disease in question is important, too. "Sometimes a disease is not well enough characterized to judge if a specific probe is useful," says Bruning. He cites, for example, the phenomenon of genetic heterogeneity, in which a set of symptoms is caused by more than one gene. A diagnostic test based on a DNA probe to one gene will yield a false negative result for patients whose problem is caused by another gene. Assembling a kit that not many others have thought of is one approach at Epicentre Technologies. This firm set its sights on kits based on the ligase chain reaction (LCR, also known as ligation amplification). LCR is an alternative gene amplification technology based on the ability of ligase to knit together pieces of a gene. It has diagnostic uses--if a gene from an infectious microbe or virus is present in a patient's sample, the ligase will join the pieces, which are then detected. "Our Ampligase kit contains everything necessary for ligation amplification, with the customer's own target-specific set of four oligo-deoxynucleotides," says Dahl. The kit includes buffers, ligase, and a positive control template. Another strategy in reagent kit development is to troubleshoot, identifying a problem and solving it. Dahl describes a problem in sequencing DNA that is avoided with Epicentre Technologies' Sequi-Therm cycle sequencing kit. "DNA fragments with regions rich in the DNA bases C and G tend to be lost, so if you read down the four lanes of a [sequencing] gel, you miss certain areas of the sequence," he says. This is more than just a minor annoyance, because GC-rich areas are often present very near to genes that encode protein, and are also the parts of genes that most frequently mutate, causing disease. Carving A Niche Oncor Inc. is focusing on a fusion of technologies--DNA probes with cytogenetics (identifying chromosomal variants that cause illness). This technology is called "FISH," for fluorescence in situ hybridization, or chromosome painting, and was pioneered at Lawrence Livermore National Laboratory, Livermore, Calif. In FISH, a labeled piece of DNA, the probe, binds to its complementary sequence on a chromosome. This is much more specific than classical chromosome staining, in which dyes are used to stain chromosomes in characteristic patterns reflecting more generalized features, such as GC- vs. AT-rich areas. FISH is very useful in distinguishing among some blood cell cancers, which are often associated with chromosomal rearrangements. A definitive diagnosis is crucial to determining prognosis and course of treatment. This diagnosis is possible by deciphering the associated chromosomal anomaly. A classic example is chronic myelogenous leukemia (CML), which is nearly always associated with a chromosome translocation that places the tip of one chromosome, number 9, onto chromosome 22 (chromosomes are numbered in size order). This so-called Philadelphia chromosome, named for the city where it was discovered in 1960, juxtaposes the Abelson (or ABL) oncogene on chromosome 9 against a region of chromosome 22 called the breakpoint cluster region, or BCR. When the two genes are opposed, a single "fusion" protein is manufactured under their direction. This protein somehow lifts cell cycle control to cause cancer. Oncor's BCR/ABL Translocation DNA Probe highlights the ABL gene with yellow-green fluorescein and BCR with red rhodamine. If a patient with unexplained fatigue and bruising and a leukemic white blood cell count has the red and green flash next to one another in a chromosome preparation, the telltale Philadelphia chromosome is there, and the diagnosis is CML. "We also have a Translocation 15/17 Probe for acute promyelocytic leukemia, neuroblastoma, and her 2 neu, which is an oncogene associated with breast cancer," says Bruning. Oncor's Chromosome In Situ System for Dual Color Detection includes reagents to block repetitive DNA so that the probes home in on the single copy genes of interest; reagents for hybridization, detection, and signal amplification; and the probe. The dual color technology is licensed from Indianapolis- based Boehringer Mannheim Corp. Perhaps most exciting is the use of Oncor's probe kits to solve chromosomal mysteries. Vicki E. Powers and Huntington F. Willard of the department of genetics at Stanford University used the probe kits to identify isochromosomes more definitively than is possible with traditional staining. Isochromosomes have identical arms because during cell division, the duplicated chromosome split along the wrong plane. The result is double doses of some genes and the absence of others--an unhealthy imbalance of genetic material. "The information obtained [from using the probes] both aids diagnosis and provides fundamental information on the nature of human chromosome abnormalities," they reported in the company's newsletter. Another chromosomal mystery is being solved by David F. Callen of the department of cytogenetics and molecular genetics at Adelaide Children's Hospital in Australia. He is using Oncor's DNA probe kits to determine the chromosomal origins of ring chromosomes. These tiny rings of genetic material are sometimes present in cells of patients with unrecognized syndromes. Being able to tell which chromosome gave rise to the ring can aid diagnosis. While Oncor reaches both the research and clinical marketplace, other firms specialize in molecular biology reagent kits for the health care industry. Here, too, innovation, convenience, and time-saving are the goals. For example, there is the PACE 2-Direct Probe Assay for Chlamydia trachomatis and Neisseria gonorrhoeae, offered by San Diego-based Gen-Probe. A dual test is useful because a patient who has one of these sexually transmitted diseases has up to a 50 percent chance of having the other. The kit includes the probes, an activator, a separation reagent, probe diluent, positive control, negative reference, and a wash solution, sufficient for 100 specimens. A single test uses one urogenital swab and replaces standard microbiology culturing. Such kits serve a need because physicians do not have the time or training to develop DNA probes. A Crustacean Simulation The quintessential molecular biology kit is one that provides a test that cannot be done--at least not easily--otherwise. This is so for the Limulus amebocyte lysate (LAL) test, a standard method for detecting bacterial endotoxin in drugs or on medical devices such as syringes. Endotoxin causes septic shock, which can result in death. The LAL test borrows the distinctive response of the horseshoe crab (Limulus) to encountering endotoxin--its coppery blue blood gels. This reaction was discovered in 1956 by Frederick Bang at the Marine Biological Laboratory at Woods Hole, who, with colleague Jack Levin, isolated the amebocytes responsible for the reaction, and re-created the clotting in vitro. The LAL test rapidly replaced injecting rabbits with contaminated drugs and looking for a response of fever, indicating the presence of endotoxin. Today, companies that market the LAL test usually get freeze- dried crab blood from Associates of Cape Cod Inc., Falmouth, Mass. Here, during summers, college students "bleed" horseshoe crabs that hang upside down, the animals' blood collecting in jugs below them. (Only 30 percent of each crab's blood is taken, and the crabs are returned, apparently unscathed, to the sea.) Because most manufacturers of medical devices can't simply pluck a horseshoe crab from a nearby shore and watch its blood clot into blue gelatin, companies such as Bio-Whittaker Inc. of Walkersville, Md., provide LAL kits. "Our product is derived from the amebocyte," says product manager Maribeth Donovan. "We lyse the cell, and all the components of the blood coagulation system are there. The LAL test consists of lysates of the blood cells. You mix a sample with LAL, and if it gels, it means endotoxin is in the sample." More informative is the company's "chromogenic" version of the LAL test. "Instead of the flipped tube having solid on top, it turns yellow," says Donovan. "We can quantitate the amount of endotoxin in the sample by the intensity of the yellow compared to a standard curve." Ricki Lewis is a freelance science writer based in Scotia, N.Y. She is the author of a biology textbook and has just completed a human genetics text. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ LEADING REAGENT KIT MANUFACTURERS Companies discussed in the accompanying article: (Page 19 of newspaper) Bio-Rad Laboratories 2000 Alfred Nobel Dr. Hercules, Calif. 94547 (800) 4 - BIORAD Fax: (800) 879-2289 Products: Call for catalog BIO 101 Inc. 1060 Joshua Way Vista, Calif. 92083 (800) 424-6101 Fax: (619) 598-0116 Products: phage DNA isolation yeast transformation Geneclean DNA purification Mermaid oligomer purification RNaid RNA purification Circleprep plasmid preparation G nome DNA isolation, for genomic DNA BioWhittaker Inc. 8830 Biggs Ford Rd. Walkersville, Md. 21793-0127 (800) 638-8174 Fax: (301) 845-8291 Products: Limulus Amebocyte Lysate Gel-Clot LAL Chromogenic LAL Epicentre Technologies 1202 Ann St. Madison, Wis. 53713 (800) 284-8474 Fax: (608) 251-3199 Products: Ampligase thermostable DNA ligase SequiTherm cycle sequencing AmpliScribe transcription Oncor Inc. 209 Perry Pkwy. Gaithersburg, Md. 20877 (800) 77-ONCOR Fax: (301) 926-6129 Products: Oncor probes Chromosome in situ hybridization systems Probe tech systems for DNA analysis RNA/DNA in situ reagent (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ PROFESSION Judging Interpersonal Skills Is Key To Hiring In Industry (Page 21 of newspaper) BY LISA J. BAIN About five or 10 years ago, good scientific credentials were enough to land a job in industry, human resources experts say. But hiring managers report that times have changed. In today's high-technology companies, teamwork is the key to developing technological products and bringing them to market, they say. And effective teamwork requires the ability to communicate both vertically and horizontally through an organization. As a result, interpersonal skills are almost as important as technical competence. Assessing these skills, however, can be difficult, and requires managers to look beyond the obvious. Kirby Vosburgh knows this well. Vosburgh, manager of the Applied Physics Laboratory at the General Electric Research and Development Center in Schenectady, N.Y., recently hired two scientists to fill out a new research group doing advanced work in lighting systems. The people he hired "are both really sharp but very different," he says. "One is a retro-'70s hippie type. The other looks like he came out of business school. What they share is they are both excellent scientists. They have a good natural understanding of their field, they are good communicators, and can do [the science] well." Vosburgh interviewed 40 scientists before settling on these two disparate characters. Although the interviewing process was lengthy--including a seminar conducted by each candidate-- Vosburgh says that after 20 years with GE and lots of interviewing experience, he knows about 15 minutes into an interview whether a candidate is the right person for the job. Other scientist/managers interviewed for this article say much the same thing--that intuition plays an important role in making hiring decisions. But human resources professionals urge caution. "People often make decisions on their gut feelings, and that can be a costly mistake," says Melanie Graper, director of worldwide R&D employment for SmithKline Beecham Pharmaceuticals in Philadelphia. Adds Paul Connolly, president of Performance Programs, a human resources consulting firm in Rowatan, Conn., "Typically, scientist-types tend to overhire on technical competence and underhire on personal characteristics. They tend to overestimate the importance of technical knowledge to overall job performance." As a result, in many companies, scientists screen candidates for technical competence and human resources professionals screen for personal characteristics such as self- motivation and initiative. But Connolly says that scientists can be trained to do this "personality" interviewing as well. "We call it behavior-based interviewing," he says. "It says that the past is the best predictor of the future." For example, an interviewer might ask questions about problems the candidate ran into during his or her research, and how the applicant went about solving the problems. The answer should help the interviewer discern some of the most important qualities he or she is looking for: not only technical competence, but also adaptability, flexibility, independence, and tenacity. Andrew Nichols, a research fellow in cardiovascular pharmacology at SmithKline Beecham, recently went through a training program for interviewers and found it "most useful. It highlighted how you can make mistakes if you don't do a careful job." Nichols recently finished interviewing for a Ph.D.-level research scientist. Through the training program, he realized he wasn't asking specific enough questions. "I was asking vague questions and getting vague answers that were open to interpretation and making assumptions," he says. As further evidence of the dangers of deficient interviewing, Nichols recalls a hiring decision made several years ago that proved to be a mistake: "We hired him very quickly, concentrated on technical abilities, and didn't look into how good he was at working with people. And he wasn't good at working with people." Indeed, interpersonal skills should be emphasized in an interview, scientist/managers say. "In an industrial concern, the ability to work well in a team and to share credit is absolutely critical," says Michael Montague, director of research operations for St. Louis-based Monsanto Co. There are people who seem spectacular but don't adapt well to an industrial environment, Montague says: "Some people are much better off working as individuals in academia." Kirby Vosburgh says that in a company such as GE, teamwork takes on an added dimension. "We're very diverse, with a broad span of technologies. A successful person may have to hold together scientists from five or six disciplines." For example, GE's recent development of a magnetic resonance microscope required expertise in several fields, including magnetic resonance imaging and high-temperature superconducting probes. "We like people with broad interests," says Vosburgh. "A lot of the commercial applications come at the boundaries between technologies. People have to be interested in communicating with people who wouldn't normally be seen as collaborators." Of course, technical competence is a major factor in hiring. But many scientist/managers say that specific skills are less important than a candidate's ability to adapt and learn. In biochemistry, enzymology, or protein purification, for example, the information and skills are readily transferable, says Monsanto's Mon-tague. Finding The Best Fit Identifying the candidate who best fits the job profile takes time. Many employers start by placing ads in major publications. Others send recruiters out to interview candidates on college campuses. Still others rely on word of mouth. Technical people usually scrutinize the candidates' curriculum vitae. Montague says that experts in an applicant's particular field of science, from both inside and outside the company, may be called upon to evaluate the quality of the work, whether the research addressed good questions, and the sophistication of the scientific approach. "We're looking for evidence of the potential to be an independent investigator," says SmithKline Beecham's Nichols. Publications are important--how many, in what journals, and how many with first authorship. "We expect to see an average of at least three publications per year for an entry-level Ph.D. We also look to see whether they've gained independent funding. That's an indicator of how good they are." Nichols says the publication expectation is not a strict one. "If they have worked in a field that's technically very difficult or if they've had experience that we would really like to have, we might consider them even if their c.v. looks weak. For example, in vivo experiments can take much longer, and we would look at them slightly differently. But when [candidates] don't have many publications, we want to know why." Because of the importance placed on independent research experience, postdoctoral fellows are often the preferred candidates. But some people move about within the industry. At GE, says Kirby Vosburgh, "we apply the professional [yardstick] that's appropriate--publications if the person comes from academia and patents if he comes from industry." For lower-level scientists--technicians and B.S./M.S.-level junior researchers, the balance shifts somewhat, with more importance placed on technical skills and less on independent research. But even at this level, says Nichols, "someone who has initiative and independence makes life much easier." Once the field of candidates has been narrowed, the long interviewing process begins. "During the interview, we try to see whether our hunches are correct," says Nichols. Among other things, the interviewers will try to determine how much of a research project the candidate actually conducted himself or herself. Particularly when the person comes from a large lab, this may be difficult to ascertain. Many interviews may be scheduled, often with scientists from different areas. At SmithKline Beecham, the typical Ph.D.-level candidate will have eight to 10 interviews, says Graper. In addition, some less formal interactions, such as lunch meetings with peers, may be scheduled. This benefits both the employer and the candidate. "When a person comes in for an interview, they're interviewing us too," says Monsanto's Montague. "The chemistry has to be right." Adds another scientist/manager, speaking on condition of anonymity, "It can make or break your lab. A small lab can be destroyed by a personality problem." Managers say they commonly ask candidates to describe examples of things that went wrong in their research and about bad interactions they've had with other scientists. Nichols says it's a "sign of integrity" when a person can admit to things he or she couldn't do. Further, Nichols notes that people who say they have not had bad interactions are probably not being honest; and if they say they've had lots of bad interactions, they're probably difficult to work with. There are other qualities that are being assessed during the interviews. For Ph.D.-level scientists, leadership potential is important. "Is there an indication that the person provides seminal ideas for others?" asks Montague. Self-confidence is another important quality, as are closure skills--the ability to see a project through to completion. As part of the interviewing process, candidates are often asked to present seminars to groups of scientists at the company. This is where they can really be grilled. "We ask very aggressive questions to see how they respond and to see if they can get along with people," says Nichols. The third phase of the evaluation process involves reference checking. Nichols says that at SmithKline Beecham, letters are written to all people named on a candidate's reference list asking for comments about the applicant's technical abilities, initiative, interpersonal skills, motivations, and strong and weak points. "If something comes up in the reference that we didn't pick up in the interview, we'd want to explore that further," says Nichols. Connolly says that human resources professionals are less likely than scientist/managers to step into legal problems when asking questions that pertain to personal circumstances. "You have to show that your question is job-related. You have to tell them about the job's expectations." For example, if a job requires extensive traveling, asking candidates if they are able to travel is all right; but asking what they'll do with their children if they have to travel is "clearly out of bounds." Connolly says that hiring scientists is not much different from hiring other types of professionals, nor is there much difference among scientific disciplines. "The content of the answers might shift, depending on the discipline, but the approach is basically the same." In all cases, he says, the standard interviewing advice is identical. There are three factors to consider, he says: " `Can do'--that's technical; `Will do'--that's motivational; and `Fit'--that's cultural." Adds GE's Vosburgh: "Some people like industry, but they don't like big bureaucracies. And they may have difficulty adjusting to different motivations for doing industrial research. The work is the same, but the motivation is different--product, not publication." Lisa J. Bain is a freelance science writer based in Philadelphia. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ PROFESSION Association Provides Seed Grants For Clinical Chemists (Page 22 of newspaper) BY JAMES M. WEIL Clinical chemists relate the chemical composition of tissue and body fluids to different illnesses--a critical step in the diagnosis and treatment of diseases. For them, as for investigators in most other fields today, transforming an idea into reality requires extensive funding. Sylvia Daunert, an assistant research professor at the University of Kentucky, is conducting research involving time-resolved fluorescence, creating more selective and sensitive assays for biomolecules and other biological compounds by using complexes of certain elements, such as europium, with fluorescent compounds. These complexes will fluoresce longer than compounds in existing biological and clinical samples and will bind either with antibodies or with the biomolecule being studied. This causes the fluorescent properties of the compound to be delayed, hence the name "time-resolved fluorescence." Daunert was one of two recipients of $5,000 grants awarded last year by the Van Slyke Society, the philanthropic branch of the 40-year-old American Association for Clinical Chemistry (AACC), based in Washington, D.C. Despite its name, AACC includes members from countries outside the United States; the association currently has more than 10,000 members worldwide. Daunert says that without such funding, ideas such as hers would never come to fruition: "Five thousand dollars is not a lot of money, but it's enough seed money to allow testing of a principle and to collect enough data to apply for bigger grants." Daunert says that most of her Van Slyke money will be used to purchase chemicals and supplies. Founded in 1988, the Van Slyke Society aims to "... promote, encourage, and stimulate the study and application of clinical chemistry," according to the society's mission statement. The society awards two $5,000 grants per year, as well as six educational grants of $1,250 each to support undergraduate summer intern- ships for students interested in clinical chemistry careers. The other $5,000 grant recipient in 1992, Theodore K. Christopoulos, an assistant professor of clinical chemistry at the University of Windsor in Ontario, Canada, is conducting studies on a new amplification system for immunoassays and nucleic acid hybridization assays. Christopoulos proposes to increase the sensitivity of these assays by taking an antibody to a substance, such as thyrotropin, and attaching it to a gene that produces a CAT enzyme. The CAT enzyme acts as a reporter by taking on a color that corresponds with the substance being studied. Instead of producing only one molecule of the enzyme, the gene would produce thousands more. These molecules would be used to catalyze the breakdown of the substrate, and the rate at which they did so would be a measure of how much thyrotropin was in the blood sample. This system would be useful for diagnosing and monitoring thyroid diseases. Also, the system would be used to detect and quantify leukemia-specific mRNA sequences for diagnosing and monitoring chronic myeloid and acute lymphocytic leukemias. Christopoulos came upon the idea while working as a research fellow at Toronto Western Hospital and applied for the Van Slyke grant just before accepting his position at the University of Windsor. "The money helped me more than I could believe," says Christopoulos. "I needed money to set up my new lab right from the beginning. The money helped me to buy the reagents and supplies needed to get started." Research most likely to be considered for funding are studies that focus on new ways to determine chemicals in the body and new ways to use such information, says Harry Pardue, chairman of the Van Slyke Research Grants Committee and a professor of chemistry at Purdue University. "We look for proposals with the potential for transportability from the research lab to the practicing clinical lab," says Pardue. Other factors considered are the project's scope, originality, significance to the science, and soundness of research plan, as well as how achievable the proposed project is. A lower priority is given to requests for money to be used for salaries or general operating expenses. Pardue says that 20 percent to 40 percent of the half-dozen or so proposals the committee receives per year are funded. The proposals are reviewed by a six-member committee, consisting entirely of working clinical chemists. The grant funding cycle begins in June and ends in May the following year. The application deadline is March 15, and the project start date should be no later than June 1 the same year. Grants are awarded by June 1 and formally announced in July. Grants are not renewable. For more information, call Christopher Hoelzel, director of marketing and publications at AACC, (800) 892-1400 or (202) 857- 0717, or write: Grant Request, 2029 K St., N.W., Seventh Floor, Washington, D.C. 20006. James M. Weil is a freelance writer based in Philadelphia. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ PEOPLE MIT Meteorologist Is Named First Winner Of New American Geophysical Union Medal (Page 23 of newspaper) Edward Lorenz, a professor, emeritus, in the department of earth, atmospheric, and planetary sciences at the Massachusetts Institute of Technology, has been named the first recipient of the Roger Revelle Medal, presented by the Washington, D.C.-based American Geophysical Union (AGU). Lorenz received the award at the fall meeting of AGU in San Francisco on December 9. Named for oceanographer Roger Revelle, who died in July 1991, the award honors individuals who have contributed to understanding the processes involved in the Earth's atmosphere, including its dynamics, chemistry, and radiation. Revelle--former director of the Scripps Institution of Oceanography in La Jolla, Calif., now a part of the University of California, San Diego--was best known as a co-founder of the theory of plate tectonics, which states that the ocean floor moves because of an upward flow of heat from the Earth's interior. Honoree Lorenz's work focuses primarily on chaos theory as a way of explaining atmospheric science. "In modern terminology, chaos has been used to colloquially mean something that is not random but still looks random," he explains. "Chaos is something that is determined by precise laws yet behaves rather unpredictably in any one case." As a meteorologist, Lorenz has applied this idea to the ever- changing weather patterns in order to establish some method of pre- dictability. "The theory of chaos is a lot like the weather. There are precise laws determining the weather but still, it's very difficult to forecast it very far in advance," he says. Ironically, says Lorenz, in order for chaos theory to work correctly, a number of intricate laws must be followed. In the early 1970s he wrote a paper examining the hypothesis that "the flap of a butterfly's wings in Brazil can set off a tornado in Texas." The paper was presented at the AAAS Convention of the Global Atmospheric Research Program at MIT, Dec. 29, 1972. "The idea was that some very minor, small, undetectable influence could lead to something quite detectable after a sufficient amount of time," Lorenz says. "This illustrates the difference between what we think the weather is and what it really is, because we can't observe it precisely." Lorenz received his bachelor's degree in mathematics from Dartmouth College in 1938 and his Sc.D. in meteorology from MIT in 1948, and has taught at MIT ever since. In 1983, he and a colleague, Henry Stommel, received the Crafoord Prize from the Royal Swedish Academy of Sciences. An explanation of his work appears in the academy's journal, Tellus (E. Lorenz, "Irregularity: a fundamental property of the atmosphere," 36A:98- 110, 1984). --Ron Kaufman (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ General Electric Company Taps Imaging Specialist To Head R&D At Its Schenectady Facility (Page 23 of Newspaper) Lewis S. Edelheit, formerly the manager of the Electronic Systems Research Center at General Electric Co., has been appointed senior vice president for corporate research and development at the GE Research and Development Center in Schenectady, N.Y. He began his new job on November 2. The longtime GE employee--he first joined the GE R&D center in 1969, after receiving his Ph.D. in physics from the University of Illinois in Urbana--specializes in the development of medical imaging scanners. The Schenectady center is one of 26 R&D facilities run by GE around the United States. In 1991, GE was issued 973 U.S. patents; of them, the R&D center accounted for 313. GE products based on inventions from the center include the "fan-beam" computed tomography scanner, synthetic industrial diamonds, and Lexan polycarbonate resin. Edelheit says that having a vision for the future as well as quick decision-making are perhaps the most essential characteristics of a successful R&D director. "What I see as the key part of my job is getting the right people working on the right projects every day," he says. "We've got to balance off high-risk and low-risk projects; balance off near-term and short- term projects; and balance off going for singles and taking a swing at a home run. I think we live in a world where things move so quickly that time ends up being the most important factor in success." Edelheit hopes to create better pathways for exchanging information among corporate entities. "I'm going to be worrying a lot about breaking down the boundaries between the research lab and the businesses; the marketing, engineering, and manufacturing departments; and the customer," he says. "We've got to work very hard to break down the walls and boundaries between organizations in order to move more quickly." Edelheit received his bachelor's degree in engineering physics from Illinois in 1964. He then worked at the R&D center, where he was the project manager overseeing GE's first computed tomography scanner, and at the company's medical imaging R&D facility in Milwaukee. In 1986, he left GE to become president and CEO of Quantum Medical Systems, a small start-up medical imaging company that was eventually bought by Siemens Allis Inc. in Cherry Hill, N.J. He returned to GE in 1991 as manager of the R&D center's Electronic Systems Research Center. --Ron Kaufman (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ PEOPLE BRIEFS (Page 23 of Newspaper) Roberto J. Poljak, a professor and head of the structural immunology laboratory at the Pasteur Institute in Paris since 1981, has taken a position as director of the Center for Advanced Research in Biotechnology (CARB) in Rockville, Md. Established in 1984, CARB is a research institute founded by the University of Maryland's Biotechnology Institute and the National Institute of Standards and Technology. Poljak is known for developing the first three-dimensional models of key antibodies and antigens (A.G. Amit, et al., "Three- dimensional structure of an antigen-antibody complex at 2.8 A resolution," Science, 233:747-53, 1986). Born and educated in Argentina, he came to the United States in 1958 as a postdoc at the Massachusetts Institute of Technology. From 1962 to 1981 he was a biophysics professor at Johns Hopkins University School of Medicine. Arthur W. Nienhuis has been appointed director of St. Jude Children's Research Hospital in Memphis, Tenn. Scientists at the 30-year-old St. Jude perform research in pediatric cancer, sickle cell disease, and opportunistic infections associated with AIDS. Nienhuis, formerly chief of the Clinical Hematology Branch at the National Heart, Lung and Blood Institute of the National Institutes of Health, will assume his duties on April 1. At St. Jude he will be continuing his genetic research as well as investigations in hematology, the study of blood diseases. Nienhuis received his M.D. from the University of California Medical School at Los Angeles in 1968. (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.) ================================ PEOPLE MIT Meteorologist Is Named First Winner Of New American Geophysical Union Medal (Page 23 of newspaper) Edward Lorenz, a professor, emeritus, in the department of earth, atmospheric, and planetary sciences at the Massachusetts Institute of Technology, has been named the first recipient of the Roger Revelle Medal, presented by the Washington, D.C.-based American Geophysical Union (AGU). Lorenz received the award at the fall meeting of AGU in San Francisco on December 9. Named for oceanographer Roger Revelle, who died in July 1991, the award honors individuals who have contributed to understanding the processes involved in the Earth's atmosphere, including its dynamics, chemistry, and radiation. Revelle--former director of the Scripps Institution of Oceanography in La Jolla, Calif., now a part of the University of California, San Diego--was best known as a co-founder of the theory of plate tectonics, which states that the ocean floor moves because of an upward flow of heat from the Earth's interior. Honoree Lorenz's work focuses primarily on chaos theory as a way of explaining atmospheric science. "In modern terminology, chaos has been used to colloquially mean something that is not random but still looks random," he explains. "Chaos is something that is determined by precise laws yet behaves rather unpredictably in any one case." As a meteorologist, Lorenz has applied this idea to the ever- changing weather patterns in order to establish some method of pre- dictability. "The theory of chaos is a lot like the weather. There are precise laws determining the weather but still, it's very difficult to forecast it very far in advance," he says. Ironically, says Lorenz, in order for chaos theory to work correctly, a number of intricate laws must be followed. In the early 1970s he wrote a paper examining the hypothesis that "the flap of a butterfly's wings in Brazil can set off a tornado in Texas." The paper was presented at the AAAS Convention of the Global Atmospheric Research Program at MIT, Dec. 29, 1972. "The idea was that some very minor, small, undetectable influence could lead to something quite detectable after a sufficient amount of time," Lorenz says. "This illustrates the difference between what we think the weather is and what it really is, because we can't observe it precisely." Lorenz received his bachelor's degree in mathematics from Dartmouth College in 1938 and his Sc.D. in meteorology from MIT in 1948, and has taught at MIT ever since. In 1983, he and a colleague, Henry Stommel, received the Crafoord Prize from the Royal Swedish Academy of Sciences. An explanation of his work appears in the academy's journal, Tellus (E. Lorenz, "Irregularity: a fundamental property of the atmosphere," 36A:98- 110, 1984). --Ron Kaufman (The Scientist, Vol:7, #3, February 8, 1993) (Copyright, The Scientist, Inc.)

---

E-Mail Fredric L. Rice / The Skeptic Tank