Saturday, August 29, 2009

In Vino Veritas

Winemakers disappointed by organic methods have turned to biodynamics as the purest route to wine that's true to soil, grape, and climate.

By Corby Kummer

For years the question in winemaking was how technology could make wine better. This was especially true if the wine was Californian. When California cabernet sauvignon bested the best of Bordeaux--in a legendary blind tasting, the "Judgment of Paris," convened by the English wine merchant Steven ­Spurrier--it was a moment of great national pride at the time of America's Bicentennial, and it was achieved in part because California winemakers had used technology in ways tradition-bound French winemakers would not. As California wine became respectable, Silicon Valley millionaires bought vineyards in Napa and Sonoma counties. California wine and tech soon enjoyed a happy marriage.

Insectaries are natural habitats for beneficial insects that control pests. The Benziger Family Winery’s main insectary is planted with more than 50 kinds of plants and flowers.
Credit: Benziger Family Winery
RESOURCES:
Benziger Estate v.2006 Tribute
Sonoma Mountain
14.5% alcohol
$80.00
www.benziger.com

Two generations of winemakers came out of the University of California at Davis armed with the latest knowledge of clones, viticulture, and gas chromatography. With their chemical toolbox, they could fix any flaw--a dry year, overripe grapes left on the vine a day or two too long, sour wine. The descendants of the original Hungarian and Italian immigrants who first planted grapes in Napa and Sonoma may have been slow to sign on to the new methods, but not the high-tech grandees who were living the California dream by buying land and putting their names on bottles of wine. New money is always attracted to old vineyards (even if California's vineyards aren't really that old).

Like most activities the very rich are drawn to, winemaking is highly subject to fashions. The current fashion is a practice that was far on the fringes even 10 years ago: biodynamic farming, ever so much more authentic and true to nature than plain old organic. It's the realization of what an increasingly vocal minority of winemakers, particularly in France, began calling for in the 1980s--utterly unmanipulated wines, with no corrections, no adjustments, no filtering, and no chance to compensate for a mistake made during the growing season.

That true reflection of the air, rain, sun, and soil of a place is what's meant by terroir, the cachet-laden term being slapped on every local food these days. Biodynamic farming, says the studiedly eccentric, preternaturally persuasive California winemaker Randall Grahm, "is the royal road to terroir."

This approach sounds completely in tune with Slow Food, the movement (about which I wrote a book) that since the 1980s has called for a return to growing and production methods dictated by nature, place, and subsistence economics. These are the methods that gave rise to the world's great artisan foods and wines in the centuries before artisan was needed to indicate "nonindustrial," when organic was the default.

Biodynamic principles in fact predate organic farming, although both were reactions to the rise of nitrogen-laced fertilizers in the early 20th century. In 1924, the Austrian-­born philosopher Rudolf Steiner gave a series of lectures on farming as it related to anthroposophy, the movement he founded upon Goethe's scientific works. Anthroposophy attempts to unite science, the arts, and the spiritual and invariably views the part in the context of the whole, up to and including the cosmos. It survives in applied "daughter forms" that include the Waldorf schools--and biodynamic farming.

Steiner's followers argue that Sir Albert Howard, the British botanist who pioneered organic agriculture after observing Indian farming practices, Lord Northbourne, the agronomist who coined the term organic farming in his 1940 book Look to the Land, and the publisher J. I. Rodale, who popularized it in the United States, were simply building on and codifying his ideas. As organic farming is now defined in government standards, however, the important things are what you don't do: apply chemical pesticides and fertilizers to crops and soil. But farmers can draw on a whole range of nonchemical surrogates for the chemical correctives they give up. It takes three years to gain full organic certification, as the land detoxes, and then it's relatively straightforward.

Biodynamic, though--that's really hard. Steiner, who gave his lectures on the home farm of a count who had an estate in what is now Poland, viewed farms as living, unified organisms that should be completely self-sustaining. Maintaining that standard means paying daily attention to exactly what's happening in your vineyard and your dirt. It means not buying the pest fighters and fertilizers that still get delivered every season to organic farms. It means constantly rebuilding soil for the future. It means not planting a good portion of your land at all--and if you're in Sonoma or Napa, that's some of the most expensive farmland on earth--and raising cows, sheep, goats, chickens, and the other animals that keep a farm thriving and independent.

A biodynamic label can differentiate a wine from the passel that are already organic. But the term hasn't quite reached the point of conferring bragging rights. Also standing in the way of status is the hippie image. Biodynamic farming involves using sprayed applications meant to encourage growth and keep pests in check, composts infused with various herbs in homeopathic quantities, and a bunch of shamanistic, ridiculous-­sounding "preparations" based on a too-literal reading of what Steiner, observing life on Central European farms, mentioned in his few writings on farming.

The Benziger Family Winery, in Glen Ellen, is a postcard-perfect biodynamic farm, and the people who run it speak with the air of calm longtime converts--unlike several winemakers I talked to on a recent visit to Sonoma and Napa, who were slightly scary. When it comes to hearing about some biodynamic practices--burying manure in a cow horn in autumn and digging it up in spring; burying oak bark in a goat's skull; using stags' bladders and cows' intestines as casings for herbs; planting and picking on "root, leaf, flower, and fruit days" shown on lunar farming calendars covered with zodiac symbols--it can be hard to tell the difference between calm and zealotry.

Glen Ellen is a relic of an era when a family of normal means could buy a beautiful piece of land and grow grapes. Mike and Mary Benziger bought the property in 1980 with the help of Mike's father, Bruno, a wine and spirits importer. Bruno and his wife moved there a year later, and other siblings followed. It was "quite mediocre" wine, Mike Benziger says, that made them change their farming ways: "We'd killed what would have been native yeasts"--the naturally occurring organisms long beloved of sourdough-bread bakers and now of winemakers--"through years of using herbicides, so we had to add lab yeasts." The soil was "very like dirt balls or talc," and was strangely quiet: "You just heard the wind in the vines. It was a green desert." Now, Mike says, the soil is "almost cakelike--like brownies."

The view as Mike talks, from a hillside vineyard across to another hill, is a patchwork of zinfandel vines, lavender, rosemary, and olive trees. Demeter, the international certification program for biodynamic farming (it has branches in 43 countries), requires that 10 percent of a farm's land be uncultivated--not as much as the percentage that would have to be wild, or reserved for grazing, in a truly self-sufficient farm, but enough to scare off farmers who profit from a single crop, however much they dislike monocropping.

Although many biodynamic vineyards do not have enough cows to make all the manure they need (and are thus not the truly closed system that is the biodynamic ideal), the ­Benzigers' three cows are sufficient to their needs. They also raise sheep that clean fields by eating weeds, and grow vegetables that renew soil by providing cover crops--and provide beautiful purslane, lamb's-quarter lettuce, and fresh peas to be sold to Ubuntu, an organic vegetarian restaurant in downtown Napa that is the talk of the food world. And this being a postcard, the farmer who delivers those vegetables is a photogenic straw-hatted college grad married to a former cook at Chez Panisse (and who, by complete chance, tested recipes for my book on Slow Food).

Cow horns are filled with manure and buried through the winter to create biodynamic preparation 500, which promotes root growth.
Credit: Benziger Family Winery

The Benzigers are quick to point out that they use satellite imagery and sophisticated soil analysis and winemaking technology to verify their low-tech methods. The high-tech-low-tech seesaw they boast about--and also the high-tech money that finances low-tech methods all through Sonoma and Napa--is on equally scenic display at DaVero, a farm just outside Healdsburg, the Napa-fying but not completely Napafied main town of Sonoma County. DaVero is kept alive by the money its owner, Ridgely Evers, made developing QuickBooks software. Its chief product is olive oil, and it leaves much more than the required 10 percent of its land open--60 percent, Evers claims.

Evers gives at least one compelling reason for paying to be certified as biodynamic rather than organic: it's good marketing. Biodynamics can fulfill the promise that organics make but don't fulfill: as Evers succinctly summarizes it, food that's "sustainably and responsibly farmed near where you live." That, indeed, is the idea that started Slow Food in the 1980s and made it into an international movement in the '90s, and that made locavore the New Oxford American Dictionary's 2007 word of the year. And it's the promise that got buried in the years leading up to the USDA's National Organic Program (NOP), which finally set a single national standard for organic certification after years of state-by-state definitions. "They didn't codify best practices," Evers says, in an undiplomatic summary of what many farmers think of the USDA's approach. "Lobbying organizations came in, and now the NOP is so far from what people think organic means as to be a joke."

Many of the vineyard owners and farmers I talked to called biodynamic the new organic. And unlike early organic-farming associations, Demeter is taking no chances that the standards it's using will be watered down. It has registered a trademark in the United States on the word biodynamic itself. Now its work will be to make consumers understand the meaning of biodynamic farming and its stricter-than-organic rules.

Interest in biodynamic farming is growing, chiefly among winemakers. Disillusion with big industry's encroachment on organics and desire for a marketing edge have led Demeter's U.S. membership to triple in the past five years, according to Elizabeth Candelario, Demeter USA's marketing director.

One reason winemakers are more drawn to the biodynamic label than the organic is that they outright reject organic winemaking methods (though not organic farming methods). "The organic law in the U.S. is not sustainable for winemaking," says Larry Stone, a legendary sommelier who is now pursuing his boyhood dream of winemaking as the general manager of the very successful Napa winery Rubicon, owned by Francis Ford Coppola. The problem, he explains, is that the standards for organic wine were written at the same time that high sulfite levels in salad bars were causing health problems. So the permitted levels of sulfites in wine--10 parts per million--are much lower than the European standard of about 50 parts per million. Good news for migraine sufferers who think sulfites are triggers. Bad news for red-wine makers: "It's almost impossible to make wines, especially red wines, that can withstand the ravages of oxidation after a year," Stone says. Thus the great disparity between the number of organic vineyards and the number of organic wines. Biodynamic standards for sulfites are in line with European ones for organic wine--which gives Demeter a big market opportunity.

But is it even possible to tell that a wine is biodynamic? In particular, does biodynamic wine taste any better?

You won't find reason to suppose so in the winemaking itself. Biodynamic certifiers dictate no loony methods for making wine, though the calendar for propitious times to make it--those leaf and fruit days--strikes many winemakers as superstitious. The winemaker's skill, or lack of it, determines the taste of any wine. And so, of course, does the quality of the vineyard.

But the argument that biodynamic farming produces better-tasting grapes is easy to make and easier to test. Chefs including Jeremy Fox, at Ubuntu, say that biodynamically grown fruits and vegetables are more likely than organic ones to taste of themselves--that elusively pure and focused flavor cooks always pursue. When Jim Fetzer, part of a family that has adopted the slogan "The Earth-Friendly Wine," converted its 160 acres to biodynamic practices to sell grapes to the Fetzer winery (now owned by a conglomerate), Benziger and others called them the most beautiful grapes they'd ever seen.

The wines Benziger makes from grapes grown in its own biodynamic vineyards are highly regarded--particularly the Benziger Estate Sonoma Mountain V.2006 Tribute, a cabernet blend the winery introduced five years ago as a "tribute" to biodynamic farming. It has a surprisingly shy nose for a wine that, as is the California custom, is a high 14.5 percent alcohol. It is delicate on the palate, too, because it includes cabernet franc, merlot, and petit verdot, and far cleaner and less oaky than the California norm. Tribute is deceptively luscious; it's so clean in the nose and light on the tongue that only after a while does the deep fruit creep up on you and make you want more, much more--very unlike the usual brassy, heavy, overripe California cab. It's not cheap (about $80 in retail stores), but the French style will appeal even to timid merlot drinkers. Is Tribute so good because of the sympathetic family's beautiful property and admirable farming methods? Maybe. It's certainly because they know how to make wine.

Many grape growers in both valleys are sold. David Bos, a young farmer with midwestern roots and the evangelical air of the religion major he once was, extols the advantages of biodynamics; all five of the vineyards owned by Grgich Hills, the Napa winery he works for, are Demeter-certified. "People ask if it makes economic sense," he said when he took me to one, near Yountville in Napa. (Several farmers said their initial changeover to biodynamics cost them a few thousand dollars an acre over several years.) "But we've seen biodynamics heal our vineyards." Using biodynamic methods, he rescued a blighted vineyard other growers would have torn up. Now, grapes from that vineyard are part of his esteemed Yountville cab. "We've been making 300 to 400 cases a year," he says. "We sell it only through our tasting room, and it sells out at $135 a bottle."

Vintage '70s farmers move quickly from the realm of the practical to the spiritual. Michael Sipiora, for instance, farms at Quintessa, a spectacular property on the Silverado Trail. He knows wine; he farmed the vineyards at Stag's Leap before joining the conservation-minded couple who own Quintessa, Valeria and Agustin Huneeus. The difference between organic and biodynamic, he told me, lies in "energy." He went on to talk about Steiner's levels of consciousness: the "etheric" level of plant life, the "astral" level of the animal kingdom, the cosmic and telluric levels of energy we share with animals, the "eagle" level attained by humans.

Sipiora buries crystals and "puts intent" on them. Water--"the great messenger"--is his main theme. His pride is the "flow form," a cascading fountain with double bowls on each level that spins water in opposite "vortexes," charging it with energy; he pipes that water around the property. He makes many of Steiner's preparations himself--valerian, stinging nettle, and chamomile are basic components--and what he can't grow on the property, he buys from the Josephine Porter Institute in Virginia: stag's bladder, oak bark for burying in skulls.

This kind of cultishness drives Aaron Pott crazy. Pott is a winemaker and consultant (formerly for Quintessa) who is planting his own vineyard. He studied at both UC Davis and the University of Burgundy and worked at two chateaux in Bordeaux, so he is familiar with New World and Old. He first encountered biodynamic farming in France and learned more when Quintessa expanded its biodynamic program. He calls many biodynamic preparations "ludicrous" and "medieval."

The problem, he says, is that Steiner wrote little on grapes (just half a page in his agricultural lectures), and his knowledge of farming was based on his experiences in chilly Central Europe--entirely removed from the climate of Napa and Sonoma. Many of the preparations aim to encourage ripening of grapes, whereas in California, overripening is the concern.

Pott doesn't dismiss biodynamics altogether. "The tenets I like," he says, "are those things that say--the way Steiner actually said--'Look at everything that's around you. Use preparations that work. These are things that work for me in the middle of Germany.' You see what's naturally occurring on your farm and use those techniques." Pott crushed leaves of the agave plant, whose interior stays moist in the desert, and sprayed them on vines to prevent sunburn--and "lo and behold, it worked." Why don't others adapt Steiner's philosophy to such pragmatic effect, and discard what is clearly unsuitable to their own climate? He shrugs. "Why don't Christians follow the teachings of Christ?"

In the end, it comes down to faith. Scientific studies comparing organic and conventional farms have shown that organic farms have better soil quality, according to John Reganold, a soil scientist at Washington State University. But studies comparing the soil on biodynamic and organic farms show "mixed results," he says. He has compared soil from adjacent biodynamic and organic vineyards and seen no difference; and although a chemical analysis of grapes revealed some differences, in a blind tasting of merlot wines from those vineyards, wine tasters were stumped. Still, Reganold is an advocate: "Biodynamic farmers observe and are in contact with the crop more often than conventional growers." And, of course, he likes that biodynamic farmers care so much about the soil.

If biodynamic means only that the soil the grapes were grown in will be better for generations to come, that's all right. "There's no money in winemaking, let me tell you," says Jim Fetzer, whose family stayed in property development and grape growing after selling its winery. The money is in the land. Given the undisputed benefits biodynamic farming has for the life of soil, maybe it's a good investment after all.

Corby Kummer is a senior editor at the Atlantic and the author of The Joy of Coffee and The Pleasures of Slow Food.




http://www.technologyreview.com/biomedicine/23201/

A Beacon to Guide Cancer Surgery

A modified virus makes cancer cells fluoresce to better identify tumors.

By Courtney Humphries


Removing tumors from cancer patients always brings uncertainty. Surgeons fear that cells they don't spot and remove might re-emerge. Researchers have been looking for ways to make cancer cells visible so that none is left behind. Some of these strategies rely on injecting fluorescent probes or nanoparticles like quantum dots that will attach to the surface of cancer cells. Now a company is working on technology that makes cancer cells fluoresce from the inside out. The approach, developed by San Diego-based company AntiCancer, in partnership with scientists at Okayama University in Japan, uses a virus that infects cancer cells to integrate a fluorescence gene into tumors. The result is cancer that permanently glows, which the company hopes would allow surgeons to remove tumors with more precision and to monitor any cancer that re-emerges.

A bright idea: Mice carrying tumors from human colon cancer cells were given a virus that causes cancer cells to fluoresce. Researchers were able to visualize the dispersed tumors (in green) and remove them surgically. The small intestine is shown in red.
Credit: Hiroyuki Kishimoto and Robert M. Hoffman, AntiCancer

To make cancer cells fluoresce, the researchers used a virus called OBP-401, a modified cold virus that can enter all cells but will only replicate in those that have activated telomerase, an enzyme that is expressed in cancer cells and allows them to divide indefinitely. Normally a cell can only divide a limited number of times before dying, because at every division it loses part of its telomeres, caps of DNA at the ends of chromosomes that keep the genome stable. But cancer cells can keep dividing because telomerase replaces the telomeres every time the cell divides.

The OBP-401 virus had been developed as an anticancer therapy. Here, the researchers modified the virus to carry green fluorescent protein (GFP), a protein derived from jellyfish that fluoresces in blue light. When the virus is injected into an animal, the gene becomes active in cells that express telomerase. Robert Hoffman, president of AntiCancer and a surgeon at the University of California, San Diego, explains that GFP is permanently integrated into the genome of cancer cells, making this technology fundamentally different from approaches that rely on attaching a fluorescent particle to a protein on the surface of cancer cells. Hoffman believes that by creating a genetic marker, the approach "takes advantage of the tumor biology more effectively."

In a recent paper in the Proceedings of the National Academy of Sciences, Hoffman's team used the virus to illuminate tumors in mice that were scattered throughout the body. During surgery to remove the tumors, they could visualize them by exposing tumors to light of the proper wavelength and looking through a filter that picks up GFP fluorescence. "In principle it should pick up any cancer cell," says Hoffman. The team has not yet reached single-cell precision, but they are able to see and remove small cancerous areas that would otherwise be invisible.

Although new cancers that form after the virus is delivered would not be fluorescent, Hoffman says that any of the original cancer that began to grow again should still express GFP, allowing clinicians the opportunity to monitor the results of the surgery over time.

Hisataka Kobayashi, a scientist in the molecular imaging program at the National Cancer Institute, says the advantages of this method are that it very specifically targets cancer cells and makes it possible to monitor the cancer over time. The method also allows for flexibility; for instance, a gene that would cause the cancer cells to kill themselves could be added to the virus along with GFP, pairing imaging with treatment.

Kobayashi says that one of the key questions of the technology is safety. Giving patients a virus carrying a gene for imaging is very similar to giving them a gene to correct a disease, he says, and consequently "all the problems with gene therapy apply to this method." Many gene therapy approaches have been stalled because of immune reactions to the treatment. However, Lily Wu, a scientist at the University of California, Los Angeles, who develops cancer therapies, points out that similar gene therapy treatments for cancer have so far been found safe in clinical trials, whereas safety "is still not determined for other synthetic vectors such as quantum dots." Wu believes that this method offers several advantages over other ways of labeling tumors but says that it will require a more thorough quantitative analysis to demonstrate its effectiveness.

Hoffman says AntiCancer hopes to complete further safety testing that will allow it to bring the technology into clinical trials. Although fluorescence in mice can be visualized throughout the body, in humans the task will be more difficult, because the light scatters easily and doesn't penetrate very far into tissues. For that reason, the researchers envision this technique being used during surgery where the tumor can be seen directly.



http://www.technologyreview.com/biomedicine/23311/

Quantum Cryptography for the Masses

A new partnership will make quantum cryptography more widely available.

By Duncan Graham-Rowe


Quantum cryptography could finally hit the mainstream thanks to a deal that will allow customers to adopt the technology without having to install dedicated optical fibers.

Light box: id Quantique's Cerberis quantum key distribution system (bottom) with two link encryption units (above) is now widely available over dark fiber networks.
Credit: id Quantique

Quantum cryptography--a means of keeping secrets safe by using light particles to help scramble data--has been commercially available for several years. But the technology has only been practical for governmental or large private-sector organizations that can afford to have their own point-to-point optical fiber that the technology requires. But under the new deal, struck between Siemens IT Solutions and Services in the Netherlands and Geneva, Switzerland-based id Quantique, any organizations or individuals wanting state-of-the-art data security will be able to buy the complete package of quantum cryptography and cable.

For the commercial development of quantum cryptography it's a significant step, says Seth Lloyd, an expert in the subject and a professor at MIT. "It makes it a lot more commercially viable. The fiber is by far the most expensive part," he says.

Quantum cryptography is a method that seeks to solve the problem of how to securely send cryptographic keys between two parties by encoding them within light particles, or photons. It allows the parties to share a random--and so almost unbreakable--key without fear of third-party interception. If anyone does try to eavesdrop on the key exchange, the mere act of observing the photons changes them, making the attack detectable.

But for this quantum key distribution (QKD) to work, the same photons transmitted by one party have to be received by the other. This means that unlike most optical fiber data signals, which are periodically amplified by repeaters to boost the signal, quantum keys can only be sent through dedicated, unamplified, point-to-point fibers.

Telecom companies have spent the last few years installing precisely this kind of fiber, but for entirely different reasons, says Lloyd. Known as dark fiber, this is essentially extra capacity that has been laid in bulk to accommodate future growth.

Some companies lease this dark fiber for their own secure data connections, but for the most part it's just laying there waiting for deployment, says Andrew Shields, head of Toshiba Research Europe's Quantum Information Group in Cambridge, U.K. "For quantum key distribution, this is a godsend. There is all this dark fiber in the ground right now that's not being used."

In the new deal, Siemens SIS will offer id Quantique's QKD system over Siemens' existing dark fiber. "It's important from a commercial point of view that companies like Siemens, a global player, are showing an interest in this technology," says Grégoire Ribordy, co-founder and CEO of id Quantique. "There's potential to really accelerate commercial development."

Initially it will only be made available to Dutch customers, says Feike van der Werf, sales director of Siemens SIS, but in time may be deployed more widely. "I see this as the first step in the switch to quantum-based security," says Charlotte Rugers, a security consultant with Siemens SIS.

In essence, this deal means that for the first time QKD will be commercialized and marketed like standard IT services, says Ribordy. Dark fiber has become so prevalent that in some countries you have fiber direct to your home, he says. At the moment it is still not widely used, mainly by organizations that really care about security. But in theory this new deal means that even individuals could adopt the technology, "if you were really paranoid," he says.

This is an important step that should help bring QKD into the mainstream, says Shields. Previously customers were forced to source their own dark fiber, either through laying it themselves or getting a telecom to provide it, but this new deal allows them to buy the complete, scalable package. Although some bigger companies may have their own dark fiber, for smaller companies it would make it easier to adopt the technology, he says. "There are people out there using it but mostly it's to assess the capability, rather than using it to hide their secrets."

It will still be expensive. Besides the $82,000 price tag for a pair of id Quantique's QKD boxes, the cost of dark fiber remains high, because the customer will have to bear the cost of at least two fibers--one for the QKD and the other with which to send the encrypted data once keys have been exchanged. Normally, the cost of each fiber is offset by having dozens of customers share it, says Shields. But QKD customers will be unlikely to want to share their cables. "I think in the longer term we will need to see QKD integrated with normal telecom fibers." But for now this isn't possible, he says. Quantum signals are very weak and classical data signals are very strong, so there is a danger they will be drowned out. Once this problem has been solved, QKD should become even more attractive, he says.


http://www.technologyreview.com/computing/23317/?a=f



Friday, August 28, 2009

Changing A Cell's Biological Battery

A new method tested in monkeys for replacing mitochondrial DNA could one day prevent devastating diseases.

By Lauren Gravitz


Mitochondrial diseases, which affect as many as one in 4,000 people, can impair muscles, nerves, even entire organ systems, and have no known cure. Now, in a breakthrough study, Oregon researchers replaced defective mitochondrial DNA with that from a healthy donor. The first subjects, four baby monkeys, are pushing the envelope on the ethical debate that surrounds bioengineering.

Born of science: Spindler, a baby rhesus macaque, is one of only four monkeys born with DNA from three parents.
Credit: Nature
Multimedia
video Watch how the researchers create a fertile egg.

Mitochondria are often called the cell's power plants--the tiny organelles are responsible for energy production, and there can be hundreds to thousands of them in a single cell. They also contain their own DNA. Unlike nuclear DNA, which is a unique combination of both parents' genomes, mitochondrial DNA (or mtDNA) is passed down through the mother, is derived almost exclusively from her egg, and typically remains unchanged from one generation to the next. Mutations in a woman's mtDNA are inherited by her child, and so far there has been no way to cure these conditions or stop their transmission.

Now, Shoukhrat Mitalipov and his colleagues at Oregon Health & Science University in Beaverton, OR, have found a way to get rid of mutant mtDNA. Using a process similar to cloning, they first harvested a fertile egg. Then, when the egg was undergoing cell division, they removed a set of its chromosomes and inserted them into an egg harvested from another female, one that already had its nucleus removed. In essence, the enucleated egg provided a set of mitochondrial chromosomes, while the transferred nuclear chromosomes provided the main genetic material for development. Other researchers have attempted similar processes, but previous efforts couldn't prevent mutant mitochondria from tagging along to the new egg.

The researchers avoided this problem by carefully isolating chromosomes during a very specific and segregated process of cell division, in which nuclear DNA is tied up into an elliptical spindle. "Our whole technique comes to efficiently separating the two different types of DNA that [mammals] carry, and to separate them very cleanly," Mitalipov says. "We believe this can be used to prevent transmission of mutated mitochondrial DNA...[and] correct for mitochondrial DNA mutations in children even before they're born."

To date, there are 200 to 250 known disease-causing mutations in mitochondrial DNA, and they occur in as many as one in 4,000 people. The syndromes vary in severity, with symptoms ranging from muscle weakness and loss of motor control to diabetes, liver disease, and developmental delays. Many die before ever reaching adulthood. "The patients carrying these types of mutations don't have the same options for genetic counseling," Mitalipov says, since any mutation a woman has will be passed to her egg. "Currently, her only options are using donated eggs or adopting a child."

"It's an important study, and it's the only approach that I can think by which you could render a family free of risk of their offspring developing a mitochondrial DNA disease," says Douglas Wallace, a mitochondrial DNA researcher at the University of California, Irvine. Because mitochondrial DNA is self-replicating, the technique allows for a way to "swap" healthy versions for mutant ones without genetic alterations.

But therein also lies the rub. Many researchers and ethicists alike balk at the idea of making genetic changes to the germline, ones that fundamentally affect an egg or sperm and will be passed along to the next generation. While swapping out mitochondrial DNA may not qualify as the kind of germline engineering people have in mind when they worry about made-to-order babies--with certain traits like intelligence or eye color specifically engineered--it edges toward that shaky ethical ground.

Two babies from three parents: Mito and Tracker, baby rhesus macaque twins, are the first to have DNA from three parents--nuclear DNA from their mother and father, and mitochondrial DNA from a second female monkey, which they inherited through her donated egg.
Credit: Nature


"The technique that they used, transferring a chromosomal spindle to get new mitochondria to power the egg, seems completely ethical and defensible," says Arthur Caplan, a bioethicist at the University of Pennsylvania. "But while this technique doesn't have much use outside of fixing problems in the mitochondria, it does open the door a tiny bit on germline engineering." Because it's using a self-contained part of the cell, he notes that it's not what people typically have in mind when they talk about tinkering with germline genetics. "But by cracking open the door, it puts the principle of never doing germline engineering into dispute."

David Magnus, who heads Stanford University's Center for Biomedical Ethics, agrees that most of society's germline engineering concerns don't apply in this case. But he does point out that the procedure would lead to, essentially, three parents instead of two, "making legal and social arrangements more complicated," he says. "What happens if the mitochondrial donor decides, down the road, that she should have some parental rights to the offspring?"

This is, of course, getting way ahead of the science itself. Much more must be done before the procedure is approved. The new technique has only been applied in nine rhesus macaques, three of which became pregnant (one with twins)--a 33 percent success rate that appears to mirror that of regular in vitro fertilization in human patients. And since the seemingly healthy offspring have not yet reached reproductive age, Mitalipov and his colleagues don't yet know whether the procedure has genetic implications they've not yet uncovered. The procedure will also need to be refined, tested in more than nonhuman primates and at other research facilities before human trials can begin. (The Oregon lab is known for very high success rates that other labs can rarely duplicate.)

Researchers in Britain, at the University of Newcastle upon Tyne, have reportedly done something similar in human embryos, but have yet to publish their results and would not comment on Mitalipov's research.

The Oregon researchers believe they may be ready to apply for clinical trials in two to three years, but much depends on funding and government approval. "This points the way to a technique--it doesn't provide a therapy," says UC Irvine's Wallace, who was the first researcher to discover disease-causing mutations in mitochondrial DNA. "It shows that the concept can work as one approach to treating mitochondrial DNA disease. And that's an incredible advance, since we have very little to offer these families right now."


http://www.technologyreview.com/biomedicine/23314/

New Type of Disappearing Ink

Nanoparticle inks that fade away in hours could be ideal for secure communications.

By Prachi Patel


Top-secret maps and messages that fade away to keep unwanted eyes from seeing them could be made with a new nanoparticle ink. Researchers at Northwestern University, led by chemical and biological engineering professor Bartosz Grzybowski, have used gold and silver nanoparticles embedded in a thin, flexible organic gel film to make the new type of self-erasing medium.

Timely disappearance: Metal nanoparticles that clump together and change color under ultraviolet light are used as an ink to create images. In visible light, the clumps break apart and the image fades away in nine hours.
Credit: Rafal Klajn

Shining ultraviolet light on the film through a patterned mask or moving an ultraviolet "pen" over it records an image on the film. In visible light, the image slowly vanishes. Writing on the medium takes a few tens of milliseconds, but the researchers can speed up the process by using brighter light. They can also tweak the nanoparticles to control how quickly the images disappear, anywhere from hours to a few days. The images vanish in a few seconds when they are exposed to bright light or heat.

The film can be erased and rewritten hundreds of times without any change in quality. It can be bent and twisted.

The technology, described in an online Angewandte Chemie paper, would be ideal for making secure messages, Grzybowski says. He also envisions self-expiring bus and train tickets. "It self-erases and there's no way of tracing it back," Grzybowski says. "Also this material self-erases when exposed to intense light, so putting it on a copier is not possible."

There have been previous reports of self-erasing media. In 2006, Xerox announced a paper that erases itself in 16 to 24 hours. These materials use photochromic molecules that rearrange their internal chemical structure when exposed to light, which changes their color. Typically, these molecules can only switch between two colors and they lose their ability to switch after a few cycles. Besides, says Grzybowski, the molecules are not bright so you need a large number to see any color change. "You have to put a kilogram of this into paper before you see something," he says.

Grzybowski and his colleagues make the self-erasing ink with 5-nanometer-wide gold or silver particles. They attach on the nanoparticles' surface molecules that change shape under ultraviolet (UV) light and attract each other. "They're like a molecular glue that you can regulate using light," he says. The unwritten films are red if they contain gold particles and yellow if they contain silver. The films can also be made of other colors, ranging from red to blue, by choosing nanoparticles of a different size. Particles exposed to light form clusters of a different color--the red film changes to blue and yellow changes to violet.

In the absence of light the clusters fall apart. How quickly they fall apart, erasing the writing, depends on the amount of gluelike particles on them.

You can write in different colors depending on how much light you put in--more UV light makes the particles form tighter clusters, which have a different color than looser clusters. The researchers were also able to write two images, one over the other, on the same film. All the nanoparticles do not get used to write the first image and can be used for the second image.

"The concept of using photostimulated reversible aggregation of gold or silver particles for self-erasing images is quite interesting and new," says Masahiro Irie, a chemistry professor at Rikkyo University in Tokyo who studies photochromic molecules. However, he believes that photochromic molecules might be better for practical self-erasing systems. Images or text written with the new inks might not have a high resolution because they require clusters of nanoparticles. Plus, the unwritten film is colored because of the nanoparticles, and it would be more desirable to have a colorless or white original film, he says.

But the flexibility and control that the new material offers makes it attractive. It is easy to control the speed of writing and erasure, as well as the color, Grzybowski says. He adds that the technology has drawn interest from a United Kingdom-based security firm.



http://www.technologyreview.com/communications/23316/

TR35 2009 Young Innovator

Pranav Mistry, 28

MIT

A simple, wearable device enhances the real world with digital information

Multimedia
video Watch Mistry demo SixthSense.
video Mistry explains how his technology, SixthSense, works.

Retrieving information from the Web when you're on the go can be a challenge. To make it easier, graduate student Pranav Mistry has developed SixthSense, a device that is worn like a pendant and super­imposes digital information on the physical world. Unlike previous "augmented reality" systems, Mistry's consists of in­expensive, off-the-shelf hardware. Two cables connect an LED projector and webcam to a Web-enabled mobile phone, but the system can easily be made wireless, says Mistry.

Users control SixthSense with simple hand gestures; putting your fingers and thumbs together to create a picture frame tells the camera to snap a photo, while drawing an @ symbol in the air allows you to check your e-mail. It is also designed to automatically recognize objects and retrieve relevant information: hold up a book, for instance, and the device projects reader ratings from sites like Amazon.com onto its cover. With text-to-speech software and a Bluetooth headset, it can "whisper" the information to you instead.

Remarkably, Mistry developed SixthSense in less than five months, and it costs under $350 to build (not including the phone). Users must currently wear colored "marker­s" on their fingers so that the system can track their hand gestures, but he is designing algorithms that will enable the phone to recognize them directly. --Brittany Sauser

1. Camera: A webcam captures an object in view and tracks the user's hand gestures. It sends the data to the smart phone.

2. Colored Markers: Marking the user's fingers with red, yellow, green, and blue tape helps the webcam recognize gestures. Mistry is working on gesture-recognition algorithms that could eliminate the need for the markers.

3. Projector: A tiny LED projector displays data sent from the smart phone on any surface in view--object, wall, or person. Mistry hopes to start using laser projectors to increase the brightness.

4. Smart Phone: A Web-enabled smart phone in the user's pocket processes the video data, using vision algorithms to identify the object. Other software searches the Web and interprets the hand gestures.

Credit: Sam Ogden


http://www.blogger.com/post-create.g?blogID=4820787567251732286



TR35 2009 Young Innovator

Michelle Khine, 32

University of California, Irvine

A children’s toy inspires a cheap, easy production method for high-tech diagnostic chips

Purposeful play: Biomedical engineer Michelle Khine sits in her lab at the University of California, Irvine, where she uses Shrinky Dinks straight from the toy store to build microfluidic devices.
Credit: Dave Lauridsen
Multimedia
photo How Khine Builds Microfluidic Devices
video Watch Khine explain and demo her devices.

In 2006, Michelle Khine arrived at the University of California­'s brand-new Merced campus eager to establish her first lab. She was experimenting with tiny liquid-filled channels in hopes of devising chip-based diagnostic tests, a discipline called microfluidics. The trouble was, the specialized equipment that she previously used to make microfluidic chips cost more than $100,000--money that wasn't immediately available. "I'm a very impatient person," says Khine, now an assistant professor at the University of California, Irvine. "I wanted to figure out how I could set things up really quickly."

Racking her brain for a quick-and-dirty way to make microfluidic devices, Khine remembered her favorite childhood toy: Shrinky Dinks, large sheets of thin plastic that can be colored with paint or ink and then shrunk in a hot oven. "I thought if I could print out the [designs] at a certain resolution and then make them shrink, I could make channels the right size for micro­fluidics," she says.

To test her idea, she whipped up a channel design in AutoCAD, printed it out on Shrinky Dink material using a laser printer, and stuck the result in a toaster oven. As the plastic shrank, the ink particles on its surface clumped together, forming tiny ridges. That was exactly the effect Khine wanted. When she poured a flexible polymer known as PDMS onto the surface of the cooled Shrinky Dink, the ink ridges created tiny channels in the surface of the polymer as it hardened. She pulled the PDMS away from the Shrinky Dink mold, and voilĂ : a finished microfluidic device that cost less than a fast-food meal.

Khine began using the chips in her experiments, but she didn't view her toaster-oven hack as a breakthrough right away. "I thought it would be something to hold me over until we got the proper equipment in place," she says. But when she published a short paper about her technique, she was floored by the response she got from scientists all over the world. "I had no idea people were going to be so interested," Khine says.

At the same time, she faced considerable skepticism. How on earth, critics wondered, could you use a toy to make a sophisticated device that's normally forged from high-grade silicon? "People either love it or they laugh at me," Khine says. She hastens to point out that Shrinky Dink microfluidics isn't perfect--minute ink splatters from the printer, for instance, can give rise to slight irregularities in the finished channels.

Still, glitches like these don't pose a problem for most applications. And Khine has already found a way around a more serious difficulty: PDMS can absorb proteins, throwing off the results of sensitive tests. She has begun to make chips directly out of the Shrinky Dinks by etching the design into the plastic using syringe tips. As the plastic shrinks, the channels become narrower and deeper--perfect for microfluidics. She can even make three-dimensional chips by melting several etched Shrinky Dinks together. The whole process, from design to finished chip, takes only minutes.

Khine plans to use her chips to detect various medical conditions, and she hopes the cheap and portable devices will someday be used to diagnose HIV and other infections at the bedside. She has also found that by growing stem cells in a Shrinky Dink device that contains wells instead of channels, she can coax them to become heart muscle cells. Such a tool might allow researchers trying to grow those cells for tissue transplants to control the process more closely.

Douglas Crawford, associate executive director of the California Institute for Quantitative Biosciences, sees advantages in Khine's approach. "Michelle's technique is bette­r, faster, and cheaper--it can put microfluidic prototyping into the hands of every lab," he says.

Khine recently printed metal patterns on Shrinky Dinks. As the plastic shrinks, the metal buckles to form shallow wells, which Khine thinks may concentrate sunlight; the discovery could help make solar cells more efficient. "We haven't come close to pushing the limits of this technology yet," she says. --Elizabeth Svoboda


http://www.technologyreview.com/TR35/Profile.aspx?Cand=T&TRID=764

Wednesday, August 26, 2009

Finding out What Colors Dinosaurs Were

Scientists have found evidence of iridescence in a 40 million-year-old fossilized feather.

By Katherine Bourzac

Nanostructures in this 40 million year-old feather once made it
iridescent. Credit: Jakob Vinther/Yale University

By using an electron microscope to examine nanoscale structures in a 40 million-year-old bird feather, researchers have determined that, in life, the birds were black with an iridescent, bluish-green coppery sheen, like starlings and grackles. The key to figuring this out was the discovery by researchers at Yale University that rod-shaped nanostructures in the feather specimens aren't bacteria, but remnants of pigment-containing cells called melanosomes.

Iridescence in bird feathers is caused by constructive interference of light scattered by the cells; how the light scatters is determined by the arrangement of the melanosomes, which are preserved not only in the bird fossils but in some dinosaur and mammalian ones as well. The Yale researchers hope this technique could be used to get a better picture of the coloring and patterning of dinosaurs and other extinct creatures. This work is described online in the journal Biology Letters.


http://www.technologyreview.com/blog/editors/24033

Better Gas-to-Methanol Catalyst

An improved catalyst could reduce the cost of making methanol from methane.

By Kevin Bullis


A new catalyst for converting methane, the main component of natural gas, into a liquid fuel--methanol--has been developed by researchers in Germany. The catalyst could make direct conversion of methane to methanol cheaper than it is with existing catalysts, but it will likely fall short of a holy grail of hydrocarbon chemistry--a catalyst that allows natural gas to replace petroleum fuels on a large scale.

Solid idea: A micrograph of a new catalyst, made primarily of carbon, nitrogen, and platinum, that is used to convert methane into a liquid fuel (methanol).
Credit: Regina Palkovits, Max Planck Institute for Coal Research

The new catalyst is based on one of the few catalysts that convert methane directly to methanol, at low temperatures, without producing much carbon dioxide or other unwanted byproducts. That catalyst, developed by Roy Periana, now a professor of chemistry at the Scripps Research Institute, proved too expensive to commercialize.

The new catalyst, described in the early online version of the journal Angewandte Chemie, has "solved one of the main problems with Periana's catalyst," says Ferdi SchĂ¼th, director of the Max Planck Institute for Coal Research, who led the work. Because Periana's catalyst is a liquid dissolved in sulfuric acid, it's difficult to recycle, a serious problem because the catalyst requires the expensive metal platinum. The new catalyst is a solid, says SchĂ¼th, and so is much easier to recycle because it can be removed from the sulfuric acid simply with filters.

SchĂ¼th says the discovery of the solid catalyst was "serendipitous." His colleagues had developed a polymer with a molecular structure that he recognized was similar to Periana's catalyst. He was able to incorporate platinum into that structure and showed that the resulting solid catalyst performed as well as the liquid version.

Methane-based fuels could be significantly cleaner than petroleum ones. What's more, the supply of natural gas is vast, with large supplies now being accessed with new drilling techniques and orders of magnitude more potentially available in the form of methane hydrates at the bottom of the ocean. But because it is a gas, methane is more expensive to transport and less convenient for use in vehicles than liquid fuels, and so far chemical methods of converting it to a liquid have been costly.

While the new catalyst does solve one of the problems with the Periana catalyst, "it is by no means the biggest problem," says Jay Labinger, faculty associate in chemistry at Caltech. Indeed, Periana says that the development of a solid version of his catalyst will not be enough to commercialize it. He is working on new catalysts that use the similar mechanisms but cheaper and more effective materials.

The two key issues are typical problems for experimental catalysts--they don't work fast enough, which increases the size and cost of equipment needed, and they don't produce high enough concentrations of the desired product, making it expensive to separate the product from other chemicals. Labinger estimates that the rates of the new German catalyst need to increase by an order of magnitude, and Periana says the concentrations need to increase three- to fivefold.

Periana suggests, however, that the German catalyst may offer new directions for research, especially if the mechanisms involved in producing the methanol are different from his liquid catalyst. Indeed, SchĂ¼th says that one key component of Periana's catalyst, chlorine, isn't necessary with the new form, suggesting it could work by different means. Meanwhile, he's also developing catalysts that use different materials. One is promising, he says, producing methanol at rates two times faster than Periana's liquid catalyst.


http://www.technologyreview.com/energy/23313/

The Evolution of Retweeting

Formalizing the retweet may change people's behavior.

By Kristina Grifantini


The microblogging and social networking site Twitter took off last year and had more than 44.5 million users worldwide as of June. In the 140-character limited ecosystem of Twitter, users have evolved a language of their own, figuring out creative ways to filter the sometimes overwhelming stream of Twitter posts. Now, Twitter has announced that a user-generated communication technique called retweeting--reposting someone else's message, similar to quoting--will be formally incorporated into Twitter. Some experts say Twitter's approach will hinder the conversational aspect of retweeting; others predict that it will create a new way of communicating.

Credit: Technology Review

Twitter has incorporated other user-generated linguistic tools, such as using a hash symbol in front of a word to make it easily searchable (like "#conference09"). Another common technique is typing @ in front of a username to reply directly (but publically) to the user, which Twitter also formalized after users adopted it. These linguistic tools have even trickled into other social media environments, including YouTube, Flickr, Facebook, and blogs.

Currently, there is no set format for retweeting, which loosely consists of reposting someone's tweet and giving due credit. The most common scheme for a retweet involves prefacing the post with the letters "RT," then the @ symbol, and the username of the person being quoted. The retweet rebroadcasts the information to a new set of followers, who see the retweet and have the option of retweeting themselves. In this way, ideas, links, and other information can be distributed--and tracked--fairly quickly.

But the retweeting format is much more inconsistent and complex than the targeted reply and hashtag conventions, according to Microsoft Research social media scientist Danah Boyd, who recently posted a paper on the behavior of retweeting. Variations include typing the attribution at the end and using "via," "by," or "retweet" instead of "RT." What's more, people often add their own comments before or after a retweet. This becomes a problem with Twitter's 140-character limit, explains Boyd. Typing "RT @username" takes up characters, and so does adding a comment. To deal with this, users will paraphrase or omit part of the original text, sometimes leading to incorrect quotes.

Last week, Twitter announced that it will soon implement a button that will let users automatically repost someone else's tweet. While this will make it quicker and easier for users to accurately retweet, the mockup of the new button does not appear to let users edit the retweet, so that commentary can be incorporated. Rather, the "retweet" button will add the image and name of the quoted person to the original tweet and post it for those who follow the retweeter.

The new retweet function "is not going to meet the needs of those who retweet. At the same time, I think it's going to bring retweeting to a whole new population," says Boyd. "Adding commentary is a huge element to why people retweet." Instead of just replying privately to a person with an opinion, by retweeting and adding a comment, users can target a larger audience, sharing their opinions and inviting others to do the same, she says.

Boyd found that the percentage of Twitter users who retweet is fairly small, but she expects that number to increase once the retweet button is incorporated. In her research, Boyd found that 11 percent of the retweets examined contained commentary. But she says that number likely underestimates the phenomenon, as she only looked for comments at the beginning of the message.

"Retweeting is primarily used by the geeks and news folks," she says. "What's really starting to hit [Twitter] in large numbers... are those involved with the pop culture." Boyd expects that a retweet button will bring the practice to those millions of users who follow celebrities, such as Twitter fanatics Ashton Kutcher and Oprah Winfrey, for example. "We're going to see information spread from populations who haven't engaged in that way [before]. We'll see an evolution of the behavior," says Boyd. "It will become a way to validate or agree with other users' content."

Users often employ retweets to provide context in conversation, says Susan Herring, a professor of information science and linguistics at Indiana University and editor in chief of the Language@Internet journal. "I can't imagine that [the new Twitter tool] will be very satisfactory to Twitter retweeters," says Herring. "A retweet plus a comment is a conversation. A retweet alone could be an endorsement, but it's a stretch to view an exchange of endorsements as a conversation." Herring does agree that it will increase retweeting and broaden the range of users who retweet.

Retweets are not just of interest to users but also are valuable to companies and researchers who strive to keep track of how ideas spread. Retweeting "is this elegant viral mechanism," says Dan Zarrella, a Web developer who studies viral marketing in social media. "The scale and data you can extract from [retweets] has never been possible with [other] viral or word-of-mouth communications," says Zarrella, who claims to have a database of more than 30 million retweets.

"I think that having a button and supported structure of retweeting is definitely a good idea, but I disagree with the implementation," Zarrella says, and suggests using a format like third-party Twitter tool TweetDeck and others do: pressing a retweet button there will automatically copy and paste the old link with the "RT" syntax, but the tool still allows the retweeter to modify the text.

By taking out the "RT @username," Twitter is making it impossible for users to search for retweets themselves, says Zarrella. "They're limiting how much you can analyze retweets." Zarrella speculates as to whether the retweet button might have been created so that, down the road, Twitter can charge for different features, such as extensive tracking of retweets.

In addition to showing the original tweeter's image, the new Twitter button will also show the latest 20 retweets of a post. "If they show the breadcrumbs of the trail of everyone who retweeted, that's a good thing," says Steve Garfield, a new media advisor to several large companies and prolific video blogger. "I like to add value to my retweets by adding a comment, to tell people why I like it." If the new function doesn't allow for comments, Garfield says users will just design a new way or revert to the old way.

"People will continue to repurpose Twitter to meet their needs," predicts Herring. "I can't imagine that those who are passionate retweeters will erase their practice."



http://www.technologyreview.com/web/23312/?a=f

Entangled Light, Quantum Money

A breakthrough explores the challenges--and suggests the financial possibilities--of creating quantum networks.

By Mark Williams

In recent years, the Austrian physicist Anton Zeilinger has bounced entangled photons off orbiting satellites and made 60-atom fullerene molecules exist in quantum superposition--essentially, as a smear of all their possible positions and energy states across local space-time. Now he hopes to try the same stunt with bacteria hundreds of times larger. Meanwhile, Hans Mooij of the Delft University of Technology, with Seth Lloyd, who directs MIT's Center for Extreme Quantum Information Theory, has created quantum states (which occur when particles or systems of particles are superpositioned) on scales far above the quantum level by constructing a superconducting loop, visible to the human eye, that carries a supercurrent whose electrons run simultaneously clockwise and counterclockwise, thereby serving as a quantum computing circuit.

Two Nodes of a quantum network that Caltech researchers created by halting entangled photons within two ensembles of cesium atoms housed in an ultrahigh-vacuum system. Temporarily storing entanglement provides a basis for quantum data storage, which might be useful for various applications, including quantum cryptography.
Credit: Nara Cavalcanti
RESOURCES:
Functional Quantum Nodes for Entanglement Distribution over Scalable Quantum Networks
Chin-Wen Chou, Julien Laurat, Hui Deng, Kyung Soo Choi, Hugues de Riedmatten, Daniel Felinto, and H. Jeff Kimble
Science 316: 1316–1320 (2007)

Mapping Photonic Entanglement into and out of a Quantum Memory
K. S. Choi, H. Deng, J. Laurat, and H. J. Kimble
Nature 452: 67–71 (2008).


The physicist Richard Feynman proposed the idea of quantum computing in 1981 to exploit the information-­processing potential of atoms, photons, and elementary particles. By now, the field has advanced sufficiently far that researchers not only are able to manipulate physics for unprecedented experimental effects but have proposed commercial applications.

But before technologies like quantum communications, computing, and metrology can realize their potential--a quantum Internet and uncounterfeitable money are two interesting possibilities--quantum networks must be able to transmit and store data. The quantum optics group at the California Institute of Technology has been working toward this goal. The team is headed by H. Jeff Kimble, Valentine Professor of Physics, who led the 1998 effort that achieved the first unambiguous teleportation of one photon's quantum state--that is, the information represented by its spin, energy, and such--to another photon. Now Kimble and his team have demonstrated a way for entanglement--the nonlocal relationship that allows quantum teleportation, which Einstein skeptically dismissed as "spooky action at a distance"--to be created in networks.

Much as the motion of electrons in microprocessor circuits transmits data within today's computers, the teleportation of quantum states between entangled particles would perform that task in quantum networks. As for data storage, says Kyung Soo Choi, a researcher in Kimble's group, a central question that one of their recent experiments resolved was, "How do you convert entangled light into an entanglement of matter and back into light?" Entangled states are fragile, and networks of entangled light will require repeating devices--much the way long-distance fiber-optic networks require optoelectronic repeating devices to regenerate diminishing signals. Therefore, entanglement will need to be generated and stored in component subsystems within a greater quantum network. Now Kimble and his team have demonstrated a technical solution to the problem.

The Caltech team used two ensembles of cesium atoms whose states they influenced with a laser, making them either transparent or opaque as needed to manipulate incoming photons' speeds. The researchers then split single photons, putting them in superposition--that is, they were part of the same quantum wave function and, thus, entangled--while ensuring that they propagated along two paths into the two cesium ensembles. Choi explains, "We slowed the light to a crawl and halted it inside the matter by deactivating the control laser that was making the cesium ensembles transparent, so the quantum information--the entangled light--was stored inside the atomic ensembles. By reactivating the control laser, we reaccelerated the photons to normal speed, restoring the beams of entangled light." So far, the Caltech researchers have stored entanglement in matter for spans of one microsecond. Kimble estimates that he and his team can extend that to 10 microseconds.

Kimble possesses a courtly Texas gentleman's manner, as I discovered after his lab manager found him 15 minutes on the schedule following two weeks when the physicist was away, making presentations at four conferences on two continents. Those 15 minutes became a tutorial on recent technical advances in verifying and quantifying entanglement. Measurement is the central problem in quantum mechanics, since any particle or system exists in a quantum state only until another system, whether one as slight as a stray air molecule or as complex as a human observer, gains information about it and thereby collapses that state. This is mind-bendingly abstruse stuff. Aside from discussing quantum metrology, though, Kimble made one easily graspable assertion: "Our society's technical base is information commerce. In the next 20 years, quantum information science--a fusion of computer science and quantum mechanics that didn't exist 20 years ago--will radically change that commerce."

The revolutionary technology that ­Kimble envisions is large quantum networks, resembling the Internet but relying on entanglement. What inherent advantages would promote the development and adoption of such networks?

Substantial ones. Quantum networks have already been built on a limited scale. In 2004, the world's first permanent quantum cryptography system was activated in Cambridge, MA, linking Harvard, Boston University, and DARPA contractor BBN Technologies (formerly known as Bolt Beranek and Newman, under which name the company created the original ARPAnet). Today, id Quantique, a Swiss company, and MagiQ Technologies, a U.S. one, offer commercial modules using optical fiber to transmit quantum keys, in the form of photons encoded as bits by controlling their polarization, over limited distances that top out at about 100 kilometers. Since attempted interception of these light particles would disturb their state and expose eavesdropping, such quantum cryptography systems offer absolute data security.

Furthermore, the prospect of quantum computing was what provided the initial impetus for research into quantum networks. If such computing can be done seriously (so far, experiments have used at most seven qubits, or quantum binary digits), it promises to surpass classical computing in significant respects. Scott Aaronson, an MIT expert on computational complexity, cites the algorithm published in 1994 by MIT mathematician Peter Shor as the breakthrough that proved quantum computing a viable proposition by demonstrating that it could factor very large numbers in reasonable computing time. Because that task has been beyond classical computers, most public-key cryptography has hitherto been based on factoring large numbers. But it would be vulnerable to cryptanalysis based on quantum computing. As Aaronson says, "That's why the National Security Agency is interested in quantum computing." Quantum cryptography, however, would offer data security against quantum code-breaking as well as against regular cryptanalysis.

Besides ensuring the security of data, the quantum wide-area repeater networks, or QWANs, that Kimble has in mind would possess few of current networks' latency issues--indeed, could be as nearly instantaneous as light speed allows. Moreover, the exponential parallelism that would give quantum computing its power--with two entangled particles, or qubits, representing four different values, four qubits 16 values, and so on--ought to apply to networks of quantum computing devices. Kimble says, "Though there'll be a largest size attainable for the state space of individual quantum processing units, it'll be possible to surpass that by linking those units together into a fully quantum network." A quantum computer's "state space" is the full range of potential states in which the computer could exist. When a quantum algorithm is run, this computational process collapses that state space and shrinks the computer's range of possible states down to a single one: the correct answer to the given problem. With a network of quantum computers, Kimble is claiming, the exponential computational power of each device would be multiplied exponentially.

MIT's Seth Lloyd has given some thought to the design options for quantum networks. He says, "Networks using cesium-atom ensembles are one of the most promising technologies for transporting quantum information over long distances." Yet the ensemble approach is relatively bulky, and the larger a quantum system, the greater the problems for computing. Lloyd says, "Circuit-­based approaches like super­conducting loops are more scalable within a small space, with potentially large numbers of qubits on one circuit board." But such systems are unsuitable for communications. "Kimble and I have collaborated on concepts using individual atoms instead of ensembles," he says. "If we could move information between atomic ensembles and individual ions and ion traps, that's a scalable quantum technology." A plausible scenario, according to Lloyd, seems to be to use ensembles for communications and the more localized, scalable quantum devices, like the superconducting loops or the ion traps, for computation.

So Kimble has a reasonable argument that quantum networks are feasible. And the advantages that he envisions--absolute data security, no latency, and a further exponential gain in computational power--would hardly be negligible in the world of information commerce.

Some commercial applications of quantum information technology are fairly obvious. Human stock traders have come to rely on the computerized trading programs known as high-frequency traders (HFTs). On some days, these generate more than half the volume on the New York Stock Exchange. Major trading institutions have spent millions developing their algorithms to analyze market data and execute large numbers of trades according to strategies that are, mostly, sophisticated variations on buying microseconds after some data arrives and then selling microseconds later at the expense of other traders who couldn't get the data in or their trades out as rapidly. Futures traders who use near-instantaneous quantum networks will have clear advantages over those who don't.

Other commercial applications are possible as well. Scott Aaronson suggested one of them in a paper called "Quantum Copy-Protection and Quantum Money." He observed that quantum states cannot be copied because any measurement process destroys them, which "raises the possibility of using quantum states as unclonable information." Exploiting this possibility will require circumventing the fact that quantum states collapse under measurement and creating, first (for purposes of quantum money), unclonable states that can be verified as authentic, and second (for purposes of quantum copy protection), unclonable states that would still allow the protected software, DVDs, CDs, and so on to be used. Aaronson demonstrated that at least one type of publicly verifiable quantum money and two schemes for quantum-based copy protection are theoretically feasible--raising the possibility, for the first time ever, of absolutely uncounterfeitable money and insurmountable digital-rights protection.

The first generation of money emerged with the invention of coins in Lydia nearly 3,000 years ago, its second generation with the paper bills of exchange issued by the banks of Renaissance Italy, and its third with electronic money and the virtual economy of the modern era. If scientists like Kimble and Aaronson are correct, quantum networks may soon give rise to a further generation of money.

Mark Williams is a contributing editor to Technology Review.



http://www.technologyreview.com/computing/23198/?a=f

Tuesday, August 25, 2009

Biotech Bacteria Could Help Diabetics

Genetically engineered gut bacteria trigger insulin production in mice.

By Emily Singer


Friendly gut microbes that have been engineered to make a specific protein can help regulate blood sugar in diabetic mice, according to preliminary research presented last week at the American Chemical Society conference in Washington, D.C. While the research is still in the very early stages, the microbes, which could be grown in yogurt, might one day provide an alternative treatment for people with diabetes.

Engineering edible bacteria: Researchers engineered friendly bacteria (dots in the bottom half of the image) to produce a protein that triggers intestinal epithelial cells (top, highlighted in blue) to produce insulin.

The research represents a new take on probiotics: age-old supplements composed of nonharmful bacteria, such as those found in yogurt, that are ingested to promote health. Thanks to a growing understanding of these microbes, a handful of scientists are attempting to engineer them to alleviate specific ailments. "The concept of using bacteria to help perform (or fix) human disorders is extremely creative and interesting," wrote Kelvin Lee, a chemical engineer at the University of Delaware, in Maryland, in an e-mail. "Even if it does not directly lead to a solution to the question of diabetes, it opens up new avenues of thought in a more general sense," says Lee, who was not involved in the research.

People with type 1 diabetes lack the ability to make insulin, a hormone that triggers muscle and liver cells to take up glucose and store it for energy. John March, a biochemical engineer at Cornell University, in Ithaca, NY, and his collaborators decided to re-create this essential circuit using the existing signaling system between the epithelial cells lining the intestine and the millions of healthy bacteria that normally reside in the gut. These epithelial cells absorb nutrients from food, protect tissue from harmful bacteria, and listen for molecular signals from helpful bacteria. "If they are already signaling to one another, why not signal something we want?" asks March.

The researchers created a strain of nonpathogenic E. coli bacteria that produce a protein called GLP-1. In healthy people, this protein triggers cells in the pancreas to make insulin. Last year, March and his collaborators showed that engineered bacterial cells secreting the protein could trigger human intestinal cells in a dish to produce insulin in response to glucose. (It's not yet clear why the protein has this effect.)

In the new research, researchers fed the bacteria to diabetic mice. "After 80 days, the mice [went] from being diabetic to having normal glucose blood levels," says March. Diabetic mice that were not fed the engineered bacteria still had high blood sugar levels. "The promise, in short, is that a diabetic could eat yogurt or drink a smoothie as glucose-responsive insulin therapy rather than relying on insulin injections," says Kristala Jones Prather, a biochemical engineer at MIT, who was not involved in the research.

Creating bacteria that produce the protein has a number of advantages over using the protein itself as the treatment. "The bacteria can secrete just the right amount of the protein in response to conditions in the host," says March. That could ultimately "minimize the need for self-monitoring and allow the patient's own cells (or the cells of the commensal E. coli) to provide the appropriate amount of insulin when needed," says Cynthia Collins, a bioengineer at Rensselaer Polytechnic Institute, in Troy, NY, who was not involved in the research.

In addition, producing the protein where it's needed overcomes some of the problems with protein-based drugs, which can be expensive to make and often degrade during digestion. "Purifying the protein and then getting past the gut is very expensive," says March. "Probiotics are cheap--less than a dollar per dose." In underprivileged settings, they could be cultured in yogurt and distributed around a village.

The researchers haven't yet studied the animals' guts, so they don't know exactly how or where the diabetic mice are producing insulin. It's also not yet clear if the treatment, which is presumably triggering intestinal cells to produce insulin, has any harmful effects, such as an overproduction of the hormone or perhaps an inhibition of the normal function of the epithelial cells. "The mice seem to have normal blood glucose levels at this point, and their weight is normal," says March. "If they stopped eating, we would be concerned."

March's microbes are one of a number of new strains being developed to treat disease, including bacteria designed to fight cavities, produce vitamins and treat lactose intolerance. March's group is also engineering a strain of E. coli designed to prevent cholera. Cholera prevention "needs to be something cheap and easy and readily passed from village to village, so why not use something that can be mixed in with food and grown for free?" says March.

However, the work is still in its early stages; using living organisms as therapies is likely to present unique challenges. More research is needed to determine how long these bacteria can persist in the gut, as well as whether altering the gut flora has harmful effects, says MIT's Prather.

In addition, recent research shows that different people have different kinds of colonies of gut bacteria, and it's unclear how these variations might affect bacterial treatments. "This may be particularly challenging when it comes to determining the appropriate dose of the therapeutic microbe," says Collins at Rensselaer. "The size of the population of therapeutic bacterial and how long it persists will likely depend on the microbes in an individual's gut."


http://www.technologyreview.com/biomedicine/23302/