Saturday, August 8, 2009

DOE Energy Hubs on the Brink

Research centers conceived to speed energy-related research are facing a tough battle in Congress.

By Kevin Bullis


A major effort to revamp research and development at the Department of Energy, which Energy Secretary Steven Chu says is critical to solving energy-related challenges, hangs in the balance as the Obama administration attempts to make its case to a skeptical Congress.

Bell's Nobels: Steven Chu, speaking at MIT on the subject of energy innovation hubs, cited the success of Bell Laboratories at spurring invention. The inset is a slide from his talk, picturing the inventors of the transistor and the first of many Nobel Prize winners, including Chu, from Bell Labs.
Credit: MIT World/Technology Review

Last month, the House and Senate committees responsible for appropriating money to the Department of Energy shot down Chu's proposed "Energy Innovation Hubs," with the House killing funding for all but one of the eight proposed hubs and the Senate provisionally funding only three. The House committee called the hubs redundant and criticized the Department of Energy for a lack of planning and clear communication about them. Since then, the department has issued much more detailed accounts of the hubs, and the Obama administration has said it "strongly opposes" the committee's decision to cut the requested funds.

Each hub would bring together top researchers under one roof to address one of eight "grand challenges" related to energy and would be modeled on the Manhattan Project, which developed the atomic bomb, and the legendary Bell Laboratories, where the invention of the transistor and the development of information theory, among other things, helped make possible the semiconductor industry and the Internet.

"The intent is to create a fierce sense of urgency to deliver solutions," says a report released by the DOE in response to congressional criticism. (For a full list of the eight challenges, and detailed descriptions of the hubs, click here and here.)

The idea to create hubs was inspired by Chu's own tenure as a researcher at Bell Labs. These well-funded research facilities featured top researchers, who had the authority to decide quickly whether or not to fund a new project, based on discussions with the researchers who came up with the idea. "You could say no within an hour, and you could say yes within a day or a week," he said at a talk earlier this year at MIT. What's more, the close proximity of leading experts in a variety of fields made it easy to find out what work had been done in an area and what pitfalls should be avoided. After talking to a couple of people, "you were likely to be sitting down with a world expert," he said. At Bell Labs, research ran the gamut from basic efforts to explore how the world works to research that applied those findings and developed technical solutions--stages of research and development that are typically kept separate at universities and national laboratories.

The proposed innovation hubs would mimic this approach, with managers located on-site rather than behind a desk in Washington. They would be funded for five years at a time, freeing researchers from yearly funding cycles that make planning difficult. And the funding would be substantial--$35 million for the first year and $25 million per year thereafter. In comparison, research projects at universities typically receive $150,000 a year. To obtain a second round of five-year funding, the hubs would have to show significant progress in proving that their new technologies can work, with the goal of developing something that industry could bring to market.

The innovation hubs could correct some historical shortcomings of the Department of Energy, according to Chu. In the past, the department has not focused on commercializing technologies, and most of its efforts have revolved not around renewable energy but rather around cleaning up after nuclear weapons development, says Mark Muro, a fellow and policy director at the Brookings Institution in Washington, DC. "The reason Steven Chu wanted eight of them, dispersed across the entire corpus of the lab's research activities, was essentially to transform the culture and practices of the lab system," he says.

But some experts say that replicating Bell Labs today isn't a good idea, and likely isn't even possible. "So many people have tried to build a mini-Bell Labs and it has never quite succeeded," says Howard Anderson, a professor at MIT's Sloan School of Management and a venture capitalist. For one thing, when Bell Labs was at its height, "they didn't have 125 venture-capital firms ready to suck off all the brightest guys all at once," he says. When venture capitalists "see someone with a breakthrough who is on a government salary, we say, come over here, take $5 million, a chance to be rich." Over time, the hubs could be drained of the top talent essential to them functioning well.

But William Aulet, director of the MIT Entrepreneurship Center, says that the Bell Labs approach has been superseded in industry by a new approach exemplified by companies such Cisco, which draw on many outside researchers and are open to spinning off technology into other companies. He says that while clusters of researchers are definitely a good idea, a more open, Cisco-type approach will ultimately be more effective than a lab that tries to do everything itself. Aulet is encouraged that more-recent descriptions of the hubs by the DOE include ties to industry, which could help foster such an open model.

The appropriations bills from the House and Senate committees are now awaiting conference, so there's a chance that funding for the hubs could be restored before a final vote. What's more, the Waxman-Markey energy and climate-change bill that's working its way through Congress also has a provision for very similar energy innovation hubs.



http://www.technologyreview.com/energy/23126/

A Browser's View of Your Computer

Researchers reveal how attackers may be able to peer into users' computers over the Web.

By Erica Naone


The Internet is already a difficult place to maintain privacy, and now two security researchers have revealed new ways to spy on Web users via the browser. At a presentation at DEFCON 17, a hacking conference held in Las Vegas last week, the researchers showed a variety of ways to snoop on people online, despite the privacy tools employed by most browsers.

Credit: Technology Review

Robert Hansen, CEO and founder of the Internet security company SecTheory, and Joshua Abraham, a security consultant for the security company Rapid7, demonstrated how to do everything from obtain details of the software running on a user's system to gain complete control of a computer. If the attacker can convince the user to visit a website he controls, perhaps through a link in an e-mail, a number of attacks on the user's browser become possible.

The attacks worked with minimal participation from the user and, in one case, none at all.

"Your privacy is up to whichever site you're visiting and what browser you're using," says Hansen, who emphasizes that users cannot trust the privacy controls built into a browser to keep them safe. "[Browser] privacy buttons are just a basic protection," he says. In many cases, they're mainly designed for benign situations, such as protecting a user's privacy from other members of a household. To a determined attacker, however, Hansen says these privacy protections aren't enough.

Hansen and Abraham showed how an attacker could build up detailed information about a user and her system with a variety of simple tricks. For example, by persuading a user to cut and paste a particular URL into a browser bar, an attacker can discover the person's username and the name assigned to her computer, and can gain access to files on that system. Similar attacks can detect what plug-ins the user has installed in her browser.

This sort of information can be used to build a targeted attack against a particular user, Abraham says. Knowing which plug-ins a user has installed, for example, makes it easier to break into a system using a software flaw.

Hansen and Abraham raised privacy concerns about Google Safe Browsing, a commonly used extension for the Firefox Web browser that is designed to warn users about malicious websites. The researchers say that the tool performs that function well, but it also regularly issues a cookie that could be used to track all of the websites that a user visits. This information could be revealed if, for example, a government chose to subpoena the data.

Abraham went on to demonstrate a Java applet--code that runs inside the browser--that could grant an attacker access to a user's machine, including encrypted files, and to the machine's microphone. To pull this off, the attacker has to get the user to click twice--once to visit a page the attacker control, and once to click through a browser warning. However, Abraham says that an attacker could disguise the applet as legitimate software related to programs the user has already installed.

While many of the attacks revealed by the pair need to be customized to a particular person, Abraham says it might be worth the effort if, for example, an attacker is trying to gain access to a particular company network.

Hansen adds that the attacks don't call for much technical skill. "Most of the hard work has already been done for you," he says, since many of the tools needed to pull off the attacks are freely available online.

Kate McKinley, a security researcher with San Francisco-based iSec Partners who studies browser privacy, agrees that plug-ins such as Flash can open up privacy holes. She notes that most browsers offer a feature that clears private data, but says this often doesn't cover what is stored in plug-ins or certain newer browser features. Cookies stored in Flash, for example, can persist even when a user switches browsers, since they store data in a different dedicated location.

Users can protect themselves, Hansen says, but this means changing their online habits. For example, users need to get into the habit of questioning any dialogue boxes that are thrown up by the browser. "Are you willing to trade off usability for your security and privacy?" he asks. "There's no easy answer, but we need to raise awareness of these issues."


http://www.technologyreview.com/web/23136/

Scaling Up a Quantum Computer

A series of sustained quantum operations shows promise for developing a practical device.

By Kate Greene


Researchers at the National Institute of Standards and Technology (NIST) in Boulder, CO, have demonstrated multiple computing operations on quantum bits--a crucial step toward building a practical quantum computer.

Shine on, ions: Beryllium ions are trapped inside the dark slit on the left side of this chip. When researchers focus lasers on the ions, the ions can be used to perform quantum calculations.
Credit: J. Jost at NIST

Quantum computers have the potential to perform calculations far faster than the classical computers used today. This superior computing power comes from the fact that these computers use quantum bits, or qubits, which can represent both a 1 and a 0 at the same time, in contrast to classical bits that can represent only a 1 or a 0. Scientists take a number of different approaches to creating qubits. At NIST, the researchers use beryllium ions stored within so-called ion traps. Lasers are used to control the ions' electronic states, depending on the frequency to which the laser light is tuned. The electronic states of the ions and their interactions determine the quantum operations that the machine performs.

Over the past few decades, researchers have made steady progress toward a quantum computer, for instance, by storing quantum data or performing logic operations on qubits. But the NIST work, which is published online today by the journal Science, pieces together several crucial steps for the first time. The work involved putting an ion into a desired state, storing qubit data in it, performing logical operations on one or two of the qubits, transferring that information among different locations, and finally reading out the qubit result individually. Importantly, the researchers show that they can perform one operation after another in a single experiment.

"This is the next step in trying to put a quantum computer together," says Dave Wineland, lead researcher on the project. "It's nice to have reached this stage."

The NIST team performed five quantum logic operations and 10 transport operations (meaning they moved the qubit from one part of the system to another) in series, while reliably maintaining the states of their ions--a tricky task because the ions can easily be knocked out of their prepared state. In other words, the researchers had to be careful that they didn't lose quantum combinations of 1s and 0s while they manipulated their ions.

One of the major problems in performing multiple operations is that the ions heat up after a single operation, in which laser beams, tuned to specific frequencies, adjust the energy level of electrons. Once this happens, explains Jonathan Home, a postdoctoral researcher at NIST, the researchers can't do any further operations because the qubits can no longer hold both a 1 and a 0. To solve this problem, the researchers added magnesium ions to the mix. These ions are cooled with another set of lasers and, though the cold magnesium ions are not used for computation, they effectively chill the beryllium ions, keeping them in a stable state.

A second challenge when repeating operations inside this type of quantum computer is making sure that the ions are protected from stray magnetic fields that can also cause them to lose their quantum state. To solve this problem, the researchers chose specific energy-levels within which the ions are temporarily impervious to changes in surrounding magnetic fields. This maintains the qubit's state for up to 15 seconds, plenty of time, says Home, to perform a series of millisecond-long operations. "Our particular choice of levels doesn't change with the magnetic field," he says. "We don't have to worry about the lifetime of the qubits anymore."

The experiment is a "milestone accomplishment," says Isaac Chuang, a professor in the electrical engineering, computer science, and physics departments at MIT. "Very much like the early evolution of transistors into calculators, this work demonstrates a complete assembly of basic steps needed for a scalable quantum computer." Chuang adds that the research "sets the bar" for other quantum computing systems.

In demonstrations, the researchers manipulated two qubits at a time. For ion trap systems, the maximum number of qubits used in varying experiments so far is less than 10. In order to outperform a classical computer, the researchers would need to perform operations on 30 or more qubits, suspects Home, something he thinks could happen in the next five to 10 years. While quantum computers hold promise for breaking ultrasecure encryption codes, he says that early quantum computers will mostly likely be used to simulate physical systems, for example, the electronic properties of materials.

But to get there, the researchers will need to improve their system. Currently, it performs with 94 percent accuracy. For a quantum computer to be reliable enough to use, it must be 99.99 percent accurate. A major factor affecting the accuracy of the system is the intensity fluctuations of the lasers that perform the operations on the ions. However, new, more-reliable, and more-powerful ultraviolet lasers could solve this problem, says Home.


http://www.technologyreview.com/computing/23137/


Thursday, August 6, 2009

Q&A: Aneesh Chopra, National CTO

The presidential adviser explains how information technology can reboot America.

By David Talbot


In announcing the appointment of Aneesh Chopra as the nation's first chief technology officer (CTO) in April, President Barack Obama said Chopra would "promote technological innovation to help achieve our most urgent priorities." So far, Chopra's federal policy focus has been on leveraging information technology to revamp health care, education, and the energy infrastructure.

Credit: David Deal

Chopra works within the Office of Science and Technology Policy under presidential science adviser John Holdren but also directly advises the president on technology policy--a new role in the executive branch. His job is distinct from that of Vivek Kundra, the nation's chief information officer (CIO), who provides oversight of the government's information-technology contracts and efforts to make the federal government more open and efficient.

Before he took the CTO post, Chopra served as Virginia's technology secretary. His efforts ranged from forging public-private initiatives on rural broadband expansion to launching a competition for iPhone apps to aid middle-school math students. Chopra, who is 37, spoke recently with Technology Review.

Technology Review: Why do we need a national CTO?

Aneesh Chopra: President Obama has suggested there is a thoughtful role for technology and innovation across a wide range of priorities. While we have had White House leadership on technology policy in the past, this Administration has taken a broader view of the power and potential of technology to reduce health-care costs, deliver energy efficiency through smart-grid applications, and improve the skills of the workforce.

TR: Some economists see little evidence that federal spending on broadband will have a payback for the economy. How does spending $10,000 to get broadband to a rural farmhouse help the economy?

AC: It's not just broadband for the sake of laying pipe and capacity--it's about spurring innovative applications. Our teams at the commerce and agriculture departments are collaborating with the private sector not only to extend access where service today doesn't exist, but also to achieve a wider range of goals. We envision innovation in health care through telemedicine, distance learning, and even smart-grid infrastructure.

One example of rural application innovation we championed in Virginia was creating regional, shared e-911 services powered by broadband. In funding this with seed capital, we anticipated both improvements in emergency coverage and reductions in long-term costs. And it is conceivable that a grant that supports a rural farmhouse would open up higher-wage telework opportunities to that resident.

TR: What will make government more effective at promoting these kinds of technology applications?

AC: Government's role in promoting technology has traditionally been in the investment of basic R&D or the procurement of goods and services. It is my intention as CTO to focus on public-private collaboration to operate between those two extremes. In some cases, we might invest in a more targeted R&D opportunity that would bring private sector resources, universities, and the public sector together on a given problem. In others, we might use a procurement opportunity to spur market innovation.

TR: How would that work?

AC: For example, defensesolutions.gov is a website that seeks to drive innovation towards Department of Defense needs. Instead of procuring a specific device described by a multi-thousand-item specification, the department asks for a solution, such as "How do you field-test for the presence of explosives, drugs, and gunshot residue?" By leaving room for unanticipated and potentially disruptive technologies in the private sector, we can procure an innovative solution. Dozens of ideas have already been submitted to defensesolutions.gov, and a few are already gaining traction.

TR:That method might find you a disruptive technology--but on health IT, aren't there plenty of well-established electronic medical records technologies out there?

AC: Yes, but to receive stimulus funds, health-care providers will have to demonstrate meaningful use of technology to improve care quality, lower costs, or improve patient engagement and communication. I am focused on helping ensure the "meaningful use" criteria are designed to spur health reform and promote new products or services. Today, we don't have benchmarks differentiating the various products. When we do, we are confident the market will drive towards better value in achieving them.

TR:Is "meaningful use" an idea that should be more broadly applied?

AC: I would love to see this model apply in other areas where we see policy benefit in the adoption and use of IT.

TR:On the smart-grid, power utilities can prove "meaningful use" of existing technologies by documenting reduced electricity demand. Right now, though, the investment decisions are left mainly to 50 sets of state regulators and local utilities, and implementation is spotty. How can the federal government fix this?

AC: The federal role there has been very clear. First, we are seeding capital investment in this space through the Recovery Act--$4.5 billion for matching funds and demonstration projects. These initial projects are crucial to proving the value of the smart grid. Once the business case has been demonstrated, we believe that state and local decision-makers will continue investing in the build-out. Secondly, we are working through NIST [National Institute of Standards and Technology] on open standards to ensure the interoperability, reliability, and security of the smart grid. As we saw with the Internet, open standards enable innovation and scalability. These federal initiatives are complementary and in a collaborative spirit with state regulators and industry.

TR:Looking past these initial efforts, you are working on a broader policy document about spurring innovation. Can you give us at least a broad outline of what a national innovation policy would look like? What would it cover? What would be its goals?

AC: The Administration has three key goals for strengthening America's competitiveness and driving innovation. The first is improving the environment for private-sector innovation. This includes efforts to make the Research and Experimentation Tax Credit permanent, to encourage small businesses with targeted capital-gains tax reductions and improved access to capital, and to reform our patent system.

Second, we must invest in the building blocks of innovation such as human capital, fundamental research, and infrastructure. The president has committed to double the budgets of key science agencies, triple the number of National Science Foundation Graduate Research Fellowships, improve public school performance in science and math, and restore America's leadership in college attainment.

Finally, we must harness innovation to address key national priorities, including accelerating the transition to a low-carbon economy, allowing Americans to lead longer, healthier lives, and making government more open and transparent. All of this must be done with a view towards concrete measurable outcomes.


http://www.technologyreview.com/communications/23127/


Bringing Graphene to Market

A startup's conductive graphene inks can be used to print RFID antennas

By Katherine Bourzac


A startup company in Jessup, MD, hopes later this year to bring to market one of the first products based on the nanomaterial graphene. Vorbeck Materials is making conductive inks based on graphene that can be used to print RFID antennas and electrical contacts for flexible displays. The company, which is banking on the low cost of the graphene inks, has an agreement with the German chemical giant BASF and last month received $5.1 million in financing from private-investment firm Stoneham Partners.

Crumpled graphene: Conductive inks made by startup company Vorbeck Materials contain crumpled graphene. This atomic-force microscope image is colorized to show the topography of a piece of graphene of the type used in the inks; red areas are higher and blue are lower.
Credit: Ilhan Aksay and Hannes Schniepp

Since it was first created in the lab in 2004, graphene has been hailed as a wonder material: the two-dimensional sheets of carbon atoms are the strongest material ever tested, and graphene's electrical properties make it a potential replacement for silicon in faster computer chips. Synthesizing pristine graphene of the quality needed to make transistors, though, remains a painstaking process that, as yet, can't be done on an industrial scale, though researchers are working on this problem.

Vorbeck Materials is making what company scientific advisor Ilhan Aksay calls "defective" graphene in large quantities. Though the electrical properties of the graphene aren't good enough to support transistors, it's still strong and conductive.

Vorbeck Materials licensed their method for making "crumpled" graphene from Aksay, a professor of chemical engineering at Princeton University. Vorbeck Materials says the inks made with this crumpled graphene are conductive and cheap enough to compete with silver and carbon inks currently used in displays and RFID-tag antennas. (Another startup working on defective graphene, Graphene Energy of Austin, TX, is using a similar form of the material to make electrodes for ultracapacitors.)

Aksay's method begins by oxidizing the graphite with acids, then separating it into atom-thin sheets. The expanded graphite is then rapidly heated, creating carbon dioxide gas that builds up pressure, forcing the graphene sheets apart. This process is common, says Aksay, but his research group developed monitoring methods to improve the yield and ensure that the graphene sheets completely separate. The sheets are then heated to remove the oxygen groups. "The conductivity nears that of pristine graphene, but the sheets are crumpled so they don't stack together again," says Aksay. The resulting powder can be added to a solvent to make inks or added to polymers to make composites such as tough tire rubber.

Most conductive inks on the market are made out of silver particles. These inks can be printed using cheap techniques but the inks themselves are expensive. Silver is used instead of cheaper metals because it is less prone to oxidation. Silver inks are more conductive than Vorbeck's graphene ink, says company president John Lettow, but also much more expensive. They also have to be heat-treated after they're applied, which means they can't be printed on polymers and other heat-sensitive materials. Graphene ink requires no heat treatment and is more conductive than other carbon-based alternatives to silver inks.

Potential applications for the inks, Lettow says, include antennas for cheap RFID tags printed on paper and electrical contacts for displays. Nick Colaneri, director of the Flexible Display Center at Arizona State University, says the inks' conductivity may limit their application in displays to low-resolution devices.

Vorbeck Materials is in talks with electronics manufacturers to develop inks to their specifications. Lettow says the company will begin selling graphene inks by the end of the year.


http://www.technologyreview.com/business/23129/



Rogue Pharmacies Dominate Bing's Ads

Illegitimate pharmacies account for 90 percent of drug ads on Microsoft's Bing, according to a new report.

By Kristina Grifantini


The pharmacy ads that appear alongside search results on Microsoft's Bing are dominated by "rogue" companies, according to a report released yesterday by KnujOn, a spam-monitoring company, and LegitScript, a firm that verifies online pharmacies.

Fake drugs: Researchers ordered counterfeit Cialis (left, yellow) by clicking a Bing advertisement for affordabledrugs.com that redirected to a rogue Internet pharmacy, expressdelivery.biz.
Credit: LegitScript/KnujOn
Multimedia
video See a video showing illegal pharmacies advertised via Bing.

The report investigates the ads that appear when a person enters search terms such as "generic meds" or "online pharmacy" into Bing. Of the 69 advertisers that the company investigated, only seven were deemed to be legitimate. The remaining 62 did not require a prescription, in violation of US law, did not have a US address or offered to ship drugs from outside of the US.

"We were able to get prescription drugs without a prescription, and some were counterfeit," says John Horton, founder of LegitScript. LegitScript states that over 40,000 online pharmacies do not meet the certifications of the National Association of Boards of Pharmacy (NABP), which stipulate that companies must have a valid pharmacy license, a location in the US, and only dispense medicine with a valid prescription.

For certain drugs, federal law states not only that a user needs a prescription but also that the prescribing doctor must have a bona fide relationship with the patient, generally consisting of face-to-face contact and the sharing of medical information. The US Drug Enforcement Administration also prohibits certain controlled drugs from being imported into the US. The FDA also recommends that consumers buy pharmaceuticals online only from companies that are based in the US, require a prescription, and are licensed by the NABP. These regulations aim to prevent the abuse of prescription drugs and the dangers of taking unregulated medicine that may be adulterated, expired, or toxic.

Many of the pharmaceutical ads on major search engines do not comply with these standards, Bruen says. "Almost 90 percent of the pharmacy ads that we reviewed are for fake pharmacies. This has been going on for a while," he adds.

The new report examined 10 online pharmacies in closer detail and confirmed through the websites' FAQs or through live-chats that no prescription was needed to order prescription drugs. In two cases, the researchers purchased prescription drugs, one of which turned out to be counterfeit.

"If you look on the major search engines, you will find ads from pharmacies that are not legitimate, selling controlled substances," says Susan Foster, vice president and director of policy research at the National Center on Addiction and Substance Abuse (CAMA) at Columbia University.

No prescription: Researchers live-chatted with a representative from rx-medical-center.com. A live-chat correspondent assured them that they did not need a prescription to order a muscle relaxant postmarked from India.
Credit: LegitScript/KnujOn

Bing, Google, and Yahoo all verify the legitimacy of online pharmacies by using PharmacyChecker, which covers overseas pharmacies in addition to US-based ones. For an online pharmacy to meet PharmacyChecker's standards, it must be licensed with a pharmacy board, require prescriptions based on face-to-face doctor-patient meetings, comply with patient-privacy laws, and have valid contact information and sufficient online encryption.

PharmacyChecker also covers Canadian pharmacies, because they undergo a regulation process as rigorous as the FDA's, according to Tod Cooperman, president of PharmacyChecker. "None of those 10 pharmacies [examined in the report] are approved members of the PharmacyChecker program," says Cooperman. He was unable to explain why ads for unverified pharmacies showed up on Bing.

"They shouldn't advertise any pharmaceuticals until they figure out ways to [regulate]," says Bruen. "They should require that anyone who uses specific drug names in their search criteria disclose their pharmacy license and brick-and-mortar location."

To sell ads, search engines typically run an auction for certain search keywords, with popular search terms, such as "Viagra," costing more. Every time a user clicks on a search ad, the seller pays the search engine a fee, ranging from a few cents to a few dollars, depending on the keyword. In this bidding system, Horton suggests that the abundance of illegal pharmacy ads may overwhelm legitimate ones. "If the drugs they're selling are counterfeit or knockoffs, they have lower costs than legitimate pharmacies," says Horton. "The rogue Internet pharmacies have more money to bid on advertising [and] drive up the auction rates, to the detriment of legitimate advertisers."

Bruen, who monitors large, organized spam networks, also found that many online pharmacies are tied to spam groups. By looking at domain registration records, they found that some pharmacies were located in Russia, India, or Panama, despite stating that they were based elsewhere.

E. J. Hilbert, a former FBI agent and the head of online security for Epic Advertising, says that search engines shouldn't allow advertisers to display one website address and then direct a user to a different one. "Why would you call it dailymedrx.com when it redirects to k2med.com, which is located in Russia?" he asks. Hilbert also suggests that search engines check the domain names of their advertisers to see where they are really based. "If an advertiser is bidding a lot higher for a keyword like "Viagra' than others, it may be a red flag that they are illegitimate, he adds.

"The online advertising market is a multibillion-dollar business," Hilbert says. "The profit model is in favor of 'run the ads and buyer beware,' versus being consumer conscious."

Last year, NABP, the American Pharmacists Association, and CAMA wrote to major search engines expressing concern about fraudulent pharmacies. "We never heard back from Google and Yahoo," says Foster. "We did eventually hear back from Microsoft [and] they indicated that they would look into the problem."

Horton and Bruen say that they plan to investigate additional search engines.


http://www.technologyreview.com/web/23128/

Wednesday, August 5, 2009

Cell on a Chip

The first artificial cell organelle may help researchers find a way to make bioengineered heparin and other synthetic drugs.

By Lauren Gravitz


The drug heparin is widely used to prevent blood from clotting in medical procedures ranging from dialysis to open-heart surgery. With a $6 billion market, it is one of the most common drugs used in hospitals today. But its widespread use belies its crude origins: more than 90 years after it was discovered, heparin is still made from pig intestines. But a new microfluidics chip, which mimics the actions of one of the cell's most mysterious organs, may help change that. Researchers at Rensselaer Polytechnic Institute in Troy, NY, have created the first artificial cellular organelle and are using it to better understand how the human body makes heparin.

Fake cell: This microfluidics chip can replicate the activity of one of the eukaryotic cell's most important, yet least understood, organelles--the Golgi apparatus. Researchers hope that it can help them understand how to create synthetic versions of important drugs such as heparin.
Credit: Courtesy JACS

Scientists have been working to create a synthetic version of the medication, because the current production method leaves it susceptible to contamination--in 2008, such an incident was responsible for killing scores of people. But the drug has proven incredibly difficult to create in a lab.

Much of the mystery of heparin production stems from the site of its natural synthesis: a cellular organelle called the Golgi apparatus, which processes and packages proteins for transport out of the cell, decorating the proteins with sugars to make glycoproteins. Precisely how it does this has eluded generations of scientists. "The Golgi was discovered over 100 years ago, but what happens inside it is still a black box," says Robert Linhardt, a biotechnologist at Rensselaer who's been working on heparin for nearly 30 years and is lead author of the new study. "Proteins go in, glycoproteins come out. We know the enzymes that are involved now, but we don't really know how they're controlled."

To better understand what was going on inside the Golgi, Linhardt and his colleagues decided to create their own version. The result: the first known artificial cell organelle, a small microfluidics chip that mimics some of the Golgi's actions. The digital device allows the researchers to control the movement of a single microscopic droplet while they add enzymes and sugars, split droplets apart, and slowly build a molecule chain like heparin. "We can essentially control the process, like the Golgi controls the process," Linhardt says. "I think we have a truly artificial version of the Golgi. We could actually design something that functions like an organelle and control it. The next step is to make more complicated reaction combinations."

"People have had bits and pieces of the toolbox for making these important carbohydrates, but one thing you should potentially do is try to emulate nature, or at least figure out how it works," says Paul DeAngelis, a biochemist and molecular biologist at the University of Oklahoma who was not involved in the research. "The miniaturization that they're doing--having little bubbles of liquid fuse and go to different compartments with different catalysts under different conditions--that's how your body and the Golgi apparatus works. It's a nice model."

Currently, researchers know what heparin looks like and what enzymes are required to make it, but they don't quite know how it's made. "It's like having all the materials and tools required to build a house and knowing what the final house looks like, and then having someone say, 'Okay, go build the house,'" Linhardt says. "What we need is a blueprint. We need to know how these tools function together, how the house is assembled." He likens the microfluidics chip to a house-building DIY reel, one that "tells us how to hammer nails, how to saw, how to assemble struts, how to put walls in." By testing reagents in different amounts, with different reaction times, the artificial Golgi may be able to teach them how to synthesize heparin and other molecules in a laboratory setting.

"It's a fusion of engineering and biology," says Jeffrey Esko, a glycobiologist at the University of California, San Diego. "One can do this in test tubes, but the chip provides a way to automate the process on a microscale." The chip also allows for precise control over each individual interaction, and at a small scale.

With the help of their microchip and substantial funding from the National Institutes of Health, Linhardt believes that they should be able to bring bioengineered heparin into clinical trials within the next five years.



http://www.technologyreview.com/biomedicine/23122/

Warning Issued on Web Programming Interfaces

Tools that connect websites can also open up new security vulnerabilities, experts say.

By Erica Naone


The rapid growth of Web applications has been fueled in part by application programming interfaces (APIs)--software specifications that allow sites and services to connect and interact with one another. But at the DEFCON hacking conference in Las Vegas last weekend, researchers revealed ways to exploit APIs to attack different sites and services.

Credit: Technology Review

APIs have been behind the meteoric rise of many key social sites. The social-networking site Facebook, for example, won huge gains in popularity and attention after opening its site to applications written by outside developers using its API.

The API of the microblogging media darling, Twitter, is also credited with partly driving its popularity. John Musser, the founder of Programmable Web, a website for users of mashups and APIs, says that the traffic that comes into Twitter through APIs--for example, from desktop clients--is four to eight times greater than the traffic that comes through its website. "The API has been crucial to the success of that startup," he says.

But researchers Nathan Hamiel of Hexagon Security Group and Shawn Moyer of Agura Digital Security say that APIs could also be exploited by hackers. They note that several APIs are often stacked on top of each other. For example, an API might be used by the developers of other websites who, in turn, publish APIs of their own. "There could be security problems at the different layers when this sort of stacking happens," Hamiel says.

Hamiel also notes that APIs can open sites to new kinds of threat. For example, he points to APIs for building applications that work across multiple websites. These tools may allow developers to pull in content from third-party websites, but Hamiel says that this also opens up possibilities for attacks.

During his presentation Hamiel showed that an attacker might be able to use an API in unintended ways to gain access to parts of a website that shouldn't be visible to the public. "Whenever you add functionality, you increase your attack surface," Hamiel says, noting that what makes an API powerful is often the same as what makes it risky.

Programmable Web's Musser says that many of the security risks introduced by an API are similar to those found in desktop computers. In both cases, he says, security vulnerabilities exist wherever there is an access point that an attacker might abuse. Any site that builds its API on top of another site's API is relying on someone else's security, and it's not easy to look into what has been built to see how well it has been handled, Musser says. "Part of the fundamental issue is just how new the technology is," he adds.

Jeremiah Grossman, founder and chief technology officer for WhiteHat Security, says that sites that publish APIs can find it hard to discover security flaws in them. He notes that often it's difficult to tell how a third-party site is using an API, and if that site has been compromised by an attacker.

APIs are also harder to test than traditional websites, Grossman says. Though software tools have been developed that can analyze a site's underlying code to pinpoint potential vulnerabilities, those tools won't work for testing APIs. "It's a lot more manual with a lot less automation, and it means, at the end of the day for the business, more expense," he says.

But while experts agree that there's no easy fix for the risks introduced by APIs, they also say the technology isn't going away. "Websites are becoming Web services, and that trend isn't going to stop," Musser says.


http://www.technologyreview.com/web/23121/


A More Efficient Spacecraft Engine

NASA's new ion-propulsion system could be ready for launch as soon as 2013.

By Brittany Sauser


NASA engineers have finished testing a new ion-propulsion system for earth-orbiting and interplanetary spacecraft. The system is more powerful and fuel-efficient than its predecessors, enabling it to travel farther than ever before.

Ion power: NASA’s new ion-propulsion system is undergoing testing at the Jet Propulsion Laboratory in Pasadena, CA.
Credit: NASA

Ion propulsion works by electrically charging, or ionizing, a gas using power from solar panels and emitting the ionized gas to propel the spacecraft in the opposite direction. The concept was first developed over 50 years ago, and the first spacecraft to use the technology was Deep Space 1 (DS1) in 1998. Since then, one other spacecraft has used ion propulsion: the Dawn mission to the outer solar system, launched in 2007.

To build the new ion-propulsion system under NASA's Evolutionary Xenon Thruster (NEXT) program, engineers at NASA's Glenn Research Center in Cleveland, OH, modified and improved the design of the engines used for DS1 and Dawn. "We made it physically bigger, but lighter, reduced the system's complexity to extend its lifetime, and, overall, improved its efficiency," says Michael Patterson, the principal investigator on the project.

Patterson presented a paper describing the engine at the Joint Propulsion Conference and Exhibit held this week in Denver. He says that his team could start building a mission-ready version of the engine by January 2010, which would take about 36 months to complete.

Chemical propulsion systems are most commonly used for spacecraft, but they require large amounts of fuel and are inefficient for deep-space missions. "You are limited in what you can bring to space because you have to carry a rocket that is mostly fuel," says Alexander Bruccoleri, a researcher in the aeronautics and astronautics department at MIT. In addition, he says, "you have to compensate for the weight and size of the propellant tanks by building a spacecraft that is flimsy or does not have many structures to reinforce it."

As an alternative, several research groups are exploring electric propulsion systems. While these engines produce much less thrust than chemical engines, they are very efficient, making them ideal for long-distance missions to asteroids, comets, or planets like Jupiter and Mercury. However, "one of the biggest challenges in electric propulsion is the high power and lifetime of the system," says Daniel Brent White, another researcher in aeronautics and astronautics at MIT.

The new ion engine builds upon the electric propulsion systems used by both DS1 and Dawn, says Patterson. It uses the same method to achieve thrust: xenon gas flows into a reaction chamber inside the engine and is ionized by electrons; electromagnets positioned around this chamber enhance the efficiency of ionization. Electrodes positioned near the engine's thrusters (known as ion optics) are then used to accelerate the ions electrostatically and shoot them out of the exhaust to push the spacecraft forward.

The Glenn Research Center engineers optimized the mechanical design of the engine's magnets and ion optics, and made other modifications, including reducing the number of thrusters, to make the system more powerful and more efficient. "The engine has a higher power level and a larger throttling dynamic range--it can go from very high power to very low power--so it can operate for longer periods of time and better execute its mission," says Patterson.

Michael Huggins, the space and missile propulsion directorate in the Air Force Research Laboratory at Edwards Air Force Base in California, says it is important to find ways to make propulsion systems more efficient, smaller, and more economical. The fact that NASA is looking at more-efficient devices for interplanetary missions "is definitely the right answer," he says.

However, there are potential drawbacks to ionic propulsion. For example, solar energy cannot be used too far from the sun. "Solar just won't work out to distances like Neptune," says White, who presented a paper at the same conference on using nuclear energy as a power source for deep-space missions. While this would provide plenty of power in deep space, safety concerns would make it politically challenging to launch a nuclear-powered spacecraft.

"The only competitor we really have is advanced chemical technology," says Patterson. "The advantage that we have is that we are very fuel efficient." Thus, for complex planetary missions that require lots of energy, says Patterson, the US and its international partners, including Japan and European nations, are transitioning to ion propulsion engines.



http://www.technologyreview.com/computing/23120/

Tuesday, August 4, 2009

Device Offers a Roadside Dope Test

The system uses magnetic nanoparticles to detect traces of cocaine, heroin, cannabis, and methamphetamine.

By Alexander Gelfand


Later this year, Philips will introduce a handheld electronic device that uses magnetic nanoparticles to screen for five major recreational drugs.

Quick fix: Philips' drug tester uses a cartridge containing magnetic nanoparticles and a handheld analyzer. Frustrated total internal reflection (FTIR) is used to detect five major recreational drugs in 90 seconds.
Credit: Philips Research

The device is intended for roadside use by law enforcement agencies and includes a disposable plastic cartridge and a handheld analyzer. The cartridge has two components: a sample collector for gathering saliva and a measurement chamber containing magnetic nanoparticles. The particles are coated with ligands that bind to one of five different drug groups: cocaine, heroin, cannabis, amphetamine, and methamphetamine.

Philips began investigating the possibility of building a magnetic biodetector in 2001, two years after a team of researchers at the Naval Research Laboratory (NRL) in Washington, DC, first used magnetic sensors similar to those employed in hard drives to sniff out certain biowarfare agents. The NRL scientists labeled biological molecules designed to bind to target agents with magnetic microbeads, and then scanned for the tagged targets optically and magnetically. The latter approach used the same giant magnetoresistant (GMR) sensors that read the bits on an iPod's hard drive. They quickly developed a shoebox-sized prototype capable of detecting toxins, including ricin and anthrax.

Philips initially developed both a GMR sensor and an optical one that relies on frustrated total internal reflection (FTIR)--the same phenomenon that underlies fingerprint scanners and multitouch screens. The company decided to go the FTIR route in order to exploit its expertise in building optical sensors for consumer electronics devices, says Jeroen Nieuwenhuis, technical director of Philips Handheld Immunoassays, the division responsible for commercializing the biosensor technology, which goes by the trade name Magnotech.

Moving to an optical detection method also allowed Philips to simplify the test cartridges that the device employs, making them easier to mass-produce, says Nieuwenhuis. With the current FTIR-based system, "we can make simpler cartridges in larger quantities more easily," he adds.

Once the device's sample collector has absorbed enough saliva, it automatically changes color and can then be snapped into the measurement chamber, where the saliva and nanoparticles mix. An electromagnet speeds the nanoparticles to the sensor surface, different portions of which have been pretreated with one of the five target-drug molecules. If traces of any of the five drugs are present in the sample, the nanoparticles will bind to them. If the sample is drug free, the nanoparticles will bind to the drug-coated sensor surface instead.

The orientation of the magnetic field that first drew the nanoparticles to the sensor is then reversed, pulling away any nano-labeled drug molecules that may accidentally have stuck to the sensor surface but leaving legitimately bound ones in place. This last magnetic trick promises to reduce what Larry Kricka, a clinical chemist at the University of Pennsylvania who recently co-authored an article in Clinical Chemistry on the use of magnetism in point-of-care testing, calls "a major restraint in such assays": the unintentional capture of molecular labels on the test surface, a leading cause of both false positives and false negatives. Kricka is not involved with Philips but does serve as a consultant to T2 Biosciences, a Cambridge, MA, firm that promotes a magnetic biosensor based on MRI technology.

During the analysis phase, a beam of light is bounced off the sensor. Any nanoparticles bound to the surface will change its refractive index, thereby altering the intensity of the reflected light and indicating the concentration of drugs in the sample. By immobilizing different drug molecules on different portions on the sensor surface, the analyzer is able to identify the drug traces in question. An electronic screen displays instructions and a simple color-coded readout of the results.

The test takes less than 90 seconds and can detect drugs at concentrations measured in parts-per-billion using a single microliter of saliva. The sensor is capable of even greater sensitivity--it has been used to detect cardiac troponin, a commonly used indicator of heart attack, at concentrations 1,000 times lower.

Philips plans ultimately to enter the healthcare market. It is working on a platform capable of testing blood as well as saliva and is seeking partners that can help expand its testing menu by providing it with additional biomarkers.

Other researchers have built experimental devices to magnetically detect a wide range of biomolecules in minuscule samples of blood or saliva at extremely low concentrations. Often this involves using microfluidic or magnetic forces to quickly shepherd the magnetically labeled molecules through scanners--though a group at the University of Utah has even built a prototype in which a sample-laden stick is swiped across a GMR sensor, like a credit-card through a reader.

The combination of high sensitivity, low sample volumes, miniaturization, speed, and ease of use has raised hopes for a handheld biosensor that could perform sophisticated tests with high accuracy.

"Everyone's trying to get there," says Kricka. "The question is who's going to win?" With Philips set to introduce its drug tester in Europe by the end of the year in partnership with the British diagnostics firm Cozart, the consumer electronics maker appears poised to take the prize.


http://www.technologyreview.com/biomedicine/23111/

Rapid TB Detector

An ultrasensitive test can spot bacteria in a half hour.

By Katherine Bourzac


One third of the world's population is infected with tuberculosis. Detecting the bacteria is time-consuming and expensive, even in hospitals with sophisticated lab equipment. And in the poor countries where the infection is most prevalent, people often don't have access to this equipment. Researchers at Massachusetts General Hospital in Boston and Harvard University have now demonstrated that a handheld device can be used to count as few as 20 bacteria in a sputum sample in a half hour. They hope to develop the test into an inexpensive product that can be deployed for TB testing.

TB detector: A handheld device can count as few as 20 bacteria in a sputum sample. A plastic card inside the device uses microfluidics to direct magnetically labeled bacteria in a sample to one of two chambers surrounded by metal coils. The coils are used for nuclear magnetic resonance imaging of the sample.
Credit: Hakho Lee

The bacteria counter is being developed by researchers led by Ralph Weissleder, director of the Center for Systems Biology and the Center for Molecular Imaging Research at Harvard Medical School, and Hakho Lee, an instructor at Mass General. The technology uses magnetic-nanoparticle labels and a detector that works on the same principles as magnetic resonance imaging. They're focused on tuberculosis, says Lee, because "even one bacterium can cause the disease, but at this point there is no easy way to detect the bacteria at high sensitivity."

The biggest problem with existing tests is that they are too slow, says Peter Katona, associate clinical professor of infectious diseases at the University of California, Los Angeles. The most accurate way to identify the infection is to grow a sample in the lab. But because TB grows slowly, that can take as long as six weeks. What's more, expensive culture equipment is typically not available in poor areas where the infection is prevalent.

The cheapest and fastest way to detect TB is a skin test that screens for an immune reaction. But such tests are not particularly accurate. "There are a number of conditions where there is no immune reaction" even if the patient carries the infection, says Steven Miller, director of the clinical laboratories at the University of California, San Francisco Medical Center. TB and HIV often go hand in hand, but in HIV-positive patients the skin test doesn't work. Another common test, staining a sputum sample with a dye that targets TB and examining it under the microscope, also has a high rate of false negatives. "Unless there is a very high load of bacteria, you can't pick it up," says Miller.

The Harvard detector can find very small loads of bacteria. It's a miniaturized version of a nuclear magnetic resonance imager, a very sensitive but typically large and expensive device used for clinical and chemical applications such as brain imaging and determining protein structures. The size and expense of typical nuclear magnetic resonance imagers is dictated by the need for a strong magnet. Weissleder's group simplified the instrument into a portable, one-pound device with disposable parts by compromising on signal quality and by placing the sample chamber right inside the radio-frequency coils. "When you're measuring bacteria, you don't need high resolution--you just need to pick up one pattern," says Lee.

As proof of principle, Weissleder and Lee demonstrated they could detect a bacterium very similar to tuberculosis in sputum samples. First, the viscous sample must be liquefied. Then it's mixed with a solution of cannonball-shaped iron nanoparticles coated in antibodies that stick to the bacteria. The sample is loaded onto the detector, which uses microfluidics to force the sample through a channel fitted with a screen that traps bacteria and washes free any nanoparticles that didn't meet a target. This channel is surrounded by a metal coil that pulses the trapped bacteria with radio-frequency waves under the influence of a magnet. This causes the iron nanoparticles to emit a magnetic signal, in turn affecting the protons in the surrounding water molecules. The Harvard device picks up on these changes, whose magnitude and duration are directly proportional to the number of labeled bacteria in the sample.

The bacteria detection process takes about 30 minutes and is as sensitive as processes that use culture samples grown in the lab. The results are described in the journal Angewandte Chemie.

"Trying to diagnose very low levels of bacteria in a sample while maintaining high quality is not an easy thing to do," says pathologist Miller. The Harvard test is very sensitive--that is, it can detect low levels of bacteria--but until the device undergoes more tests, it's impossible to say how specific it is. If it proves to have high rates of false positives, says Miller, it won't be viable in places like the United States, where tuberculosis rates are low. However, says Miller, "there could be a lot of value for a cheap and easy test like this in areas with high tuberculosis prevalence."

The researchers are collaborating with the Harvard School of Public Health to test the device on clinical samples from patients carrying tuberculosis.


http://www.technologyreview.com/biomedicine/23110/

Hackers Game a Multiplayer World

Two programmers reveal covert ways to automate characters in the immensely popular game.

By Robert Lemos


It's the simple necessities that sometimes spur invention. For Christopher Mooney, four years ago, it was the need to take a shower. A senior at the University of Southern Maine at the time, Mooney was in the midst of a long quest with a group of friends in the immensely popular online game World of Warcraft. Mooney didn't want to leave his friends in the lurch and then have to redo the quest all over again. So instead, he cobbled together some code to keep his character running with the party and healing anyone who needed it, then left his computer to freshen up.

Fantasy world: World of Warcraft is a popular massively multiplayer online role-playing game (MMORPG), with more than 11.5 million subscribers as of December 2008.
Credit: Blizzard Entertainment

On Friday, Mooney and colleague James Luedke showed off an evolved version of the original trick at DEFCON 17, a hacker conference in Las Vegas: a set of programs to automate in-game characters that have so far evaded detection by World of Warcraft's developer Blizzard Entertainment.

"Playing the game was fun, but what kept me up at night was figuring out ways to change the environment and extend the game experience," Mooney says. "Over the years, the stuff we did wrong, the things we rewrote, it must have totaled a full-time job for a year."

The project, dubbed Behead the Prophet (BTH) by the two programmers, includes code for automating characters described as "helpers." Such automated programs, known as "bots," are controversial in massively multiplayer online role-playing games (MMORPGs) because they are often used to automate the collection of valuable items--an activity known as "gold farming." Moreover, some bots use programming loopholes to cheat in other ways, for example, by giving characters super speed or the ability to attack more quickly.

Blizzard allows some third-party developers to create scripts and in-game add-ons that enhance the user interface. But the company has taken measures to prevent third-party developers and hackers from using in-game information in external programs in ways it does not approve. The company has even created a program, called the Warden, to detect programs that violate its policies.

Mooney and Luedke argue that their programs are benign. They programmed their helpers to wait until a character from a particular guild asks for assistance and then follow that character's lead in taking certain actions: healing, casting spells, and attacking enemies.

To avoid detection as well as legal issues, Mooney and Luedke created a script written in the Lua programming language that makes decisions based on what's happening within the game. The script's decisions are represented as a particular color in a bar at the top of the screen. A second program uses this color to determine which keys to press in order to control the helper character. "The outside program is the stupid thing--it just presses keys," Mooney says. "All the power is on the inside add-on."

Judging from past efforts to ban similar programs, it is likely that Blizzard Entertainment will take a dim view of the duo's activities.

In February, Blizzard successfully argued in court that a company called MDY Industries, which created a similar in-game helper program to automate a user's character for short periods of time, had circumvented the game maker's protections and violated copyright. The bot, called MMO Glider, allowed users to automate the sometimes-onerous task of killing and collecting loot.

"They are saying that we own the license and, if you don't follow the license terms, we are taking away your license and you are a copyright infringer," says Jef Pearlman, fellow and staff attorney at Public Knowledge, a Washington, DC-based digital-rights group. "It's a very worrisome model."

Blizzard Entertainment did not immediately comment on the DEFCON presentation.

Rather than eliminate bot programs, Mooney argues that Blizzard should start a handful of separate servers as a playground for developers and players who want to experiment by automating their characters. Aside from helping eliminate boring quests in which characters have to kill an onerous number of monsters--an activity referred to as "grinding"--the separate environment could be good place to test new approaches to automation and machine intelligence, he says.

"There is a community of developers that enjoy this type of game experience," Mooney says. "I think that would go a long way toward preventing the bitter back and forth between Blizzard and their developers."


http://www.technologyreview.com/computing/23112/

Monday, August 3, 2009

Nanotube-Powered X-Rays

Tiny electron emitters inside an x-ray generator could improve medical imaging and cancer therapy.

By Prachi Patel


Carbon nanotubes are at the heart of a new x-ray machine that is slated for clinical tests later this year at the University of North Carolina (UNC) Hospitals. The machine could perform much better than those used today for x-ray imaging and cancer therapy, say the UNC researchers who developed the technology. They have shown that it speeds up organ imaging, takes sharper images, and could increase the accuracy of radiotherapy so it doesn't harm normal tissue.

Capturing the heart: In a new scanner, carbon nanotubes fire electrons instantly to generate x-rays. This gives sharp, high-resolution pictures, such as this one of a fast-beating mouse heart.
Credit: Otto Zhou, University of North Carolina

Conventional x-ray machines consist of a long tube with an electron emitter, typically a tungsten filament, at one end and a metal electrode at the other. The tungsten filament emits electrons when it is heated to 1,000 degrees Celsius. The electrons are accelerated along the tube and strike the metal, creating x-rays.

Instead of a single tungsten emitter, the UNC team uses an array of vertical carbon nanotubes that serve as hundreds of tiny electron guns. While tungsten requires time to warm up, the nanotubes emit electrons from their tips instantly when a voltage is applied to them.

The researchers presented work on their nanotube scanner at the meeting last week of the American Association of Physicists in Medicine.

Physics and materials science professor Otto Zhou cofounded a company called Xintek in Research Triangle Park, NC, to commercialize the technology. Xintek has teamed with Siemens Medical Solutions to form a joint-venture company, XinRay Systems, which has developed the prototype system that will be clinically tested this year.

Taking clear, high-resolution x-ray images of body organs is much easier with the new multi-beam x-ray source, Zhou says. Conventional computerized tomography (CT) scan machines take a few minutes to create clear 3-D images using x-ray. "Because the radiation is coming from one point in space, the machine has to move the [electron] source and detector around the object," Zhou says. The x-ray emitter fires while the tube moves. The motion of the heart and lungs can blur images, so a CT scanner takes hundreds of pictures that are synthesized to reconstruct a 3-D image.

The new machine, by contrast, turns multiple nanotube emitters on and off in sequence to take pictures from different angles without moving. Because the emitters turn on and off instantaneously, says Daniel Kopans, director of breast imaging at Massachusetts General Hospital, the system should be able to take more images every second. This faster exposure, Kopans says, should reduce blur, much as a high-speed camera captures ultrafast motion. Zhou and his colleagues have been able to take breast images at nearly twice the resolution of commercial scanners, using 25 simultaneous beams in a few seconds.

Fast, real-time imaging will in turn improve cancer treatment. "State-of-the-art radiation therapy is highly image-based," says Sha Chang, a professor of radiation oncology at the UNC School of Medicine who is working with Zhou. Pictures of the tumor area are taken so that radiation can be focused on the tumor, sparing the normal tissue surrounding it. But since today's scanners are slow, Chang says it isn't possible to take 3-D images and treat the patient at the same time. "Using the [nanotube] x-ray imaging device allows [us] to collect 3-D imaging while we're treating the patient, to make sure high-dose radiation and heat [are] delivered to the right place," she says.

The clinical test results will determine if Xintek can enter the medical-imaging market. Meanwhile, the company is also selling its nanotube emitters to display manufacturers. Companies such as Samsung and Motorola are making displays based on nanotube emitters that promise to consume less power than liquid-crystal displays or plasma screens while providing the brightness and sharpness of bulky cathode-ray-tube TVs because they work on the same principle: shooting electrons at a screen coated with red, green, and blue phosphors.

Xintek's imaging technology is also proving useful for research on laboratory animals. It can take sharp cardiac images of mice, which is hard because of their rapid heartbeats. Zhou says that biomedical researchers at UNC are already using the system and are installing a second unit at the medical-school research facility.



http://www.technologyreview.com/biomedicine/23107/

Solar Industry: No Breakthroughs Needed

The solar industry says incremental advances have made transformational technologies unnecessary.

By Kevin Bullis


The federal government is behind the times when it comes to making decisions about advancing the solar industry, according to several solar-industry experts. This has led, they argue, to a misplaced emphasis on research into futuristic new technologies, rather than support for scaling up existing ones. That was the prevailing opinion at a symposium last week put together by the National Academies in Washington, DC, on the topic of scaling up the solar industry.

Cheaper solar: First Solar’s improvements in manufacturing photovoltaics have helped lead to big drops in cost. A worker at a First Solar factory in Frankfurt, Germany, moves one of the company's solar panels.
Credit: First Solar

The meeting was attended by numerous experts from the photovoltaic industry and academia. And many complained that the emphasis on finding new technologies is misplaced. "This is such a fast-moving field," said Ken Zweibel, director of the Solar Institute at George Washington University. "To some degree, we're fighting the last war. We're answering the questions from 5, 10, 15 years ago in a world where things have really changed."

In the past year, the federal government has announced new investments in research into "transformational" solar technologies that represent radical departures from existing crystalline-silicon or thin-film technologies that are already on the market. The investments include new energy-research centers sponsored by the Department of Energy and a new agency called ARPA-Energy, modeled after the Defense Advanced Research Projects Agency. Such investments are prompted by the fact that conventional solar technologies have historically produced electricity that's far more expensive than electricity from fossil fuels.

In fact, Energy Secretary Steven Chu has said that a breakthrough is needed for photovoltaic technology to make a significant contribution to reducing greenhouse gases. Researchers are exploring solar cells that use very cheap materials or even novel physics that could dramatically increase efficiency, which could bring down costs.

But industry experts at the Washington symposium argued that new technologies will take decades to come to market, judging from how long commercialization of other solar technologies has taken. Meanwhile, says Zweibel, conventional technologies "have made the kind of progress that we were hoping futuristic technologies could make." For example, researchers have sought to bring the cost of solar power to under $1 per watt, and as of the first quarter of this year one company, First Solar, has done this.

These cost reductions have made solar power cheaper than the natural-gas-powered plants used to produce extra electricity to meet demand on hot summer days. With subsidies, which Zweibel argues are justified because of the "externalities" of other power sources, such as the cost from pollution, solar can be competitive with conventional electricity even outside peak demand times, at least in California. And projected cost decreases will make solar competitive with current electricity prices in more areas, even without subsidies.

Representatives of the solar industry say the federal government should do more to remove obstacles that are slowing the industry's development. One issue is financing for new solar installations, which can be much more expensive if lending institutions deem them high risk. A recent extension of federal tax credits and grants for solar investments is a step in the right direction, many solar experts say. But more could be done. A price on carbon would help make solar more economically competitive and more attractive to lenders.


http://www.technologyreview.com/energy/23108/

Less May Be More for Wind Turbines

Nordic Windpower's two-bladed rotors depart from conventional wind-power design.

By Peter Fairley


One of the first R&D grants to a renewable-energy developer from the economic-stimulus funds approved by Congress this spring could have a dramatic impact on the design of wind turbines. The $16 million loan guarantee offered by the U.S. Department of Energy (DOE) to Berkeley, CA-based Nordic Windpower will accelerate commercialization of the company's Swedish-designed, two-bladed wind turbines, marking the first utility-scale alternative to the industry's dominant three-bladed design in over a decade.

Less to lift: Nordic Windpower’s N1000 wind turbines use two blades to generate up to 1,000 megawatts of power, making them cheaper to build than a conventional three-bladed machine.
Credit: Nordic Windpower

In recent years, wind-energy entrepreneurs have already been pushing beyond the standard design. Blue H Technologies of the Netherlands and Norway's SWAY, for example, are testing unorthodox turbine designs tailored for placement on offshore platforms anchored in deep water far offshore. Blue H is testing two-bladed turbines akin to Nordic's, while SWAY has a three-bladed design that faces the rotor downwind, bucking the industry's conventional into-the-wind orientation.

The attraction for all three companies to pursue innovative approaches is essentially the same: their designs could be substantially lighter than today's turbines, and could thus produce energy at much lower cost. That remains an important goal for wind power, which, though presently the cheapest form of renewable-power generation, remains dependent on government incentives.

What sets Nordic apart from others rethinking wind-turbine architecture is that its prototypes have been operating successfully for over a decade. Backed by Goldman Sachs since 2007 and now by the DOE, the company plans to begin shipping commercial models later this year. Nordic's experience should help overcome skepticism that such alternative designs can be robust in megawatt-scale machines--skepticism that was reinforced by earlier, failed experiments with two-bladed and downwind turbines.

Nordic Windpower CEO Tom Carbone, who formerly led the U.S. operations of Danish wind-turbine giant Vestas, says that Nordic's key technology is the "teetered hub" that the two blades use. Nordic's hub provides a flexible link between the rotor and the generator driveshaft, enabling the blades to move in and out of the plane of rotation in response to gusts or turbulence. Carbone says Nordic's lightweight design can deliver a whopping 20-25 percent cost reduction relative to three-bladed turbines.

Bumpers constrain the hinging to just two degrees in either direction, but that is enough to shed unwanted forces that would otherwise strain the turbine's gearbox. Shedding unwanted forces also means that the entire structure, from tower to generator to blades, can be built lighter and cheaper. "You're reducing the amount of material normally used to strengthen the structure against those loads," says Carbone.

Delivering on the promise of savings, however, is harder than it sounds. Nordic's design, Carbone says, "is pretty simple in function. It's a just a hinge that's perpendicular to the rotor. But perfecting that simplicity took a lot of time and effort."

Larry Miles has spent the last decade trying to develop a flexible two-bladed wind turbine with individually hinged blades. Miles's Wind Turbine Company, based in Bellevue, WA, was preparing to push a 500-kilowatt prototype of its hinged-blade turbine to 750 kilowatts when a control-system error allowed one blade to swing too far and strike the tower. The resulting damage ultimately caused the DOE to withdraw support for the firm's research program.

Carbone says that such setbacks have not tarnished Nordic because its teetered design is already well proven. Since the mid-1970s, the Swedish government has poured close to $75 million into Nordic's Swedish predecessor, producing a string of prototypes. Four of the five one-megawatt turbines Nordic has installed since 1995 are still running, demonstrating an average mechanical reliability of 98 percent. "We can catapult off that experience and use today's control [systems] and materials to make an even better product," says Carbone.

Nordic's plan is to first validate its design by selling turbines to community-scale wind-energy developments with up to 20 turbines--projects that are too small to support a dedicated maintenance staff and therefore need reliability. Nordic says it already has orders for 19 turbines for small installations at a military base in Arizona, a housing development in Minneapolis, and a power project in Uruguay. The primary use of the DOE loan guarantee will be expansion of the company's assembly plant in Pocatello, ID.

Carbone says Nordic will ramp up carefully to assure reliability and customer satisfaction before engineering its next step: a 2.5- to 3-megawatt turbine to compete for use by large, utility-scale wind farms. "In 2012 we will prototype a 2.5 to 3-megawatt machine, which will take us to a $1 billion company in roughly seven years from now," promises Carbone.

Miles estimates that the savings from Nordic's design might be closer to 10 percent. But Miles says that could still make an important difference: "If their machine works reliably, they're going to have a definite cost advantage over a three-blade machine."


http://www.technologyreview.com/energy/23109/