Microserver powered by the sun

IBM scientist Ronald Luijten has many hobbies, from gliding over the Swiss alps at 4,000 meters, to taking photos with his quadrocopter, or tinkering with technology -- particularly microservers, which he refers to as "data centers in a box."

By day, Ronald is working on a 64-bit microserver for the Square Kilometre Array (SKA), an international consortium to build the world’s largest and most sensitive radio telescope. He hopes that someday petabytes of Big Data from the Big Bang (13 billion years ago) will be crunched on the microserver, and uncover fundamental questions about the universe, including are we alone?

By night and on the weekends, Ronald has also built a microserver to host his website swissdutch.ch. The passionate environmentalist spent this past weekend "unplugging" his microserver from the electricity grid and and now powers it from solar panels backed-up with batteries.

"On September 27, 2014, I changed the energy source of the wandboard Quad to solar panels. I installed 40W of photo-voltaic panels feeding a lead acid battery of 18Ah (2x 9 Ah). The panels come in increments of 20W, and I did not think 20W was enough to make it through winter. Note that around this time of year (September), the sun is right between its minimum and maximum high point. So, I pointed the panels due south at an angle of 45 degrees," Ronald said.

To keep track of Ronald's progress visit his blog or follow him on Twitter @ronaldgadget


The algorithms of show business

New IBMer begins work to make Watson work smarter, having made HBO’s Silicon Valley look smarter

Vinith Misra
“We need to talk.” Generally not something you want to hear from your PhD advisor. But when it was followed by, “Have you heard of HBO?”, then-Stanford student, now-IBM Watson engineer Vinith Misra was intrigued (and as a film enthusiast, a little amused). His advisor, Dr. Tsachy Weissman, had been contacted by the technical advisor for HBO’s Silicon Valley in 2013 to help the show’s creators develop fictionalized compression algorithms, and he wanted to bring Vinith on board.

The series, heading into its second season, follows the story of Richard Hendriks, a brilliant, young programmer and founder of Silicon Valley startup Pied Piper. He and his colleagues face a race against time – in the form of competition from a bigger tech company, reverse-engineering their work – to land funding from venture capitalists.

Favorite algorithm? 
“The most memorable algorithm I’ve heard has to be the solution to the countably-infinite prisoners-with-hats problem. You really have to admire the ludicrousness of it (and the insanity of its claim).” 

Favorite film? 
“I’m a big fan of the director Bong Joon-Ho, and his film Memories of Murder is probably my favorite. There is a new layer every time you watch it.”
For Vinith, who studied and now works in information theory and machine learning, the project held great promise. The challenge: create a lossless compression algorithm more powerful and efficient than anything that currently exists – or, rather, to make it look like they had created such an algorithm. They had to come up with something that isn’t currently possible, but isn’t immediately identifiable as such.

Vinith approached the challenge of writing the algorithm like he would any research problem. “You have a few things to keep in mind when formulating a problem. It needs to be impactful to people, it needs the potential to work, and the ideas should be elegant, compelling, and provocative,” he said.

With script in hand, Vinith knew that the algorithms were in essence their own character in the show, so he needed to make sure his work could play the part. He combined elements of lossy algorithms for their visual aesthetics on a whiteboard, with the lossless algorithms the show called for, and even created the Weissman Score — a fictitious compression benchmark that could fool even the biggest fanboys.

Vinith and Weissman think it’s reasonable to imagine radically innovative compression algorithms of this sort emerging in the future – which explains why these algorithms would also survive a cursory glance from highly trained engineers. “If you gave these ideas to a first-year grad student, they could run with them,” Vinith said. It would require a more detailed analysis to uncover the algorithms’ fundamental unfeasibility.

Hard at work finishing his doctoral studies in electrical engineering, Vinith didn’t spend much time on set during filming. Instead, he would conduct what he called “firefighting calls” with cast and crew, adjusting technical dialogue, finding material for the whiteboards and sets, and helping producers and actors deal with unexpected technical elements.

Spoiler alert!

Vinith knew that the finale to the first season would involve a drastic alteration to Pied Piper’s compression algorithm – an alteration that was the product of a mathematical joke from a pivotal scene. Developing and coding even as his team pitches the compression during the final episode, Richard comes up with a groundbreaking development after a humorous debate about how to influence the potential funders to choose their work. This breakthrough, which set the stage for the second season, stemmed from the phrase “middle out.” Vinith even published a 12-page analysis of the scenario that gives rise to the breakthrough, and while it is highly detailed and mathematically rigorous, it is explicit; reader discretion is advised.

Vinith will continue to work on the show, but this month he also began a new chapter with IBM’s Watson Group. “Watson is transitioning from the conceptual to working on real problems…and I’m glad to be a part of it,” he said.

Vinith will work on Watson out of the IBM Research – Almaden lab, where he will be developing algorithms to reason about concepts across a variety of domains, including subjects ranging from food to baseball cards — all in an effort to make Watson more agile.

Regarding the future of the show, Vinith can’t reveal much. He continues to work on Pied Piper’s breakthroughs and is looking forward to seeing the finished product of his work on screen. “I hope people treat the show like science fiction,” he said, “but the good kind.” He says it has been a fantastic and unique experience, and one from which he has gained a greater appreciation for the aesthetic side of what he does. “Algorithms and systems are designed to be used, but the ideas behind them can often be compelling, even to non-technical people.”


IBMer earns "Genius Grant"

Craig Gentry's cryptography recognized by MacArthur Foundation

In 2009, computer scientist Craig Gentry solved a cryptography problem – one posed in 1978. The problem: can encrypted data be analyzed without being accessed? Thought impossible for more than 30 years, Craig’s “fully homomorphic encryption” technique did just that. And the John D. and Catherine T. MacArthur Foundation took notice. They recognized the impact this solution may have on cloud computing and how we protect information on the web by naming him a MacArthur Fellow.

“It has the potential to pave the way for more secure cloud computing services – without having to decrypt or reveal original data,”  said Craig. His team later earned a patent for the efficient implementation of fully homomorphic encryption.

He explained to the Foundation how homomorphic encryption works with a physical analogy of the fictitious “Alice’s Jewelry Store.

“Alice wants her workers to turn raw materials into rings and necklaces, but she doesn't trust her workers. So, she creates these glove boxes that have locks on them. She then puts the raw materials inside and locks the box. The workers can stick their hands into the box's gloves to manipulate the raw materials to create the jewelry. And then she can unlock the box to remove the finished piece.

“This is what I try to do with cryptography (and could apply to cloud computing).”

Craig Gentry on what it means to earn the “genius grant” (its unofficial title since the first Fellows were named in 1981).

The MacArthur Foundation extends each Fellow “a no-strings-attached stipend of $625,000, paid out over five years, with no stipulations or reporting requirements, and allows recipients maximum freedom to follow their own creative visions.” And while the Foundations does lay out its process for choosing the Fellows, the award has achieved near-mythic status as no one can apply, no one knows if they are being considered, and when they’re told, they’re sworn to secrecy until the official announcement.


Growing single-crystalline materials on reusable graphene

Editor’s note: This article is by IBM Research Staff Member and Master Inventor Dr. Jeehwan Kim

Dr. Jeehwan Kim
Since the first demonstrations of removing graphene from graphite a decade ago, the size of a single-oriented piece has been limited to less than a millimeter – far too small for real-world application. Our team worked to break through the original millimeter barrier, and together we found success in producing wafer-scale, single-crystalline sheets of graphene 100 millimeters (four inches) in diameter. Having achieved scalability, we reported the possibility of replacing a thick, “single-crystalline” wafer with a single-atom-thick graphene layer for growing single-crystalline materials in the paper, “Principle of direct van der Waals epitaxy of single-crystalline films onepitaxial graphene” – published today with my chief collaborator Dr. Can Bayram in Nature Communications.

Graphene holds incredible promise as a linchpin material for breakthroughs in numerous technologies, and my team at IBM’s Thomas J. Watson Research Center is working to make its potential a reality. Due to its incredible strength and supreme electrical, optical and mechanical properties, graphene – pure carbon functional at the thickness of one atom – has been touted as the next big thing in everything from high-frequency transistors and photo-detectors, to flexible electronics and biosensors. IBM is investing $3B over the next five years towards initiatives such as this, which are building a bridge to a “post-silicon” era. 

Part of what makes the material so promising is its strength relative to thickness. At only .3 to .4 nanometers thick (that’s 60,000 times thinner than a sheet of plastic wrap, or 1,000,000 times thinner than a strand of human hair), graphene is an astonishing 200 times stronger than steel. It is also the world’s most conductive material yet discovered, extraordinarily flexible and – as a single layer of carbon atoms – the first two-dimensional material.

Our groundbreaking approach, known as “graphene-based growth/transfer,” allows single-crystalline semiconductor film growth on graphene – rather than on expensive, single-crystalline wafers. The graphene serves as a “seed” for single-crystalline film growth, and because this film can be separated precisely from the graphene surface, the graphene can be reused for further growth. In principle, graphene has therefore been demonstrated as an infinitive source for growing these semiconductor materials, making the work an enormously cost-effective and reliable production method for single-crystalline films. 

Graphene’s periodic hexagonal crystal structure then allowed us to experiment with growing other semiconductor materials that demonstrate similar structural properties. Previously, production of single-crystalline semiconductor films required the use of ~1 millimeter-thick, single-crystalline wafer templates that were not reusable and were very expensive. For example, growth of a 4-inch, wafer-scale GaN (gallium nitride, a direct bandgap semiconductor) film would require a 4-inch SiC wafer – at the cost of some $3000. Now, graphene can be produced in a lab to replace the expensive SiC wafer. 

Furthermore, the new growth technique is useful in that semiconductor devices can be deposited on graphene and released or transferred to a flexible substrate.

While we have demonstrated an important, present-day use for this material, the future of graphene as a standalone material is still bright. Uses for graphene are being developed for a number of electronics, and over the next five years, the material could be used as transparent electrodes for touch screen devices, rollable e-paper and foldable LEDs. In the near future, uses are being developed for large-area graphene in high-frequency transistors, logic transistors/thin-film transistors and beyond. Its high electronic mobility – the ability of charged particles to move through a medium in response to an electronic field – makes graphene a promising material. 

Read Principle of direct van der Waals epitaxy of single-crystalline films on epitaxial graphene by Jeehwan Kim, Can Bayram, Hongsik Park, Cheng-Wei Cheng, Christos Dimitrakopoulos, John A. Ott, Kathleen B. Reuter, Stephen W. Bedell and Devendra K. Sadana


Analyzing healthcare's big data of real-world evidence

  • A custom-produced pill engineered for your unique genetic makeup, designed to fight the exact genotype of cancer in your metabolism.
  • A living avatar incorporating a genetic replica of your cancer, allowing tests to be conducted on the avatar before real-world application.
  • Differentiating between Munchausen Syndrome and domestic abuse by collecting data on a patient from various parts of a health system.
None of the above is fiction, but an emerging reality - and they are the tip of the iceberg of innovations and discoveries emerging from an unexpected source: Big Data. Like so many other medical advances today, these are examples of how hugely powerful computer processing and analyzing vast amounts of data are changing the way healthcare is conducted and administered.

“Genome testing of cancer is critical because the disease is driven by genetic mutations,” said Ya’ara Goldschmidt, leader of IBM’s Healthcare Analytics work at the company’s Research lab in Haifa, Israel. “That’s why a great deal of the genomics work that has been done in the past decade focuses on the sequencing of this particular disease.” 

Goldschmidt was speaking at the conclusion of two days of workshops discussing clinical genomic analysis and medical informatics innovations that brought together a full mix of academia, industry, health providers and policymakers from Israel’s healthcare ecosystem, as well as from abroad.

“With the enormous amounts of genomic data available today, the challenge is to analyze it methodically to better understand and control disease, ultimately providing treatment recommendations at the point of care. With the technological breakthroughs of the last few years and sequencing costs dropping rapidly, today we can do things we couldn't have imagined just a decade ago,” said Michal Rosen-Zvi, Senior Manager of Analytics at IBM Research - Haifa.
Recent years have seen a dramatic increase in the availability of data collected in the practice of healthcare. Analyzed properly, these data (known as real world evidence or RWE) are transforming healthcare for everyone, from providers to practitioners to patients. IBM's Haifa lab is playing a leading role – both in the analysis itself, and in bringing players together to engage in dialog and the exchange of ideas.

Scientists at the lab have developed decision support solutions that blend cloud, and mobile technologies with advanced analytics to gather, manage, analyze, and visualize data on different kinds of cancer and disease.  These technologies include machine learning to infer the complex  associations between genetic factors, demographic data, disease progression, and treatment options.

“Even the policymakers are conscious of the tremendous benefit we can derive from data analysis,” Rosen-Zvi said. “The legal and ethical obstacles that obstructed progress are slowly being resolved, and this is allowing us to make headway. The fact that the head of Israel’s Ministry of Health attended the event illustrates the legislature's awareness.”

A keynote presented by Isaac Kohane of Harvard Medical School explained how data analysis of patients “bouncing around” (or constantly checking into hospitals with a variety of issues) a health system can pinpoint likely domestic abuse. He also explained how medical informatics could have identified the dangers of the painkilling drug Vioxx by collecting information about heart attacks from different hospitals earlier on. The drug was removed from the market after clinical studies.

“It's imperative that we improve data sharing among all partners in our health systems,” said IBM's Ranit Aharonov, who organized the Genomic Analysis Workshop with the Safra Center for Bioinformatics at Tel Aviv University. “Technological barriers exist, as well as legal, and of course, commercial. Drug companies, for example, spend enormous sums on research so they're not enthusiastic about sharing their data freely. But even they are recognizing that improved data exchange is for their benefit too.”

Among the many presentations, Fresenius Medical Care, the world's largest integrated provider of products and services for people undergoing dialysis, explained how they gather data about alternating-day visits of their dialysis patients. Dozens of factors in the data are analyzed and processed, then guidelines are provided to the attending physician during each subsequent patient visit. Studies showed that when physicians adhere to the guidelines, patient outcome is improved.

Prof. David Sidransky of Johns Hopkins University explained how human cancer cells are transferred to a mouse - a process called xenografting - allowing genetic testing to be conducted without harming the patient. When the correct formula is found to counter that particular cancer, it can be administered to the patient. 

“In past gatherings, we'd talk about what it will be possible to do in the future,” Rosen-Zvi said. “But these two days showed us that the future is already here. The data is now available and we've begun using it to improve healthcare for everyone.”

Many of the current IBM healthcare advances address chronic care and cancer because of its potential impact on society.  With new possibilities for genomic sequencing, cognitive computing and other analytics technologies, IBM is providing decision support to enable more reliable diagnosis and care plan, including treatment options. The company is also working with healthcare partners across the globe on exciting technologies for medical training and other chronic-care areas, such as diabetes, heart disease and mental health.


The energy to innovate

IBMer and MIT TR35 honoree is making electricity accessible and available

Tanuja Ganu grew up in a small town in India about 400 kilometers south of Mumbai, where – like much of the country – energy outages happen all the time. 

“The voltage was often so low that the lights were dim and the refrigerator would burn out.

“I studied for exams by candlelight, and endured summers without working fans. To deal with this as children, we learned to time-shift critical things we needed electricity for – like cooking and cleaning," Tanuja said.

Now an engineer at IBM Research, MIT recognized her as a “2014 Innovator Under 35” for building solutions that begin to solve these challenges. Her collaboration with the University of Brunei Darussalam led to the inventions of SocketWatch, nPlug, and iPlug.

The Indian electricity sector, despite having the world's fifth largest installed capacity, suffers from a 12.9% peaking shortage. This shortage could be alleviated, if a large number of deferrable loads could be moved from on-peak to off-peak times.
Q&A With Tanuja Ganu: Experience to expertise

IBM Research: How did the experience of dealing with electrical outages influence your decision to work in this field?

Tanuja Ganu: Knowing the inconvenience of time-shifting, I was particularly fascinated with the idea of democratizing the Demand Side Management (DSM) of energy. It’s something that average citizens can make a difference doing by simply reducing their consumption during peak hours and avoiding other energy wastage (like leaving the TV and other appliances on standby).

IR: But you studied computer science and machine learning at university. How did you connect that expertise with energy and utilities – and eventually your solutions of nPlug, SocketWatch and iPlug?

TG: I first learned practical engineering from my father (also an engineer) when we had to fix appliances at home. These projects got me interested in engineering and particularly influenced my thinking about inventing and applying knowledge to solve real-world problems.

Though, I graduated with a degree in computer science, and completed graduate studies in data mining and machine learning, I looked for domains where I could address real societal problems using data insights and technological change. And during campus interviews, I came to know about the Smarter Energy group at IBM Research-India. It was the perfect combination of computer science and electrical engineering techniques specifically addressing energy issues. After an internship with them, I joined as an engineer in 2011.

IR: Where did your ideas for nPlug, SocketWatch and iPlug come from?

TG: My first project was nPlug, or “Smarter Planet in a Plug.” It is aimed at alleviating peak usage loads through inexpensive autonomous DSM. Working with a team of engineers with backgrounds in embedded systems and power optimization, we developed a device that fits between appliances such as hot water heaters and even electric vehicles, and wall sockets. nPlug senses line voltage and line frequency (how much energy the device uses and how often), and then uses machine learning techniques to infer peak periods as well as supply-demand imbalance conditions. It then schedules usage for the attached appliances in a decentralized manner to alleviate peaks whenever possible – without violating the requirements of consumers.  

SocketWatch is another device that fits between an appliance and the wall socket. It autonomously monitors the appliance’s usage – and based on the appliance’s power consumption, SocketWatch alerts the end consumer of the device’s proper usage (preventing energy waste). For example, it can switch off a TV if it is on standby mode, or alert the consumer about the energy “leaking” from a refrigerator due to a malfunction, like a leaking gasket.

Our most-recent project, iPlug, will help distributed energy sources such as a home’s rooftop solar panels. It – like our other devices – autonomously decides how to route electricity from solar panels back to the grid (on the most loaded phase during peak times), or to store or use the energy locally, based on the home’s usage needs.

IR: How do machine learning, data mining and analytics play a role in these energy projects?

TG: Thanks to advances in embedded systems and sensor technologies, a lot of high frequency data related to energy parameters, such as line voltage, frequency, active power, and reactive power is available for analysis – like finding irregularities in the operations of energy systems. My skills in machine learning and data mining help analyze and bring insight from the data by writing learning pattern algorithms.

Once the patterns are analyzed, optimization skills help in coming up with optimal strategies to solve specific issues at hand. For example, in the case of nPlugs, we apply machine learning techniques to line voltage and frequency data to understand the times of peak demand and supply-demand mismatches. Then we apply optimization techniques to determine preferred times to schedule an appliance in a decentralized manner such that they follow user-defined deadlines, but do not over-load the grid. 

IR: What stage have these projects reached? And what results have you been able to show?

TG: Though we have not evaluated these devices in large scale pilots yet, we have evaluated prototypes of nPlugs and SocketWatches in real-life settings.

We’re able to show that nPlugs correctly defer loads such as storage water heaters to off-peak hours without inconveniencing their owners. We have also studied the collective behaviors of thousands of nPlugs using simulations. They are able to reduce peak loads by up to 45 percent with a realistic mix of deferrable loads.

And we can show that SocketWatches are able to accurately pinpoint malfunctions in appliances, such as air conditioners (blocked air filters and obstructed fans) and refrigerators (gasket leakage). 

IR: How do you envision these devices being used in the future?

TG: I think there are multiple ways these devices could roll out to consumers and the industry. Utility companies can subsidize nPlugs for high consuming deferrable loads, like electric vehicle charging, to alleviate peak demand.

In the case of SocketWatch, since it provides alerts for reducing electricity waste, helps in preventive maintenance, and lowers a home’s electric bill, it could be directly commercialized to end consumers. And we could also partner with appliance manufacturers since these devices could be integrated within an appliance.

Read more about Tanuja and her work in MIT Technology Review’s 2014 Innovators Under 35.


Oil Applies Brakes to Molecules under STM at Room Temp

IBM scientists Marilyne Sousa, Peter Nirmalraj, Heike Riel and Bernd Gotsmann
Since the first microscope was invented, researchers and scientists around the world have searched for new ways to stretch their understanding of the microscopic world. In 1981, two IBM researchers, who went on to become Nobel Laureates, Gerd Binnig and Heinrich Rohrer, broke new ground in the science of the miniscule with their invention of the scanning tunneling microscope (STM), which enabled scientists to visualize the world all the way down to its molecules and atoms using a a quantum phenomenon called tunneling.

Tunneling atoms escape the surface of a solid to form a kind of cloud that hovers above the surface; when another surface approaches, its atomic cloud overlaps and an atomic exchange occurs. By maneuvering a sharp metal conducting tip over the surface of a sample at an extremely small distance, Binnig and Rohrer found that the amount of electrical current flowing between the tip and the surface could be measured. Variations in this current could provide information about the inner structure and the height-relief of the surface. And from this information, one could build a three-dimensional atomic-scale map of the sample’s surface to reveal what atoms look like for the first time.

IBM scientists in Zurich continue to push the boundaries of this instrument and in a paper appearing in Nature Materials today titled Nanoelectrical analysis of single molecules and atomic-scale materials at the solid/liquid interface
they have developed a new and frugal technique which enables the direct imaging and stable electrical read outs of single-molecules in a liquid environment using STM at room temperature.

In the unique and ultra-controlled environment of the Noise Free Labs of the Binnig and Rohrer Nanotechnology Center, scientists are using a high-density liquid called silicone oil to serve as liquid-brakes to nearly freeze single molecular motion and ultra-thin organic spacers to electronically decouple the molecules from the contact metals.

In-situ STM image of individual fullerene
molecules adsorbed on spacer-coated gold substrate.
The combinatorial effect allows the scientists to record high resolution real-space images and decode the intrinsic electronic structure of single-molecules.

The technique has been successfully extended to further resolve the atomic-lattice, quantify topological defects and map the band structure of monoatomic graphene.

In addition to applications in hybrid electronics, where electronically active molecules (organic switches) are embedded into 2-D crystal matrices, these findings provide new pathways and insights in mapping DNA electronic structure and dynamics as they interact with graphene nanopores, which has direct implications in engineering genome sequencing devices.

This research was done in collaboration with chemists from IMDEA-Spain and Theoritical Physists from University of Limerick, Ireland.