IBM 5 in 5 and Global Poverty


ARMONK, New York- Imagine mobile devices that allow farmers to touch a crop to determine its health, computers that help doctors see diseases before they occur, laptops that hear and interpret the sounds of the environment, computers that understand taste to solve food problems around the world, or cell phones that have a sense of smell to help medical practitioners identify critical diseases. This is IBM 5 in 5, a prediction of how the world could change in five years, and what it could mean for global poverty.

Touch: Mobile devices that allows you to touch through your phone

SightSee a pixel that is worth a thousands words

Hearing: Computers that hear what matters                                                     

Taste: Digital taste buds that help you eat a healthy diet

Smell: Computers that have a sense of smell

“These five predictions show how cognitive technologies can improve our lives, and they’re windows into a much bigger landscape –the coming era of cognitive systems,” wrote Bernard Meyerson on the Smarter Planet Blog. Meyerson is IBM’s Chief Innovation Officer.

What is cognitive technology?

When Watson, IBM’s smartest computer, beat Brad Rutter and Ken Jennings, two of Jeopardy TV quiz show champions, it marked the era of cognitive computing – “one that could learn, reason and understand natural language.”

Watson is now helping physicians expand their cognitive boundaries by providing greater access to large volumes of information such as patient medical history, clinical journals and scholarly articles. Watson is not only able to leverage the computer’s ability to deal with huge amounts of data, but also understands the knowledge and applies them to the problem.

According to IBM, these new computing systems can help city leaders in any country prepare for natural disasters, forecast electrical outages, plan evacuations and organize emergency equipment and personnel to respond to affected areas.

“One of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain,” Meyerson emphasized. “New technologies make it possible for machines to mimic and augment the senses.”


Imagine farmers in developing countries using a mobile device to find out the health of their crops by assessing what they are growing to a lexicon of healthy selections they can feel and touch through a tablet.

Currently, it is already doable to “recreate a sense of texture through vibration.” For example, cell phones produce vibrations that are a “recognizable sensation” for the user.  Robyn Schwartz, associate director of IBM Research Retail Analytics, and his team indicate what is needed now is to translate those vibrations “into a lexicon, or dictionary of textures that match the physical experience.”

IBM intends to capture texture qualities in a Product Information Management (PIM) system through digital image processing and digital image correlation. Farmers could match textures in the dictionary, which include data such as sizes, dimensions and color. The dictionary of texture can expand as technology develops. This has vast implications for the third world countries that are reliant on agriculture.


John Smith, senior manager of IBM Intelligent Information Management believes that computers will not only be able to see images, but also help one understand the photos. He reveals that “by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter,” and help medical practitioners see diseases before it happens.

Cognitive visual computing such as MRI, X-Ray and CT images already play a crucial role in aiding doctors to identify life-threatening tumors and blood clots. However, these images are microscopic and require a skilled eye. A computer can be trained to use pattern recognition techniques to effectively recognize critical issues in these images before the tumor becomes visible.

Another example is in the field of dermatology, whereby the computer is able to use the same recognition techniques to look for patterns and identify pre-cancerous situations before the melanomas become noticeable.

Other possible applications would include pre-empting safety issues. The uploaded photos of severe storms would enable nations to ascertain in real time the current situation and specific location to send safety crews. Such photo analysis could be similarly done with security cameras in a city. Police or security datacenters could study all the video data and determine if there are security breaches or potential safety issues. This would enable countries in conflict to address pertinent security issues and provide immediate assistance. Through the use of visual information, the computer helps one make informed and timely decisions.


IBM Master Inventor Dimitri Kanevsky patented a way to capture data from sound and interpret the information. He believes that “algorithms embedded in cognitive systems” can help us understand any sound. For example, sensors can help us interpret the sound of weather. During a storm, it can capture the sound of a tree under stress, and before the tree collapses onto the road, sensors relay the information to a city datacenter, and warn ground crews before it falls. Scientists at IBM Research lab in Sao Paolo are utilizing IBM Deep Thunder to make similar types of weather predictions in Brazil.

Visualize how such cognitive systems could play a crucial part in saving lives in developing countries before the onslaught of natural disasters such as typhoon Haiyan.


Dr. Lav Varshney, research scientist with IBM Services Research, says the digital taste buds can play a part in alleviating health issues such as obesity and malnutrition. He proposes the duo possibility of optimizing flavor and realizing nutritional requirements.

Varshney acknowledges that communities in sub-Saharan Africa only have access to a few basic ingredients for their daily meal, but even under such constraints, a creative computer can “optimize flavor profiles.”

IBM’s research team includes a professionally trained chef-turned-computer-engineer. Varshney believes that within five years, great meals can be created through utilizing cognitive systems.

Adding the dimension of creativity to cognitive computing, the system “analyzes foods in terms of how chemical compounds interact with each other, the number of atoms in each compound, and the bonding structure and shapes of compounds.” The outcome is a distinctive recipe, using a blend of ingredients that are “scientifically flavorful.”


Envisage your mobile device communicating to you that you are getting a cold before your very first sneeze. This innovation may become a reality soon.

Dr. Hendrik F. Hamann, IBM research manager on physical analytics, says that with every breath, “one expels millions of different molecules.” Certain molecules are biomarkers that carry a wealth of information about one’s physical health, and technology is able to pick up these indicators to provide beneficial diagnostic information to the doctor.

Hamann indicate that tiny sensors that smell can be incorporated into cell phones and other mobile devices, which relays information contained on the biomarkers to a computer system that examines the data. This is similar to how a breath analyzer is able to detect alcohol from a breath sample. Sensors can be created to gather different types of data from the biomarkers, such as identifying liver and kidney disorders, diabetes and tuberculosis.

These are IBM’s predictions of how cognitive technologies through innovations will “touch our lives and see us into the future.” While innovations itself does not end extreme poverty, but it allows a new way of thinking and doing things that can help break down the systems and hardware that keep people poor.

– Flora Khoo

Sources: A Smarter Planet, IBM, IBM Cognitive Systems, NY Times
Photo: Wikimedia


Comments are closed.