HOME | SPEAKING | PUBLICATIONS | ABOUT JSB | CONTACT ME

Ecological Computing

by Feng Zhao and John Seely Brown

Great Duck Island, Maine

On a small patch of land ten miles off the coast of Maine, a team of computer engineers from UC/Berkeley are conducting an experiment in ecological computing. Working with biologists at the College of the Atlantic, the engineers have installed 32 wireless sensors that are being used to monitor the habitat of the nesting petrels on the island. In the past, the biologists studying nesting behaviors of the birds had to travel to the island every now and then to gather observation data. To check on the petrels, they literally had to stick their hands into the burrows, often causing the birds to abandon their homes.

Now, these same biologists are checking on the birds on the island in the comfort of their offices, browsing data from the sensors linked by the satellite. And their colleagues thousands of miles away can share the same joy, thanks to the Internet. The untethered, matchbox-sized sensors left in the burrows monitor the occupancy by recording temperature variations inside and wirelessly send the data to a gateway node on the island. Convenience aside, the more significant benefit of the technology is the minimization of disturbance to the very habitat that it tries to help preserve.

The experiment on Great Duck Island is a small lens into an expansive future. To grasp what might happen, multiply these 32 sensors by 10 million or a 100 million and distribute them globally. When the sensor grid becomes ubiquitous it becomes like an enormous digital retina stretched over the surface of the planet. This planet-scale system could help us understand and address tomorrow's environmental challenges, ranging from monitoring global biodiversity to sensing millions of low-level, non-point sources of pollution.

Let's add intelligent browsers to this vast sensing system that lets scientists, government regulators, or environmental advocates use the internet to ask questions never before imaginable. Call this Google© on steroids. We could search vast amounts of data for abnormal events or detect interesting patterns at many different scales. But what else?

Equipped with a new generation of sensors, automobiles and trucks could monitor their own emissions and download them at a service station, to a home computer, or transmit the data in batches over cellular networks. When cars can talk to each other we can begin to create dynamic networks that can be optimized to reduce congestion, cut air pollution, speed up just-in-time deliveries, or help people find the closest available parking space in an unfamiliar city. This is more than just about convenience. We waste enough energy sitting in traffic jams each year to run our entire domestic airline fleet.

As networked sensors become dramatically less expensive and have wireless capability built into them, we may find them in a mid-west cornfield, helping farmers optimize water and fertilizer use and minimize the use of harmful pesticides. Sensor systems can go where we cannot, monitoring environmental damage in an oil spill or forest fire, tracking ocean currents, or helping biologists unravel the wonders of a rainforest canopy. We could begin to instrument whole ecosystems, using ground-based sensors networked to the next generation of satellites to understand subtle but far-reaching changes in land use and vegetation.

With pervasive and embedded intelligence, our manufacturing systems could become self-managing and more self-regulating. Products and parts with on-board sensors and radio frequency tags could keep track of themselves, help manage inventories, know when they need repair or replacement, and find their way back to the right place to be remanufactured or recycled. These systems would be capable of acting independently in response to their environment without requiring constant, and often expensive, human intervention. Industrial systems would begin to operate more like ecological ones, continually aware of their surroundings, self-organizing and perhaps even engaging in micro auctions for balancing energy loads.

Can any of this happen? Yes, and here is why and how.


Instrumenting the Planet

In 10-20 years, the Internet will change many of the ways that biologists and ecologists study living systems at nearly every scale. But we have barely scratched the surface of what is beginning to be possible. Visionaries, entrepreneurs and techies are designing an omni-present, planet-scale sensor network that will dwarf the Internet by many orders of magnitude. This sensor network, or informational grid, will provide entirely new kinds of instruments for doing environmental sciences on a scale never before possible. This grid will be adaptive and be able to select and attend to interesting things happening in the environment. To build a system of this size, computer scientists and engineers will have to borrow ideas from biology and ecology, and figure out how large-scale complex systems adapt, repair and self organize. An open, two-way interaction between environmental scientists and computer scientists is likely to have far-reaching implications for both the computational and biological worlds for many decades to come.

The transformational force underlying this change is the confluence of recent rapid technological advances such as micro-electro-mechanical system (MEMS) sensors and actuators, wireless and mobile networking, and low-power embedded microprocessors. Moore's Law describes our ability to progressively manufacture smaller and smaller transistors but that, in turn, suggests that we can progressively make sensors cheaper, smaller, more versatile, and less power hungry. Wireless sensors that integrate communication, computation, memory, sensing, and on-board power in a single package can be used to detect anything from temperature, humidity, light, sound, pollutants et al., to traffic on roads. Today, one can readily order buckets of sensor nodes from startup companies such as Crossbow that manufacture integrated sensors. These match-box sized sensors, costing $100-200 each in small quantities today, will soon be smaller than a thumbnail and costs no more than a few cents each, when produced in massive quantities. Massively distributed systems built on these emerging sensor technologies will have to be designed to operate in radically different ways. In fact, our normal notions of personal computing will not help us much in understanding an emerging, distributed system that will largely run itself.

Self-organizing systems

Unlike the Internet that requires round-the-clock human supervision and maintenance, the planet-scale sensor net will be for the most part autonomous, self-configuring, and attentive to its context and to its users. Because it is deeply embedded into the physical world, the system is subject to some very severe constraints, the most important of which is limited node onboard energy supply. Remember, each sensor will be using wireless to talk to each other. The same principle applies: the more you talk on your cell phone, the faster you drain your battery .

To maximize the usefulness and lifetime of the system, the sensor net has to adapt (and re-organize itself) as environmental conditions or user needs change. Engineers are busy figuring out how such systems can borrow ideas such as diffusion and reaction from living organisms. Nature does it very well. For example, termites in the Australia's Kakadu National Park build impressive mounds with little global knowledge or design.

Researchers have been developing biology-inspired, light-weight, peer-to-peer communication protocols to link the sensors together. Unlike TCP/IP protocols for the Internet that use fixed addresses to route information, the so-called diffusion routing protocols from UCLA/ISI use the data that sensors collect to dynamically set up routing pathways, thus incurring much less overhead than TCP/IP. More recently, PARC scientists have developed an IDSQ protocol that uses an information gradient derived from local sensor data to self-organize the network into coherent aggregates of nodes. As the physical phenomena in the environment move or application requirements change, the aggregates adapt accordingly, all without centralized supervision. A new-generation of featherweight sensing, communication, and security protocols are being developed to make practical deployment of sensor nets a reality.

An important feature for such a large-scale organic system is the ability to focus on interesting things. The sensor net must allocate limited resources to attend to current and emerging events of interest. The mechanisms of biological vision systems can inspire interesting design here. They can reduce the amount of energy the system needs to use, since the nodes that are not in the attentional foci can be turned off until needed.

There are two sorts of information to be gathered and stored in the sensor net: dynamic state information about the environment, and models about the world. What does a node know when it wakes up in the wild? One piece of knowledge crucial for many sensing tasks is the location of the node. Can nodes figure out their relative distances and orientations by looking at some common physical phenomena? Human vision system can get subpixel resolution exploiting the randomness of the retina cell placement. We may be able to turn this around, and use moving stimuli in the form of scanning lines or shadows for sensors to calibrate their positions with respect to the common reference stimuli.

A big concern often raised about such an omni-present system is personal privacy. How could one walk down a street without being videoed and tracked by an unauthorized individual? One way to protect our privacy is to build some friction or loss into the system, so that not every piece of information is immediately accessible or recoverable. Ideas such as statistical sampling used in databases could be extended to allow resolution-controlled access to the various information repositories on the sensor net. Just like in a human vision system, one can imagine building some sort of de-focusing lenses that blur out identity information, say license plate numbers, while providing aggregate data like the extent or size of the traffic jam on highway 101. But, of course, how such a distributed system can be controlled and by whom are not simply answered by listing technical capabilities especially given the self organizing nature of this kind of system.

Ecological Computing: The Co-evolution of Digital and Biological Worlds

The collective challenge facing computer science and environmental communities now is how to move from our traditional focus on personal computing to broader ecological computing that utilizes the notions of complex adaptive and self-organizing systems in the design of a new kind of information fabric. Ecological computing systems are blended into the physical environment through sensors, actuators, and logical elements; they are invisible, untethered, adaptive, and self organizing. This is where the computational world meets the physical world.

A sensor net is an example of an ecological computing system. To co-exist and co-evolve with the surrounding environment, the sensor net must be able to regenerate itself, and recycle its parts for new uses. Living systems are incredibly good at this. The sensor net has to be designed in a similar way if we expect it to survive. What will you do with a dead sensor node, depleted of battery power? Changing the batteries every couple of weeks on zillions of such embedded sensor nodes, some of which may even be physically inaccessible, is clearly not feasible. Leaving the dead nodes out there will create the next environmental super disaster that will cost our grand children dearly to clean up.

However, there are abundant energy sources in the environment. Sensors could harvest energy from vibrations of passing foot traffic, temperature differentials of body heat, or chemical reactions in the soil. Solar energy is a clean and limitless source. The current generation solar panels are still too inefficient for a massive deployment for sensor nets. One may find their use in an open environment such as a desert. But forest and densely covered areas may require other modes of energy harvesting, unless one can set up the solar panel high up above tree canopies or building tops.

One interesting idea is to hijack garbage-eating bacteria commonly found in woods to convert carbohydrates in the environment into sugars that can be used to power up sensor nodes . Out in the wild, there are plenty of dead tree leaves, branches, roots on the ground that are good sources of carbohydrates. If we are careful in adjusting the density of the sensor nodes and their duty cycles, nature can easily replenish what is consumed by the bacteria. An extra bonus of using such bacteria-powered energy cells is that their naturally occurring "rotten smell" can be exploited to repel certain unwelcome animals from devouring the sensor nodes without causing environmental damages. Interestingly, one of us had the misfortune of attracting animals to sensors in one of our recent field experiments in the Mojave Desert, quite unintentionally. Coyotes chewed up all the windshield foam on our microphone sensors apparently because the foam contained a chemical called uria that the coyotes found tasty.

On a more global scale, balancing energy across nodes could shift the load from low-energy-reserve nodes to high-powered ones. Remember, the fabric as a whole is the sensor. There will be some nodes somewhere that have some energy left. The trick is to get the energy to where the action is. Some sort of peer-to-peer diffusion, with gradient set up autonomously by the local energy demand, may work here. The net may even be able to heal itself, patching sensing holes or moving nodes around. Nanotechnology might enable damaged nodes to grow new "eyes, ears and noses". Clearly, environmental friendly design must be high on any future agenda for such systems.

Programming an ecological computing system will be more like designing a biology experiment, telling the system to "grow a finger here", than writing low-level embedded programs. New transformations need to be invented that take global properties one wants to design into some local representations that are easy to specify and implement on the embedded nodes. Invisible "compilers" will take care of the low-level, mechanical translations. Computer scientists and environmental researchers must join hands in the design of the programming technology.

Various species in nature co-exist through some very complex dependencies and feedback loops, the so-called web of life. The coming digital and ecological worlds will co-evolve in a similar, symbiotic way. In fact, the boundary between the two will be so blurred, that we may not even be able to tell one from the other in this ecological computing fabric. Of course, at this juncture we are not certain how far this symbiosis can go but we are certain that as we move from just focusing on information processing systems to massively distributed informating systems --systems that read and respond to their context-- we will have fundamentally new tools for analyzing and effecting ecological systems. How such systems are designed and applied to the environmental challenges of the future is one of the primary governance challenges of today.

For Further Exploration

Brown, J.S. & Rejeski, D. (2000). Ecological Computing, The Industry Standard, Dec. 18.
http://www.thestandard.com/article/0,1902,21365,00.html?printer_friendly=

See the Chapter on “Beyond the Internet” at: http://www.rand.org/scitech/stpi/ourfuture/

National Research Council Report: “Embedded, Everywhere: A research agenda for networked systems of embedded computers,” National Academy Press, 2001. (http://www.nap.edu/html/embedded_everywhere/)

The PARC Collaborative Sensing Project web page: http://www.parc.com/ecca

Crossbow Technology, Inc.: http://www.xbow.com