Mighty Science in Defense of Nature

by Mike Hamilton

Digitize or Die

Soon I'll turn 65, so I better get on with digitizing my life. My granddaughter is 5, my son and daughter are in their 30s, and if plans work out, future interactions with them might include conversations with me as a digital avatar.

Several of my colleagues have written their autobiographies, either humorous stories published in book format, or they have blogged for a while. I prefer leaving an interactive copy of myself that is engineered to know some of my best stories and answer questions. It won't be an easy task — I've been digitizing steadily since the late 1970s when I built my first computer and started measuring and mapping the environment using gadgets and widgets. You might say I have led a very digitally abundant life.

As an explorer — a digital naturalist — I've been figuring out ways to digitize nature in order to amplify our understanding about the complexity of biological diversity. I've worked with interdisciplinary teams of faculty and students to design, engineer, and deploy new tools for remote sensing of ecosystems using specialized cameras, environmental sensors, and robots. The robots we've designed were mobile sensor platforms — capable of autonomous airborne surveys of vegetationand wildlife; water chemistry sensing and sampling drones; multi-sensory robots navigating along cables strung through a forest canopy; underground tubular robots that collect ultra-high resolution microscopic images of the connections between roots and fungi; and floating, swimming and submarine robots which measure water quality or record coral reef health. We have learned a lot through the use of digital tech, but what has motivated us the most was the sense of urgency to digitize as much of nature as we can before conditions irreversibly changed.


Now I'm retired, I have a new urge to reprocess my life's work and recover as much of the original data, or its metadata, to re-purpose those studies into environmental story-telling, and hopefully, to entertain. Better yet if my stories can motivate young people to carry on with our duty to protect and study nature — and thereby protect ourselves. Working against me has been the rapid change in digital formats, so many digital files or media types older than ten years are nearly impossible to open or convert — that is unless you saved the original hardware and software. I've also found that tools to automatically back up files can be flawed, leaving unfortunate gaps that are, well, extinct.

Thanks to my careers at Cornell University, Cal Poly University Pomona, and the University of California, I have used a great many computers starting with an IBM 360 mainframe, DEC PDP-11 minicomputer, and too many desktop PCs to count! With the evolution of programming languages, operating systems, file formats, and applications, I've found the technology learning curve to be steep but at the same time amazingly fun. The digital archive of my professional life contained the punch cards used in graduate school, back-up tape drives, large floppy disks, then smaller ones, the first hard disc drive was a 10MB attached to an Apple IIe, then a 20MB drive in my first Mac, 40MB in the Mac II. There was a progression from 1GB to 500GB drives in the subsequent 10 years. My current SD card stores 256GB, while my portable USB-C external drives hold 5 TB of data…yet, they keep filling up! As operating systems evolve, so do file formats at both the system and program levels. On a Mac, for example, without knowledge about the resource fork, you might not be able to open a file from an old operating system with a new one.

My first foray into artificial intelligence came at a time when a computer language called Lisp, inference engines, knowledge bases, and expert systems were cutting-edge. Not the same as the fuzzy-logic Bayesian neural-net machine learning approaches of today — expert systems were all about digesting domains of knowledge into collections of if-then rules, or decision-trees and then applying an inference system to resolve patterns — not unlike a mirror of our logical thinking process. The Apple Technology group, led by Alan Kay, had discovered my early work and seeded our lab with Macs and a software program called Smalltalk, written on top of Lisp. We also were given Bill Atkinson's newly written HyperCard. Without deep-diving into the computer science weeds, Smalltalk is well suited for modeling the hierarchical classifications common to plant and animal taxonomy. HyperCard was easy to program and to integrate with languages like Smalltalk and hardware devices. Using these tools, my students and I wrote a program to identify common wildflowers and their habitats using a visual question and answer approach that took on the metaphor of a nature walk.

Our project was dubbed "The Macroscope" inspired by the science fiction novel of the same name by Piers Anthony who conceived of a space-based telescope of infinite resolution in the space-time continuum. Our Macroscope included a geographic information system with satellite and aerial photographs, "street-view" style 360-degree panoramas of representative habitats, and a species database coupled to close-up images of as many of the plants and animals we could identify on our field trips to biological field stations and reserves around the world. The goal was to link the "Big Data" of our digital multimedia database of ecosystems and species with a natural language AI that would simulate a human botanical or ecological expert. The typical dichotomous keys in field guides rely on the user learning specialized knowledge of botanical anatomy and require direct measurements of flowers and other plant parts. Our system incorporated the 54,000 photographs of natural scenes and close-ups of species stored on a large laserdisc and controlled by the first color Apple Macintosh to take you on a virtual guided naturalist walk in many places around the world. It was useable by children as well as adults.

Technical aspects of building our Macroscope in the late 1980s involved shooting single frame 16mm film sequentially from a stationary point to record overlapping panoramic images in every direction. There were no hemispherical lenses in those days, so we used a specialized tripod head that allowed us to photograph a calibrated overlap between frames as we moved the pan-head horizontally, then vertically. Approximately 200 images were recorded for each panorama. The film was later transferred to 1" DV format video at a post-production facility in Burbank, and then written to a recordable laserdisc. On the computer side, we built Filemaker databases that sends a command to the laserdisc player to search for a particular image, then feed the analog video image signal through a frame grabber (a device for converting electronic images from analog to digital), finally inserting the new digital image into the database record that contained the technical information about the image, its geographic location, its spatial position relative to adjacent images, the common and scientific names of any species in the frame, the habitat classification, names of students involved, additional field notes, date and the time.


Apple provided a new video format for us to test called Quicktime, and so we wrote extensions to the format called XCMDs that could manipulate the video in ways that allowed individual images to become clickable maps linking to other images, and for the color data to be looked up and then used to connect to a database or classification program. Using this tool, I could hover the cursor over a flower and automatically read the RGB spectral data of a region of the plant to cross-reference a database file and narrow the search to one or few possibilities. Another tool was written to take the individual overlapping stills and merge them into a continuous panorama that was Quicktime VR compatible. QuickTime VR was the precursor to most of today's 2D VR experiences such as Google Earth, Google Streetview, and YouTube VR.

The late 1980s were a very experimental time for digital image formats and cameras. At that time portable cameras were entirely analog, either film-based or motion or still video. When we worked in the rainforests of Venezuela, the Canon camera company loaned us one of their prototype professional still video cameras, and Pioneer loaned us a laserdisc video recorder that could store up to 54,000 images. The equipment enabled us to build an extensive image database of the rainforest, savanna, and alpine species while we explored Venezuela. We thought the images were beautiful, but they were only 700 by 500 pixels, with 24-bit color! To manipulate an image down to a pixel required digitizing it from the video source. The thousands of images we collected were small and grainy by today's standards. The Macroscope was a substantial multi-year project, required expensive hardware and lots of training to set up and operate, let alone to catalog and add new data. The AI interface was difficult to use. Today the entire project data, images and text, easily fits in a 16GB thumb drive. The function of the Macroscope was a precursor to what would become mainstream features on everyone's smartphones in 30 years.

Throughout my career, I participated in dozens of digitally infused conservation and technology projects, yet as I review them and attempt to convert whatever original data files I still have, I'm finding the process to be a daunting project in itself. There are heartbreaking gaps for me — for example the entire field summer of 1993 where I digitally documented wildlife in Namibia, Africa, including field mapping of thousands of their geolocations using a GPS linked Apple Newton, was lost 20 years later due to hard drive failure and lack of a reliable back-ups (not to mention the Newton is now a non-functioning artifact). Only a handful of the miniDV videotapes from that expedition remain, and thankfully those have been converted to Quicktime by a video transfer service, given my camera to read those tapes is no longer around. 


In about a year, I'll have the digital assets at hand to start programming my avatar. The user interface is still up in the air. I've tried out several flavors of chat-bots, they hold some promise. My conversations with Siri on my HomePod are just not that deep, even if he does a great job controlling my smart home appliances or recommending music. Moreover, Siri is not an open system. Vector the little robot is a favorite of my granddaughter, I've been experimenting with its SDK, but the parent company Anki that houses the cloud for his conversational AI is closing down in September. That leaves DIY, so I signed up for a free Google Cloud account and begun learning Dialogflow for the conversational AI, and TensorFlow to add machine learning into the visual side of my stories. If a picture can tell a thousand words, wouldn't it also be helpful if a picture can explain itself?

Can I explain myself? Can my avatar? Time will tell.



You can also read this story on MEDIUM

The James Reserve "Robot Forest" in 2005

These are the aquatic sampling sensor buoys and networked info-mechanical robots in action at the James Reserve in 2006.

The first version of the Macroscope simulated a guided naturalist walk through the ecosystems of the San Jacinto Mountain in California in 1987

I'm holding the 16mm film Bolex movie camera we used on our first Venezuela field expedition to film the Macroscope ecology laserdisc in 1989.

An overview map of the clickable panoramic nodes that link the Quicktime VR "view maps" in the James Reserve Macroscope

From interactive laserdisc to real-time sensors, the Macroscope evolved from this first conceptual design to a real-world laboratory of wireless sensors and robots "taking nature's pulse" from 2000- 2012. The National Science Foundation provided funding.

The majority of the digital data collected during our 1998 Namibia field expedition was lost due to an unfortunate drive failure and an inadequate back-up system. A few digital video files are all that remain. The quality of these 30-year-old videos makes me laugh, and then feel old!

August 28, 2019