Currently scientists, engineers, and physicians of every discipline are utilizing digital imaging. Since digital images are already data made visible, the ability to analyze what the images represent is greatly facilitated.
Imaging for science applications has a long history going back to the late 1800’s to just before the discovery of X-Rays when Professor Wilhelm Roentgen pioneered the medical sub-specialty of radiology. Radiology over the past quarter century has benefited dramatically from digital imaging allowing for the creation of computed tomography scanning (CT), magnetic resonance imaging (MRI), and ultrasound. The next decade should see dramatic changes in medical imaging with a large number of even conventional x-ray imaging systems switching over to digital methodologies which will eventually result in film based systems becoming obsolete. The benefits of digital technology for conventional x-ray imaging can result in more information being available to the physician in less time resulting in better diagnosis with less exposure to the patient of harmful x-rays. Additionally, MRI and CT technology are continuing to take advantage of digital imaging and image processing techniques as well, pushing the limits of computational power.
In addition to medicine we should note that many of the advances we have in digital imaging, image processing and image interpretation are due to technologies developed by the U.S. Department of Defense (DOD), Central Intelligence Agency (CIA) and National Reconaissance Office (NRO) as well as the National Imaging and Mapping Agency (NIMA). In the mid to late 1950’s the U.S. was launching the first of their spy satellites, the Corona series. In order to get information from these satellites, the satellites would be tasked to take pictures during specific portions of their orbits. To get these pictures, the satellites would eject a film canister, which would then fall through the atmosphere, and then get caught in mid-air by an airplane. From here the film would then be taken to special facilities, developed and interpreted. Clearly more efficient methods needed to be developed. Current satellite imaging technology allows for digital imaging that can be downloaded and processed essentially in real time along with other modalities such as infra-red, and multispectral imagery. How good are the pictures? Well, for example, cars and trucks can be determined to make and model with about 20cm resolution. Conventional optics physics tells us that the optimal resolution for even the best imagery from space is probably about 10cm typical. (possibly better if the platform is tasked to a lower altitude, but this is VERY expensive). However, there is no real advantage to going to higher resolutions from space. The issues being worked on concerning the folks that have the best technology are computer vision and enhanced ability to interpret data as imagery analysts have to decide what to look at and then interpret, producing the real bottle neck in analysis. In fact, I knew of folks that specialized in runway lengths. They looked at images of runways all day, every day to determine lengths and capacities of runways. Faster multi-spectral imaging, real-time visualization, better/faster tasking of platforms and data storage are among other hot areas of study.
These technologies often end up trickling down to other non-classified areas so that science can benefit. NASA and other governmental agencies use these techniques to study the earth and its oceans from space resulting in a more thorough understanding of our planet and its weather. Turned another way, digital imaging has long been an incredible tool for the study of both outer space and inner space. Many of the technologies in current charge coupled devices (CCD’s) used by biologists and chemists for digital microscopy were developed by astronomers to see some of the largest and farthest away things in the universe. Cooled CCD’s among other technologies are important to reduce the amount of noise in images so astronomers could see ever fainter objects in the universe while biologists, chemists and physicists use many of these cameras and techniques to image the smallest things in the universe. Geologists, botanists and scientists in forestry, among other folks in the remote sensing community also use these techniques to study everything from composition of various strata to the health of forests. Additionally, there are a number of papers that have been written by archaeologists using remote sensing techniques to survey ancient roads and cities.
Image forensics used by police agencies and the FBI is another rapidly growing field taking advantage of a variety of techniques becoming common in the commercial graphics industry. In imaging forensics, the use of software for acquisition, processing, archiving, and retrieval of digital images is designed to optimize image capture and to perform analysis of both raw and enhanced digital images. Utilizing these techniques, images can be examined to reveal detail that can be masked by artifacts and interference during acquisition.
So, given that this is a Macintosh centric site focused on the use of Macs in science and engineering, how does all of this relate to Apple, the Macintosh and their roles in science? It should be noted that many of the technologies we use in digital imaging in science and engineering owe quite a bit to Apple and other companies such as Xerox, Texas Instruments, Kodak and Adobe. Indeed, much of the technology that writers and graphic artists rely upon has been pioneered by two of these companies, Apple and Adobe. In fact, Photoshop is one of the most complex digital imaging environments ever developed. Adobe has done an admirable job of hiding the mathematics and complexity of image analysis from the non-technical user and created a software package that allows the graphic artist to import, modify and enhance images. In fact, most scientists I know are also quite proficient Photoshop power-users as Photoshop has become a de facto standard for many of us to create and layout images for publication. Photoshop enhancements or plug-in’s such as Dr. Russ’s Fovea Pro and Image Processing Tool Kit or IPTK are even being used to allow the use of Photoshop as an actual scientific analysis environment.
Historically, Apple’s primary focus has been on the consumer market. However given the UNIX underpinnings of OSX, and technologies such as Firewire, Quicktime and even iPhoto, these products provide scientists using digital imaging, tools to facilitate asking questions in their research at price points never before seen.
Quicktime, an Apple product designed for media of all types is now over ten years old. I have long thought that Quicktime has the potential of becoming a platform in its own right functioning as the heart of media devices and perhaps with Apple’s focus on digital consumer devices this will become reality. Regardless, Quicktime from its inception has been used by scientists and engineers to display information. I myself was using it in 1992 to display sequential still images of neuronal growth cones as time lapse video.
Firewire from Apple (accepted as an official industry standard in 1995) has made the process of actually capturing image data from cameras potentially much easier and cheaper. For example, those using digital imaging for microscopy have for years needed to purchase not only the CCD camera, but also special (and expensive) hardware interface cards or frame-grabber cards to get the information from the camera to the computer. Cameras for science use now are coming onto the market with Firewire interfaces built in, thus greatly simplifying the process of image capture while providing for potentially large increases in data throughput resulting in higher frame rates among other benefits.
It is important to realize that the tools of digital imaging and their benefits flow both ways between the science arena and the consumer markets and this nexus is one of Apple’s strong points. You might remember or be interested to know that the iPod is not Apple’s first foray into digital consumer devices. While the first patents for a ?film-less? electronic camera were awarded to Texas Instruments in 1972, Apple was the first company to bring a truly digital still camera to the consumer market in 1994. The Quicktake 100 when released could take digital still images and be connected to a personal computer via a serial cable to download those images and then manipulate them. I know a number of folks who were introduced to digital imaging with the Quicktake 100. (I was one of them that eventually traded in my Quicktake 100 for a Quicktake 200 and last year upgraded to a Cannon S20.) Despite the fact that these cameras are oriented for the consumer market, it could certainly be argued that science users have benefited greatly from the development of digital imaging for this market as well. While CCD cameras for scientific use typically have tighter tolerances, square pixels etc.., than do consumer grade CCD’s, the economies of scale help push the price of digital imaging down for scientists and help create tools which scientists can use for their research. Indeed many scientists needs are quite nicely filled by current consumer level CCD’s.
That said, we should now address the issues concerning data. Scientists and engineers going back to the earliest massive construction projects have long had the requirement for data management and storage. Wax and stone tablets from some of the earliest civilizations on earth have served as data repositories for commerce and the logistics of undertaking large projects. Starting in the 1940’s with the advent of computers, data management became much more complex starting with the use of paper tape and then magnetic tape to record and archive data. Currently the use of optical media is much more common and we are on the cusp of being able to store and retrieve truly mind boggling amounts of data using holographic storage. One only has to look at companies like Oracle and others to see that many have realized the need for tools to manage this data.
So, why bring all of this data management stuff up? And what does it have to do with Apple? As I alluded to before, digital images are data made visible. Images still and motion take up huge amounts of storage space. But beyond the actual space requirements of images, we need the means to organize our data/images. Apple is not the first company to see the need for image archives. Manufacturers of digital imaging products have all had their proprietary products to manage images taken using their microscopes etc? And products like Phylum from improvision, Ugather from Codeblazer technologies and Image-Pro from Media Cybernetics have been around for a while. Additionally, database applications not traditionally designed for image archiving have also been called into use as image archiving tools.
Apple’s role in digital imaging with Quicktime, Firewire and iPhoto has been to apply their talents to dramatically simplify the process of storing and organizing image data. With the advent of iPhoto, I am sure that many in the software industry will copy techniques and interfaces pioneered by iPhoto for digital still image acquisition and archiving. Perhaps Apple themselves will even become interested in making iPhoto Pro products for higher end needs.
In conclusion, we see that many of the technologies for digital imaging required by scientists and engineers are already present on the Macintosh platform. Future articles in this column will illustrate how these technologies are used in various areas of science and we aim to provide reviews of products that seek to fill various needs for science users including digital imaging on the Macintosh. Your feedback on products you desire to see reviewed and evaluated is critical in fulfilling these goals.