Can you imagine if the universe of what we know (or, said another way, what we know about the universe) were restricted to what we can see with our naked eyes?
The classic Powers of Ten short film, created in 1977, does a marvelous job conveying the relative size of things. Beginning with a view of a man and woman having a picnic near a lake in Chicago, the film first zooms out to visualize what the universe looks like 100 million light years from Earth and then zooms in all the way to a single proton. This is a journey through 40 powers of 10 — from 10+24 meters beyond Earth’s galaxy to 10-15 meters into the nucleus of an atom. (There’s a more recent version of the movie starting in Venice, Italy; I remain partial to the original.)
The human eye has a limited visual range. Most of us can see something as small as a millimeter (10-3 meters, like a grain of sand) and something as far away as 100 meters (10+2 meters, roughly the length of a football field). That’s six powers of ten. Beyond this range, most humans need assistance.
The ancient Egyptians and Mesopotamians were the first to develop lenses — transparent optical devices that focus or disperse light by refraction. A convex lens bends light rays toward the middle; a concave lens makes the light rays spread out. Scientists including Ibn al-Haytham (965-1040) and Isaac Newton (1643-1727) made foundational discoveries that progressed our understanding of optics to the point that instruments could be built to help us see things that couldn’t otherwise be seen.
These instruments vary in their complexity and utility. A magnifying glass is a single convex lens that bends light rays to make an image appear larger than the object actually is. Binoculars use two convex lenses to magnify something in the distance. A movie projector uses concave lenses to project a magnified image onto a screen. Cameras use a series of lenses to bend and focus light onto a film or imaging chip. Eyeglasses and contact lenses fix blurry vision by focusing light in the appropriate spot on the retina — using for instance a concave lens to correct nearsightedness. Laser eye surgery reshapes the cornea (the eye’s outermost lens) to improve visual acuity (an outcome of pioneering work on excimer lasers carried out at IBM Research).
Perhaps the two most sophisticated optical instruments are the microscope and the telescope. Microscopes magnify microscopic forms. Telescopes bring faraway objects into view. Over centuries these technologies have evolved to increase their range, magnification, and resolution. Today innovation continues to push the boundaries of what we can see.
Microscopes: Seeing tiny cells, molecules, and atoms
A Dutch father-son team of Hans and Zacharias Janssen is credited with making the first compound microscope (one with more than one lens) in 1590. This early model consisted of a narrow tube with a convex lens on one end and a concave lens on the other. It magnified objects between 3x and 9x, but images appeared blurry. By the late 1600s, lenses were improved to enhance resolution and increase magnifications up to 270x, which opened up all sorts of other applications.
During the next 100 years, optical microscopes were used by botanists, zoologists, physicists, and physicians to study plants, materials, and biological specimens, leading to the discovery of bacteria, observations of cell division, and so much more. Over subsequent centuries, microscopes were made smaller, higher quality, and more stable, causing them to become critical tools across diverse fields including microelectronics, medicine, biotechnology, and mineralogy.
Today, the compound light microscope remains the most common type of microscope in use, typically with three or four different magnification options (40x, 100x, 400x, and sometimes 1000x). Decent quality microscopes, like those used in schools, run a few hundred dollars; sophisticated tools cost tens of thousands of dollars (typically with ultra-high-quality optics, motorized stages, and digital image capture). The best optical microscopes can see features as small as a micron (10-6 meters), limited by the wavelength of the light itself. To see anything smaller, we need a different type of microscope.
Ernst Ruska built the first electron microscope in 1933, for which he won the Nobel Prize. Instead of using visible light to create a magnified image, it uses a focused beam of electrons, achieving higher resolution and magnification. The transmission electron microscope (TEM) shines electrons through a thin sample, while the scanning electron microscope (SEM) moves an electron beam back and forth across the surface of an object. According to the Guinness Book of World Records, the highest resolution microscope today is an electron microscope at Cornell University that resolves features as small as 0.039 nanometers (3.9×10-11 meters). The price tag for an electron microscope is $300,000 on average, with some running into tens of millions of dollars.
Electron microscopes allow us not only to “see” the shape of a material but also to learn about its chemistry and electronic structure. They are useful for assessing material quality, studying tissue samples, and analyzing failures in microelectronic chips. Early in my career as a research scientist at IBM, I spent countless hours inspecting and measuring tiny semiconductor patterns in an SEM lab — a sound-proof basement space dedicated to this high-sensitivity instrument. The SEM image at the top of this post shows a hexagonal array of nanometer-scale pores formed by polymer materials that self organize into regular structures. It takes practice to capture great SEM images, which can in fact make beautiful wall art.
Acoustic microscopes use sound waves to penetrate objects and “see” structures inside without causing damage. Karl Dussik was the first to use this for medical imaging in 1942 — at ultrasonic frequencies, beyond what is audible by humans. It wasn’t until the 1990s that ultrasonic imaging became commonplace for medical diagnostics, allowing resolution of internal structures like brain tumors, gallstones, blood flow through the heart, and fetal development. I recall with awe the early glimpses of my babies when they were still in my belly – measuring the crown-rump length, checking for abnormalities, and visualizing facial features.
The scanning acoustic microscope was introduced in 1974 by R. A. Lemons and (my thesis advisor) Calvin Quate at Stanford University, detecting reflection and diffraction of sound waves off microscopic structures to yield information about what’s inside. It allows non-destructive visualization of features as small as 100 nanometers (10-7 meters) and is used to inspect optical and electronic devices for defects.
Scanning probe microscopes are a family of instruments that move a tiny tip along the surface of a sample to monitor the nanoscale surface characteristics (sort of like fingers on brail or a needle on a record). The scanning tunneling microscope (STM) was invented in 1981 by Gerd Binnig and Heinrich Rohrer at IBM Research, for which they won the Nobel Prize. By using a feedback loop to maintain a constant current flow between the probe tip and a conducting sample, it is possible to achieve atomic resolution.
The STM’s cousin, the atomic force microscope (AFM), generates a three-dimensional topographic image of any surface. Developed in 1985 by Binnig, Quate, and Christoph Gerber, the AFM is now routinely used to inspect nanometer-scale features of organic and inorganic materials. When I joined Dr. Quate’s research lab at Stanford a decade later, he was already well known as the co-inventor of the scanning acoustic microscope and the AFM. His latest question: could we leverage the same tools used for seeing nanoscale structures in order to create tiny patterns in a controlled and efficient way? This became the topic of my Ph.D. research.
Telescopes: Seeing faraway planets, stars, and galaxies
The first telescope was invented in 1608 by Dutch eyeglass maker Hans Lippershey, who applied for a patent on an instrument “for seeing things far away as if they were nearby.” (Interestingly, Lippershey lived in the very same town in the Netherlands as the Janssens, inventors of the compound microscope. It was an innovative optics community!)
Soon after, Galileo Galilei created his own refractive telescope, increasing the magnification of heavenly objects to 30x, allowing him to discover sunspots and Jupiter’s moons. Newton created a reflecting telescope in 1668, which replaced lenses with mirrors, making it simpler and less expensive to build. Over the coming centuries, scientists refined and augmented the telescope, causing the entire field of astronomy to flourish.
During the early twentieth century, aided by enhancements to the mirror manufacturing process, large-scale telescopes were built to image and research space. These telescopes were housed in observatories situated in high altitude locations with unobstructed views of the sky. Active and adaptive optics improved range and resolution, enabling discoveries of new galaxy clusters, multi-planet systems, and extrasolar planets. Many telescopes capture wavelengths beyond the visible segment of the electromagnetic spectrum (including radio waves, infrared, and ultra-violet), which provide unique data and insights. Today’s largest telescopes include the Keck Observatory in Hawaii, the Gran Telescopio Canarias in the Canary Islands, the Large Binocular Telescope in Arizona, and the Very Large Telescope in Chile. Currently, at least three new giant telescopes are under construction.
If we can make so much progress seeing the heavens from Earth, wouldn’t we be able to see things even more clearly if the telescope itself were in space? That was the concept conceived by American physicist Lyman Spitzer in 1946. Space telescopes avoid both light pollution and the filtering and distortion due to Earth’s atmosphere, which plague ground-based observatories. Space telescopes, however, are much more expensive and difficult to maintain.
The first space telescope launched in 1968. Perhaps the most famous of all is the Hubble Space Telescope, which was launched in 1990 into low Earth orbit and is still in operation today. The Hubble has made more than 1.4 million observations at optical and ultraviolet wavelengths, allowing scientists to estimate the age of the universe, establish the presence of black holes in nearby galaxies, and determine the mass and size of the Milky Way. It has also collected awe-inspiring photographs, like those of Jupiter and Europa and galaxy NGC 2525 that is situated 70 million light-years from Earth.
On Christmas Day 2021, NASA launched the James Webb Space Telescope, known as the successor to the Hubble. This $10 billion infrared telescope is expected to have a science lifetime of more than a decade. After traveling 1.5 million kilometers from Earth (much farther away than the Hubble), it reached its final destination on January 24, 2022. Now thousands of scientists will leverage this large space telescope to study the universe.
Home telescopes today range in price from roughly a hundred to many thousands of dollars. Simple instruments will do the job for casual star gazing, but extensive deep space exploration requires more sophisticated optics and electronics. With automated view finders and digital image capture, even amateur astronomers can get vivid views of the moon, planets, and stars. Last year, I purchased a home telescope (it seemed like a safe, socially-distant pandemic activity), hoping to observe the full lunar eclipse in May 2021. I struggled due to dense cloud and tree cover, but others had better luck, capturing some stunning photos of the so-called “super flower blood moon.”
Centuries of innovations have revolutionized our fundamental understanding of the universe and enabled ground-breaking discoveries. This has transformed medicine, revolutionized physics and astronomy, and launched entirely new industries like microelectronics and biotechnology.
Where would we be without the discoveries of early scientists and the ingenuity of countless engineers? We wouldn’t know about viruses or far-away galaxies, about sperm or our solar system. We’d be blind to tumors and asteroids, protons and black holes, bacteria and exoplanets. We’d be back in the dark ages…literally guessing or inferring, squinting and straining to resolve what is illuminated by the technological advances of the microscope and telescope.