Before stone there was wood. We humans are, after all, products of the forests. No material has followed the history of our species, from millions of years ago in the heart of Africa to the present, more intimately and persistently than wood has. This simple fact is often lost in the archeologist's understandable passion for more durable substances like stone and brick, but our material world was once almost exclusively one of logs, sticks, bark, twigs, bamboo, and the myriad other forms that wood takes in nature.
No material is so impressive in its versatility. This follows from nature's own design, endowing woody plants with a great range of strengths, densities, and flexibility. This, in turn, is largely due to the composite structure of the material—generally combinations of strong fibers and tough binders. The result may be dense and heavy, like oak or hornbeam, or soft and light, like balsa. It may be extremely flexible, like willow or laurel, or tough and resistant to change, like teak or mahogany.
Mastery of wood allowed humans to transform rivers and seas from barriers to highways; to build fences, homes, and walls, even when little stone was available; and to furnish their lives with everything from beds and chairs to buckets and barrels. In the last century or so, this mastery—and the dependency that followed—has extended to entirely new forms, including slices, laminates, particles, and chips.
In the 21st century, we tend not to think much about pots—or any of a host of things made from fired clay. But to archeologists, the existence or absence of ceramics is one of the most fundamental indicators of a society's stage of development. An "aceramic" society has few ways of storing food or water, and thus its agriculture remains limited and primitive until it learns how to fire clay.
Clay can be shaped and then hardened by drying, but only when it is baked to a sufficiently high temperature does its chemistry and structure permanently change to make it impervious to liquids and allow it to retain its shape over a range of conditions. The earliest of the great civilizations—in Mesopotamia (modern Iraq), China, and India—thrived on creating not only practical containers but also colorful tiles, statues, and jewelry. Combining clays with other minerals and then firing the mixture resulted in brilliant colors—the first glazes.
Ironically, for this most ancient of manufactured materials, some of the most advanced substances to come from 21st-century laboratories, such as high-temperature superconductors, are ceramics.
Fiber & cloth
So ubiquitous is cloth that we hardly even think of it as a material, but it is arguably the first complex manufacture that humans mastered. While a limited number of "fibers" exist in nature, there is no cloth. And the fibers that do exist—silk, cotton, or wool, for example—have to be transformed before they can be made into cloth. How humans learned to take the hair of a sheep or goat and twist hundreds or thousands of individual hairs together to make woolen thread or yarn, or how they learned to soak the straw of the flax plant to separate fibers that they could then spin into linen, is one of the great mysteries of our early history.
One of the qualities of cloth that has appealed so strongly through the ages is its great variety. The Chinese discovered perhaps as much as 5,000 years ago that they could use the fine exudations of the silkworm larva to make a sheer and exquisite cloth. About the same time, cloth makers in India and Egypt were spinning the fibers of the cotton plant into a remarkably versatile fabric, one that they could weave into a range of weights and textures. Throughout much of the world, our forebears found that they could work the hair from a variety of herded animals, from camels to sheep, into some of the softest fabrics known or the scratchiest materials ever to rub human skin.
Today it is a rare wardrobe that does not feature some cloth whose history began in a test tube rather than on a farm.
The versatility and variety of cloth has not diminished with the application of modern chemistry to its improvement. While some of the earlier efforts at fashioning clothing from synthetic materials such as rayon or polyester became the butt of jokes, today it is a rare wardrobe that does not feature some cloth whose history began in a test tube rather than on a farm.
The first metal that humans were able to make serious use of was copper. But this metal, occasionally found as nuggets though more commonly reduced from ores, is fairly soft and will not hold an edge. Bronze, an alloy of copper with small amounts of tin, is a much more useful form. We do not know how craftsmen came up with this combination, but thousands of years of working with copper preceded the discovery of bronze some 5,000 to 6,000 years ago (in a number of places).
As the first intentionally produced alloy, bronze set the precedent for the widespread use of metals, notably for weapons. The first swords had bronze blades, and the material's ability to hold a sharp edge while resisting chipping or breaking was perhaps its most valued attribute.
Because the constituents of bronze, particularly tin, are not especially common, they were probably the first products of long-distance trade. Even before classical times, the Mediterranean region saw extensive trade of copper from Cyprus (whose name comes from the Greek word for the metal) and tin from sources as far away as southwest England (Cornwall).
Iron & steel
Even in the 21st century, no more important metal exists than iron, and this has been true for as much as 3,000 years. Workable ores of iron occur in almost every part of the world, and a variety of techniques can produce forms of the metal with a great range of properties. Historically, there have been three basic forms of iron: wrought iron, cast iron, and steel. Craftsmen relying entirely on experience and observation discovered each of these forms and used them for centuries. It was not until the 19th century that the constituent differences among them were understood, particularly the role of carbon.
Wrought iron is almost pure iron, a metal that can be readily worked in a forge and that is tough and yet ductile, meaning it can be hammered into shape. Cast iron, on the other hand, has a marked amount of carbon, perhaps as much as five percent, mixed in with the metal (in both chemical and physical combination). This constitutes a product that, unlike wrought iron, can be melted in charcoal furnaces and thus poured and cast in molds. It is very hard but also brittle. Historically, cast iron was the product of blast furnaces, first used by Chinese metalsmiths perhaps as early as 2,500 years ago.
For the last century and half, the most important form of iron has been steel. Steel is actually a great range of materials, whose properties depend both on the amount of carbon contained—typically between 0.5 and 2 percent—and on other alloying materials. Generally, steel combines the toughness of wrought iron with the hardness of cast iron, hence historically it has been valued for such uses as blades and springs. Before the middle of the 19th century, achieving this balance of properties required craftsmanship of a high order, but the discovery of new tools and techniques, such as open-hearth smelting and the Bessemer process (the first inexpensive industrial process for mass-producing steel from iron), made steel cheap and plentiful, displacing its rivals for almost all uses.
The first glass was probably made by accident, as sand found its way into a kiln and then fused. This resulted in a material resembling a ceramic in cold brittleness but actually very different in structure and properties. While we think of glass as normally clear, it took centuries of experience and experiment to produce a reasonably clear glass, probably not until Roman times.
Glass is made by heating sand (silicon oxide) with a "flux," which is a mineral (such as sodium carbonate) that will lower the melting point of the mixture. The color or clearness of the glass depends on the purity of the ingredients and the particular combination of sand and flux. The result is a transparent or translucent material that is quite brittle but is nonetheless physically a liquid rather than a solid. That is, glass has no crystalline structure and actually "flows," albeit typically very slowly.
Cables of glass fiber now constitute the backbone of our Information Age.
The first glassmakers who could turn out truly clear glass consistently and reliably were the Venetians, whose "cristallo" set the standard for fine glass in the late Middle Ages. Glass was much too expensive for common use in windows until about the 17th century, and true widespread application had to wait until industrial-scale manufacture in the 19th century. The material's versatility has continued to flower up to our own day, when cables of glass fiber now constitute the backbone of our Information Age.
Humans set down their language in writing for thousands of years before they had paper—writing can be put onto stone, clay, wood, cloth, skin, and other surfaces. But despite the diversity of alternatives, and the emergence of electronic media, a world without paper would be very different from the one we know.
Paper is one of a number of inventions that came out of an astonishingly creative period in China about 2,000 years ago. When fibers—either directly from a plant or, more typically, from discarded cloth—are permitted to soak and partially rot, they can be reconstituted into randomly entangled sheets. By carefully pressing these sheets until dry and then, usually, "sizing" them with a starch (to seal gaps among the fibers), an exceptionally versatile material can result. The variety of forms the final paper can take ranges from the fineness and flimsiness of tissue to the toughness of kraft and other wrapping papers or the "washi" paper the Japanese use for lanterns, screens, and even walls.
Plastics & rubber
For thousands of years, our ancestors appreciated a small number of materials distinguished by their smooth and often colorful appearance: Ivory, tortoiseshell, and horn could be turned into small luxury items that bore a warmth and elegance hard to achieve in other substances. In the 1800s, the growth of markets, combined with a greatly enlarged understanding of chemistry, gave rise to a whole new class of artificial substances that made it possible to turn out such items in great quantities at low cost. The first commercial plastic was celluloid, made from nitrated cotton and camphor. When this combination was heated under pressure, it was transformed into an astonishingly versatile substance that could be made into everything from combs and collars to dolls, playing cards, and, eventually, ping pong balls.
In the 20th century, this first plastic was joined by a host of substances that were even more, well, plastic. The new plastics, by then typically made from by-products of coal or petroleum production, were sometimes fashioned into more ersatz luxury items. But their properties also lent themselves to a host of technical uses, from telephones and other electrical devices to substitute body parts and other medical devices that would be all but unthinkable made out of any other substance.
Closely related to these plastics was rubber, which started out as a natural product brought to Europe by early explorers of South America. Natives used the milky sap of Hevea brasiliensis for waterproofing and making bouncing balls. The Europeans adapted lumps of dried sap to rub out pencil marks (hence the name "rubber"). It was not until about 1840 that Charles Goodyear discovered how to make the material into a range of stable forms, good for everything from combs to inflatable rafts. The rubber pneumatic tire proved indispensable for motor transport in the 20th century, and this dependence led to the invention of synthetic rubber, much in the same fashion as the plastics.
For millennia, the range of metals available for making things comprised a limited set of elements. Apart from precious metals used for money and jewelry, the practical metals were exclusively forms or alloys of lead, copper, or iron. This changed in the late 19th century, most spectacularly with the introduction of aluminum. This was a metal that was not even suspected to exist until about 200 years ago, when chemists began to use new tools to explore the composition of common minerals.
One of the minerals they investigated was alum, which people had relied on since ancient times as an astringent and in making dyes more stable. This material appeared to have an unknown metallic origin, but the metal was not separated out until the 1820s and not purified until the 1850s. Even then, it was very expensive and difficult to make. The wonderful properties of aluminum—especially its remarkable light weight and its silvery luster—attracted wide attention. Precious jewelry and exotic objects such as opera glasses were fabricated from it. It was even seen fitting to make the small apex of the Washington Monument, completed in 1884, out of pure aluminum.
The coming of aviation in the early 1900s gave aluminum new strategic value.
The status of aluminum changed dramatically in the 1880s, when two young chemists, one in France and the other in the United States, discovered how to make pure aluminum metal using strong electric currents. This "electrolytic process" made the metal readily available, depending only on the availability of cheap electricity. The coming of aviation in the early 1900s gave aluminum new strategic value, and expanded production to meet wartime demand led to the metal becoming one of the most ubiquitous of the 20th century.
As scientists began to develop a much deeper appreciation for the various properties of materials, particularly from the mid-19th century onward, one of the key behaviors they described was how readily or not an electric current passed through the material. Some substances, such as wood or glass, appeared to resist the passage of electricity, while others conducted currents easily. The former were used to insulate and the latter, such as copper and then aluminum, as conductors, typically in the form of wires. These investigations of electrical properties turned up another class of materials that resided midway between the two main classes. Such "semiconductors," including carbon, allowed electric currents to pass but only poorly and with some physical response. (Carbon, for example, tends to heat up, which is why it was the material Edison found suitable for the filaments of his light bulb.)
These 19th-century investigators could never have suspected that in the latter part of the next century, these semiconductors would be put to work in ways that would transform our world as profoundly as any other material. Early in the 20th century, inventors devised means of controlling electrical currents and waves in remarkable ways, leading to marvelous new forms of communication, such as radio and television. The means they discovered involved controlling the actions of electrons: When these movements were controlled precisely, sound, light, and other phenomena could be picked up, transmitted, replicated, amplified, and manipulated to inform, entertain, investigate, and calculate in ways never before possible.
At first this electronic control depended on vacuum tubes, but this was at a great expense in energy, speed, and efficiency. Spurred on by World War II and then the rapid expansion of demands for communications and control technologies, alternatives to vacuum tubes, such as the transistor, began to appear in the post-war years. These alternatives, at first made with very exotic and expensive semiconductor materials, were not initially easy to manufacture or use. But when chemists discovered how to create large and very pure crystals of silicon, a host of new possibilities opened up. With the development of the techniques for imprinting cheaply and reliably entire complex circuits, including the workings of digital computers, onto wafers of silicon, a revolution in information and communications became possible—a revolution that we continue to see unfolding before us today.