Codeby Charles Petzold Published 21 Oct 2000
Download Code (2014) PDF ePub eBook
- 1. Register a free 1 month Trial Account.
- 2. Download as many books as you like.
- 3. Cancel the membership at any time if not satisfied.
What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries.
Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines.
It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.
I'll be honest. I only read this book because it was quoted as a must read by Joel Spolsky on a stackexchange answer about how to go about learning programming (and finding out if you want/should be a programmer).
I was a little hesitant due to the year of release. Being at least some 11 years old that's a lot of time in the tech world. Ultimately though that doesn't matter. I defy any developer/programmer/system builder to read this book and not blitz through it lapping it up. Yes if you've done some schooling in computing or computer science you may be happy with much of the content but you'll surely find things you've either not thought about before in much depth or just wasn't explained in quite the elegant way that Petzold does. For me, whether it was due to age, experience or just maturity through both I found it filled gaps in my memory and indeed gaps in student course material.
Petzold opens up the world of computing through a concise linear storytelling format. Starting with a basis in Morse Code and Braille through the telegraph system, barcodes, boolean logic, circuits with memory, von neumann machines, adding peripherals, I/O devices and GUI interfaces we just about catch up to the modern era with talk of HTTP and the world wide web. Having pretty much built the systems (or simplified versions of) we're discussing in the incremental circuit and systems diagrams on the way.
Admittedly there's some rather 'of their time' phrases and facts that raise a smile (low resolutions, high costs for 'small' HD storage sizes, usage of cassette tapes by consumers) but this is all still valid information when taken in the context of the time of writing.
If you are a Developer/Programmer you're not going to go into work having had an epiphany of how better to do things, but you may have a new found respect for what you're doing and the many, many ingenious shoulders you are standing upon.
Raise your hand if you think metaphors and analogies should be used sparingly. I'll raise my hand with you. This book is for us.
After reading this book, I can see behind the pixels on my computer screen. I know what I'm really looking at. So many layers of abstraction are removed by learning about how logic gates can be arranged as processors and RAM, how code is simply a representation of those microscopic switches being flipped, and how pixels are simply a graphical interpretation of the state of particular switches. Moreover, I also have a little bit of an understanding of the historical evolutions these inventions and conventions went through: not just how computers work, but why they work that way and how they came to be.
The book was tougher to grasp than I thought it would be (I do not have an extensive background in electronics or programming). Although it started off easily, it became progressively more complicated except for the last chapter or two. Of course, this was to be expected, as the book began with the basic building blocks of a computer, and built progressively more complicated systems from those initial components. However, the problem wasn't really a result of the subject matter, but of the writing style, which seemed to grow more terse in later chapters. I was left with the impression that the author felt he was running out of space, which I'm sure he was; it must be difficult to keep a book with such a vast scope to a manageable size and prevent it from turning into a reference manual. I would characterize this book as grueling, but that might be because I was obstinate in making sure I fully understood every detail of every page. There were a few pages that I had to pore over repeatedly until I received a eureka moment. A few more explanatory sentences here and there would have alleviated this, but ultimately, drawing my own conclusions was very rewarding. The book seemed to recover from its gradually adopted terseness with an appreciated but sudden reference to the first chapter in the very last sentence. Someone less focused and more inclined to skim might find this book to be a bit lighter reading, but it still only took me a few days to read the whole thing.
I was surprised to see that the book did not really cover how transistors work at the electron level, which leaves what I consider to be a major gap in any understanding of how modern computers based on integrated circuits work. The text says that transistors are functionally equivalent to electromechanical relays or vacuum tubes and work similarly, but hardly any more than that. This missing knowledge is something that would have been appreciated and wouldn't have taken up much space. It seems like an especially glaring omission when juxtaposed with the inclusion of a few pages on EBCDIC, an obsolete alternative to ASCII text codes descended from paper punch cards.
Despite these minor gripes, this is a really great book, and I highly recommend it to anyone who has the interest and persistence to get through it. It teaches and ties together many mathematical and electrical concepts, and the payoff for the reader is a new perspective on computing. Despite being first published in 1999, it hardly seems dated at all, probably because it's really a history book and most of the computing history it covers happened in the 1980s and earlier. All computing history after that is basically just increasingly complex variations on those simpler foundations. A sequel would be welcome.
P.S. I think I've discovered a typo in the assembly code program on page 322. It seems to me that there should be an additional "AND A,0Fh" after the four lines of "RRC" and before the first "CALL NibbleToAscii" line. If I'm wrong, would anyone mind explaining why? And if I'm correct, would anyone mind giving me peace of mind by confirming this? Thanks! :)
Electricity is like nothing else in this universe, and we must confront it on it's own terms. That sentence, casually buried near the beginning of the book, exemplifies the engineer's muse: a striving to become aware of the inhuman, how it operates, and to find means of creating a socket for human enterprise, something to extend the fallible chassis of our flesh.
The first two-thirds or so of this book follows a double track. One track covers the ways in which meaning may be encoded into messages, the other weaves repetitions of a relatively simple device — the telegraph relay — into machines that marshall electricity into the forms of logic and memory. These two tracks eventually coincide at the device we know as a computer. Though it would be impossible to build a computer from telegraph relays, the machines we use today perform the same tricks with electricity that were possible in the 19th century.
The last third of the book is more concerned with the makeup and successive improvements in implementation of the devices that embody the marriage of electricity and meaning. For someone like me, accustomed to the elves of the internet bringing me a regular helpings of news, porn, and status updates from the virtual smörgåsbord, it was interesting to see how they have been made so much easier to use since the era of assembly code and text terminals.
Regarding electricity, that prime mover of the information age, it has struck me that electricity is the stuff minerals dream with, and we may have subjected an alien order to the vagaries of our desire without being prepared to one day pay the price. We live, all of us, in an era of debt, making allowances for even a future of cities submerged and massive conflicts fostered by drought. When it finally comes time to pay off our mineral deficit, will it be our dreams — that which makes us human — to ultimately be forfeit?
If you work with computers and didn't read this book, you are lame.
What a ride! A book about computers “without pictures of trains carrying a cargo of zeroes and ones” — the absolute no-nonsense book on the internals of the computer. From circuits with a battery, switch and bulb to logic gates to a thorough description of the Intel 8080. Great way to fill blanks in my computer knowledge.
The book takes the approach of constructing the computer “on the paper and in our minds” — that's great when you're at least a little familiar with the topic, maybe not so when trying to discover a completely unknown territory (but the author takes great lengths to go through everything step by step — e. g. the various gates, binary subtraction, memory handling, etc.).
In a way, this is a perfect book on the topic. If you know a better one, I want to read it.
Every single person in tech should read this book. Or if you're just interested in tech. Or if you just want a basic appreciation of one of the most important technologies in human history—the computer.
This book contains the best, most accessible explanation I've seen of how computers work, from hardware to software. The author manages to cover a huge range of topics—electricity, circuits, relays, binary, logic, gates, microprocessors, code, and much more—while doing a remarkable job of gradually building up your mental model using lots of analogies, diagrams, and examples, so just about everyone should be able to understand the majority of the book, and gain a deep appreciation of what's really happening every time you use your laptop or smartphone or read this review online.
I wish I had this book back in high school and college. I've been coding for 20 years and I still found a vast array of insights in the book. Some of the topics I knew already, and this book helped me appreciate them more; others, I knew poorly, and now understand with better clarity; still others were totally new. A small sampling of the insights:
* Current is the number of electrons flowing past a point per second. Voltage is a measure of potential energy. The resistance is how much the substance through which electricity is flowing resists the passage of those electrons. The water/pipes analogy is great: current is similar to the amount of water flowing through a pipe; voltage is similar to the water pressure; resistance is similar to the width of the pipe. I took an E&M physics course in college and while I learned all the current/voltage/etc equations, I never got this simple, intuitive understanding of what it actually means!
* We use base 10 because we have 10 fingers; a "digit," after all, is just a finger (so obvious when you actually take a second to think about it!). Had we been born with 8 fingers, like most cartoon characters, we'd probably use base 8 math. Computers use base 2 because building circuitry based on two states—the presence or absence of voltage (on and off, 1 or 0)—is much easier than circuitry based on ten states.
* The notation we use in math is essential. It's not about looking pretty or not, but actually making the math easier or harder. For example, addition and subtraction is easy in Roman numerals but multiplication and division are much harder. Arabic numerals make multiplication and division much easier, especially as they introduce a 0. Sometimes in math, you switch to different coordinate systems or different geometries to make solving a problem easier. So it's no surprise that different programming languages would have the same properties: while any language can, in theory, solve the same problems as any other, in practice, some languages make certain problems much easier than others.
* This book does a superb job of showing how logic gates (AND, OR, etc) can be built from simple physical circuits—e.g., from relays, which are much easier to imagine and think about than, for example, transistors—and how easy it is to do math with simple logic gates. I remember learning this back in college, but it still amazes me every time I see it, and with the crystal-clear examples in this book, I found myself smiling when I could picture a simple physical circuit of relays that could do arithmetic just by entering numbers with switches and passing some electricity through the system (e.g., to add, you have a sum and a carry, where the sum is an XOR and the carry is an AND).
* The explanation of circuits that can "remember" (e.g., the memory in your computer) was superb and something I don't remember learning at all in college (how ironic). I love the idea that circuits with memory (e.g., latches) work based on a feedback mechanism: the output of the circuit is fed back into the same circuit, so if it gets into one state (e.g., on, because electricity is flowing through it), that feedback mechanism keeps it in that state (e.g., by continuing to the flow of electricity through it), effectively "remembering" the value. And all of this is possible because it takes a finite amount of time for electricity to travel through a circuit and for that circuit to switch state.
* The opcodes in a CPU consist of an operation to perform (e.g., load) and an address. You can write assembly code to express the opcodes, but each assembly instruction is just a human-friendly way to represent an exactly equivalent binary string (e.g., 32 or 64 binary digits in modern CPUs). You can enter these opcodes in manually (e.g., via switches on a board that control "on" and "off") and each instruction becomes a high or low voltage. These high and low voltages pass through the physical circuitry of the CPU, which consist of logic gates. Based purely on the layout of these logic gates, voltage comes out the "other end," triggering new actions: e.g., they may result in low and high voltages in a memory chip that then "remembers" the information (store) or returns information that was previously "remembered" (load); they may result in low and high voltages being passed to a video adapter that, based on the layout of its own logic gates, results in an image being drawn on a screen; or they may result in low and high voltages being fed back into the CPU itself, resulting in it reading another opcode (e.g., perhaps from ROM or a hard drive, rather than physical switches), and repeating the whole process again. This is my lame attempt at describing, end-to-end, how software affects hardware and results in something happening in the real world, solely based on the "physical layout" of a bunch of circuits with electricity passing through them. I think there is something magical about the fact that the "shape" of an object is what makes it possible to send emails, watch movies, listen to music, and browse the Internet. But then again, the "shape" of DNA molecules, plus the laws of physics, is what makes all of life possible too! And, of course, you can't help but wonder what sort of "opcodes" and "logic gates" are used in your brain, as your very consciousness consists entirely of electricity passing through the physical "shape" of your neurons and the connections between them.
There are a few places the book seems to go into a little too much detail—e.g., going over all the opcodes of a specific Intel CPU—and a few places where it seems to skip over all the important details—e.g., the final chapter on modern software and the web—but overall, I have not found another book anywhere that provides as complete of a picture of how a computer works. Given the ubiquity of computers today, I'd recommend this book to just about everyone. It'll make you appreciate just how simple computers really are—and how that simplicity can be used to create something truly magical.
As always, I've saved a few of my favorite quotes from the book:
A computer processor does moronically simple things—it moves a byte from memory to register, adds a byte to another byte, moves the result back to memory. The only reason anything substantial gets completed is that these operations occur very quickly. To quote Robert Noyce, “After you become reconciled to the nanosecond, computer operations are conceptually fairly simple.”
The first person to write the first assembler had to hand-assemble the program, of course. A person who writes a new (perhaps improved) assembler for the same computer can write it in assembly language and then use the first assembler to assemble it. Once the new assembler is assembled, it can assemble itself.