The last few weeks I've been musing about digital logic, FPGAs, and Verilog — all things that are appearing more and more regularly in the embedded systems space. So far, you've seen some simple logic gates and how to simulate Verilog using Icarus.
The problem is, simple logic gates are only interesting to a point. Combinatorial logic can transform an input to an output. But there are several limitations. First, it isn't practical to create circuits with memory using purely combinatorial methods. The obvious place you need memory is the registers of a CPU. However, you also use circuits with memory when building state machines and other kinds of circuits you might not immediately associate with memory. For example, if you were building a traffic light, you need a way to keep track of the light's current state, and that isn't easy to do with combinatorial logic. You somehow have to feed the output of the circuit back to the input, but there are a few potential problems with doing that.
One of those problems is that real logic circuits don't react instantly. There is a small but finite delay between when an input changes state and the output (or outputs) change to reflect the new input. What's worse is that a complex circuit may have several paths between inputs and outputs that don't take the same amount of time. This can lead to all sorts of problems.
The solution — and the real action — comes when you start playing with sequential logic. I'll spare you all the hairy theory and the drawings of cross-connected inverting gates. What you really need to know is that there is a special class of logic element called a "flip flop". This funny name refers to how the element takes one state, or the other state. You can think of a flip flop as a one-bit memory cell that operates with a clock and special rules (depending on the flip flop type).
Verilog has the reg data type, which is sort of a one-bit memory, but don't get confused. While a reg is a key part of a flip flop realization, it isn't one all by itself. You'll see what I mean after you examine a few examples.
For our purposes, all flip flops will have a clock input and — this is key — the outputs only change on the rising (or falling) edge of the clock (depending on how you design the flip flop). If the clock is stable (no edge) or presents the wrong edge, then the inputs don't matter.
This is why the flip flop is like a memory. It is also how you solve the variable delay issue. Suppose you have an AND gate with two inputs. The delay between your inputs and the first AND gate input is 100nS. The delay between your inputs and the second AND gate input is 80nS. That means that if your inputs change, in 80nS the output could be "wrong" because the first input hasn't "caught up" yet. If I remember the output of the AND gate with a flip flop, and synchronize my inputs to a clock that doesn't change more often than, say, 110nS, then the output will always be correct.