Sunday, February 3, 2013

Two roads diverged in computing history...


I’m still working on differentiating between an analog and a digital computer. Radar, though called an analog technology, was largely based on electronic equipment. I understand that analog technology also used electricity, but I’m struggling to understand where the dividing line between the technologies is placed. Is the distinction based on moving vs. fixed parts, method of processing data, or something else? For example, the ENIAC, being programmed using “switches and plug boards,” seems like a hybrid between digital and analog in some capacity, as it appears to make use of both. Surprisingly, the ENIAC is referred to on page 67 as “a wartime investment in digital computing…” which leads me down a rabbit trail of defining concepts. I hope we have a moment in class to touch on this.

It is unsurprising that entrenched interests resisted the emerging digital paradigm. Although analog computing was the status quo, it’s interesting to note that starting in the 60’s, ARPA’s exploration into AI and computer networking seems to be far more digital in nature.  Although it represents a subset of computing, it seems as though these ideas held some of the greatest potential impact, and hinted at a direction computing could follow. 

2 comments:

  1. Great question, Matt, and I think we'll have time today to address it. The differences between analog and digital technologies can be explained in several ways--for instance, continuous input vs. discrete sampling. I often tell people to think about an analog vs. a digital clock, i.e. one that is wound or based on spring or motor mechanisms, and one that is powered by electricity and reads out distinct intervals rather than a range of connected states.

    ReplyDelete
  2. Thanks, Alenda...I can understand it better now.

    ReplyDelete