[Novalug] Algorithms As Art

Bryan J Smith b.j.smith@ieee.org
Wed May 27 15:13:22 EDT 2015


On Wed, May 27, 2015 at 12:29 PM, John Franklin via Novalug
<novalug@firemountain.net> wrote:
> In college, the Computer Science program was in the college of Arts and Sciences,
> not Engineering.

My Alma Mater was like that until the mid '90s as well.  Computer
Engineering and Computer Science were completely different, the latter
requiring far less calculus, physics and did not focus on materials,
semiconductor, analog, layout and those realities.  I know several
universities were like this.

But in the case of my Alma Mater, it wasn't so much a college having
CS first, and adding the Engineering and EE department later.  The
school started out as an engineering college, and offered Computer and
Environmental Engineering from day 1 ('67), programs that were rare at
the time.  The Space Grant had a lot to do with this, including
several being the first ABET accredited programs thanx to industry
partnerships on several, leading edge technologies being developed in
the area -- from NASA to Martin-Marietta.

One could say MIT's innovative guidance research and the digital
computers NASA adopted, especially given the very resource limited
computers had an operating environment that prioritized time sharing,
pushed any new engineering schools to seriously consider the fact that
"math-driven" logic concepts were leaving a lot out, and completely
new, EE-centric ideas were needed.    I don't think people realize how
advanced, let alone high risk, but also rewarding, this was until the
Apollo 11 landing "issues."

In hindsight, I'm lucky to have been exposed to non-clocked,
non-boolean semiconductor design and layout early in my career, via my
Alma Mater's industry relationships -- almost a decade before
clockless units became required in any microprocessor design today.
Most people still wrongly assume clocks are how processors are
self-timed, internally, but that's literally not the case any more,
because the physics and analog realities make that impossible, and
have for a long time.

> We used to joke it was because programming was an art, and a black one at that.

I'll never forget being in the library and a CS major was explaining
to me digital logic.  I just was a humble -- rare for when I was so
young, I was a brash, arrogant SoB until some humbling events in my
late '20s, and still have to watch it today -- and listened for 10
minutes while in line.

When he asked what my major is, I said with a straight face, "Computer
Engineering."  He said, "oh, you understand digital logic then."  And
I merely replied, "I do, and I also understand how it doesn't scale,
and how layout becomes very different in today's pipelined,
superscalar microprocessors."

I was lucky to have an instructor from industry that my Alma Mater,
wisely so, let him go completely off-materials and teach us both
Verilog and VHDL for synthesis, along with Cadence and other layout
tools.

It's kinda funny to look back on it nowdays, because I had 3 different
implementations of finite state machines, because I nearly completed a
CS minor (only 1 class short).
- Discrete Methods (Math-like -- CS minor)
- Linear Systems (Analog electronics -- EE core)
- Digital Signal Processing (Analog-to-Digital Processing -- EE specialty CpE)

In the same regard, I'm glad I went through most of the CS minor,
because I got far more "programming methods" than I would have ever in
my CpE program.  We only took a pair of analytics course, plus one
"Data Structures" class (with basically an OO centric lab, C++ at the
time)  They didn't care what languages you used, although you had to
do 1-2 programs in Fortran, which is basically mandated by my Alma
Mater.  Why?  Again, industry partnerships in the area ... there was
just a crapload of legacy code in the area.

I was glad I learned Fortran, because at my first, salaried job, I was
working with Rocket Scientists who still used it, and I had to take it
and write autopilot and guidance in C.  I also, quickly learned how
much inherently easier it is, using the basic syntax of Fortran 90+,
to do common 6DOF (+ time) in a 7 dimension array, than any C with
various libraries.  It was then on that I realized there is no "best"
programming language.  What matter is that you know the methods to use
languages, and their various libraries, effectively.

<pet peeve=on>
Which is why I cringe every time a CS major says one should learn
assembler (2G) to optimize code.  In reality, just learning computer
organization with C (3G), and the most efficient libraries for an
architecture and platform, is far, far more useful and better in my
view.  Because given how pipelined, superscalar processors are today,
with their RISC-like internal microcode and timeless circuits, you're
_rarely_ going to outsmart the engineers that wrote the optimizing
compilers and know the internals -- at least not without studying it
for 2+ years, and just for a single implementation that is not
transferable.

At most one should only read the optimization sections of a
programming manual for a specific processor implementation, and
in-line the assembler instructions published in that book in C for a
specific, optimizing library.  But because most libraries already do
this, use different instructions for extensions or solutions of
specific implementations, very modular, it's more important to know
which libraries have these optimizations, and when to use them.

Which is why I always correct anyone who recommends a 4G coder learn
assembler (2G).  It would do them a lot more good to just use C (3G)
to learn basic computer organization, and then follow that up with
focusing on what optimized libraries to use for their applications,
than attempt to go lower.  Because even the instruction set really
isn't the instruction set, and designs are way too complex below the
instruction set to understand how to optimize an implementation --
especially since you have to learn each one.  Instead, learn how to
leverage what others have already done for each architecture, in a
generic language that loads the modular library optimized for it.
</pet peeve>

-- bjs

-- 
Bryan J Smith - http://www.linkedin.com/in/bjsmith



More information about the Novalug mailing list