On December 29, 1972, Eastern Air Lines Flight 401 crashed into the Florida Everglades, causing 101 fatalities. The crash occurred because the entire flight crew became preoccupied with a burnt-out nose gear indicator light. Meanwhile, they failed to notice the autopilot had been inadvertently switched from Altitude Hold to Control Wheel Steering mode. In this mode, once the pilot releases pressure on the yoke the autopilot maintains the pitch attitude selected by the pilot until the pilot moves the yoke again. The investigators believe the mode was accidentally switched and then an ever-so-slight forward pressure was applied to the stick, causing the aircraft to enter a slow descent.
What is the parallel with Information Security? Information Security, with its sundry standards and glut of gizmos, has been on a nearly imperceptible descent for years…while those involved and those that should be concerned are focused on the indicator light.
I have been in this industry for many years; I cut my teeth installing networks. At the time I was the young techno punk thinking that my technology was so very “kewl.” I still remember the day an “old-timer” mainframe person asked me how I did turnover. I had never heard the word applied to computers, so I pointed to the floppy disk drive on the server I was installing and said, “you pull that out, turn it over and you get twice the space,” with my young kid smirk. I guess I am now the old-timer network person…but I do better understand that question.
In the early days of personal computers and the DOS operating system, you did not need a password. Indeed, you could not even add one if you wanted to. The first Windows operating system started by typing WIN, pressing [Enter]…and that was all one needed.
Then people stumbled onto the thought that a password would be innovative, so that was added — even before Windows NT (New/Next Technology) — in the humble Windows 95. This was hardly any kind of security at all; like so many security ideas that were afterthoughts, you could simply boot to DOS.
IBM’s LAN Manager added a LM hash algorithm that used a method of hashing passwords so weak, it could be cracked in did support some decent security, but like with many things – what caught on like fire was the “easy” one.
The slow descent of the Information Security aircraft began here — with the personal computer. It was rare to hear about someone “cracking” into a mainframe, and even less common to hear about a mainframe virus spreading.
There is a reason for that.
Mainframes were built with a business mindset; they cost so much they were treated like a large piece of the business. Therefore, that question about turnover — which is a process of moving application code from a test environment to a production environment — included segregation of duties. The design afforded the developer to develop code, but have it submitted by an operator. Operators could not write code, they could only take code written from a developer and move it into production: an enforced segregation of duty built directly into the system.
And, it didn’t stop there. To connect to the mainframe, you had to first be connected to a controller, which like its name implies controlled access to the main system for both performance and security. If you needed to connect from a remote location there were dial-back systems. Your phone number had to be listed and it was matched to your login ID. You would call into the system, enter your login and — yes — a password; the system would hang up and call you back at the number listed in your account.
Mainframes were usually put in secure locations with restricted access; backups were standard operating procedure, along with off-siting that media. Job logs of backups and applications were monitored daily and if the mainframe connected to another system, it was all very well-defined and secured.
Obviously, none of this was very agile.
Furthermore, it was expensive and not at all user-friendly. That is partly because the technology had not evolved to present day standards, but those business thought processes were all baked-in and provided a great deal of assurance.
The Personal Computer changed all of that structure by putting the computing power closer to the end-user. Soon payroll could be running on Bob’s PC over in the corner and no one batted an eye, not even the CIO — who was under pressure to get more of these thingamajigs in the hands of users.
All of a sudden, making a change didn’t take an act of congress, it took the end-user fiddling around until the “thingamajig” did what was needed… or even close was good enough if one could just do it on his/her own. Was Bob’s PC being backed up? Were changes controlled? Was there any security (other than typing in WIN)? No, but who needed all of that overly complicated stuff anyway? Everything was working fine, and it was a lot more agile.
Around that same time a buzzword began being thrown around about the PC and these changes: paradigm shift. And to be sure it was, and it did indeed put computing power closer to the user. I was “paradigm shifting” networks like a madman, hooking up PC’s to Token-Ring, creating shares, and even adding passwords (on occasion).
However, the other paradigm shift was away from security and standard business practices. All eyes were fixated on the nose gear indicator light, while the secure standard business practices began their gentle negative glideslope.
(Stay tuned for Part II: A Glimmer of Hope.)