AskTog, November, 2003
A D'ohLT is a person who screws up some product or service so badly that, when you attempt to use it, you end up slapping your head and yelling that famous homersimpsonian cry, "D'oh!"
D'ohLT rhymes with "jolt," and is, of course, not to be confused with any similar sounding, but possibly actionable word.
Over the next few months, I'm going to be talking about some real D'ohLTs. Some of these people, like the ones I'll be talking about today, have screwed up once that I know of, but they did it spectacularly. Others have screwed up repeatedly.
One entire occupation, of whom I'll be talking in September, is heavily populated by D'ohLTs--bright people, but D'ohLTs nonetheless.
Your old Maximum Security article [April, 1999] sounds very familiar to me: I also have some all-purpose passwords. Unfortunately, some (many) services restrict the characters you can type. At least some of them tell you before you try the first time. ("No special characters, numbers or blanks allowed.") Pity is: All of my passwords normally are letter/number/special chars combinations.
When I want to register for a new service I try to use one of my usual passwords. Then I get: Password not allowed.
I have a very secure password, and yet I am being forced to use a less secure one!
Alright, I have to think of a simpler only-letter password. Done and registered. For this time, I can use the service. But the next time I surely won't remember this special password. Then the "Forgotten your password?" cycle starts. That is, if I can remember my account name...
My personal solution to this problem has been to create a database with each site a record listing the user name and password chosen. I have a shorthand for my usual password, but all others I'm forced to create are "in the clear," typed in right there for anyone with access to my machine to see.
Ive been watching security people for years as they've slowly increased the security of everything they can get their hands on until any idiot can wander in.
That sounds a bit contradictory, but I will soon prove my point. Before getting into the proof, however, I would like explain that it is not solely the security people's fault. They have all attended one D'ohLT University or another, where their professors have carefully groomed them for their current state of profound D'ohLTism. That's the problem with being D'ohLTed; you are very likely to turn around and D'ohLT someone else.
My wife, the Doctor, was working over the summer at a local hospital. They are fiercely into security, requiring no fewer than four sets of passwords to navigate their system. And why not? There are confidential patient records on those systems! By golly, they ought to have eight sets of passwords, and really make things secure!
So works the mind of a D'ohLTish security engineer, working feverishly away in his cubicle in the basement next to the steam plant.
Take him out for a walk. Let him see the sunshine for the first time in years. Introduce him to some normal human beings. Be gentle at first; these are creatures with whom he has had no contact since being sucked into the depths of the university system.
Then, when his pallor begins to fade and he begins to take on signs of socialization, take him into the offices in the hospital and let him see the four sets of user names and password clinging to the monitors on yellow stickies (e. g., Post-It Notes) or, for the more security-minded, slid into the top drawer where no one would think to look.
Only a DohLT would come up with a security scheme that is so overly complex that its guaranteed people will write down their passwords. And yet, this kind of D'ohLTishness is par for the course with these guys. They are the most clueless profession I know, and they are showing no signs of getting any better.
Of course, theres always room for more retardation of productivity, and, if it can be found, these guys will do it. After the first six weeks, my wife had received only two of the four sets of usernames/passwords, and shed had to speak to no fewer than seven people to get them. Two weeks of further extreme effort finally produced the last two sets.
What was she doing in the meantime? Instead of spending full-time repairing people, which is nominally her job, she wasted hours camping out in another doc's offices, using his computer (and passwords--they were right there on the sticky note) to do her work.
Meanwhile, the other doc, bumped from his office, would go and gets an extra cup of coffee. The security D'ohLTs had thus not only opened up your medical records to anyone schooled in the use of sticky notes, they were pouring money down the drain in the form of lost productivity and company-supplied coffee.
Fortunately, of course, this problem is self-limiting. Yes, she only worked at full throttle for the final two weeks of her ten-week stint, but when she returns in December to work for another three weeks, her user names and passwords will all be waiting for her.
Except unused user names and passwords expire after 90 days.
Even constant users have to make up (and post on their computer monitors) new passwords every 90 days, even if they keep their user names. Expiring stuff is the only way these guys can prevent the unthinkable: memorization. Once people memorize the little devils, they dont need their cheatsheets anymore, and then, suddenly, there's real security. They can't let that happen!
Hospitals all over the country now are freaking out at this moment because of the new security law that suddenly hit them by surprise, with no more than about six years notice. My wife called down to Emergency a couple days after the law struck to ask them to fax a few pages from the record of a patient they had just sent up and they refused. Someone could steal the fax off the machine that sits right out in the hall, with easy patient access.
While these worthies spent years thinking up ways to require four sets of auto-expiring user names and passwords for all the doctors, they failed to set up physical security for either computers or fax machines.
The goal of security is not to build a system that is theoretically securable, but to actually make it secure!
The universities, at least as evidenced by their graduates, are only interested in theory. That needs to change, and change now. The yellow sticky phenomenon has become so pandemic that it has received attention in both newspapers and business journals. I realized that many of these professors don't get out a lot, but they are at least supposed to read. Turning out graduates at this late date who are making security worse, instead of better, is just simply irresponsible.
These Primary D'ohLTs shouldn't shoulder all the responsibility. The Secondary D'ohLTs, in the form of practitioners, are not stupid people. In fact, they are, in my experience, uniformly bright. The evidence of the error of their ways is all around them, gracing the edges of monitors everywhere. They need to take some initiative. They must look outward, to the way things "really work," once people are in the mix.
Excessive security can not only turn your financial and medical information into an open book, it can actually kill you.
Fifteen years ago, the approved method for gaining possession of a vehicle other than your own was to wait for the owner to wander off, then jimmy the door and hammer a screwdriver into the ignition. Bowing to auto-insurance industry pressure, auto makers have removed that option in many high-end cars, which are no longer practical to steal.
This has made the insurance companies very happy, but, unfortunately, it is getting a lot of their clients killed, since high-end cars are no longer being taken when the owners are away, but when the owners are there, car keys in hand.
Car theft only costs the insurance company money. Car jacking could cost you your life.
Even when the auto security D'ohLTs aren't killing us outright, they are raising our blood pressure to dangerous heights. We had a VW Rabbit several years ago which featured a theft-proof radio, rendered useless once it was removed from the vehicle. It could only be made to function again by performing an elaborate and secret ritual, involving pressing a whole bunch of buttons in sequence while holding your right foot with your left hand and crowing to the moon.
Of course, the radio didn't really know it had been stolen. It only knew that it had lost its connection to the car battery. So the first time the battery went dead, we no longer had a radio.
VW had given us a sheet with the magical incantations, but it had clearly warned us not to leave the sheet in the car, the equivalent to leaving passwords on a yellow sticky. Ever compliant, we put the sheet in "a safe place," where it probably rests today. (I can't know for sure, since we've never remembered where the safe place is.)
After waiting several weeks for VW to confirm our identity through DNA analysis, we received a copy of the magic sheet. This second copy remained in plain sight in the glove box as long as we owned the car.
Lately, the auto makers have been kowtowing to the insurance companies once again, by adding special lug nuts to each wheel, keyed to a special socket that must be used to remove the wheel.
Unfortunately, the special lug nut has only about 2% or 3% of the surface in contact with the tool, compared to a standard lug nut. If the wheel was overtightened at the factory, as happened with our Lexus RX-300, the custom part of the lug nut will crack right off the car when you attempt to change a spare tire on a dark road late at night, as happened to us, rendering removal of the wheel impossible.
Both our Lexus and new VW now have standard lug nuts on all wheels and to heck with the auto insurance company. We want to keep our life insurance company happy!
A security design must be comprehensive, covering every aspect, every detail of the user experience. However, even the most perfect design can't cover every eventuality. You must also test thoroughly and actively solicit user feedback to catch holes in the security net of which you are not even aware.
A fellow wrote me not too long ago about his experience with an encryption application. He’d been doing a little work for the gummint (that’s "politician-talk" for "government") and had to keep all his output encrypted, so he was using a high-security encryption program for the Mac.
He noticed that every so often the program would "eat" a document, instead of encrypting it, removing every trace of the document from his system. After his initial shock, he quickly developed a work-around: He'd just drag a copy onto the Tresor icon, instead of the original. If it were eaten, he'd try it again until it worked. Once encryption was successful, he’d drag the unencrypted original and copy onto his software shredder.
This worked like a charm until one day when he attempted to launch Tresor by clicking on it. To his surprise, instead of launching the application, the document "opened up" into a window. That’s because what he thought was a document, wasn’t. Instead, it was an Apple "package," a clever object that looks like a folder to the developer, yet looks and acts like an application to the user.
A package allows the developer to have what appears to be a single application that might contain, for example, a System 9 application, along with its System X counterpart, and any supporting files associated with the applications.
This is an idea I first proposed back at Apple in 1987 for, I think System 6. The version I conceived, however, was intended to be bug-free. The released version is not, because, while the package acts like an application most of the time, once in a while, with no apparent pattern or visible feedback, it acts like a folder.
When my correspondent looked inside the suddenly-revealed folder, guess what he saw? All the missing, unencrypted, secret gummint documents, ripe for the taking.
Before Tresor, he would lock his hard disk up. With Tresor, he felt it was OK to leave the hard disk around. His security was actually reduced.
Unless you take a comprehensive approach to security, both at the human level and at the system level, you are likely to not only fail to increase the users security, you may actually succeed in decreasing the users security. In this case, the bug was Apple's, not Tresor's, but the Tresor folk had failed to "close the loop" by actively soliciting feedback. That one error seriously compromised their otherwise excellent, product.
And speaking of security, dont you just love those websites that continue to ask you to enter in your requested password, all done in 128 bit encryption mode, with the characters blanked out so you cant see what youre writing, only to parrot it back to you in an email that can be read by any 12 year old with a Radio Shack computer? Of course, its the same password you use for every single site short of your bank, so now everyone has full access to your computer and your life.
If you are a security expert, unless you are addressing, testing for, and actively soliciting feedback about every eventuality, you are not doing your job.
If you teach security, unless you are teaching a holistic, comprehensive, and practical approach to this vital effort, you are doing your students and your country a disservice.
If you are a security expert who is competent, you need to work to change your profession. It is in deep trouble, and your colleagues are dragging you down.
If you are a designer who must work with a D'ohLT, don't despair. Treat him or her as mildly retarded, in need of help, not criticism, and you will get along fine. Take responsibility on yourself to form a comprehensive security plan. Ensure that user, field, and quality assurance testing, along with user-feedback will thoroughly and comprehensively prove out the security design.
If you are a designer who gets to work with a competent security professional, thank your lucky stars. I've had the pleasure of working with more than a few, and it is a sheer joy.
Entry of a wrong digit, actually a digit not part of the correct code, resulted in the red LED illuminating and the lock resetting. Entry of a correct digit resulted in a green LED whether it was in the correct sequence or not.
With that kind of feedback, once one had determined which numerals were not part of the code it was only a matter of trying all 16 remaining possible combinations to gain entry. It was rather amusing to watch new visiting engineers try and succeed in getting the lock to open before the personnel inside could answer the door buzzer.
Have a comment about this article? Send a message to Tog.
Previous AskTog Columns >
Contact Us: Bruce Tognazzini
Copyright Bruce Tognazzini. All Rights Reserved