Manuals: Simplicity a Virtue?
CDs and Tube Amplifiers Revisited
Web Page Naming Conventions
Standards: The High Cost of Low Cooperation
Log In vs. Low Gin
Simonides as UI Researcher
Microsoft's "Office Binder"
Loved your article on manual writing, but I think I profoundly disagree with it.
Should a great manual be a work of literature you can curl up in bed with,
or the easiest possible reference guide? Yes, there is a role for the former,
but for a software manual people want simplicity, ease-of-lookup,
step-by-step, and not to have to wade through long texts, however well
How do I know? I regularly run manual-writing workshops, which start with
people sitting in front of piles of manuals and trying to find out how to
do something. What comes out at the top of the want list are:
- Easy step-by-step instructions
- Great Table of Contents
- Great Index
- As little blurb as possible
- Graphics and screen dumps
What do you use a manual for? Something like looking up how to do styles in Word. For this, I want the simplest of explanations and then step-by-step how to do it. It may not be great literature but it is simple, practical and effective.
I had started the article Henry mentioned with a tribute to a beautifully written manual by David Pogue. But I have no disagreement with Henry's view. David's manual was a bit unusual in that the vast majority of users of the manual would first approach the product having insufficient domain knowledge. (Worse, many of those with the least knowledge would assume they had the most, a dangerous situation.)
In many cases, a combination tutorial/instruction manual is not called for. When many users already have domain knowledge, by all means give 'em what they need in the simplest, most straightforward way possible.
Back in the early days at Apple, we had a writer compose a perfectly lovely 400+ page manual for a reasonably simple piece of software. All our entreaties that it was just too much fell on deaf earsright up until user testing. We brought in a couple of users to try out the software. After four hours, each was still pouring over the manual, having yet to turn on the computer. The writer left without saying a word. A week later, we received a forty page manual in the mail. It contained all that was necessary.
Least you suddenly err on the side of sparseness, however, consider the case of OnStream. These are the folks that sell 13 gigabyte 8-track tape storage devices labeled "30 gig." (Apparently, someone, somewhere can get the equivalent of 30 gigs on the 13 gig tape as long as they have remarkably compressible material and a strong tailwind. Kind of like claiming a sub-sub-compact car is "roomy enough for six" as long as the six are all under two feet tall and weigh less than 22 pounds each.)
OnStream just released their 13 gig drive labeled 30 gig for the Mac, along with a very handy "quick start" guide. Unfortunately, it only gets you to the point where both the computer and drive are connected to each other and turned on. Then, you are referred to a 267 page manual written in a language I'm sure is understood by hardware engineers. Of course, the manual is contained in one of those hateful PDF files that you can either view with normal text and distorted illustrations or kindergarten text (48 point type) with readable illustrations.
Capping the whole thing off, the manual, written by the software OEM, was written to cover every conceivable storage device other than 8 track tapes that pretend to hold 30 gigs but don't. It was a thoroughly unpleasant experience, and equally thoroughly unnecessary.
If OnStream just once had sat two normal Mac users down in a room with an unopened OnStream box and a Mac computer and told them to back up the Mac disk, they would have immediately discovered what their tech support people are certainly discovering now: Their documentation is simultaneously inadequate and excessively voluminous.
By trying to cut corners, they decided to foist off all operating instructions onto their generic software partner, supplying only a hardware-hookup manual. Said partner, also trying to cut corners, produces one giant PDF manual for all their OEM customers, instead of custom assembling (a feature of some of the more powerful publishing systems, such as Interleaf) a manual that has some vague relationship to the hardware people have purchased.
The trick in manuals is to avoid being either too terse or too wordy. Find the sweet spot. And you can make that sweet spot a lot bigger by supplying quick reference cards, reference manuals, and tutorials separately, so people can attack the literature that they find most informative. And consider using paper. PDF files save a lot of time for everyone except the poor schlub who has to wade through it.
CDs and Tube Amplifiers Revisited
Just a few points to clear up some of what you said about CDs and tube amplifiers in A Century of Scams (December, 1999).
First, tubes: It is well known that they distort more than transistors. The "advantage" that tubes have is they degrade gracefully. When you crank up a transistor amplifier, its input-output relationship is near linear up to a limit, then it just clips the output level at that limit. This introduces clearly audible harmonics that come through as unpleasant distortion. Solution: don't play it too loud. Vacuum tubes on the other hand do not clip, what they do is have a nonlinear input-output relationship. This results in more overall distortion at normal operating levels, but it is never of a type that is considered unpleasant. No one in their right minds would spend big bucks on a vacuum tube amp when they can get better reproduction from a cheap solid-state amp by just cooling it with the volume knob.
Second, CDs. The sampling argument against CDs is spurious. If people cannot hear frequencies above 20 kHz (and you can't) then CDs only have to sample at 40 kHz to contain all the info up to 20 kHz. The only thing to look out for is that you might get aliasing of even higher frequencies, so before a CD is sampled, a filter removes all frequencies higher than the 20 kHz limit. This can (in theory) yield physically perfect reconstruction of the sound up to 20 kHz.
The problems some people have with CDs occurs when they replace their records. A vinyl recording is biassed to counteract the biases in the vinyl playback procedure (the needle, being a physical object, cannot follow high frequencies as well as low, so the high frequencies are amplified before pressing to compensate). If the original masters are not remastered for CD, you get the biases that were put there for vinyl. Unfortunately, the CD playback system doesn't have the vinyl systems biases, so an unremastered vinyl recording on CD sounds tinny - too much treble. Unfortunately, audiophiles know squat about auditory perception and Shannon's sampling theorem, so we get these misconceptions about CDs and tubes repeated again and again, and these misconceptions influence what people believe they hear when they play their systems (especially
audiophiles, who are both near-deaf and extremely gullible).
I will grant you that tubes become nonlinear before they clip, but they do clip. I've seen it on a 'scope. Usually, however, even a sharp clip gets softened by interstage capacitors, output transformers, or whatever else might be in the way. (Solid state amplifiers, with their low working voltages, can avoid most of these encumbrances.)
As for conventional audio CDs, there are theoretically already enough samples there to do the job. I have yet to see anyone, in a properly controlled study, prove otherwise. There are, however, two areas of improvement for digital recording. First, CDs do not have sufficient dynamic range (16 bits) to properly reproduce the quietest passages of a symphony. Second, they cannot handle 5.1 or 6.1 audio or what we used to call "quadraphonic," in the early days.
Surround sound, properly mixed, leaves stereo in the dust. This is not surprising since stereo is only one-dimentional sound. And just so you don't have to write and say, "Hey, what are you talking about? Stereo is 3D sound!", I'll explain.
Monaural sound, like from a standard radio, comes from a single point, the single speaker in the radio. That is not one dimensional sound; that is zero dimensional sound. A single point has no dimensions
Stereo sound each has two sound sources, instead of one. In effect, your stereo speakers represent two points in space. (Because you likely have two or three speakers inside your enclosure, you sort of have multiple points, but for any given frequency range there is only one, and let's keep this thing simple.)
Two points allow you to represent a linea one-dimensional line.
Going to surround sound, whether with four, five, or six speakers, as currently spec'ed, allows for two dimensional sound, with both width and depth. What continues to be missing from all the proposed standards is height.
The height problem can be addressed in several different ways. One simplistic way is to double up the speakers, with one set high and one set low, enabling us to drive a single point of sound around the room, placing it at will any point in three-space. An old proposal, floated when quadraphonic first came around 30 years ago, was to have a single speaker directly overhead. This was rejected for fear the wives of America would move each and every one of their husbands out to the garage. ("You're putting that ugly thing where???)
Tomlinson Holman, who developed the THX standard and coined the term "5.1 channel," referring to five surround speakers plus a subwoofer, has proceed an ultimate goal of 10.2 sound, with an array of seven speakers covering the front of the room out to 60 degrees, plus three speakers similarly filling in the rear area. The point two in 10.2 refers to two subwoofers, one on each side at 180 degrees.
In this scheme, the two speakers in the front set at 45 degrees would be elevated, supplying necessary height information, giving us the third dimension, at least within a reasonable space, without requiring that we run wires across the ceiling. Holman further proposes that the 10.2 standard gracefully degrade, so you get the best possible sound whether you are using 10.2, 5.1, or a table radio with a single speaker. Sign me up!
Web Page Naming Conventions
After visiting your site I have to agree with all the ideas on interface design that I've read from your pages. Very good read. Congratulations.
However, when I tried to bookmark your page, I see on my bookmark "Ask Tog Home Page: March, 2000" - which is very irritating IMHO. Shouldn't the title of a web page be made "date independent" so that users don't have to rename their bookmarks after adding?
Just want to see how you think about this. I believe your opinion will be interesting.
Mea Culpa, Michael. I would like to claim that this was another of my little tests to see if y'all were paying attention, but that excuse has worn thin. In this case, unlike most months where I have multiple articles on the front page, I only had the one, so I just displayed the article "in place," rather than having users link to it.
Such a scheme is actually much worse than Michael suggested. I could have replaced my usual date title with a title properly reflecting the article, but consider what would have happened had I done so: A month later, perhaps, Michael returns by using his bookmark. Does he find the article? No! Why? Because, regardless of the name of the page, the URL it points to will be www.asktog.com/index.html. That's the URL of the home page every month, but the article Michael so carefully bookmarked will no longer be there!
That's easily fixed! Just have index.html whisk people immediately to articleName.html and no one will be the wiser. People who want to revisit the article will bookmark the page and automatically be returned to it.
Ooops! What if someone bookmarked the page not because they wanted to come back to the article, but because they wanted an easy link to the home page. Next month, they'll be in for quite a surprise.
It's probably best to just lead into articles on the first page of a periodical, with links to individual articles, properly titled.
Standards: The High Cost of Low Cooperation
I bought a Sony Cybershot digital camera in October of 1999. It came with some relatively nice software for digital photo management. So, I jumped right in. Shot a 1,000 pictures in 6 weeks time and dutifully added keywords and captions each day to what I had shot. I soon found the limits of the software and began my search for other programs that have similar, but more advanced functions.
After I installed, used and threw away Adobes Activeshare, I had an awful realization. The metadata I was adding was in Sonys software was proprietary and was not being stored as a standard. In fact, none of these programs for so-called digital photo management store metadata that can be universally accessed. Then, I had a realization that was worse. There is no standard. There might be a movement for this standard, and a movement for that one, but consumers are left in the middle, many unaware. I guess this is nothing new in the computing world, just look how hard address book transfers have been in the past.
Its spooky, especially when you consider this:
By the time our 6-year-old is 30, my age, I conservatively estimate that he will have:
- Written between 750,000 and 1 million emails
- Taken between 20,000 and 30,000 digital photos
- Filmed about 1,000 hours of digital video
Im not advocating that its necessary to store every one of these items. But have companies like Adobe, Sony and Apple thought about digital asset management on the individual level? Sure, if it points to their pocketbooks.
So I askTog - you must have friends in high places - is there hope? Or have I made a commitment to Sonys PictureGear 3.2 for life?
Macintosh user for 12 years by choice, PC for 1 year by necessity
PS The 2-button mouse rules!
There is little hope. The best you can do is to try to store things in the most generic forms possible and, even then, you may lose them all. I still have several 8 mm tapes I made back in the 1960s. A few I transferred to VHS video at the time. Those, of course, look horrible. I am waiting for a stable, standard digital format before I go through the exercise again.
Computers are worse. I first got a royal screwing by those nice people at Intuit who never bothered to write an Apple II to Macintosh file translator. I've got three years of data, much detailing tax-deductible improvements to my current house, that have simply become unreadable. Microsoft has, I am given to understand, stopped recognizing early Word files in the newer copies of Word. Apple has stopped reading the original Mac disks.
The sad thing about many of these changes is that there is little or no notice, so several years may go by before you realize that the material you are so carefully protecting may have become, in effect, unread
I now refuse to become a foot soldier in any more format wars. If these guys want to fight out DVD-audio and DVD read/write formats, let them. I'll just sit on the sidelines until a true standard comes along.
You would think that they might have learned something from the success of LPs, Compact Cassettes, CDs and DVDs: Standardized formats sell.
Read/Write DVDs will be a particularly ironic war, as close to 10 "standards" prepare to compete: It was the computer industry that forced the entertainment world to standardize on one single DVD-Video format. Now, the computer industry is prepared to cause utter chaos. I'll be sitting on the sidelines watching this one. They can have my money when they learn to play together.
Log In vs. Low Gin
"log in" is the correct term when asking someone to enter userID and
Hassan Abdel-Rahman wrote:
I'm trying to identify which term is correct: "login" or "log on" , i have to say, i cannot find many applications that use the term "log on"...
If you want to log on to a site, you must first log in. The dialog requesting you to do so should say, "log in," not, "login," properly pronounced "low gin." Yes, I am well aware that login ("low gin") is used frequently; that does not make it right.
Simonides as UI Researcher
...Those who have studied memory know that there is a stunning application to computer interfaces that will forever make the concept of the 'lost file' archaic.
In 2000 BC the Greek poet Simonides attended a dinner for a rich patron. After reciting a poem comparing his host to one of the gods, Simonides became upset when only half the promised fee was paid. "Get the other half from the god," is what he was told, and he left the banquet. (Code writers and other writers for hire, beware: there is nothing new under the sun.)
There was an earthquake. The bodies could not be recognized.
Simonides came to the scene. He was able to identify the bodies remembering where everyone had sat. And thus came a crucial realization: when information is tied to a physical place, it is easy to remember.
The techniques can simply be adapted to computer interfaces today.
And it is applied today, particularly in the Macintosh. How sad that, with the advent of System X, it appears it will no longer be.
Microsoft's "Office Binder"
Your most recent column discusses various ways of bundling associated documents together (through "piles", "Filing cabinets" etc. I'm curious as to what you think of Office Binder, which has been a component of Microsoft Office (on the PC, at least) since Office 95? This in fact addressees many of the issues you raise, is capably implemented and easy to use, but no-one every seems to use it.
The metaphor at play is a bulldog clip. You create an empty "Binder" file and then add files to it (by drag-and drop or browsing); these files can be of any format that supports OLE (Object Linking and Embedding). When you open the 'binder', you see a list of files down the left hand side, containing icons representing the embedded files. Clicking on one of these opens the relevant document in the main part of the window, loading up the relevant menus, toolbars etc. at the same time. This is in fact rather similar to the universal menu bar on a Mac: when editing documents within the Binder, Word or Excel etc. menus all appear on the Binder window's menu area when required.
In effect, it's a simple host for OLE embedded files. You can embed all associated files for a project into one document that can be e-mailed around, or printed etc., and all the different document types can be edited within the same window. Contents pages, templates, indexing and page numbering etc. can all be applied centrally (which is fabulous when coordinating submissions from different people - just drop the Word doc containing a new submission into the right place in the sequence, and watch the page numbering etc. adjust automatically).
Virtually no-one seems to know about, or use, the Binder; use spreads amongst my colleagues on a word-of-mouth basis. Yet no-one, having seen it, rejects it per se. I suspect the problem, as with (perhaps) some of your solutions, is that it's just one more thing to learn how to use; the thing itself is intuitive, once you grasp the metaphor, but I often get the sense that many people have quite enough difficulty with the Alias metaphor. Folders seems to be 'good enough'...
I must confess to having never known about it. I had proposed such a device a couple years earlier as part of the Starfire project. The concept is that you give people an object in which they can view a lot of disparate material, but you don't require them to formalize the relationships between the material. The value is that people can approach their tasks informally, only preparing the formal document toward the end, when the problems and solutions have finally become clear.
Office Binder is probably an orphan because it is so well hidden. While it does appear under the "new" menu in the desktop properties menu, it is sandwiched in among about 47 different document types. It is not a document, it is a container. It belongs clustered with folders, shortcuts and any other new containers.
It could also use a little publicity. Because Office Binder is part of Office, instead of part of the OS, where it belongs, it has become lost in the shuffle.
It is true that people are not assimilating much continuing knowledge when it comes to objects and behaviors, but that can be changed. People 20 years ago said Americans could never learn the European style of highway markings. Today, the red circle with the slash through it is ubiquitous. If you look at how much GUIs have grown and expanded during this same time, it is obvious users can learn, too. We just sometimes have to help them along.
Contact Us: Bruce Tognazzini
Copyright Bruce Tognazzini. All Rights Reserved