NUblog archives

June 2000

(2000.12, 11, 10, 09, 08, 07, 05, 04
2001.01, 02)

See also: Special reports on moguls and megalomaniaOlympix

On this page


2000.06.28

Jumpout photos

We wrote recently – griped, really – about online photography. We pretty much skipped over a discussion of the idée fixe of thumbnails, which, when selected, lead to a new page showing a bigger version. (A bigger version that hasn’t been optimized for onscreen display. Gripe, gripe, gripe.)

We will give Quokka Sports mad props for a feat of JavaScript trickery. An acquaintance of one contenunian, Douglas Robson (passim), wrote a first-person account of scaling Kilimanjaro in the company of his father. Brush your mouse over the photos and they leap to a larger size (and sometimes errantly stay that way like a port wine stain on the page that will not go away: thumbnail, be not proud).

They’re amateur snapshots still, but the interface approach works quite well. Improvement: Mousing over one photo brings up a selectable floating menu of other photos, which, when clicked on, replace the source thumbnail at full size. Sort of like the much-maligned OS X icons.


Scrunching eras together?

We explored this issue in coverage of the near-shutdown of APBNews.com. Christopher Byron’s pointed but factually accurate article on layoffs at Salon made some arguments that butt up against our ideals here at contenu.nu.

  1. By firing writers, Salon eroded its only reason for being: Good content.
  2. Since advertising isn’t even remotely approaching cash outlays, the magazine will burn through all its venture-capital cash. (The way of all things online, shurely?!) Near the endtimes, Salon will fire even more editorial staff, further weakening any reason to read it. No readers, no audience, no advertising. It’s a death spiral.

We are not so whip-smart here at contenu.nu that we have a snappy and convincing response for the foregoing. Indeed, there is all sorts of evidence that people resent having to pay for content online, as we know from the case of the music industry.

But will that always be the case?

In Canada, television used to be free – after you bought a set. The parallels are considerable, actually. You buy a computer and net access (which is effectively free in Canada: $29 a month for unlimited access is essentially nothing), and you are then able to enjoy the entire Web.

In the ’80s, cable television came along. You were then given the option of paying for the technical infrastructure to receive free television programming your antenna could not otherwise pick up. Later, specialty channels were devised, funded in part by a levy on your cable bill. And everyone knows about premium channels, which you pay for because you specifically want the content.

Presently we may be trying to scrunch two eras together. We may be working in Internet time when the natural course of media evolution should slow us down. We seem to be in the free-content stage, akin to early broadcast television. (Of course, advertising on broadcast TV actually brought in money.) When content-rich sites like APBNews and Salon bleed, we skip to the wrong conclusion: That there is no sustainable economic model for online content, only services and transactions.

But there was no model for paying for otherwise-free TV signals, either. Cable companies invented it, and eventually nearly everyone signed on, willingly. What no one has figured out yet is a next step in online evolution analogous to the jump from antenna reception to cable TV. Imagine that cable TV had gone over like a lead balloon, with cable operators tanking left and right. Would we have shut down television stations in response?

That, after all, is only one step removed from Christopher Byron’s predictions: Content sites die. Killing off content sites, since they’ll wither on the vine anyway, would at worst be a form of euthanasia.

Are we quite sure of what plug we’re pulling?


(We note, however, that ad-supported content schemes seem to be skipping a step of their own. Media conglomerates envision a day when you could never own a copyrighted work for unlimited future enjoyment; each rerunning would trigger a royalty payment merely because it is technically possible, not because it is just and proper. And cellphone makers are entirely in thrall to the idea that telephone owners should be spammed with ads at every possible intersection merely because it is technically possible rather than just and proper. The entire WAP structure permits and encourages an overdose of advertising. Some form of happy medium is required in both cases.)


2000.06.27

Will "the death of content" please shut up?

Even we are getting tired of discussing the so-called death of content, and we wrote about it weeks ago. We’re all being hasty, every one of us. The patient just checked into the hospital with a sore neck. It is too early to diagnose meningitis.

In Weblogging tradition, we will, however, triage and bloodlet some recent sightings on this increasingly tedious topic. Maybe soon we can pull a sheet over its face.

  1. Jon Katz, the excessively famous media critic, desperately needs to write shorter, and the incompetence of his copy editor (improbably, he claims to actually use one) makes Katz’s Freedom Forum columns an exercise in foreign-language translation. He has quite a valid point, though, about Closed vs. Open media.
    1. Instead of wringing hands over layoffs at Salon, Katz refocuses on the cœlecanth-like habits of traditional journos and the sites that obsess them. There’s next to no interactivity at Salon, just as there isn’t in print newspapers, and the hand-wringers entirely overlook mailing lists, Weblogs, and small- and large-scale "content" sites scattered all around the world. We interpret this as benign ignorance with malign effect. Journos are too old to really get the net, and too swayed by brand names to read what any individual netter writes on his or her site, or to give that writing any credence. Journalists, after all, are Arbiters of Truth™; individual netters are little more than jumped-up conspiracy theorists pounding away on keyboards in trailer parks, at least as far as journos are concerned.
    2. Katz goes so far overboard as to fish Kate Winslet out of the Atlantic in dubbing all these neglected authors "journalists." He drops the word so many times (in that context, ten times) that he ends up sounding like a member of a religious cult who must utter the word "Yahweh" in every third sentence. Katz goes so far as to say "anybody with a computer and a modem can be a journalist and use the open protocols of the Net." Um, no. Anyone can publish. That doesn’t make you a journalist. A writer, yes. A publisher, sure. A contributor, a participant, a blogger, a (content) creator. But journalism requires more acumen than your typical Weblogger or contributor to a mailing list could ever put together. If that’s a value judgement, so be it. The ability to use a Font menu does not make you a designer; reading and writing do not make you a journalist.
    3. (It’s an entire scene in Jesus of Montreal, in fact: "You ought to publish a book." "I’m not much of a writer." "I said publish a book.")
    4. Nonetheless, Katz hasn’t merely hit the nail on the head, he’s hit the right nail on the head. There is a tendency to fret over the content-rich Web sites that most closely resemble print newspapers scanned and uploaded to the Web. That also includes APBNews.com, a glorified Allô Police–style tabloid gussied up for the Internet.
    5. Surely anyone reading a Weblog like this would never make that mistake. Surely anyone writing a Weblog like this would never do likewise.

In a similar vein, Matt Welch provides a delicious You Are There! diary of his brief tenure as a cog in a very expensive wheel at DEN, the broadband-content concern that crashed and burned to no one’s regret a month ago. You won’t believe your eyes, which is to say you will believe every single word, but with ill-restrained distaste and horror. Welch:

Web sites looking to make real money either need to (a) be Yahoo, (b) sell porn, or, best of all, (c) start small and win a damn following. There is a dirty little secret about content companies: popular, scaled-down sites like Suck, CapitolHillBlue, and the Smoking Gun all make money, as in that stuff left over after the bills are paid. Meanwhile, heavily-staffed, venture-backed heavyweights like Salon, TheStreet.com, and APBNews are bleeding money like hemophiliacs.... [T]he funding bubble for money-losing content companies is probably over.

OK. But the problem then becomes making a name for oneself. This NUblog, for example, has been read about 12,000 times in its first month, or 400 times a day. Maybe the right 400-odd people are reading it, but that’s still too small an audience to get bought by Wired.

So here’s one lesson we learn from the net: Anyone can publish. Not everyone who publishes expects to make a fortune. Some of us do it to be heard, or for love. Vile commerce is not our only aspiration. (contenu.nu will, of course, assist you in your pursuit of vile commerce. It’s just that this NUblog is not in that category.)


2000.06.24

Photography: Worth a thousand?

We’re going to borrow a phrase from our friends Just van Rossum and Erik van Blokland, the kooky type designers–cum–programmers at LettError, who complain that software tools unintentionally constrict the imaginations of their users. You may be able to do a hundred things in Photoshop, but that’s still only a hundred things. Toolspace is smaller than Ideaspace.

Now, what do you see on typical Web sites? Graphics and text. What do the graphics look like? A lot of line art and illustrations using the Web-safe palette. A preponderance of effects using stepped screened horizontal and vertical lines, shading, and faux-3D. Rectilinearism as a kind of imposed artistic religion.

In a nutshell, everything online looks like it was produced on a computer by a Web designer.

Yet films tend to look like films even though filmmakers benefit from a vastly wider array of options – aspect ratio, film stock, lighting and décor, makeup, and of course postprocessing. (In-camera effects, like the filters used to such flabbergasting effect by Slavomir Idziak in Gattaca, are, sadly, now démodé.) Even digital-video snippets, as seen in Welcome to Sarajevo and other motion pictures shot on film, are still accepted by film audiences as rightful forms of filmmaking. All-digital films are slowly catching on. Those divergent visual approaches are all accepted as film, and they span a huge gamut. Ideaspace and Toolspace are neck-and-neck here.

So don’t try telling us that Web sites look like Web sites because they’re Web sites produced by Web designers using Web tools. By comparison with cinema, we on the Web labour under slavery-era æsthetic constraints, but we nonetheless do not see enough visual variety. (We exclude Flash from the discussion.)

As fans of photography (one contenunian is a published photographer; another went through the standard buy-a-camera–learn-to-see-again epiphany), we know it doesn’t have to be so.

Photos are crap

But, Mission Control, we have a problem. Photo reproduction online is poor. Web designers just do not have the skills to customize photographs for onscreen display under the rigors imposed by the JPEG format. (Don’t even think of saving a photo as a GIF.)

  1. Pictures are too small. Weaned on a diet of thumbnails, designers shrink photographs to the size of the metal shutter on a 3.5" floppy disc.
  2. Pictures are dull. Computer monitors are luminous. In effect, you’re dealing with a lightbox. Ever looked at a slide through a loupe on a lightbox? The richness and saturation are astounding. A reasonable facsimile of that visual epiphany is attainable online. Photographs can appear jewel-like even on a VGA monitor. Philip Greenspun gives technical explanations of how to wrangle Photoshop to make it so.
  3. Pictures are generic. The rise of royalty-free stock-photo houses has made it easy for that minority of designers who bother to add photography to select terminally boring, unoriginal, and inappropriate imagery, of the sort that features smiling, well-dressed, improbably multicultural office workers huddling close enough together to trigger a sexual-harassment lawsuit. (The musical group Erasure, which lampooned this falsely-generic, consent-manufacturing, homogenized-yogurt photographic style in the CD-insert photography of its 1991 single "Love to Hate You," described it in an interview as "the twentieth century, really. It’s 1990s/1980s advertising. I don’t think advertising’s moved on. It’s just that maybe it’s more kind of evil.")

Understanding photography

We know the real reasons why photography is so slipshod online. Photos cost, they require an eye, and the only real way to handle them is out-of-house.

  1. Photographers cost good money. No wonder: You have to own a lot of equipment (now including digital cameras, whose technology expires faster than homogenized yogurt), not to mention computerization, which requires crème-de-la-crème hardware and software. Few photographers will sell you a photo outright. You pay for limited usage rights. The concept of "limited" anything on the Web sticks in the craw. (What did the Wired apologists used to say? "Information wants to be free"?)
  2. You need to know your stuff. Developing an eye for photography takes years of reading, gazing at photos, and shooting. In short, photography requires taste. (Alert readers will note the contenu.nu slogan: "Taste | Acumen | Content." We listed the words in that order for a reason.) Web-focused art directors with digital-only skills who lack a print background simply haven’t had time to gain that kind of eye. (It could be worse. Developing an eye for typography takes a decade. That may explain why online type is so bad.)
  3. Employers are absolutely notorious for piling expectations onto their Web employees.
    1. Everyone at contenu.nu (as of this writing, there are five of us) has combined artistic and technical skills. You need both to function in the content "space."
    2. But it’s not at all atypical to find a job posting requiring full fluency in HTML hand-coding and Dreamweaver, Photoshop and Illustrator, and Javascript, Perl, PHP, CSS, IIS, and ASP. Are you a designer or a programmer? Or neither? (Reporter to Ringo Starr in A Hard Day’s Night: "Are you a mod or a rocker?" Starr: "I’m a mocker.")
    3. So are you a photographer, too?
  4. With such emphasis on doing everything not only in-house but funnelled through a single person, who the hell wants to deal with freelance photographers and their artistic temperaments?

Photobrokerage

We see an opening for an entirely new industry – the online photobrokerage. Spiritually akin to a stock-photo agency, which maintains large corpora of already-shot photographs (and makes limited assignments for photos with anticipated future demand), the photobrokerage would act as a central photo-management depot for all the Web-design shops in town who don’t have the time, patience, or taste to oversee photography themselves.

Say you’re a designer. You’ve finally received signoff on your client’s Web-redesign brief. (Finally.) You run the brief by the photobrokerage. "Got any ideas?" you ask. "Know anyone who could shoot something for this?" The broker acts as a de facto art director/photo editor, brainstorming approaches and energizing his or her network of experienced shooters while you do your "real" work.

The broker nursemaids the photographer, if necessary, and serves the Web designer by delivering fully-optimized, jewel-like JPEGs (or PNGs) of candidate shots, akin to a contact sheet. The photographer always gets paid his or her mimimum day rate. That gives the broker an incentive to understand the designer’s requirements and know how to put them into effect with the shooter, because a mismatch will cost the designer money and undermine the relationship. Of course, the designer pays full rates for reasonable Web usage of selected photographs (e.g., two years of unlimited visitor downloads, but no alteration of the delivered files; photographer bylined on a credits page, on the actual site page, in ALT and LONGDESC).

The system leapfrogs the common obstacles to using photos online. The designer and broker are both artists, and can talk in the same vernacular. The Web designer trades off personal control against convenience and an ultimately superior Web design that couldn’t otherwise be accomplished (being, as they are, too busy learning IIS and Javascript). Only photographers keen on seeing their work on the Web will be interested, and shooters who worry about crappy JPEG quality are reassured, since the brokerage takes care of optimization. (Or the photographer can do it, if technically inclined.)

The photobrokerage system ends up costing more than hiring a photographer directly. Rather like the despised temp industry, you live with higher prices in return for reduced hassle. Except unlike office temps, who make next to nothing for hard work, photographers earn standard rates. The artist is protected. (Relax, Courtney.)

Stock-photo agencies are micturating themselves trying to figure out how to survive in an age where everyone expects visual content to carry no cost. (And how to survive in a world where Gates and Getty own nearly everything.) By slightly expanding their self-conception – from a warehouse into a brokerage – agencies can have it both ways: Selling the past and meeting the needs of the present.

We told you we open-source our knowledge. (We also said we wouldn’t give away the store. Hmm.)


2000.06.20

Quickie update from the redoubtable Lawrence Lee of Tomalak’s Realm, a man who embodies the best content-management system on the entire Internet, who can find everything he ever saw in an instant: The Canadian government actually published guidelines on bilingualism on "electronic networks." Leave it to Lawrence to know exactly where to find it.

Anyone with any kind of content problems in Vancouver who does not immediately hire Lawrence at the highest possible rates is a fool. And you can quote us on that.

Metadata

Remember the word "shovelware"? The term described uploading old print articles to the Web. It was a cardinal sin for years, and was credited with the demise of Time Warner’s Übersite Pathfinder.

We were told that real Web content contained multimedia! and links! and interactivity! Well, yeah. Content on the Web containing those features is truly Web-like. But, just as inverted-pyramid blurbettes aren’t the only acceptable form of Web writing, there’s more to life than Flash and chat.

Layers

With HTML metadata, you can add layers to static text without recourse to animated GIFs or anything produced by Macromedia.

For most Web-surfers, the metadata we’re talking about are "Hey, neato!" features. Suddenly a balloon pops up with more information, for example. It’s impossible in print. It’s intrinsically Web-like. Even if all your site presents is words, you can add Webness through these tags.

Pages

At the page level, there’s actually a tag called <LINK>. You place it inside <HEAD> and before <BODY>. You can specify relationships between the current page and pages below it in hierarchy (<LINK REL>) and above it (<LINK REV>). You can keep these straight pretty easily: REV means REVerse, so when you use <LINK REV> you’re going up in hierarchy.

The NUblog you are reading, for example, has a relationship to the contenu.nu homepage above it and the Background page below it. We’d show the relationship this way:

<LINK REV="Homepage" href="home.html" title="contenu.nu homepage">
<LINK REL="AbouttheNUblog" href="nublogbackground.html" title="Background on the NUblog">

Who cares? Well, suddenly you’re able to navigate from one page to another without using text navigation coded into the Web page. Try that in print.

(There’s actually a list of LINK types you can use, including LINKs for author, copyright, start, end, next, previous, and more.)

The problem? Pretty much the only browsers that support LINK metadata are Lynx and iCab, which next to no one uses.

Titles

Did you know you can slap a TITLE on nearly anything? An image, a link, a table row?

Why would you want to? To add an extra dimension. If you’re using pretty much any browser other than Netscape, you’ll find that nearly every link on this page contains hidden TITLE text. In Explorer, a balloon or ToolTip pops up with the TITLE text. In iCab, you see it on the status line. In Lynx, you have to hit L.

Now, we do condone the practice of adding a bit of fun or cheekiness through TITLE. Example:

<a href="http://www.alertbox.com" title="Alertbox, proof that freedom of the press belongs to the high-priced consultant who owns one">

For images, in HTML 4 you must provide an ALT text for users who cannot see the graphic, and you should so so for every single graphic image even if you aren’t coding to the HTML 4 standard. (ALT="" is perfectly valid for spacer GIFs.) It’s an accessibility thing. You can add a TITLE, and the ALT and TITLE can be different. We put our image TITLEs in brackets as a differentiator.

Smart operators like Blast Radius, who insist on throwing big, graphical pages at you, use the TITLE tag to give you something to look at while the massive image loads. (You can do something similar with Javascript, but we’ll leave that for another day.)

You can enTITLE table rows. We do it here, actually, and the effect can be somewhat annoying, because, in Explorer, a balloon follows your cursor around like a lost puppy at a picnic. We consider TITLE on TR a bit of an experiment. (We’ve played around with the settings today. Try mousing around. Good for a laugh.)

Abbreviations and acronyms

What HTML desperately needs is an EXPLANATION tag, so that you could embed explanatory text around a passage that would appear when you selected the text. (We could have used it in the preceding sentence, for example.) In HTML 4, there are two close analogues, ABBR and ACRONYM. You can cause an expansion of an abbreviation or acronym to appear:

<ACRONYM title="frequently-asked questions">FAQ</acronym>
<ABBR title="independent">indep</abbr>

We use those tags consistently here. Again, try mousing around. (Explorer does not indicate which abbreviations and acronyms are graced by that metadata, so you have to stab in the dark. iCab underlines such text.)

Images

Remember LINK, virtually unsupported by browsers? It gets worse.

By far the most obscure of the metadata tags, LONGDESC is used to link to a textual description of an image – a long textual description, much longer than ALT, which theoretically has a maximum length of 1,000 characters. You add it inside the IMG tag: LONGDESC="picture-LD.html".

LONGDESC is an HTML 4.0 tag meant to provide accessibility for blind and visually-impaired net-surfers. Though well-intentioned, LONGDESC is replete with problems:

  1. Of browsers currently available, only iCab supports it. (Control-click on an image. Select Description from the Image submenu.) Even two browsers alleged to support the entire HTML 4 spec, Mozilla and Macintosh Explorer 5, ignore it altogether, putting the lie to compatibility claims.
  2. Many screen readers used by blind and visually-impaired people support LONGDESC, but few authors bother to include such information. (Still today, in the year 2000, we find Web authors who can’t be bothered to enter ALTs.)
  3. Writing a description of an image, to paraphrase Lester Bangs, is akin to dancing about architecture. It can be done; the skills are cognate with those taught to audio describers. Pretty much no one walks in the door with those skills.
    1. The problem, as ever, is that Web development is typically carried out by two kinds of people: Designers and programmers. Neither designers nor programmers can necessarily write – and we mean write vivid, concise English.
    2. What few content experts there are in the world – contenunians among them – rarely know anything about accessibility. Even if they had the skills to sum up a photo in 300 words, few would even know it was necessary.
    3. Authoring tools do not prompt you to include a LONGDESC, though Homesite has been upgraded to make LONGDESC addition at least possible. (If no shipping browsers had supported LONGDESC, that would not be the case, according to the consultant who oversees feature additions.) That will change after U.S. federal government regulations kick in. Any vendor who hopes to sell authoring tools to the American feds will have to upgrade the tools to prompt for or otherwise automate accessibility features, which will cause rage throughout the land as designers and programmers are forced to work extra seconds crafting ALTs, and extra minutes crafting LONGDESCs, for every image, time they could spend producing Flash-only splash pages.
  4. Practically no sites outside of the disability field even bother to include LONGDESC, so the tag has no profile among surfers at all, not that browsers could even expose the tag to people.

You can see just how difficult accessibility to imagery actually is by looking at the Break This Page! experiment, which pitted four access approaches against one another, LONGDESC included.

Any image on this site not consisting of text, which is fully encapsulable in an ALT, is accompanied by a LONGDESC. For good measure, a D. link, overtly visible to all browsers, is also provided. Hey, we’re cutting-edge. (You can find examples in the archives.)


2000.06.18

In proper Weblogging tradition, here are some content-related links we’re appreciating lately:


2000.06.14

Multilingual content

Now, we have chided our dear American friends elsewhere on this fine site. We were making the point that Web sites don’t have to be designed for 19-year-old American frat boys (invariably described as surfing the net from their "dorms," a word we never use in Canada, preferring "residence"; Cf. "college" vs. "university").

At a more global level (irony intended), the Web is a multilingual phenomenon. This is a hard sell with our American friends, who simply cannot understand why anyone on earth – certainly including Latinos – would ever want to speak any language other than American.

We look at multilanguage support as a form of accessibility. We have adopted this viewpoint after finally accepting, in the manner of breaking down and letting the love of a deity into our hearts to heal our plantar warts, that subtitling and dubbing are access technologies and techniques on a par with captioning (FAQ) and audio description. We came out of our denial. We broadened the metaphor and now accept Web multilingualism as access. You should, too.

The first hump

Let’s assume you accept that some non-English content on your site would be useful – if only to sell to people who speak that language.

There. We got you over the first hump. (Notice we’re not doing a big sell job here. Either you get multilingualism or you don’t. It’s a personal journey. We can’t impel you toward salvation. You need to save yourself. We can’t do your saving for you.)

But you’re not off the Bactrian camel yet, and you’re facing another hump. You’ve got an interface problem: How do you tell people who don’t understand English that, on your site, they don’t have to?

contenunians all have combined artistic and technical skills, so right now we’ll breeze you through the tech requirements, rather like a manly hand passing over solarized fields of wheat. (Hire us and we can program the whole shebang for you.)

Doin’ it right on the wrong side of town

We have never encountered a satisfactory multilingual site. We like what ettusaishomme.ne.jp does (see archive), but that’s an interface element, not a strategy. We thought the Quebec government site, at www.gov.qc.ca, was being smart in reading our browser settings until we tried the obvious French-language variant of that URL, www.goUv.qc.ca ("government" vs. «gouvernement»), and realized that the system was feeding language-appropriate content from two distinct domains.

Canadian government sites are the worst. Nearly all of them carry hopelessly unwieldy bilingual acronym URLs, like www.ccra-adrc.gc.ca. Then you pick a language. Then you search and plod forever. (Try finding absolute anything at the CRTC site – and it benefits from a bilingual acronym: www.crtc.gc.ca.)

Rare government sites with separate English and French URLs (like Defense, dnd.ca|mdn.ca) flub it even worse: DND/MDN still asks you to make a selection. (We tried it with EN and FR as our settings.)

Seasoned netters will know that this discussion impinges on overlapping and storied fields in software development best known by alphanumeric acronyms: i18n, l10n, and g11n (internationalization, localization, and globalization, so named because of the number of letters between initial and final). We would like to get to know professionals in those fields (or just language obsessifs).

While we’re waiting for the baby-blue Ericofon telephone to ring, we’ll confess our frustration a having always been bridesmaids and never brides when it comes to multilingual Web content. We want it done better. Somehow, anyhow, by someone, anyone. We have ideas. (Hey, we open-source, but we don’t give away the store.) We know how to do it right. Now we need to find someone who wants it done right. Know anybody?


2000.06.20

Multilingualism redux

We’d like to offer further advice on multilingual content. Last time, we talked about technical and user-interface details. We’ll now discuss actual words.

The translation problem

In practical terms, multilingual Web sites are most widely found among multinational corporations. They’re the ones with the cross-language reach. (Number two on the list is government, usually in multilingual countries.)

The multinationals are in the marketing business. In planning a polyglot site, the question you have to ask yourself is: What am I marketing in English, and what I marketing in the other language? Computers are sold around the world. Specific laundry detergents are not, for example.

Then another question, the trickier one: What’s different between the two languages? In some cases, nothing. Tide laundry detergent is the same product as expressed in English, French, and Spanish in North America. Apple has gone to some lengths to standardize its graphic design and marketing approach worldwide; international Apple sites look like the American Apple site. Cases like those fit the conventional wisdom that translation merely means restating the same sentences in two different languages.

Content or narrative sites, however, offer more obstacles. If you’re marketing a product by showcasing an individual person in a story, whether based on real life or made up for the purposes of advertising, you impinge on culture.

As an example, MTV in Germany is unlike MTV in Australia. Indeed, this happens a lot on TV (because narrative works swimmingly in TV commercials), but rarely on the Web, which is a graphic-design medium. When it happens, you’re dealing with text and pictures. On the image side, diversity issues come up. Should the people featured be Caucasian? female? young? in a wheelchair?

Workflow

On the text side, you’re stuck with workflow problems. You have to find a reputable translator, provide finalized English copy, and integrate the translated copy into your site. But what do you do about updates? If you’re the European subsidiary of a multinational and you manage Web sites in seven languages, which languages do you update first? Or do you update none of them until all languages have been translated? (The latter is the better way, unless you want foreign-language speakers to get the impression they aren’t wanted on the voyage.)

If your in-house staff handles English copy, will you bite the bullet and tell everyone, all the way up to the VP of marketing, that you won’t go to press with English until the other languages are finished, too?

That will often mean having in-house translation staff, at least on some days of the week. If not, can you develop a relationship with out-of-house translators to turn copy around on a dime?

Designing what you can’t understand

How will your translators handle designed copy?

How will your Web designers handle the translated copy?

Nuances

It gets worse. You may have to customize your copy for different variants of the same language. "National" French isn’t the same as Canadian French. Spanish is subtly different in separate countries in Central and South America and in Spain.

Now, maybe the words won’t need alteration at all. But if you provide a feedback form that asks for a postal address, you have to either provide free-form entry fields (a big box to type in everything but personal name and country name, for example; careful about what happens when you hit the Return key!) or well-researched country-specific forms (e.g., asking for a state in Mexico but not in Puerto Rico).

Do you offer downloadable PDFs of your content? Have you taken care of paper sizes and preferred formats for page numbers?

English-speakers will be aware that British and American English are written differently. But Canadian English uses yet another spelling (and different date formats). We can sum up the spelling differences this way:

Most Canadians and many British will put up with American spellings. Some Americans will put up with British or Canadian spellings (though we have received angry American E-mail about words like valourize that are allegedly misspelled). However, you shouldn’t ask your readers to "put up with" anything. Canadian- and British-specific sites should be custom-written. (We skip for the moment the nuances of, say, Australian and Irish English sites.)

Asymmetry

Now back to the real world. ("Finally!" you mutter.) In the real world, we don’t always have exactly equivalent copy in two languages at hand. In fact, sometimes parts of a site will never be translated.

You have to be up front with people about these gaps in bilingual content. In language-selection areas of the site (discussed previously), tell us we’re only able to view a selection: Français (extraits seulement).

If a subpage on a topic has been translated but most of the subsubpages further exploring that topic have not, say so, and give people the option of reading the English anyway. "Sorry, but this site doesn’t offer any further pages in French on that topic. Would you like to look at the English pages?" (Or German, or whatever language or languages you offer them in.)

In searching, sometimes real-world constraints will come up. If you’re the legal governing body of a bilingual province and your visitors search for French-speaking lawyers, sometimes there won’t be any in the vicinity. The system should then say "We couldn’t find any French-speaking lawyers in your locale. Would you like us to expand the area of search or show you some English-speaking lawyers?"

As you can see, multilingualism is not limited to replicating the same information in two languages. As we learned in the disability field, where accommodation can sometimes mean that treating a disabled and a nondisabled person exactly equally results in inequality, treating English- and non-English-speakers identically sometimes ends up highlighting your poor preparations. People can live with inconsistent bilingual content – if you don’t try to fib and pretend there’s more available than there actually is.

Who’s gonna handle all this?

We apologize for the length and generality of this discussion. Our goal is sensitization, not training. We can see an objection coming, though, from real-world clients, who won’t be thrilled at all by the prospect of multiplying their Web-development costs by some unknown factor to accommodate a known number of languages.

But that’s the reality. Doing multilingualism right requires hiring people just for that task. Even consultants like contenu.nu can’t do it all for you. All we could do is iron out the language, content, and workflow issues. But it’s gonna cost. And it’s gonna be a pain in the neck. Welcome to the global village.


2000.06.12

Stuffing too much down the pike

The essence of the Internet is text, and that will remain true until Steve Case finally succeeds in turning the Web into television. (NBTel is already doing it.)

Nonetheless, we like streaming media. Streaming audio, anyway, which functions adequately well for short periods even over a 56K modem and very well over any kind of high-speed connection. What a pleasure it was to discover that Triple J in Oz is even worse than CBC’s attempts at programming music of any recent century, but in an entirely different way!

From a content-management and usability standpoint, however, too many streaming sites put the cart before the horse. They assume that, merely by directing your browser to their site, you’ve already got all the equipment to handle broadband signals. They throw hugely complex JavaScript and/or Flash sites at you just at the homepage.

What’s wrong with this picture?

When was the last time you browsed the aisles of a video store holding a VCR and television in your arms?

Isn’t it understood that you set up the artwork before you enjoy it? Even in middle-class bookstores with in-house cafés, you buy your book first, then read it.

Sites like these, which slap you across the face with high-bandwidth content before you’re ready for it and before you even asked for it, are designed for programmer elites, not real people.

You can’t even visit a broadband site without a broadband connection. It’s even more infuriating when the site offers multiple feeds, because then you know you’re going to be wading through even more molasses. If only one feed interests you, naturally you’re going to undergo another level of selection. You’ll undergo a second level of setup before enjoyment.

Example? The shadowy, little-understood Toronto cybercaster 1groove.com, which has secured crème-de-la-crème DJ talent and good management at parent Iceberg Media, according to print reports unavailable online. The outfit, which runs three net-radio posts, also swung a deal to broadcast programming overnights on CIUT, a beleaguered "campus/community radio station."

1groove’s site is so complicated it requires tech support on-air. Dr. Trance, alias Don Berns, dictates directions on how to navigate the site (to send comments to DJs, for example), most of which hasveto do with clicking some kind of bottle cap at the site’s homepage. (How’s that for an interface element?)

No doubt in response to complaints, 1groove.com now features a "static" page "for Rapid access to the 1Groove experience..." (sic). Even that subsection requires JavaScript, and the noframes copy indicates hotshot programmers’ attitude toward anyone not exactly like them: "What are you thinking? Get frames."

Yes, what were we thinking? Why are you making us walk to the airport just to sit down in an airplane?

How to do it right

If you’re looking for repeat listeners, accept that they’ll want to find the exact, unchanging, permanent link for their favourite feed and bookmark it.

Give us svelte, sexy, quick-loading, self-evident pages that lead us to those bookmarkable links. The only pages that should require a broadband link are those that actually feed sound. Everything else – DJ bios, program playlists and transcripts, archive listings, the lot – should be built for narrowband, not broadband.

And kids, no Flash, please. None of it. Just as some Republicans are gay, some of us interested in Internet audio are not interested in Internet animation.

(Later, we will discuss content-sharing between print and electronic news sources online.)


2000.06.10

Our hate-on for portals

We promised to explain why we hate portals. We live up to our promises.

The raison d’être of the portal is to attract eyeballs, which, the theory goes, would then become monetized. Portals are intended as one-stop shopping for the vast, intimidating, scary, pedophile-stacked, pornography-drenched, hate-literature-replete Internet.

In an attempt to please everyone, portals please no one. Portal content rests on the profoundly elitist, ill-informed, and condescending premise that the plebes are interested in generalities. Portals provide information on Sports or Entertainment, but what people actually are interested in is the Dallas Stars, the All Blacks, or Juventus, or Stephen Rea, Air, or Run Lola Run.

Even a general sports fan will not sit there and download generic sports headlines. A general sports fan who’s been online for more than three months will have cultivated his (sic) specific interests. Why? Because the net is all about specificity. Freed of the costs of imprinting and schlepping paper throughout meatspace, there’s effectvely unlimited space for niche interests.

How were eGroups and Topica born? Why are there 33,122 public Listserv lists? People have specific interests, not general.

Portals are an American-network-television model. They errantly recapitulate the concept of the mass audience.

What we’ve got are well-paid marketing men (sic) and geeks setting up portals with links to "content" on every conceivable mainstream topic. The marketing men are then able to say "Sports? We got sports. Entertainment? We got that too." They themselves aren’t interested in the generic concepts of "sports" and "entertainment." Even for the marketing men’s own interests, portals are a fraud.

Portal unusability

Portals invariably ape the taxonomy of Yahoo, with categories and subcategories, all of them selectable. That hierarchical taxononomy has become an absolute standard online. Epinions is trying to make it work. Philip Greenspun’s famous photography site, photo.net, recently adopted the meme. It’s omnipresent.

What is the effect on content? You can’t find a damn thing. Not without a search, anyway, and if your topic isn’t natively covered by your portal’s in-house content staff – what’s that? the portal you run doesn’t have a content staff? – then what you end up with are links to other sites. Once you know those links, you have no reason to re-visit the portal for that topic. Ever.

At a structural level, presenting dozens or hundreds of links on a page is too dang confusing, especially to the neophytes on whom portals prey.

Portals are not about content. We know this because of the crappiness of their content. What galls us is the arrogance of portalistas, who foist content and a content structure on us that they themselves would never accept.

What is the solution?

As much as we despise the neologism, vortals make much more sense to us. Provide tightly-edited content on a specific topic – very much including links to off-site sources – produced by a dedicated team of obsessifs on that topic. Now, why wasn’t this obvious in the first place? Furthermore, do it right and you can own your "category," which absolutely no general-interest portal can claim.


2000.06.08

We will stretch our earlier metaphor of duelling Fred Flintstones on the rack a bit more in discussing the demise of APBNews.com.

The serious crime site ran out of cash this week and laid off its entire 140-person journalistic staff.

(It’s worse, of course. Salon laid off staff. Why? "Traffic, baby. Traffic for media [coverage] never gets huge, compared to Britney Spears.’’ And those aren’t the only layoffs and closures.)

On hearing the news, one Fred Flintstone suffered mild cardiac arrest. If something as tightly-focused, non-exploitative, and professionally-run as APBNews.com could tank ("No media outlet – no Web site, no newspaper, no magazine, no broadcast – came close to providing the criminal justice content that APBnews.com did"), didn’t that mean that supporters of portals, a near-fatal melanoma on the skin of the Internet, would have won?

Didn’t it mean that the only content of interest to people is headlines? The promise of a zillion Reuters articles you could maybe read if you could weed your way through the barrage of links, banners, and animated GIFs at your favourite portal site?

Marketers see the Internet as a transactional medium. So do a lot of netters themselves. Admittedly, E-commerce (or, as we write it, E$) is often convenient. Marketers get the most press, because the business pages, physical and virtual, are about selling. Selling is one step away from the thing being sold. Just as portals provide links to content rather than content, the business pages provide meta-coverage of the net rather than genuine coverage.

Anyone who’s been online more than a year or so, and anyone with a brain and actual interests, would never rank "Shopping" as the main reason they go to the trouble of dialing a modem and connecting to a creaky local service provder (or AOL). Shopping does not captivate the interest. Content does.

Content can be and is part of E$ sites. Indeed, we have criticized the skimpy content at E$ sites elsewhere. What we want is more content at shopping sites, not less.

But we want more and better content generally. APBNews.com gave us both. And it tanked anyway.

This Fred Flintstone is almost ready to concede defeat. He figures the portalistas have prevailed, and we might as well just pack up and head home to surf Reuters links for the rest of our lives.

But the other Flintstone notes the fact, largely unremarked by the chattering classes, that every portal save for Yahoo is losing money hand over fist. Frankly, portal content isn’t working. Portals aren’t working. On another day, we will explain how portals insult the intelligence of the Internet audience.

This Flintstone also notes that every APBNews.com staffmember is being aggressively recruited. (Programmers, too, of course. Running a content site isn’t an all-journo venture.) As Dan Fost put it, perhaps too optimistically,

I’m not too worried about my unemployed online colleagues, certainly not in the way I am when a newspaper folds. When someone takes a job online, they shouldn’t think of it merely as a job with a particular dot-com; instead, it’s like taking a job with a huge company called the Internet. Your project or division may fold, but once you’re working online, there are plenty of other places hungry to hire you.

If the demise of APBNews.com really signaled a low-water mark in the still-nascent field of excellent content, why are other content sites clamoring for their talent?

In meatspace, sometimes good products die on the vine. The canonical example is Betamax. Beta died. Home video did not.

APBNews.com is dead. Content is not.


We note, though, that online content has an added dimension compared to print: You can measure it. You always know how many people read your stories. Salon was somewhat notorious in using logfiles to adjust the prominence of various articles. And ultimately, it sealed the fate of the 13 people dismissed. All their sections pulled in low numbers.

Still, on the net, where niche content is cheap, we wonder what "low" really means. We suspect it has something to do with greased totem poles at charity benefit picnics rather than readership.


2000.06.07

We were pleasantly surprised to find, tucked away at BusinessWeek’s site, a regular column by John M. Williams ostensibly on adaptive technology, though in practice it concerns how technology affects disability. A bit drab, the writing. But this is one case where online media make it possible to cover topics that would be too expensive to justify in print production. Space on a printed page costs more than files on a server.


2000.06.06

Usability is like love

Over on the CHI-Web mailing list – and you cannot consider yourself civilised unless you subscribe to CHI-Web – we discussed the deserved demise of boo.com. Since the CHI-Web list is made up of usability types, we recapitulated the blind-person-exploring-an-elephant phenomenon and laid blame for boo’s failure on usability.

Then Jared Spool piped in, saying that since no site his company had ever studied had enabled more than 42% of test subjects to complete typical tasks, usability is much less important than we think it is. Indeed, it has next to nothing to do with a site’s viability.

Thus was unleashed, however unwittingly, a battle of doctrines. One Fred Flintstone, hovering over a shoulder, exclaims "Boo died because of crap usability. Usability rules!"

A second Fred Flintstone, hovering over the other shoulder and resembling Jared Spool, counters with "Usability is barely relevant in determining who flourishes online – our subjects are successful less than half the time."

Should we be looking at a multifactorial approach here?

Isn’t it likely that, as everywhere on the Web, usability and every other factor are all interlinked? A well-designed site with usability proven in repeated testing and iterative refinement can still fail over a 14.4 modem connection. You "did everything right," but factors outside control worked against you.

What are the factors?

If we think about factors we can control, though, shouldn’t it be obvious to seasoned Web designers and researchers that a successful site has to:

We take these factors all together at once and arrive at a gestalt. boo was slow, looked great, offered nothing of real relevance to genuine consumers (the wares were too twee and overpriced), was actively unusable, and told you to get lost if you used a Mac (and failed with any combination of graphics, Java, or Flash turned off).

Amazon is fast, looks OKish, offers many useful items and some content (nothing we particularly like), is quite usable, if only for repeat customers who have One-Click Ordering set up, has never failed to function in any of a zillion browsers we’ve used, and will let anyone browse and visit.

Now think of a couple of also-ran E$ sites and try applying the list above – sites you visit despite their annoyances. You will end up with a matrix of Dungeons & Dragons–style hit points that can help diagnose the successfulness of the site. (In fact, should we start a new meme – Web-site hit points?)

When looking for love,
leave the checklist at home

It’s sort of like love, isn’t it? Or getting a cat. We went to the Humane Society specifically looking for a small, short-haired female cat and went home with a long-haired Garfieldesque male orange tabby who was so huge he could reach all the way to the countertop.

Or, getting back to love, you may say you adore redheads, but suddenly a black person sweeps you off your feet. Or, over time, your black friend becomes something more.

Maybe Web sites are the same way. Maybe, viewed as designers/researchers, we have to stop slagging sites because they fail singularly on component X when it’s components T through Z that are all relevant. Maybe we shouldn’t look for love armed with a yes-or-no checklist.

And maybe – this is pretty well accepted – people put up with sites for the first confusing minutes and keep at it until they get what they want. This directly contradicts the nielsenism that people will instantly switch to another site if what you’re offering isn’t what they want or if it’s hard to use.

(There’s so much wrong with that declaration. Among other things, only superusers know what all the other options are to go to in the first place.)

Stick to it anyway!

Is it also true that, even with all the other factors usability specialists cannot control, we need to take care of usability anyway? Though we might not have the absolute diagnosis of why boo went under ("Usability rules!"), we should still keep working to make other sites usable (despite "Things work only 42% of the time")?

We say this knowing it’s already the way to go. contenu.nu principal Joe Clark spent nearly 25 years working on disability issues and accessibility. Even in very large cities, you’re never serving more than a few people at any given time – by some objective measures, especially to conservatives, that represents failure. But not for those few people. Same with our hit points: The site may be slow (demerit!) and look like shite (demerit!), but by gar, most everyone who comes to the site can actually use it for its intended purpose. Because we knew usability was important despite the other problems and kept at it to make it work.

Usability Is Like Love. We can see the T-shirts already.


2000.06.05

Would Jakob Nielsen please shut up?

We owe a debt to Jakob Nielsen. We really do. We’ve read all his Alertbox columns. We agree with much of what he says, particularly his pleas for visitor-focused design and accessibility. (Nielsen’s advice on creating thumbnails is very clever and entirely ignored.)

But jeez, this guy ought to stop pretending he knows everything about online reading.

In response to a now-famous study by the Poynter Institute on the habits of news-site readers, Nielsen writes:

Web content is intellectually bankrupt

– in itself a self-incriminating overstatement, akin to Roger Black’s notorious claim that "There’s hardly any good work on the Internet at all," as if he, a Web designer, weren’t complicit in that fact if true –

and almost never designed to comply with the way users behave online. Almost all websites contain content that would have worked just as well in print. Even online-only webzines are filled with linear articles with traditional blocks-of-text layouts. No hyperlinks, no scannability. New forms of content that are optimized for online are exceedingly rare, and I keep returning to the same four examples when I am asked to name good writing for the Web: Tomalak’s Realm, AnchorDesk, the Feed Daily mini-column, and Yahoo Full Coverage.

Except that linear articles do indeed function online in many cases. But hold that thought. The usability arbiter authority continues:

[T]he most common behavior is to hunt for information and be ruthless in ignoring details. But once the prey has been caught, users will sometimes dive in more deeply. Thus, Web content needs to support both aspects of information access: foraging and consumption. Text needs to be scannable, but it also needs to provide the answers users seek.

Well, either it needs to be scannable or it doesn’t. Nielsen pays obeisance here to the unstated principle that the Web is for E-commerce or discrete information tidbits. Service-oriented sites are the most popular on the Web (Amazon, search sites – you know the scene). But they’re not the only kind of written content.

In the Poynter study, seasoned news-site visitors were asked to read news as they usually would. But news is decomposable. You can sum it up in a headline, or boil it down to a couple of sentences.

It’s a mistake to expand the newswriting model to apply to everything on the Web. At a newspaper site, for example, cultural reviews are much less likely to be skimmed, as are gripping news stories that really mean a lot to you – like water contamination in Walkerton, where you hang on every word.

Other kinds of writing – including the now-ubiquitous Weblog format you are presently enjoying – often call for thorough reading, particularly when the writers are good. (Current fave: Ryan Gantz, Sixfoot6.com.)

But here’s the biggie: The four sites Nielsen loves the most aren’t even Web writing sites, let alone news sites. They are links to other articles – real articles, the kind you have to sit and read.

What Jakob Nielsen likes isn’t content, it’s meta-content. And we’re supposed to trust his advice when applied to the entire Web? As the kids used to say in the latter days of the passing century, homey don’t play dat.

So what do we do instead?

Nielsen is not entirely wrong. We strongly agree that many forms of online content should be chunked up, using tags like <STRONG> and Cascading Stylesheet parameters that let you box or overline-and-underline text you wish to highlight. We use it here, and the essential CSS is:

SPAN.hilite { border: solid 1px #FF6600; padding: 1px }

That markup looks pretty awful in Netscape, as so much that even resembles standards-compliant HTML increasingly does.

But when you need to present 3,000 words, or even 250, because that’s how many words you need, what you do is refuse to throw the baby out with the bathwater and engage in best practices for online reading:

So which is better: Presenting as much text as you need to present, but responsibly and readably, or listening to a pundit whose fave Web sites are little more than file cards linking to books in a great library?

(If you think we’re being rough on those four sites, don’t. We adore Tomalak’s Realm and could not live without it, and Lawrence Lee’s ability to manage vast oceans of content, all of it seemingly at his fingertips, is so impressive it’s Orwellian. And best of all, this blog entry looks a lot like an Alertbox screed. Ironic, huh?)


Earlier, we mentioned "a stunningly concise evisceration of the usability deficiencies of Reflect.com." We learn now, through an article in a certain newspaper, that Reflect.com was masterminded by Critical Mass of Calgary. The accompanying photo depicts not one but two Critical Mass employees riding BMX bikes around the office. Given the firm’s apparent inability to produce a usable site, let’s add an item to the questionnaire clients should present to would-be consultants (including us, of course):

If the number is greater than zero, thank the consultants for their time and high-tail it outta there.


Some usability critiques we’re liking lately: A stunningly concise evisceration of the usability deficiencies of Reflect.com in BusinessWeek; usability praise, largely from an automated software source, for REI. We’re just wondering, though: Computers find raw census data usable, for gosh sakes. Why have we developed simulation software to evaluate usability? Isn’t this asking the wolf to guard the coop?We ask you: Haven’t we learned anything from The Matrix?