Sunday, 17 December 2017

Kodak DCS Pro 14n

Let's have a look at the Kodak DCS Pro 14n, a full-frame digital SLR from 2003, notable for being one of the first ever full-frame digital SLRs. On paper the 14n looked like a winner. It was cheaper than the Canon 1Ds and had a higher resolution sensor, with fourteen megapixels to Canon's eleven. But the reviews were indifferent, and the 14n is generally regarded today as a mis-step that did nothing to halt Kodak's inexorable decline.

But before that, the most important thing. In all of my years I have pushed my body to the limits of physical endurance, and beyond; I have pushed my mind to the edge of sanity; I have gazed into the abyss, and contemptuously turned my back on it. I have sneered at God's wisdom and become numb to pain and pleasure.

A lesser man than me would break down and cry, but underneath my hard exterior there is a hard interior. I admit that I haven't had much luck with women but there's still time. However one thing I haven't done is transfer files from one device to another with FireWire. In all my years I have never transferred files from one device to another with FireWire.

Transferring files with FireWire. My life is now complete.

In the first decade of the twenty-first century many millions of people used FireWire on a daily basis to transfer music from their Apple Macintosh to their iPod, and other people - richer, more influential people - used FireWire to transfer video files, and even some PC people used FireWire, but not many, because PC people had USB. I had USB. For a while USB was much slower than FireWire, but it caught up, and eventually Apple switched to USB and gave up on FireWire.

But the air of glamour remains. FireWire was the interface of Apple people. People who had sex regularly. Sex people. As a PC person I could only dream of the kind of lifestyle that Apple people took for granted. With their model-like good looks, lounging in IKEA kitchens, going out to nightclubs, drinking just champagne, having lots of sex, and transferring files with FireWire. The Apple people probably knew nothing about data transfer protocols or eSATA or USB polling, but they did have a lot of sex. Unlike me.

Well, that's all changed. Except for having lots of sex. That hasn't changed. But I have used FireWire. I used it for the first time yesterday and I will use it again later today.

What was it like? The DCS 14n has a four-pin FireWire plug, so I needed a four-pin-to-eight-pin cable to connect it with my G5. In the days before everybody used Adobe Camera Raw or Lightroom, camera manufacturers had their own image management software. Fuji had FinePix Viewer, Canon had Digital Photo Professional, and Kodak had DCS Capture (which interfaced with the camera) and PhotoDesk (which developed RAW files). Sadly Kodak no longer seems to host either, but I have archived copies.

After shooting some test images I plugged the 14n into my G5 with the FireWire cable. There is of course something sexual about plugging a cable into a socket. You have to find the right orientation and ensure you use the correct amount of force, but unlike actual sex you don't have to worry about the connector going limp. And, ironically, the more you worry about the connector going limp the more likely you are going to have problems. It's a negative feedback loop. You worry that you won't perform; your worry affects your performance; the humiliation of failure feeds back into the worry.

One quirk of the 14n is a tendency towards purple highlights around brightly-lit spots. It's the same colour as the IR blocking filter on the sensor, which makes me wonder if it's a result of flare inside the camera body.

A wise man recognises the problem and takes steps to deal with it, even if this means admitting fault and doing a lot of hard work to rectify it. An unwise man blames his failure on others, which is how you get Peter Sutcliffe. All those women he killed weren't the problem. He was the problem. He should have killed himself. But anyway I fired up DCS Capture and transferred the files. Was the result any more or less euphoric than using USB? I can't tell. Digital SLRs are notoriously slow at transferring files, and the 14n was notoriously slow. But that progress bar, oh boy. That magic feeling. If nothing else comes from this rotten year, I can at least say I have used FireWire. I'm going to go on eBay and see if I can find an old iPod with FireWire (checks eBay) I'm not paying that much.

Over the years I've used several of Kodak's old DCS cameras, most recently the six megapixel Kodak DCS 760, which was launched in 2001, and the two megapixel Kodak DCS 520, which came out in 1999. I've also owned and used the DCS 420, DCS 460, DCS 560, and the Canon D2000, which was a rebadged DCS 520 sold by Canon because Canon took ages to release their own digital SLR. The DCS cameras are fascinating nowadays because they're huge and strange; for the most part they're curiosities, although the 500-600-700 series are still usable as cameras. They take standard Compact Flash cards and, surprisingly, batteries are still available.

The 14n. It has a strange mixture of design cues from the DCS 520/560 (the abbreviated vertical grip) and the 600/700 (the bulging handgrip). The Fuji S3 was essentially Fuji's take on the same idea, and it was a lot more elegant.

The 14n has long fascinated me. It has a fourteen megapixel, full-frame sensor, without an antialiasing filter. The specification doesn't sound bad today. It has a base ISO of 80, which gives me a chance to see how steady I can hold the camera. Furthermore the general concept - a proprietary sensor built into a Nikon F80 film camera body, with a portrait grip - reminds me of the Fuji S3, which was pretty good and actually got better with time, because later version of Adobe Camera Raw could process the camera's files even better than Fuji's own software. The 14n has a number of fatal problems, but in general it's a better camera now than it was in 2003. Back then photographers had to run the RAW files through Kodak PhotoDesk, which applied smeary noise reduction. Modern versions of Adobe Camera Raw bypass that. The images I have shot with my 14n look less plastic than the samples I can find in contemporary, 2003-era reviews.

A Brief History of the DCS Series
The original Kodak DCS 100 0f 1991 was the first ever digital SLR. It was essentially a Nikon F3 with a digital back, attached by cable to a large storage and playback unit that the photographer had to carry over his shoulder. The image quality was superior to contemporary still video cameras and, unlike film, its images could be transmitted instantly across the world, so despite an astronomically high price it sold well enough to prompt Kodak to enter the digital SLR business.

Fans of Digital Photography Review might recognise this scene.

Back in 2003 the reviewers had to use Kodak PhotoDesk, which applied noise reduction to the files. Adobe Camera Raw doesn't, with results that look less artificial.

History recalls that Kodak invented the digital camera and then failed to exploit it because they were schmucks, but I think that's unfair. The company's professional digital SLRs had a near-monopoly for several years, and its consumer digital division sold millions of cameras. But the professional imaging department faced the same problems as DEC, Sun, Digital, IBM and so forth - a reliance on high-margin, big ticket items that failed to scale down when commodity products became "good enough" - and its eventual demise was not unique in that respect. Meanwhile the consumer digital camera division was out-competed by the likes of Casio, which again was common throughout the camera and computing industries. I grew up in a world that had always been Japanese and Korean; Kodak's survival into the 2000s was impressive.

In my opinion Kodak was not uniquely mismanaged or undercapitalised; it suffered from being an American company in a market that America could no longer dominate, that no longer needed America.

The DCS 100's successors divide roughly into three waves. The DCS 200 and 400-series of the mid-1990s were Nikon film SLRs mounted on a large imaging component that resembled a huge motor drive. The camera bodies could actually be separated from the digital back and, with a bit of work, turned into 35mm film cameras again. I've used the DCS 420 and 460. They're incredibly awkward today, with chunky batteries and erratic colours. The DCS 460 is notable for its then-extraordinary six megapixel resolution and its APS-H sensor. Although the APS-H format is generally associated with the later Canon 1D press cameras, it was invented several years earlier by Kodak and used in all of their six-megapixel digital SLRs.

The DCS 200/400 generation apparently sold quite well and introduced many newsrooms to digital photography, although the cameras still felt a bit home-made. One thing the DCS cameras all had in common was a relatively naked sensor. The infrared blocking filter was always very thin, and none of the DCS cameras had a built-in anti-aliasing filter, although later models had a removable filter mounted just behind the lens.

In parallel with the Nikon-bodied models Kodak also sold the DCS 1, 3, and 5, which used a Canon EOS-1n chassis. On the whole Canon was less keen on collaborating with Kodak than Nikon, so there were fewer Canon-bodied DCS cameras. Kodak also sold the 300-series, which used Nikon Pronea APS film camera bodies and was aimed at a lower price range.

A DCS 760 sitting next to a Nikon D1x, six megapixels versus five interpolated. The DCS 760 was a Nikon F5 with digital additions; the D1x was an entirely new body.

The next wave of DCS cameras was slicker. The two-megapixel DCS 620 and six-megapixel 660 were built around the Nikon F5, with a more elegant digital body that was massive and tough as nails. The DCS 520 and 560 used the EOS-1n again, and were sold by Canon as the D2000 and D6000. Kodak upgraded the Nikon models into the six-megapixel DCS 760 and DCS 720x, which had a novel sensor filter that allowed the camera to shoot at ISO 6400, impressive at the time.

The same DCS 760 sitting next to a Fuji S2. The S2 was a six megapixel, APS-C digital SLR built on a Nikon F80 film body, and as you can see Fuji managed to keep the size and weight right down. The N80 chassis was also used by the Nikon D100, and Kodak's later SLR/n. The rivalry between Kodak and Fuji was akin to that between Canon and Nikon. Both companies dipped their toes in the professional digital SLR market, along the way producing some interesting cameras.

The DCS cameras were staggeringly expensive in their heyday - selling for $15k+, mainly to news agencies - and despite massive price cuts the range apparently never made a profit. As with the Apple Power Macintosh G5 I wrote about last month they were technically competitive, but the cost was too high, Not just the financial cost; the weight and size as well. The Nikon D1 and D1x essentially killed off Kodak's press cameras, although Nikon's decision to discontinue the F5 film camera helped force Kodak's hand as well.

The DCS cameras are fascinating today for their quirks. The DCS 500/600/700 had huge batteries that resembled sub-machine gun magazines. They had dual hot-swappable PCMCIA card slots which could take multi-gigabyte cards, plus GPS support, timelapse and tethered shooting at a time when those things were unusual.

They shared one problem, however, which is that they couldn't generate JPEG images in real time. Photographers were supposed to download the RAW files into Kodak's PhotoDesk and batch process them into JPGs, or presumably transmit the RAW files to home base so that the picture editors could do it instead. The cameras had an option to generate JPEG files, but it took ages, drained the battery, and in my experience the results always looked purple. The 14n's JPEG engine isn't particularly good either, and was one of the reasons for its indifferent reviews. Compared to (for example) the Fuji S2, the 14n's JPEGs looked washed-out and had intrusive noise reduction, and furthermore the camera took ages to generate them.

With the demise of the DCS 760 Kodak decided to shift from the press market to the advertising, product, wedding, historical imaging, surveying, geography etc market instead. To that end they launched the DCS Pro Back, a sixteen megapixel medium format back for Hasselblad, Mamiya, and Contax cameras, notable for having a square format at a time when most medium format backs were rectangular 645. To confuse matters it was sold for 645 cameras, but had a 1.6x crop factor that essentially chopped off the sides of the 645 image circle. It wasn't Kodak's first digital medium format back, but it was the first one they seemed to put some marketing effort into. Lord knows if the DCS Pro Back was any good. Used examples still sell for hundreds of pounds. Can you find the batteries? What if an irreplaceable component burns out?

Kodak launched the DCS Pro 14n. It had the body of a digital SLR but was essentially a turn-of-the-millennium medium format camera in disguise, with all that entails. I still can't spell millennium without looking it up. I'm never going to learn. What's the point? It's 2017. I will have been dead for nine hundred years before anybody gets excited about the millennium again. You will be dead. We will all be dead. Together at last, joined by the one thread that binds us.

All of Kodak's earlier DCS cameras used Kodak-made CCD sensors, but for whatever reason Kodak decided to source a third-party sensor for the 14n. I have no idea what went on behind the scenes during the development of the 14n. The camera body doesn't seem to have been a problem; reviewers found the vertical grip awkward, but it's not that bad. The interface is similar to the late DCS 760 interface, with Kodak's typical hold-the-button-to-select-a-page, let-go-to-select-a-function design. In retrospect it seems that something went wrong with either the sensor, or the electronics surrounding the sensor, or the programming involved in taking data off the sensor, or all of those things, or the management of those things.

The reviews were scatching, singling out the poor startup times, the flaky interface, appalling image write times, poor battery life, a poorly-implemented lens optimisation feature that gave photographs a colour cast, generally washed-out, magenta-toned colours, and more importantly excessive noise. Nowadays full-frame cameras are generally high-ISO champs, but the early models were surprisingly poor. The early Contax N Digital was plagued with banding noise and the Canon 1Ds was no great shakes at higher ISOs either.

The DCS Pro 14n was also noisy with long exposures, which is why I think of it as a 35mm-sized medium format digital back; it was made to be used with studio strobes. The 1Ds had noise at higher ISOs but could do multi-minute exposures at lower ISOs without a problem. The 14n, on the other hand, hates exposures longer than a few seconds. It has a dedicated long exposure mode that works by stacking a series of shorter exposures, but although the thought of thirty-second exposures in daylight is intriguing the feature feels tacked on.

A four-second exposure at ISO 80 with the 14n, showing my workhorse camera with my most-used lens, a Leica-R 60mm f/2.8.

A crop from the above. Notice the smeary colour noise - this is the worst kind of noise because it's hard to get rid of. Bear in mind this is a four-second exposure at ISO 80. In the studio you can just turn up the strobes, or buy more strobes. What if you're shooting a landscape at sunset with stacked grad filters, and you need an eleven-minute exposure? The 14n is not your camera. At higher ISO the smeary noise is joined with vertically-banded noise.

The 14n has a curious "longer exposure" feature that simulates lower ISOs, but with a fixed range of shutter speeds. It works by stacking a series of shorter exposures. The later DCS SLR/n had more options here, including a simulated ISO 50.

The result works, sort-of. This is a thirty-second, ISO 6 exposure. There's very little noise. 

But you can see the stacked exposures. If this had been a single thirty-second exposure the people would be invisible ghosts; instead they're captured at points along their timeline.

Therein lies the 14n's big problem, or problems. The low base ISO is awkward for photojournalists because not every news event happens in broad sunlight. It's awkward for general photography. The combination of low base ISO and limited buffer makes it unappealing for the wedding market, although in its defence the buffer isn't as tiny as the Fuji S3. It can't do arbitrarily long exposures, so it's not much use for landscape images.

Even in a studio context the lack of an antialiasing filter means that fine hair and certain clothes produce unattractive multi-coloured moire patterns, something that Kodak's software was supposed to fix, but didn't. For outdoors portraits on a sunny day at f/1.4 the 14n is probably fab, although I imagine the ugly purple highlights will be a problem if you use the sun as a backlight.

In common with other F80-based cameras the 14n doesn't meter with non-CPU lenses, so if you have a desk drawer full of AI/S lenses you have to bring along a lightmeter. I'm sorry for all that stuff about FireWire, by the way. It was just supposed to be a throwaway gag, but it developed a life of its own. It's true, though. I genuinely haven't ever used FireWire. It sailed right past me, but I had a dream, and like many men who dream in the daytime I act upon my dreams.

After the relative disaster of the triangular plants thing, the swings and swinging ball / padded stairs are much more popular.

Back to the 14n. In 2003 there was a sense that the camera wasn't finished. Kodak released a flurry of firmware updates, and on the side of the body is a mysterious port marked TEST. This is actually a serial port that will interface with GPS units. The lack of a USB port seems odd nowadays - it has Firewire instead - but then again the Canon 1D and 1Ds didn't have USB either, so I can't really hold that against Kodak.

The last official firmware was version 5.4.1, but this enterprising chap from Ukraine managed to pull the firmware apart and upgrade a few features, in particular eliminating the lossy compression on Kodak's 12-bit DCR RAW files. I still use Photoshop CS4, so after installing this new firmware I have to run the DCR files through a modern version of Adobe's DNG converter before they look right.

What's the Pro 14n like? Physically it uses Nikon's F80 film camera as its base, which is a step down from the Nikon F5. The same chassis was used by the Nikon D100, Fuji S2, and Fuji S3, of which the S3's implementation was the best. The F80 was built for film, and the cameras based on it all had one usability flaw - ISO is on the PASM dial, which means twisting the dial to change ISO and then twisting it back again, which is awkward. As with all of the aforementioned cameras the F80s buttons and dials feel a bit cheap, although whereas the S2 and S3 were all-plastic the 14n's back section is made of metal. Holding it horizontally is fine. Vertically is awkward - the 14n should have been half an inch taller and half an inch shallower - but not impossible, although I wouldn't fancy touting around an 80-200mm f/2.8 zoom all day.

There are two card slots, one CompactFlash and one SD, which can be set to save RAW files to one card and JPEGs to another. At launch the 14n's SD card slot didn't work, and even with later firmware upgrades it doesn't read SDHC cards, so it's limited to 2gb cards. Whereas the other DCS cameras had no problem with multi-gigabyte cards the 14n is fussy. It's unproblematic with the 1gb and 2gb Compact Flash cards I have lying around, and the 1gb IBM Microdrive I still have, but with 4gb cards I start to get ERR messages and card write errors.

It uses lightweight stick batteries. The DCS cameras were notoriously bad at power management - they drained the battery even when turned off - and it's good form to remove the batteries when you aren't using them. This generally means having to reset the date, because despite having a brand-new CR2032 button cell the camera doesn't retain the date. Kodak stopped making the batteries a long time ago and they're surprisingly hard to come by nowadays. Most of the sticks available today appear to be home-made clones. The ones I have will last enough to fill up a 2gb card with juice left over. My camera also came with a dummy battery that hooks up to the charger so that it can be powered by the mains. The charger is huge.

What else? The lack of an auto-ISO mode is irritating. ISO 160 isn't too bad, and even ISO 400 is usable if the image doesn't have too many shadows. The camera tops out at ISO 800, but this can only be activated with six-megapixel RAW files, and it looks dreadful. I found myself continually switching between ISO 80 and ISO 160 depending on the lighting conditions. The camera is compatible with D-TTL, which was Nikon's D1/D100-era flash system; it didn't last long before being replaced by I-TTL. In practice the 14n was intended for use with studio strobes, which are dumb and use the PC Sync socket (or an IR or other suitable trigger).

The LCD screen has a pair of notches above and below it that appear to be designed for a plastic screen protector, but I can find no evidence that Kodak ever sold one. The camera is new enough to be covered extensively by camera review sites; as always Rob Galbraith's review had the best combination of technical evaluation and professional photographic oversight.

Sometimes it goes wrong.

What happened to the 14n? The reviews were poor, but it was cheaper than the Canon 1Ds and in theory outresolved it, and perhaps some photographers were fond of Kodak's workflow, so judging by this list of serial numbers it appears to have sold 15,000 units or so. The 14n name suggested that there was going to be a Canon-bodied 14c, but perhaps understandably Canon were uninterested in helping Kodak compete with the 1Ds.

Instead Kodak essentially re-released the 14n a year later as the DCS Pro SLR/n, with a new sensor and new electronics. For a fee the company offered to upgrade existing 14n cameras to SLR/n standard; the resulting cameras were rebadged 14nx. The Pro SLR/n attracted slightly more favourable reviews than the 14n, although it was still overshadowed by the 1Ds.

Kodak did eventually sell a Canon version of the camera, albeit without Canon's involvement. The DCS Pro SLR/c used a body donated by Sigma, with a reverse-engineered version of Canon's lens mount; reverse-engineering the flash automation and lens interface must have been a difficult job. The result was essentially a clone, and remains unusual today as the only Canon-compatible digital SLR not made or even sanctioned by Canon.

What else? In common with digital SLRs of the era the sensor cleaning mode simply flips up the mirror. You have to clean the sensor manually. There's no live view or video recording. The only factory option I am aware of was a buffer expansion from 256mb to 512mb. The serial port could in theory interface with GPS units, and might have been used to connect to digital image verification hardware, but if so I have seen no evidence of it. Unlike earlier DCS cameras the DCS Pro 14n was never carried into space by NASA, who had by that time switched to Nikon. There are rumours of a monochrome-only model, but I've never seen one.

Kodak discontinued the SLR/n and SLR/c in 2005 and disbanded its DCS division. Kodak continued to make sensors for other manufacturers, notably Leica, and was an early driver of the Four-Thirds system, but in 2011 the company sold its sensor division. Whatever remains of it now belongs to ON Semiconductors, who make sensors for cars and mobile phones. Kodak itself declared bankruptcy in 2012. The company still exists, and still makes motion picture film. When I go on holiday I occasionally see Kodak signs, slowly fading away. Like IBM it's one of those names from the past that people remember, although no-one knows what it does any more.

A Kodak sign, shot in 2015, ironically with Fuji Velvia.

Thursday, 14 December 2017

Star Wars: The Last Jedi

Off to the cinema to see Star Wars: The Last Jedi, the latest instalment in the long-running Star Wars space saga. You wait thirty-two years for a Star Wars film and then three come along at once. To paraphrase Howard Hawks, The Last Jedi has three good scenes and lots of indifferent scenes. It's overlong and repetitive and, as is the fashion with modern Star Wars films, it feels like a combination of bits from other Star Wars films. Did I enjoy myself? I did, although I would have enjoyed myself slightly more if the film had been half an hour shorter.

Now that Disney has its hands on the franchise the films have become an annual event. Next year there will apparently be a film about space rogue Han Solo. The year after that, who knows? Something about the bounty hunters, probably. Or a variation of Fifty Shades of Grey starring sadistic torture droid EV-9D9, and hopefully Maggie Gyllenhaal. Or the swashbuckling adventures of Lobot, which will be both an action film and a poignant exploration of autism. I don't know. I don't work in Hollywood.

I saw the film at London's The Science Museum. The Last Jedi was shot on good old-fashioned 35mm film with 70mm IMAX inserts. The Science Museum is the only place in the UK and perhaps all of Europe screening an actual film print. I went into the film without preparation. I haven't see any of the trailers or read any of the reviews. The film was released in the UK on 14 December 2017. I saw it on 14 December 2017. I was of sound mind and body.

For Force and Rogue the Science Museum introduced the film with cheaply-animated graphics of Darth Vader and Chewbacca. This time they have persuaded one of the staff to do a little filmed introduction in which he is strangled by Darth Vader. Mid-way through the screening the fire alarm went off, and we all walked out into the street, and then we walked back into the theatre and the film resumed. The end.

I've written about Star Wars before. The first film, just called Star Wars, was released in 1977. It was an enormously popular space adventure notable for its striking special effects and its sincerity. Although director George Lucas was an arty film school hipster, he treated Star Wars as if it was a real film, like Lawrence of Arabia, even though it was set in space and had laser swords and spaceships and robots. Audiences worldwide were eager to be distracted from Gerald Ford and punk music and William Friedkin's Sorcerer so they lapped it up.

The long-awaited sequel, The Empire Strikes Back, was released in 1980. Although Lucas took the risky decision to finance it from his own pocket he didn't compromise his artistic vision; Empire had a peculiar structure and a downbeat tone. Our heroes spent the whole film running from one catastrophe from another. The film ended on a bleak note, with one of the heroes maimed both physically and mentally and another imprisoned in a block of stone.

On both an artistic and technical level Empire remains the high point of the series and one of the best science fiction adventures of all time. Even today it looks and sounds awesome, all blue and orange, with cool stop-motion models and a rocking soundtrack by top orchestral mastermind John Williams. It has class; very few films have class.

The original film trilogy came to an end in 1983 with Return of the Jedi, which threw all the flaws of the Star Wars series into stark relief. The first film had been assembled from things that inspired George Lucas. There were bits of Dune, Buck Rogers, Dambusters, Japanese Samurai films, old westerns, Lensman and so forth. The sci-fi treatment felt fresh and new, but by the time of Return of the Jedi the series had begun to cannibalise itself. Jedi is by no means a bad film; at the very least it resolved the original trilogy in the most efficient way possible that didn't result in people asking for their money back.

The fundamentally derivative nature of the series hurts The Last Jedi. Instead of drawing inspiration from outside the series, the filmmakers have remixed a collection of elements from the original films and from the flood of media that followed them. Yet again there is an evil superweapon which, yet again, has a weak point. For what must be only the second or third time, but feels like the millionth, our heroes easily infiltrate a heavily guarded military base by wearing captured uniforms. Again, the good guys attack enemy vehicles that can only fire forwards by approaching them directly from the front, instead of for example the sides. There is a technical problem that can only be resolved by plugging in some fuses, or opening a circuit board, at which point a door opens. Our heroes run for safety towards a spaceship which is blown up just before they reach it. And so forth.

The film picks up the story from The Force Awakens, which was released two years ago. I remember being impressed that it didn't suck. The new young cast could have been irritating but were instead charismatic, even Daisy Ridley with her plummy BBC English accent, and BB-8, the cute new robot. The treatment of homosexual love between dreamy space ace Poe Dameron and reluctant Imperial Stormtrooper Finn was sweet; the characterisation of the chief villain was unusually complex for a Star Wars film, although the absence of a truly hissable baddie left the film's drama feeling surprisingly low-stakes.

On the other hand John Williams' score had one good new theme but was otherwise weak, and the plot felt like a rewrite of the original. Two years later I barely think of The Force Awakens, but then again there are very few things from 2015 that I think about, indeed I can barely remember 2015. It was the year in which no-one died. Sometimes I worry that all the sealed Force Awakens merchandise I bought might not pay for my retirement after all. The crate of Sphero BB-8 toys in particular cost a fortune. What if the batteries wear down? I'll have to pay someone in China to make new batteries.

The Force Awakens was overshadowed by last year's Rogue One, which was a prequel that filled in some of the storyline from just before Star Wars, using CGI to recreate some of that film's original cast, in the process returning Peter Cushing to the silver screen over twenty years after he died. Now that Christopher Lee is dead it would only require some deft legal manoeuvering to reboot Hammer's Dracula films with CGI versions of the original cast, perhaps including a CGI Ingrid Pitt, who is also dead. Madeline Smith and Gabrielle Drake are still alive. I hope there is a CGI model of them somewhere. I would pay money to borrow it.

Before you write in to complain, I am fully aware that Ingrid Pitt did not appear in any of Hammer's original Dracula films. I merely hope that some part of her is preserved so that she may continue to entertain audiences forevermore, even as whatever remains of her soul begs for the sweet release of oblivion. This may seem cruel and self-centered, and it is, but if this world was not created for my amusement, what was it created for? Or is the universe merely a byproduct of physical processes, created by no-one, for no purpose? Write your answers on a piece of paper and then throw it on the fire, because whatever answer you chose was wrong because there is no answer and we are all doomed.

The Force Awakens was overshadowed by last year's Rogue One, which despite production problems that resulted in a sometimes disjointed narrative - and another weak score, recorded in a rush when the original composer had to drop out - was possessed of some gripping and surprisingly brutal action sequences. The decision to make a darker film than its predecessor seemed self-conscious, and the two lead heroes were a bit dull, but overall the two films distracted me from the horror of life for a few hours, and for that I am grateful, but also resentful because they gave me hope in a world where hope is a lie.

But what about The Last Jedi? Is it any good? Is stoic hero Luke Skywalker a virgin? Sadly the film doesn't answer that. Did the audience applaud when space-princess-turned-military-commander Leia Organa appeared on the screen? No, they didn't. Leia is played by Carrie Fisher, who died almost a year ago to the day. The main credits end with a dedication to her; the audience applauded at that point. 2016 was the year in which everyone died, culminating in George Michael (25 December), Carrie Fisher (27 December) and Fisher's mother, Debbie Reynolds (28 December).

At the time it felt as if 2016 was Muhammed Ali and we were Sonny Liston; the year knocked us down in round one and then dared us to get up so it could knock us down again. Ali himself died in 2016. Sonny Liston was lucky. He died in 1970. If he had lived until 2016 he would have died as well, so it's perhaps lucky that he died earlier. The people who made The Last Jedi have access to a CGI model of Carrie Fisher - it was deployed briefly in Rogue One - but they have promised not to use it. Nonetheless it sits on a hard drive somewhere. Waiting.

Back to the film. Without wishing to spoil it, The Last Jedi borrows an awful lot from Empire. It begins and ends with an evacuation against seemingly impossible odds. The middle section has a training montage in which Daisy Ridley's Rey apparently teaches herself how to be a Jedi Master, while Luke Skywalker moans a lot; Yoda makes a cameo appearance, here rendered with CGI that's supposed to look like a foam puppet.

However it's not all Empire. The middle section also has a bit of James Bond with John Boyega's Finn and Kelly Marie Tran's Rose, a spunky space mechanic, who infiltrate an alien casino. This is, incidentally, when the fire alarm went off, so I missed a teeny-tiny bit of the action. The film has a bit of politics at this point. Our heroes turn into Jeremy Corbyn and decide to liberate the serfs and their livestock, although it's plainly obvious that just after our heroes escape the livestock is either rounded up or killed and the serfs are put back to work. A short scene at the very end of the film suggests that the serfs were however inspired by Finn and Rose, so perhaps at some point they will rise up and slaughter their capitalist masters. Also, does that little kid use The Force to move a broom, or what?

I have to wonder. Do Islamic terrorists see themselves as freedom fighters? Do they see us as the evil empire, and our society as a den of rich parasites? Are they are in fact morally right, if legally wrong? As before, if you have any answers, keep them to yourself or write them on a piece of paper and burn it.

This section of the film has one of the three good scenes that I mentioned. A short but exciting chase on the back of an alien horse. What are the other two? There's a very short fight involving Luke Skywalker, in which it becomes apparent that all is not what it seems - the audience applauded this part, and I was impressed with its chutzpah. The highlight however is an extended light sabre battle involving a pair of unlikely allies. There's something audacious about it because it materialises out of thin air. It was the film's only punch-the-air moment.

Sadly however the second half of the film bogs down. The Rebel fleet becomes involved in a long chase with an Imperial squadron that seems to go on forever. Poe Dameron leads a rebellion against the Rebellion that goes around in a circle and leads nowhere. Along the way Laura Dern single-handedly destroys an Imperial Star Destroyer with a tactic that made me wonder why it hadn't been tried before; I found myself wondering why they gave her purple hair, and what happened to Laura Dern anyway? She was in Wild at Heart and Jurassic Park, and then seemed to fall down the same hole as Juliette Lewis and Mary Elizabeth Mastrantonio.

By the final battle, which again borrows a lot from Empire, I found myself becoming bored. There are only so many times you can watch a bunch of attack craft speeding towards another bunch of vehicles before your brain starts to melt. My hunch is that the director had more of a handle on the physical action than the space battles. The fight scenes are exciting, the space battles dull, except for one short sequence in which the Millennium Falcon takes on some TIE Fighters and leads them through a crystalline tunnel. This was the film's fourth good scene. It was obviously a homage to a similar sequence in Return of the Jedi, but it worked.

Overall The Last Jedi passes the time but suffers badly from padding, especially in the second half. If every scene involving Laura Dern was cut it would have been slightly better. Not because Laura Dern is bad but because her "arc" is compartmentalised and pointless. The end.

What else? The toys this time are called Porgs. They're little bird things. At one point the chief villain uses the word spunk, in the old-fashioned way; the audience laughed. The film has a short cameo from Benicio del Toro, who plays a wasted variation of Han Solo. He's terrific but sadly only it in for a few minutes. Once again Gwendoline Christie is completely wasted behind a metal mask, although as before it's ambiguous whether she dies or not. A sequence in which a beloved main character survives certain death by floating through space is either heartwarming or ridiculous depending on how drunk you are. Andy Serkis is terrific as a motion captured CGI character, this time the evil Supreme Leader Snoke. I was surprised to learn that he voiced the character as well. His performance - with lots of close-ups of leering and bad teeth - is probably the best acting in the film. Lupita Nyong'o has a one-scene cameo that was presumably filmed in a shed somewhere. I missed her; her character in Force was entertaining.

Neither Mark Hammill nor Daisy Ridley can act, in a conventional way, but they both have charisma and Daisy Ridley has gusto, so I don't mind. Do you remember Anjelica Huston? She was a better actor than Daisy Ridley, but she didn't have charisma, so no-one remembers her nowadays. The cast of Star Wars were, for the most part, very limited actors, but they had charisma, and that goes a long way. Half of them were acting behind masks and they had more charisma than Anjelica Huston.

The Star Wars films take place a long time ago in a galaxy far, far away, which means that the writers have to be careful with language. The characters can't talk about miles or kilograms or hours or New York, because those things don't exist in the Star Wars universe. In this film a character uses the word "god", and another character uses the word "bastard", which surprised me; the Star Wars universe is usually very po-faced. A gag in which one character pretends to have a bad mobile phone conversation with another character feels un-Star Warsy, and Yoda's cameo involves what may or may not be a metatextual dig at the masses of Star Wars books and merchandise that has appeared since 1977, or might not. I don't know. I just don't know, the end. Until next year.

Wednesday, 6 December 2017

The Night Before the Death of the Sampling Virus

Let's have a look at Otomo Yoshihide's The Night Before the Death of the Sampling Virus, a fascinating CD that came out in 1993. It's a collection of 77 disjointed snippets of noise, some of which are only a few seconds long. In the liner notes Yoshihide requests that you listen to the CD in shuffle mode, or alternatively that you listen to several copies of the record at the same time, which sounds like a clever ploy to sell more records. He also suggests that you smear grease on the disc, but I didn't do this.

I only have one copy of Sampling Virus, so I used a computer to shuffle the tracks. Then I layered them four times, thus:

Whilst that sweet sound assaults your ears contemplate all the music you might be listening to instead. Sampling Virus suffers from a technical problem whereby compact disc players can't seek instantly, so no matter how it's sequenced it always sounds like a series of disjointed sound snippets. A few years later Autechre, working under the name Gescom, released Minidisc, a minidisc that exploited the format's gapless playback. I haven't heard it. The idea sounds a bit naff nowadays. It's probably the kind of thing that was envisaged by the man who devised the CD specification way back in the late 1970s.

Extreme Records still exists. It has nothing to do with Hungary's Arrow Cross Party, despite having a similar logo.

Sampling Virus was released a few months before Meat Loaf's Bat Out of Hell II and was largely overshadowed by that record. I remember hearing Meat Loaf a lot more on the radio. As a consequence it didn't chart. Is it glitch music? I'm not sure. Some of the lengthier tracks have a glitchy sound, but I think they were edited manually with a sampler. The idea of randomly ordering fragments of music isn't really glitch music, it's chance music, which is something else.

I've always thought of glitch as a musical form that exploits the technical errors of musical equipment, as in Oval's Systemisch (1994), which uses CD-skipping sounds. Sampling Virus is essentially Japanese noise music with a chance element.

Will I ever listen to it a second time? Unlikely, but there are few people on this planet who can truthfully say that they have listened to Sampling Virus all the way through, and I am one of them. Last year I wrote about Touch Records' Ringtones, a collection of audio snippets intended for use as mobile phone ringtones - the album was released about eighteen months before audio ringtones took off, and is interesting now for being slightly ahead of the curve. Ringtones has 99 tracks, and although it was never intended to be played in shuffle mode, or even listened to as an album, it works equally well as an accidental sequel to Sampling Virus. Just for fun I decided to apply the same treatment to Ringtones that I applied to Sampling Virus up the page:

Has there ever been a genre more ripe for commercial discovery than Japanese noise music? Katy Perry's most recent album, Witness, has been relatively unsuccessful, but she has a knack of bouncing back from adversity with a new sound. What better seam to mine than Japanoise? Few genres encapsulate modern life more perfectly than Japanese noise music, and where Katy Perry leads, we will follow. Glitch music itself almost threatened to break into the mainstream a few years ago, and although the likes of Merzbow and KK Null are not mainstream figures they could probably sell out the Royal Festival Hall, so Katy Perry should have no difficulty bringing Japanese noise to the world's arenas. She has something that Merzbow doesn't have; attractive breasts.

If you think about it, choral music is a kind of glitch music. The vocal texture of a choir comes from the layering of different voices; if all the voices in a choir were identical, the result would sound like a very loud soloist. Even highly trained vocalists can't produce pure tones, because the human animal is much less precise and repeatable than a machine. We don't mean to be different, we just are.

Wednesday, 22 November 2017

Apple Power Macintosh G5: Flame On

Let's have a look at the Apple Power Macintosh G5, a weighty space heater that can also perform computing tasks. Apple launched the G5 in 2003 with great fanfare, but nowadays it has a decidedly mixed legacy. In 2003 it was a desktop supercomputer that was supposed to form the basis of Apple's product range for years to come, but within three years it had been discontinued, along with the entire PowerPC range, in favour of a completely new computing architecture. The G5 puts me in mind of an ageing footballer who finally has a chance to play a World Cup match; he is called up from the substitute's bench, entertains the crowd for twenty minutes, but the team loses, and by the time of the next World Cup the uniform is the same but the players are all different. Our time in the sun is brief, the G5's time especially so.

I've long been a PC person, and from my point of view the G5 came and went in the blink of an eye. I knew that it had a striking case and a reputation for high power consumption and heat output, and for being 64-bit at a time when that was rare in the PC world, but that's about it. Almost fifteen years later G5s are available on the used market for almost nothing - postage is incredibly awkward - so I decided to try one out. Mine is a 2.0ghz dual-processor model, the flagship of the first wave of G5s. Back in 2003 this very machine was, in Apple's words, "the world's fastest, most powerful personal computer".

Before turning it on for the first time I informed the local flying club that I was about to activate a powerful radio frequency source. I sent a letter to the Home Office and British Telecom and my electricity provider to inform them that I was not a terrorist, and that the souls of the dead are reincarnated on Jupiter, and that I no longer wished to be married to Helena Bonham-Carter. Furthermore I removed all of my clothes out of fear and assumed a defensive posture, attempting to prove to any and all observers that I was no harm to anyone.

I was slightly disappointed when the machine started not with flames but with a muted whoosh, which settled down into a quiet hum. Nothing exploded or shuddered. Masked men did not burst into the room. Instead there was a familiar chime and OS X 10.5.8 loaded up.

My G5 still had its original 160gb 7200rpm hard drive, which was very noisy. The date code says that it was constructed in August 2003.

I have a gadget that can measure the power consumption of electrical items. My desktop PC is a quad-core 64-bit i5-2500k running at 3.3ghz, with an SSD, two spinning hard drives, and an Nvidia GTX 750. I built it back in 2011, and apart from adding the graphics card and SSD I haven't felt the need for more power.

According to my gadget the PC idles at 70 watts and at full whack consumes 150 watts of power. Under load that's about the same as a modern 4K television, perhaps slightly more if I include the PC's monitor, but my PC rarely runs at full power.

The G5 plays DVDs perfectly well, like every desktop computer since the late 1990s. The G5 predated Blu-Ray, but there were a couple of G5-compatible Blu-Ray drives. They were only useful for burning data discs, however, as only the fastest G5s had the necessary combination of processor grunt and graphics hardware to decode Blu-Ray video, and even then finding PowerPC software that would play Blu-Ray films was problematic.

In comparison my new, fourteen-year-old Power Macintosh G5 has two separate 64-bit PowerPC 970 processors running at 2ghz. It has two spinning hard drives and an ancient Radeon 9600. At idle it consumes 140 watts of power - only ten watts less than my PC under load - and when taxed it sucks up 280+ watts. That's almost twice as much as my fridge, and apparently with very heavy processing power consumption goes up to 400+ watts. On the positive side the G5 is quieter than my PC. It has more fans, but they run slower, only going mad every once in a while.

OSX 10.5.8 Leopard was the final version of OSX for the PowerPC-era Macintoshes. Leopard is now ten years old. It's a close contemporary of Windows Vista, but whereas Vista was mocked as a bloated mess Leopard was generally regarded as a decent-albeit-inessential upgrade of 10.4 Tiger. As a PC person I'm impressed with how well Leopard and the G5 have aged. The toned-down look of mid-decade OSX Leopard is still attractive. It recognises USB peripherals without popping open an irritating dialogue box, the interface feels smoother and less flaky than Windows XP on a similarly old machine, and the tablet-esque "cover flow" feature feels ahead of its time.

Cover Flow was included in the first iPhone, released at almost the same time as Leopard in 2007, and although Apple has subsequently fallen out of love with Cover Flow it remains part of MacOS nowadays. Leopard runs an obsolete version of iTunes, version 10, which is faster and easier to use than the latest version.

Back in 2005 there was some debate as to whether Tiger (older, less featuresome, but faster) or Leopard (a few more features, more modern, slower) was the best choice for the Power Macintosh G5. I'm in two minds. My hunch is that the minor performance hit of Leopard is insignificant on the faster G5s, but on the other hand very little OSX software was made obsolete in the jump from 10.4 to 10.5 - Photoshop CS4 and Logic Pro 8 also work on 10.4 - so beyond a feeling of completeness there's no pressing practical need to switch from 10.4 to 10.5.

A pair of 1TB Western Digital Caviar Greens I had lying about. with jumpers over pins 5+6 to force them into SATA-I/II mode. The Intel-powered Macintosh Pro had four drive bays arranged from front to back through the middle of the machine, with "cold swappable" caddies. The G5 isn't quite as elegant - the caddy is fixed in place, and the drives go on backwards, so you still have to plug in the SATA connectors. G5-era OSX had software RAID support, although I suspect an SSD would be more sensible.

I bought the G5 mainly to use Logic Express, a music sequencer. Here's a song I wrote with this combination, as featured in the previous post:

And here's a little video of Logic Express performing the track. A more complex arrangement with masses of reverb would sorely tax the CPU, although it wouldn't be too hard to fix this with creative bouncing:

There was something melancholic about the process of setting up Logic Express. Logic's big selling point is its simple interface and its massive selection of genuinely good built-in sound generators, in particular a decent software sampler that has a range of usable, natural-sounding, but not annoyingly obvious instruments. Logic uses AU "audio unit" plugins instead of the more common VST standard. There was a boom time in the early 2000s when masses of free VST/AU plugins were available, but over the years the market has died off, partially because the audio units included with the modern Logic Pro X are extensive and well-made, partially because the developers have moved on.

So there was something sad about hunting down old plugins that were last updated in 2007, hosted on personal websites that died in 2011, or that remain as shells with (c) 2005 dates on them. It's as if creative electronic computer-based music flourished in the early part of the 2000s and then died suddenly, which is surely not the case, but browsing through dead links gives that impression. HyperUPIC and Sonasphere, for example, appear to have completely left the internet, never to return. Fortunately MDA and DestroyFX still exist. Perhaps I'm out of touch.

The only PowerPC browser actively maintained today is TenFourFox, which is based on FireFox. On my 1.67ghz PowerBook G4 it's painfully slow, but it's much more usable on a G5. It even accesses the web version of Google Drive without grinding to a halt. It has very limited support for online high-def video and Netflix is a distant dream, but on the whole it makes the G5 almost a usable everyday machine, especially if you own a solar power plant. I imagine the late dual-core and quad-core G5s would be pretty speedy.

The problem of course is that the same could be said of almost any cheap laptop or Windows tablet released during the last ten years, minus the bit about having a solar power plant. The laptop would probably have more USB ports, and might well have USB 3, which would compensate for the G5's second drive bay. Anandtech made this very point when they tested one of the later G5s against a 2010 Mac Mini, which was generally faster while using roughly one-tenth the power. My late-2008 MacBook Pro, for example, is slightly less flexible than the G5 but runs MacOS High Sierra and is much faster.

Fourteen years later the exterior of the case is still stunning. The interior has however tarnished a bit.

I decided to benchmark my G5. I ran the trial version of Geekbench 2, an older benchmarking utility that runs on different platforms. Geekbench is as old as the G5, and in a neat coincidence it uses the lowest-specification Power Macintosh G5 as its benchmark, with a score of 1000. Perhaps the author wrote the original version on a G5. I don't know.

My 2.0ghz dual-processor G5 scores 1645, which makes it 60% faster than the 1.6ghz entry-level model, or at least the benchmark score is 60% higher. That's reasonable given that the 1.6ghz model had a lower clock speed, slower memory, a lower bus speed and only one processor.

The G5 again. It was divided into three thermal zones. The front of the machine is to the left. From top to bottom, left to right, the top compartment has a DVD drive, a pair of fans, and two hard drives. The middle compartment has a fan plus mono speaker mounted in front of a 56K modem, followed by space for PCI-X cards and the AGP graphics card. At the rear of the machine is a catch that releases the access panel.
The lower compartment has the memory sitting beneath a Wi-Fi/Bluetooth card, then the CPU modules and their heatsinks, then another pair of fans. The base of the case contains a hefty 600w power supply.

The 17" 1.67ghz PowerBook G4 I wrote about last month scores 883, drawing just 47 watts whilst doing so, which suggests that the 1.6ghz Power Macintosh G5 wasn't much cop. Putting it another way, my G5 draws six times more power than a contemporary G4 laptop, but benchmarks only twice as fast. I realise I'm comparing two different fruit, but the G5 feels like an attempt to achieve performance gains with brute force rather than sophistication. It also feels like the result of two companies with different product release schedules trying to reach an uneasy compromise. The G5's deficiencies were masked by the fact that contemporary PCs were just as power-hungry, but therein lies a history lesson.

In the 1980s and 1990s home computers didn't use much electricity. No-one cared about "thermal design power" and most computers were either air-cooled or used a single fan to push air over the power supply unit. By the 2000s however heat became a major issue. Intel's Pentium 4, introduced in 2000, was a small step in terms of performance improvements over the Pentium III but a giant leap in heat output. Even without overclocking the typical Pentium 4 system required a PSU fan, a CPU fan, a case output fan, perhaps input fans and a fan on the graphics card. I call the early-mid 2000s the time of nine fans.

Eventually the Pentium 4's design team reached something called the "power wall", whereby the gains from extra clock speed were outweighed by the difficulty of cooling the chip. Furthermore a phenomenon called electromigration, whereby circuitry degrades at higher temperatures, started to eat into the lifespan of the chips. This is one of the reasons why modern CPUs tend to use multiple, modestly-clocked cores rather than one single very fast processing unit.

The Power Macintosh G5 did actually have nine fans. My Power Macintosh G5 has nine fans. Two are hidden away in the base of the machine, where they are attached to the power supply unit. Four fans front and back draw air over the gigantic CPU heatsinks. There's a single fan in the middle, which airs the PCI cards, and two small fans blow air over the hard drives and rear of the motherboard.

That's nine fans. I've counted. Two plus four (six) plus one (seven) plus two (nine) equals nine. The case has an array of temperature sensors that make sure everything is cooled effectively. Each machine apparently has a unique thermal profile stored somewhere in its BIOS. Some modifications require that the thermal profile is recalibrated before the fans work properly again, and you can only do that with Apple's G5 diagnostic tools, which aren't publicly available.

Photoshop CS2 - technically Bridge - plus TenFourFox. In this shot I've had to use Adobe's DNG converter to convert my camera's RAW files. Good luck finding the last version of Adobe's DNG converter that supports the PowerPC! The G5 will run up to CS4, which is still competent nowadays.

The G5's case is a clever piece of design that works well, but there must have been a better way. Imagine if the time and brainpower spent dealing with the G5's heat generation had been applied to other problems instead. Back in the 1990s the PowerPC chip was touted as an efficient, RISC-based alternative to the Intel 80X86. It ran at lower clock speeds than contemporary Pentiums but did as much work. Successive generations of the PowerPC chip kept Apple Macintoshes competitive during the late 1990s, but the architecture started to lag in the early 2000s with the introduction of second-generation Pentium 4s and efficient X86 clones from AMD. The G3 and G4 remained competitive mobile chips but Apple was in danger of having its desktop machines fall behind.

The PowerPC G5, formally known as the IBM PowerPC 970, was Apple's great white hope. It was announced at Apple's 2003 keynote presentation, which is available on Youtube:

The keynote is like something from a parallel world. Nowadays Apple's product announcements are full of 3D face recognition and all-glass backing and dual-lens cameras and rose gold; in 2003 the company chose to highlight the G5's bandwidth and bus speed and its advanced chip fabrication technology. Nowadays Apple doesn't talk about cost - if Sir or Madam baulks at the price of a new MacBook Pro, perhaps Sir or Madam might consider going elsewhere - but in 2003 the G5 was sold as a cheaper alternative to an equivalent dual-Xeon PC. There was also a rackmounted file server version of the G5, the XServe, which is something the modern Apple would never dream of releasing. This was a time when Apple was fond of pointing out the UNIX roots of OS X.

On paper the G5 looked terrific. The PowerPC 970 was a 64-bit chip attached to a system bus that ran at lighting speed, with a multi-processor-enabled architecture that could access up to 8gb of memory, with SATA hard drives and an awesome case. All except the most basic Power Macintosh G5 machines had either two CPUs or a dual-core chip - in one case two dual-core chips - and later machines increased the memory limit to 16gb, with the last batch of G5's adding support for PCI-e. Even in 2017 the idea of a quad-core desktop PC with 16gb of memory plus SATA and PCI-e sounds current.

The G5's considerable weight is focused on these little pads. There were aftermarket cork pads, but I've used masking tape to wrap a pair of old cycle gloves around the handle-stands, which doesn't change the fact that in 1998 The Undertaker threw Mankind off Hell In A Cell, causing him to plummet sixteen feet through an announcer’s table.

As mentioned earlier Apple's television adverts claimed that the G5 was "the world's fastest, most powerful personal computer", although here in the United Kingdom the ITC objected to that claim and forbade Apple from repeating the ad. The single-processor machines weren't particularly impressive, but the G5 was new and hopefully had room for expansion whereas the Pentium 4 and Xeon were in 2003 several years old.

But that seems to be where things went wrong, because the G5 didn't have room for expansion. Or contraction, because no matter how hard IBM tried they couldn't produce a chip that would fit into a laptop. The PowerPC 970 was simply too power-hungry, and when underclocked it didn't run much faster than the G4, and the 64-bit architecture was of questionable benefit in a mobile context. Many years later ARM demonstrated that it was possible to make incredibly frugal, powerful mobile RISC chips, but that was still science fiction in the early 2000s.

The G5's 64-bit architecture was something of a false start. Only a handful of applications used the G5's 64-bit address space, and although OSX could access huge amounts of memory it didn't have a 64-bit kernel for several years. When Apple abandoned the PowerPC they temporarily took a step back into a predominantly 32-bit world with the Core Duo, only fully embracing 64 bits a few years later, with the Core 2 Duo and OS X 10.7.

Logic Express is essentially a multi-track audio / MIDI sequencer on a G4 PowerBook. On a G5 however it will run lots of instruments and effects at once.

The G5's lack of mobile mojo was unfortunate in a world that was gradually pivoting towards mobile computing and mobile internet, doubly so given that half of Apple's profits came from its laptops. It's fascinating to speculate whether IBM's inability to make a mobile G5 was a result of its inability to do so or from a lack of motivation. In the past IBM had made mobile versions of the 80386, and it had even sold RISC-powered versions of the ThinkPad laptop, but that was a long time ago, and IBM had no other use for mobile chips. In the early 2000s Apple had turned the corner into profitability but from IBM's point of view Apple was still just another customer, of relatively minor importance. IBM might have spent a fortune setting up a dedicated POWER mobile team, but to what end?

And there was something else. Intel had launched the Pentium 4 with great hopes that it would still be around in ten years, but development hit a brick wall in the early 2000s, and for a brief period AMD seemed poised to become the dominant player in the X86 market. Intel's problems with the hot, power-hungry Pentium 4 mirrored those of IBM with the G5, but whereas IBM was uninterested in a mobile G5 Intel made three concerted attempts to stuff the Pentium 4 into laptops, failing each time. To its credit Intel wasn't too proud to admit defeat. After going back to the drawing board the company came up with the Pentium M, which was released to the world in 2003, just as Apple was putting the finishing touches on the Power Macintosh G5.

The Pentium M was perhaps the most influential CPU of the early 2000s. Very few people recognised it at the time. With the exception of a few small-form-factor PCs it was generally fitted into laptops, which were of limited interest to performance enthusiasts. Laptops had a reputation for being underpowered; no-one in 2003 expected that the Pentium M would be any good. Its name was easy to confuse with the earlier Pentium 4M, and furthermore Intel insisted on downplaying the Pentium M in favour of the Centrino platform. Part of this unwillingness to publicise the Pentium M might have come from the fact that the design owed more to the Pentium III than the Pentium 4. I imagine Intel was unwilling to make the Pentium 4 look bad given that it was still theoretically their desktop flagship.

But not for long. The Pentium M didn't just outperform the Pentium 4M mobile chip, it also benchmarked within a few percent of the desktop Pentium 4, while consuming less power and generating less heat. After a brief diversion with the Pentium D Intel essentially gave up on the Pentium 4 in favour of a multi-core development of the Pentium M, which was sold as the Core architecture. The mostly-mobile Core Duo and desktop-oriented, 64-bit Core 2 Duo went on to re-establish Intel's dominant position in the X86 marketplace.

Apple had, as a side projected, already ported OSX to the X86 architecture. There were rumours that Pentium 4-based development machines actually ran OSX faster than the G5. At some point Apple's engineers must have become privy to the Pentium M development roadmap, and in mid-2005 Apple publicly announced that it was saying goodbye to the PowerPC architecture in favour of Intel.

Given the G5's notorious heat issues the switch to Intel was less of a shock that it might have been. I have the impression that long-term Apple fans are fond of the PowerPC era and nostalgic for the likes of the dual-processor, mirrored drive door G4, but not blind to the G5's faults. Apple fans aren't like Amiga fans, thank goodness. They know when to admit defeat.

The fans pull straight out. Further work generally isn't necessary - it's easy enough to blow dust out of the heatsinks, and the airflow tends to keep the G5's interior surprisingly clean. Most other faults are terminal and can be fixed by throwing the G5 into a deep bog and buying a new one instead. The single-processor models just had the top heatsink and fan. The liquid-cooled models enclosed the CPUs and cooling unit in a single large block.

The Pentium M's life ran alongside the PowerPC 970/FX used in the G5. They were both launched in 2003 and ended their lives in 2005. During that period the Pentium M underwent a die shrink and scaled from 1.3ghz up to 2.27ghz, roughly doubling in performance in the process. The PowerPC 970 also underwent a die shrink, but its performance increases were more modest until the very last wave of multi-core G5s, which were impressively fast but not enough to change Apple's mind.

The first wave of G5s consisted of a 1.6ghz entry-level model, a 1.8ghz also entry-level model, and a 2ghz dual-processor flagship. The second wave, launched in mid-2004, introduced the more efficient 970FX processor but was otherwise very similar to the first wave, with a 2.5ghz model sitting at the top of the range. Performance-wise the second wave seemed to be only slightly faster than the first. A third wave came out in mid-2005, but again the machines were much the same as their predecessors. The last batch of G5s emerged in October 2005 and introduced dual-core processors and PCI Express ports. They were launched when it was already known that Apple was going to abandon PowerPC and were therefore doomed to be the last of the line.

Apple also launched a couple of orphan G5s - a 1.8ghz Dual Processor machine that filled out the bottom of the range, and a single-processor 1.8ghz model that used iMac components in an attempt to sell a budget model. More than half of the fourteen different G5 models ran at 1.8ghz or 2.0ghz. The final, dual-core 2ghz model was only slightly faster than my first-generation 2ghz dual-processor G5; the last 1.8ghz model was actually slower than its predecessors. Meanwhile the later high-end models needed liquid cooling units to tame their incredible thermal output.

Within a few years some of the liquid cooling units developed leaks that could silently corrode the machines away from the inside. Some units were more reliable than others, but nonetheless the effect on the resale value of liquid cooled G5s was dramatic. If you don't want to bother with liquid cooling the most powerful non-liquid G5s are the third-wave dual-processor 2.3ghz models and the 2005 dual-core 2.0ghz and 2.3ghz models, of which the dual-core models are the most desirable due to the inclusion of PCI-express.

The access panel is a rigid, weighty chunk of aluminium. If the entire G5 run had fallen through a timewarp to Nazi Germany circa 1941 the scrap aluminium could have filled the sky with Messerschmidts.

Upgrading the G5 is generally easy. In ascending order of difficulty, easy first:

All but the most basic models had eight RAM slots, which accept memory in pairs, working from the inside out. They aren't picky; you can mix brands and capacities. My G5 has the two 256mb sticks it was sold with, plus a pair of 512mb sticks, plus two pairs of 1gb sticks for a total of 5.5gb. The later models could accept up to 16gb of memory, although from what I have read advances beyond 4gb in Leopard give only minimal improvement and only then if you're using something like Photoshop a lot. The memory is air-cooled and does tend to get hot. My desktop PC has four slots but two of them are blocked by hard drive cables; the G5's memory is easy to reach.

All models take SATA hard drives, but the early models were designed for the SATA-I standard. This means that if you use a modern SATA-III drive, you have to put a little jumper over pins 5-6 to enable SATA-I/II compatibility. I have a packet of these little jumpers. I'll paste some of them into the next line so that you can use them:

The G5 is fussy with SSDs, and OS X 10.5 doesn't support TRIM, and even though SSDs are now trivially cheap I had a pair of old Western Digital Caviar Green HDDs sitting about doing nothing, so I used them instead. I cloned the operating system across. My original G5 hard drive is in theory faster than the Caviar Green (7200rpm vs 5400rpm), but it was very noisy, and in my personal experience new slow drives tend to be better performers than old fast drives.

The G5 uses PCI-X slots, which are compatible with original PCI cards. PCI-X was a dead-end standard common in servers; it was obliterated by PCI-Express. In 2017 you will only find PCI-X cards on the used market. My G5 came with a four-port eSATA PCI-X card that will connect with external eSATA hard drives and a Mark of the Unicorn audio interface card that's useless without an external hardware module that I don't have. Most PCI-X cards were for storage, ethernet access, RAID controllers and the like. Sadly there don't seem to be any PCI-X USB cards.

eSATA, by the way, was essentially SATA but for external drives. As with FireWire 800 it was competitive with USB for a while but eventually overshadowed by USB 3.0. eSATA doesn't transfer power, so you can only use drives and drive arrays that have their own power supplies.

The G5 has three USB 2 ports, two FireWire 400 ports, and a FireWire 800 port. In my life I have never used FireWire to transfer data. I will probably go to my grave having never used FireWire.

The initial wave of G5s shipped with graphics cards that had a DVI port and an ADC port. What was ADC? It was a proprietary Apple thing that, like so many proprietary Apple things, was technically clever enough that it didn't seem like deliberate lock-in, but nonetheless didn't even take off within the Apple ecosystem, let alone outside it. Only a handful of Apple monitors supported it and the standard was essentially dead even before the G5 came out. There are ADC-DVI adapters available, but they're too expensive to make sense. The later G5s used graphics cards with dual DVI outputs; the most powerful had a dual-link DVI port that could drive the 30", 2560x1600 Apple Cinema Display.

Early G5s used AGP; later models used PCI-E. My G5 has an air-cooled Radeon 9600. In theory I could upgrade the card, but in practice the G5 only accepts Macintosh-only versions of the various graphics cards that were available, and they're rare on the used market because most of them were sold with the G5 rather than separately. There's not much point upgrading the G5's graphics card unless you want to use dual monitors. OS X might feel slightly snappier. The few games available for the G5 might run faster. I would be wary of the extra heat and power draw.

Everything Else
My G5's optical drive is a bit flaky. Sometimes it reads a disc, sometimes it doesn't. Replacing it is apparently easy but I'm not going to bother. There was a brief period in the early 2000s when it was feasible to back up data to a writeable DVD but with the availability of cheap SD cards and USB sticks there's no point any more.

Fans, brackets, antennae and other components are still widely available. The G5 is in theory entirely replaceable - you can build a new one from spare parts and an empty case, if you have a copy of Apple's thermal calibration software - but there's no point when so many G5s are available on the used market.

As a long-term ownership proposition the G5 is problematic. The fastest models were outpaced by their Intel replacements either immediately or within a couple of years, and are thoroughly obsolete nowadays; the G5's power consumption is a reminder of a time when oil was cheap, interest-only mortgages were a fantastic idea, and the economy was not only going well, it would continue to go well forever. Using the G5 as a file server or overnight rendering machine is an expensive proposition. As a space heater it's less efficient than an actual space heater unless you do useful work with it.

On a more esoteric level the G5's fantastic case can be stripped out and used to house a PC, although it's tricky because the ports and buttons don't conform to the PC standard. This chap here chopped his down and made a cute G5-based mini-PC. Alternatively you could gaffer tape a Mac Mini to the inside and just run all the cables through the cooling holes, using a little stick to press the power button. Two G5s joined with a plank of wood make a neat coffee table. Turned on its side, stacked, and suitability modified the G5 can be used as a chest of drawers. The G5's metal case generally resists corrosion but isn't stainless, so its marine applications are limited.

Seriously though, the G5's aluminium case is now both its greatest strength and its greatest weakness. On the downside it's too huge to send through the post, but on the upside it's a solid, genuinely impressive work of engineering that feels useful as a spare part, if only as an object d'art. Apple is routinely mocked for putting style ahead of substance, but the G5's case is a superb example of a functional design that works well and is also beautiful to look at and indeed think about.

Even with the fan in place there's still a large empty space in front of the memory chips. Enterprising storage vendors sold brackets that could house extra hard drives in this space, although bearing in mind that the memory chips get hot I'm sceptical that it would have been a good idea. I often wonder if Apple could have sold a scaled-down G5 case purely as a robust RAID enclosure, with fans etc.

Back to Geekbench. My 2003 2.0ghz dual-processor G5 scores 1645 while consuming 280+ watts of power. In comparison my late-2008 unibody MacBook Pro laptop, powered by a 2.4ghz Intel Core 2 Duo, scores 2758, drawing 45 watts in the process. By that time the Intel-powered, eight-core Macintosh Pro was Geekbenching values of almost 10,000. My desktop PC, an Intel i5-2500k, Geekbenches at 8238, consuming 150 watts. One criticism levelled at the Intel-powered Macintosh Pro was that despite its class-leading power, the edge it had over ordinary Core 2 Duo / i5-powered Intel hardware didn't justify the cost, but that's another argument for another blog post.

Judging by Everymac's figures the very last, quad-core G5 Geekbenched at 3316, which is very impressive for a machine now twelve years old. I imagine it would still have a niche in a recording studio, if you had the aforementioned MOTU audio interface and the appropriate version of Logic Pro and were so familiar with the workflow that it would be disruptive to change. I shudder to think of that setup's power consumption over twelve years of use.

What's the logical next step from a G5? That's a difficult question. Apple intended for you to replace your G5 with an Intel-powered Mac Pro. The Mac Pro used the same basic case design as the G5, albeit that the interior was rejigged. It was conceptually much the same as the G5, combining multiple processors and multiple drive bays with a plethora of RAM slots and ports. However the switch to Intel coincided with a new appreciation for frugal computing, and many former G5 owners opted for one or more Mac Minis instead, using USB and latterly Thunderbolt for external storage.

The G5-descended Macintosh Pro was discontinued in 2013. Apple then intended for professional users to adopt the next-generation Mac Pro, a tubular monstrosity that defies description, but in practice professionals often switched to 5K iMac, assuming they remained with the Macintosh platform at all.

The problem is that the basic design philosophy of the G5 and Mac Pro - monster processors, tonnes of internal storage, all in a big case - is a throwback to the past, because for all but edge cases standard desktop processors are fast enough and faster ports mean that external storage isn't appreciably slower than internal storage any more. Furthermore The Cloud continually eats away at the idea of a fat client of any kind.

From top to bottom the graphics card has DVI and ADC ports. Then there are holes for the wi-fi and Bluetooth antennae, although my machine connects to the internet without them. Then SPDIF, line out, line in, USB 2, Firewire 400, Firewire 800, Ethernet, Modem. The small front panel has the power button, a headphone port, USB 2, Firewire 400. I've plugged in a USB hub, because three USB ports isn't enough.

Nowadays the G5 is a magnificent example of excess. The Wild Bunch of Sam Pekinpah's classic Western "came too late and stayed too long"; the G5 came too late, but with a lifespan of only three years its time was brief.

And gone forever, because fifteen years later the public's appetite for electricity-guzzling computers is about as great as that for petrol-guzzling cars, e.g. nil. Some G5s probably soldier on in recording studios, and if you happen to be given one for free and you're willing to leave the television turned off and never use the oven it's a perfectly usable albeit very slow desktop computer. It's the cheapest Macintosh desktop tower that's still generally usable.

But even a cheap Intel Atom-powered Windows tablet outperforms it, and once you get bored you face the difficult prospect of selling it on again. If you live near a small airfield they might be able to use it as means of de-icing aeroplanes. When I tire of mine I will offer it to the Royal Navy as a potential replacement for their amphibious assault craft. On the one hand aluminium has a tendency to melt at high temperatures, but on the other hand the RN is strapped for cash, besides which ships have access to huge amounts of seawater, the end.

EDIT: After writing the above I decided to see if I could get Linux working. Until a few years ago Linux generally had PowerPC support, although this was more theoretical than actual - several distributions claimed to support the PowerPC architecture, but very few people seemed to have got it working. Debian was the easiest so I tried Debian. I didn't bother with dual-booting; I just popped in a spare hard drive and installed on to that, thus swapping the UNIX-based OS X for a home-made UNIX clone. Unfortunately it wasn't much cop.

Firefox on Debian on a Power Macintosh G5 in 2017, failing to load a page. It just stops.

I used Debian 8, because PowerPC support was dropped from Debian 9. After some messing around with partitioning and then installing the appropriate firmware for Airport Express I had a working computer with wi-fi, but it was surprisingly slow and clunky for a dual-processor Pentium 4-class machine with seven gigabytes of memory. Perhaps it wasn't using the graphics card properly. In comparison OS X sped along.

Debian on the PowerPC has a modern version of Firefox, albeit not the most modern version, which was unusably slow and didn't render pages fully. Most other browsers either crashed immediately - perhaps Java was the problem - or were only a step removed from text-mode browsing. Could I have fixed this by spending my precious time reading five-year-old forum posts and typing strings of arcane characters in a terminal window? I don't care. There's no point. "Software sells hardware", as the saying goes. I bought a MacBook Pro with my own money purely to use Logic; I have an Android mobile phone because it runs OSMAnd. I couldn't care less about Debian because it has nothing I want, so into the bin it goes.