Showing posts with label computing. Show all posts
Showing posts with label computing. Show all posts

Saturday, June 20, 2015

Do Androids Dream of Electric Sheep...?

The above is the title of the Philip K Dick novella from which Blade Runner took inspiration and, finally, we can now answer the question that it posed.

No, they don't.

Apparently, they dream of dog-fish, camel-birds and pig-snails...

Monday, July 23, 2012

Doing no evil

"Don't be evil" has always been the semi-official motto of Google—to those who believe in the company, it probably still is.

However, those of us in the tech industry have been highly sceptical (to say the least) for some time, and a couple of incidents have recently been thrown up that I would like to highlight.*

The Children's Furniture Company has recently gone out of business, directly blaming the change in Google's search algorithms for the decision.
As we are a purely web based business, it has always been important to be somewhere near the top of Google for keywords such as 'bunk beds' and 'childrens beds'. Google is where all our customers look and up until May, that's exactly where we were - page 1. To get to that slot is highly prized and competitive and over the past few years we have both advertised on Google and like many companies, used SEO specialists (Search Engine Optimisation as it's called), to help move us up the natural Google listings. A company we used about two years ago put some external links onto our site that Google now considers as webspam and for this it has demoted us to nowhere land (along with 1,000's of other businesses as well).

It seems that we cannot take these links off and the only option open is to completely rebuild the site. Sadly this would take too much time and too much money, whilst not being able to sell furniture at the same time. So as we couldn't see how people would find us; and as we were about to have to invest in a heap more stock, we decided that there was no option but to shut.
This state of affairs was brought to my attention by an SEO professional writing at SearchWatch.
Were the Children’s Furniture Company a good company? Who knows? Certainly not Google. Nor did they care. What is it to Google if they fufilled all their orders, had great customer satisfaction and a satisfactory range of products? The algorithm trumps all and thus customer choice is lessened and another few mouths are on the dole queue.

If you think that the Google of 2012 is a search engine, you’re fooling yourself. It is an advertising channel. It was only yesterday that a screengrab was doing the rounds showing that just 14% of a Google search result is made of organic listings. The rest? Adwords and Google’s own properties—YouTube, News, Shopping and so on. Throw in the increasing personalisation and localisation of results and tie-ins with review sites you’re left with not much space for the little guy. Even the long tail has been ceded to such “quality” sites as eHow and Yahoo! Answers leaving the middle ground for people to fight over the scraps that fall from the top table.

And maybe that’s fair enough. Businesses used to close all the time because they couldn’t afford to advertise during Coronation Street and no-one cried about it very much. That’s an expensive way to get in front of a million noses and get your brand known that was always closed to small business. If Tesco decided they were going to start selling paint and rollers, then your little round-the-corner DIY shop was often toast by the time the 3rd ad for Tesco Paint was on rotation during Hollyoaks.

Google was supposed to be different: a leveller. If you sold paint out of your little shack on the A650, you could go toe to toe with Wickes, B&Q; and any retailer in the world so long as you paid your dues, built a good site, offered good service and worked within Google’s guidelines. And for a while, that held up. It’s still the message they peddle.

But I think we can safely call bullshit on that notion now.
Indeed. And that is only to be expected: Google's responsibility is to its shareholders.

But there have been a number of actions by Google, in the last few years, that blow apart their claim to be an ethical company.

As I've said, those of us in the tech industry always thought that this "don't be evil" bullshit was... well, bullshit.

Let us be clear about this: Google is not primarily a technology firm.

Google derives 96% of its revenues from advertising: it is in Google's interests to provide you with free products, which enables it to show you adverts, which persuade you to buy its sponsors' products.

There is nothing wrong with this: and, assuming that I must be shown adverts, I would rather be shown adverts for products that I might be interested in.

However, in pursuit of this goal, Google has made a number of questionable technological and business decisions: decisions that might be understandable, but which most people would find difficult to reconcile with the company's "don't be evil" motto.

But what about Google's reputation as a hotbed of technological invention? Apart from its search—which is becoming more and more polluted by financial interests—what wonderful, successful technologies have they come up with recently?

Yes, GMail and Reader were built and deployed by Google themselves—and they remain very good products, integral to my daily workflow.

And I am writing this—ironic, I know—on a Google product. But Blogger was invented and deployed by others, and bought by Google.

I also use Feedburner—also invented by others and then bought by Google. And the same applies to YouTube.

Picasa? Mostly lost out to Flickr (and now, arguably, Instagram) but was, in any case, invented by Lifescape.

Google+...? Does anyone actually use it? Regularly?

What we did admire was the way that Google churned out good products: or bought them, made them freely available and improved them. But the reaction to Google's recent acquisition of Sparrow—a brilliant Mac OS email client—shows that even this reputation is at an end.

Daring Fireball has only this pithy comment to make:
Congratulations to the Sparrow guys, I guess, but this gives me The Fear for Sparrow’s future.
Sure enough, Sparrow will no longer be updated and developed. This is, as Matt Gemmell points out, a success for the Sparrow developers—its what, I imagine, they were aiming for. It is, nonetheless, an acquisition intended to shut down competition.

More damagingly, online tech magazine Boing Boing goes further—promoting this short but entertaining video.



The point that I am trying to make is that Google has lost whatever respect it had amongst many technologists—either for its technological prowess, or its radical attitude.

UPDATE: I knew I'd forgotten something—whoops! Android, of course, requires a post all of its own. However, there are two things to consider when assessing Google's way here:
  1. Google loses money on Android.
  2. Android looked very different before and after the launch of the iPhone.
  3. Android is a massively fractured platform that, with every iteration, is demonstrating why the Apple "walled garden" ecosystem—and control over carriers—is, in my opinion, better for consumers.
I will address these points in more detail on a later post.

The other product raised is Chrome: this is a browser running on the Open Source WebKit rendering engine, that is sponsored by Google—but also by Apple and a number of other big corporations. It is not a Google-alone product any more than, for instance, Blogger is. Although the V8 Javascript engine is also worth discussing...

* I realise the irony of the fact that I am writing this on a Google product, and using a video hosted on another Google product.**

** I also realise that the irony is lessened slightly by the fact that Google did not invent or deploy these products—it bought them. Yes—all of them: Blogger, Feedburner and YouTube were all acquired, not invented, by Google.

Tuesday, June 19, 2012

Surface detail

As a Mac fan, you might be unsurprised to know that your humble Devil is pretty underwhelmed by Microsoft's new Surface tablet.* Although, to be fair, the video is not as cringe-inducingly embarrassing as Microsoft's usual promos.

It does underscore one important thing, of course—that Microsoft has understood that having control of both hardware and software makes it easier to create a great user experience. Further, Microsoft are trying to lock down some of the software elements too—restricting the choice of web browsers on the ARM version of Metro.

Anyway...

Many media outlets are hailing the Surface as Microsoft's competitor to the iPad. Whilst I think some serious competition to Apple's iPad is a good thing, I share Justin Watt's opinion that Microsoft is not, in fact, competing directly with the iPad as such.

Whilst I know from personal experience that people in businesses are loving their iPads and iPhones, as Justin points out, the "enterprise" IT-integrated iPad experience is very locked down—for reasons of "security", of course.

Basically, most IT departments that I have encountered are highly conservative at best: at worst, they can be lazy, hide-bound and arrogant. Personally, I think that many IT departments are signing their own death warrants**, but they will be around for a good long time yet.
Enterprise employees can be inspiring, but that depends on said enterprise that they work for. A place that fosters creativity, thinking outside the box, and new ideas leads to happy workers who are open to change if it means making their day to day routine more enjoyable. Let’s just say that having 30,000+ workers doesn’t make for an accomodating work environment for new ideas and embracing change. Integrating iOS and thinking of mobile development in parallel with desktop software development for this many users isn’t an easy or quick task and for that reason the Surface may succeed very well in the enterprise. It’s more of the same. Buried underneath that beautiful Metro interface is Windows. Pure Windows able to run that software developed in 1992, not needing Citrix remote desktop apps, and not needing 100’s of new apps bought to open Office documents that don’t format or display properly on iOS.

Goliath Wants Your Market

In enterprise, Apple is David. The Goliath in enterprise that is Microsoft wants Apple’s market in mobile enterprise. Apple hasn’t entrenched itself nearly deep enough in enterprise. Microsoft has the ability to successfully corner the mobile enterprise market just as it has with the desktop enterprise market. Goliath is bringing the Surface to the table and inside of the enterprise market, it has a fighting chance of succeeding.
I agree with this: the Surface will be largely adopted in enterprise environments.
Outside of enterprise, I think it’s a different story. I think the Surface will fail miserably, but that’s another post I intend on publishing later this week.
I'll look forward to that.

* For a start, there is no firm availability date, nor any indication of pricing.

** In the businesses that I work with, I am finding more and more CEOs and executives are becoming more tech-savvy. And, in all too many organisations, the IT departments are fighting the management.

The result: more and more outsourcing of entire IT functions. This is especially happening amongst many of the smaller, nimbler organisations but larger ones are also started to adopt this trend.

And, of course, if your IT supplier says that they won't support the CEO's shiny new iPad, then it is far easier to change them supplier than it is to fire your IT department.

Especially when more and more of your productive work environments are outsourced to web suppliers or Cloud applications.

Tuesday, June 12, 2012

Apple's Mac Pro update

At the World Wide Developer's Conference (WWDC) yesterday, Apple released a slew of hardware upgrades.

Many of them look very impressive—not least the flagship retina display MacBook Pro. However, your humble Devil has always been a Mac Pro user—I require the expansion capabilities that the power tower offers—and, in this respect, I can only echo Shawn Blanc's comment...
Not much new — no USB 3.0 ports like the whole MacBook lineup got today, and still no Thunderbolt. Why did Apple even bother?

Quite. This is the first update that the Mac Pro has had in two years, and Apple have elected to omit all of the pro hardware features.

I would like to think that Apple have given the standard model a small speed bump, and little else, simply to keep sales going whilst they prepare for a massively revised model later in the year.

However, I fear that this is not the case. Instead, this derisory update lends credence, I think, to the rumours of the Mac Pro's imminent demise.

UPDATE: I may have called that too soon. Via Daring Fireball, I see that MacWorld reports that a customer sent an email to Apple CEO Tim Cook, essentially stating something similar to the above, and Cook replied thusly:
Franz,
Thanks for your email. Our Pro customers like you are really important to us. Although we didn’t have a chance to talk about a new Mac Pro at today’s event, don’t worry as we’re working on something really great for later next year. We also updated the current model today.
We’ve been continuing to update Final Cut Pro X with revolutionary pro features like industry leading multi-cam support and we just updated Aperture with incredible new image adjustment features.
We also announced a MacBook Pro with a Retina Display that is a great solution for many pros.
Tim

This is good news.

Regardless of the actual state of the hardware, I love that Apple executives reply to their customers directly like this: Steve Jobs did the same thing.

As far as I am concerned, if the CEO of a multi-billion dollar company can be bothered to respond to a customer via personal email, that is indicative of great customer service across the company.

UPDATE 2: to compound my annoyance, the latest OS X version—Mountain Lion—will not run on my 2006 Mac Pro. So it looks like I shall have to wait until next year until I can upgrade both hardware and software.

Wednesday, August 24, 2011

Jobs done

So, the day has finally come: Steve Jobs has resigned as CEO of Apple.
PRESS RELEASE: Letter from Steve Jobs

August 24, 2011–To the Apple Board of Directors and the Apple Community:

I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple’s CEO, I would be the first to let you know. Unfortunately, that day has come.

I hereby resign as CEO of Apple. I would like to serve, if the Board sees fit, as Chairman of the Board, director and Apple employee.

As far as my successor goes, I strongly recommend that we execute our succession plan and name Tim Cook as CEO of Apple.

I believe Apple’s brightest and most innovative days are ahead of it. And I look forward to watching and contributing to its success in a new role.

I have made some of the best friends of my life at Apple, and I thank you all for the many years of being able to work alongside you.

It has become increasingly obvious, over the last few years, that Jobs's illness has taken an increasing toll on his health—and one does not have to read between the lines to understand that Jobs's failing health is the major driver for this resignation.



Pancreatic cancer has a very bad prognosis—it killed the 32-year old Bill Hicks in very short order (as well as many, many others)—and the Whipple Procedure (which Jobs originally took a leave of absence to undergo a few years ago) is, in itself, pretty radical. I last saw Jobs when he introduced the WWDC keynote back in early June: although he was enthusiastic, he looked pretty frail.

Jobs has taken Apple from being, as he put it, "90 days from bankruptcy" in the mid-90s—when I bought my very first Mac—to, at one point this month, the biggest company in the world (by market capitalisation). Indeed, at the end of July, it was reported that Apple had more cash in the bank than the US Federal Government—which is pretty good going.

To those of us who follow Apple with a near-fanatical zeal, it has been obvious for some time that the company was putting in place a transition plan. Over the last few years, each successive keynote has seen more presentations from the likes of Scott Forstall, Jonathan Ive, Phil Schiller and Tim Cook—even when Jobs has, theoretically, been back at full fitness. For watchers of the company, this moment has been long anticipated and, whilst not welcome news, we can at least be confident that Apple has—as Jobs puts it in his letter—a "succession plan". And, indeed, Tim Cook has been named CEO.

Whilst former COO Cook may not have Jobs's imagination, he is an immensely competent administrator and has been handling much of the day-to-day running of Apple since he joined the company in 1998. Indeed, it was Cook who took over as temporary CEO when Steve Jobs took a leave of absence, for surgery, in 2004.

Jobs has not entirely left the company: he takes over as Chairman of Apple and it is to be hoped that Jobs's vision will continue to drive the company for as long as he is able. Personally I fear that it may not be for too much longer, but I hope that I am wrong. Because Steve Jobs is a genius.

As I have been saying for sometime—paraphrasing the great Bill Hicks—the fact that we live in a world where Steve Jobs is dying of cancer, but Bill Gates coooooontinues to enjoy his ill-deserved wealth shows that there really is no god*.

In the meantime, I expect Apple to go from strength to strength, and to continue to produce great machines that I can use to actually get my work done—rather than having to fuck about with bollocks like Create A New Network Place.

I salute you, Steve Jobs, and wish you many more years of creating beautiful things.

*UPDATE: just to clarify, for those with a nastier frame of mind than myself, I am not wishing death on Bill Gates. I am simply pointing out that the fact that Gates is not ill and, if there were any justice in the world, Jobs would also not be dying of cancer. 'Kay? 'Kay. Good.

UPDATE 2: John Gruber at Daring Fireball comes to pretty much the same conclusion, but makes the interesting point that Jobs's creation is not really any one product.
Apple’s products are replete with Apple-like features and details, embedded in Apple-like apps, running on Apple-like devices, which come packaged in Apple-like boxes, are promoted in Apple-like ads, and sold in Apple-like stores. The company is a fractal design. Simplicity, elegance, beauty, cleverness, humility. Directness. Truth. Zoom out enough and you can see that the same things that define Apple’s products apply to Apple as a whole. The company itself is Apple-like. The same thought, care, and painstaking attention to detail that Steve Jobs brought to questions like “How should a computer work?”, “How should a phone work?”, “How should we buy music and apps in the digital age?” he also brought to the most important question: “How should a company that creates such things function?”

Jobs’s greatest creation isn’t any Apple product. It is Apple itself.

Quite.

Monday, August 22, 2011

Identify the browser...

... a most amusing game sourced from The Art of Trolling.



Well, it made me giggle. And then, when I have to debug that bastard toilet again tomorrow, it will make me smile in between the bouts of incandescent rage...

Monday, May 30, 2011

Quantum computing

Via Dale Amon at Samizdata, I see that D-Wave have made the world's first commercial sale of a quantum computer.
On Wednesday, D-Wave Systems made history by announcing the sale of the world's first commercial quantum computer. The buyer was Lockheed Martin Corporation, who will use the machine to help solve some of their "most challenging computation problems." Lockheed purchased the system, known as D-Wave One, as well as maintenance and associated professional services. Terms of the deal were not disclosed.

D-Wave One uses a superconducting 128-qubit (quantum bit) chip, called Rainier, representing the first commercial implementation of a quantum processor. An early prototype, a 16-qubit system called Orion, was demonstrated in February 2007. At the time, D-Wave was talking about future systems based on 512-qubit and 1024-qubit technology, but the 128-qubit Rainier turned out to be the company's first foray into the commercial market.

According to D-Wave co-founder and CTO Geordie Rose, D-Wave One, the technology uses a method called "quantum annealing" to solve discrete optimization problems. While that may sound obscure, it applies to all sorts of artificial intelligence-type applications such as natural language processing, computer vision, bioinformatics, financial risk analysis, and other types of highly complex pattern matching.

As the subsequent interview with Geordie Rose reveals, proof that quantum computing is actually being undertaken by the chip was demonstrated in a Nature paper recently (£); further, Google—whose engineers worked with D-Wave on some of the software components—have also published some information on how the system has been used.

All of this is pretty impressive, but don't expect such technology to come to the consumer market in the near future: in operation the machine needs some 15 kilowatts of energy—not least because the chip needs to operate at near absolute zero—and the box's footprint is about 100 square feet!

Still, it's good to see that such mind-boggling technology can be produced—and by the private sector too*...

UPDATE: Counting Cats comments on this development in terms of its application to cryptography.

* Yes, this is a dig at all those idiots who think that only governments can invest ton of cash into scientific research.

Thursday, April 28, 2011

Making stuff

As some readers may know, in real life your humble Devil designs software. I have only really started doing it formally in the last eighteen months or so and it is a massive learning curve.

Whilst you might have a vision of what your software should look like and how it should operate, things never seem to come out quite as you imagine: you are learning how code works, you are constantly learning new paradigms, constantly understanding your markets, adapting to laws and embracing new technologies—and, most challengingly, you are (almost certainly) trying to realise your vision with vastly limited resources.

Which is why I find this quote from Ira Glass—as transcribed by Daring Fireball from this video—so incredibly apposite...
All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you. A lot of people never get past this phase, they quit. Most people I know who do interesting, creative work went through years of this.

I have, indeed, gone through years of this—and in several professions. I spent many years trying to produce great art in the medium of print, before taking my first hesitant steps in web design.

I'm not sure that I have ever produced really great work in pure website design—but in the area of software design, working with fantastic developers, I am finally producing really great work (it's the combination of problem-solving, elegant code, workflow design and aesthetic beauty that captivates).

And with the new system that we are about to embark on, it is just getting better and better and better...

But (lest the above seem overly self-congratulatory) one of the things that continues to drive me on is that my "work disappoints" me still—it could and should be better. But that is, at root, why I continue to love my job (and neglect The Kitchen)—because I know I can do better, and because the people that I work with not only give me the freedom to try, but the skills and the creativity to realise it...

Friday, April 01, 2011

Classic bait and switch

Following on from my last post about Google's non-release of Android "Honeycomb", it seems that the company have quite neatly bent the mobile phone companies over a barrel and begun screwing them royally...
Playtime is over in Android Land. Over the last couple of months Google has reached out to the major carriers and device makers backing its mobile operating system with a message: There will be no more willy-nilly tweaks to the software. No more partnerships formed outside of Google's purview. From now on, companies hoping to receive early access to Google's most up-to-date software will need approval of their plans. And they will seek that approval from Andy Rubin, the head of Google's Android group.

John Gruber sums up what this means, whilst also claiming that he "saw this coming all along"...
So here’s the Android bait-and-switch laid bare. Android was “open” only until it became popular and handset makers dependent upon it. Now that Google has the handset makers by the balls, Android is no longer open and Google starts asserting control.

I pass no judgement on Google's behaviour—it is, after all, a business: what does amuse me is the fact that so many people somehow thought that Google wasn't...

Monday, March 28, 2011

Open

One of the things that makes Google so super—as compared to, say, Apple—is that all its software is "open source".

Apart from its search algorithms. Obviously.

Oh, and Honeycomb—the latest version of its Android operating system.
Google says it will delay the distribution of its newest Android source code, dubbed Honeycomb, at least for the foreseeable future. The search giant says the software, which is tailored specifically for tablet computers that compete against Apple's iPad, is not yet ready to be altered by outside programmers and customized for other devices, such as phones.

As John Gruber notes...
Guess we need a new definition of “open”.

Snigger...

Monday, March 07, 2011

It's long past time that IE6 died

Over the last few years, your humble Devil has been working for a small web software company in Surrey. I was hired as a second-string website designer and—mainly due to the fact that I just won't shut up when I see things that need sorting out—I have swiftly moved through various jobs within the company: from second-strong designer, to Project Manager, to Head of Marketing*.

My current role, and the one that I hope to stay in, is as Product Manager. Despite the fact that I have seen the company triple in size over my three years with them, it is still a small company and, as such, I do rather more than a Product Manager in a large company would do. I put together the product roadmap, write software specifications, design the workflows, user experience (UX) and user interfaces (UI) for the products, as well as coding a good deal of the actual UIs too.

It's busy but immense fun and, usually, incredibly satisfying.

However, we are a web software company and, as such, there are a few things that are massively annoying: these can generally be defined as Internet Explorer 6, Internet Explorer 7 and Internet Explorer 8 (I am reserving judgement on IE9, since it looks to be half-way decent), and their prevalence amongst our customer base.

Of all of these, Internet Explorer 6 is the worst: its support for CSS and Javascript is pitiful and its debugging tools non-existent. What that means is that not only does it not work "properly" but it won't even give you a clue as to why. Released in 2001, IE6 for Windows had worse CSS support than (the now defunct) 5.2 for the Mac: as a browser it is slow, archaic and out-dated.

Unfortunately, for various technical reasons—mainly to do with the tight integration with Windows that led to accusations of monopoly abuse, as well as providing massive security flaws—many large organisations still use IE6 and are having a hard time weaning themselves off it.

But the simple fact is that IE6 not only prevents people like me from writing better web software: it is a massive security risk. As one writer at ZDNet put it... [Emphasis mine.]
Any IT professional who is still allowing IE6 to be used in a corporate setting is guilty of malpractice. Think that judgment is too harsh? Ask the security experts at Google, Adobe, and dozens of other large corporations that are cleaning up the mess from a wave of targeted attacks that allowed source code and confidential data to fall into the hands of well-organized intruders. The entry point? According to Microsoft, it’s IE6...

This would be worrying enough: after all, there are plenty of corporations which are still using IE6—but at least you don't have to give them your sensitive information.

But, as I know from personal experience, one of the areas most resistant to upgrades is the NHS—and they do have plenty of your most personal details on file. Yes, they are behind the N3 network (which brings a whole new set of challenges to those of us working with them) but it only needs one entry point to compromise the entire system.

Many NHS organisations believe that they are supposed to be using IE6; many of them believe that the Spine applications that they need to access will not work on anything other than IE6. This is not only untrue, but these organisations are ignoring a very clear Directive—issued over a year ago by the Department of Health—to cease using IE6 and to upgrade to IE7 as a minimum.
The Department of Health has told trusts using Windows 2000 or XP to move to version 7 of Microsoft's browser.

In a technology bulletin published by the department's informatics directorate on 29 January 2010, it advised NHS trusts using Microsoft Internet Explorer 6 on either Windows 2000 or Windows XP to move to version 7 of the browser.

"We've advised NHS trusts to upgrade to IE7 as early as possible," said a spokesperson. The guidance said that IE7 works with the department's Spine applications, and provides additional security.

The notice also recommended that organisations that continue to use IE6 should apply a security update patch from Microsoft to all affected computers, or if this is not possible apply mitigation methods suggested by the vendor.

Microsoft reported a significant security problem with IE6 on 14 January which could compromise a computer's operating system, although the browser was already known to be less secure than newer versions. The new vulnerability could act as an entry point for hackers to a network, allowing sensitive information to be stolen, according to the DoH bulletin.

Some weeks ago, I raised this issue with a number of NHS organisations, and asked—given the sensitive nature of the data that they hold—why they are still using this browser. Most have said they will look into it, and that is the last that I have heard of the matter.

It is hardly surprising that government organisations—not known for their ability to keep our data safe—are still using this out-dated and flawed browser. It is bordering on the criminal that they continue to use IE6.

Now, Microsoft themselves have set up a new website—IE6 Countdown—which seeks to encourage the death of this shitty piece of software. Naturally, M$ do not put it in quite those terms—they seek to push the benefits of upgrading to the latest version of IE rather than pointing out that IE6 is crap—but the message is the same: don't use IE6, especially for security-critical systems.

Perhaps, with IE6's own manufacturers seeking to kill it, those who risk the integrity of our data every single day might pay some attention.

And then we can take some small steps towards a better web experience too...

* At the moment, we are desperately looking for friendly, enthusiastic people to fill two roles: that of a web designer/front-end developer and that of first-line tech support. Please drop me a line if you would like more details...

Wednesday, October 13, 2010

Is it in their Nature to lie?

In his real life, your humble Devil is a Product Manager for a small software company. Given that it is a small software company, your humble Devil actually delves into the methods and programming of said software.

As such, I know a little about how software programming works, and what is considered acceptable and what is not—both by the programmers themselves, and by those performing the "acceptance tests".

Having established some vague credentials, I would like to draw your attention to this article in Nature—as highlighted by His Ecclesiastical Eminence—regarding the ClimateGate data releases last year.

As most people will know, most of the forensic fury was focused upon the emails exchanged between the key players in this fraud, but a few people started delving into the data that was released alongside those communications.

In fact, your humble Devil highlighted a large part of this in my collation of comments around the HARRY_READ_ME.txt file (a post that resulted in over 24,000 absolute unique visitors in one day).

What this file displayed was not what Nature dismisses as "wonky code", but an utter failure of any kind of systematic programming ability, plus a total lack of verification and testing.

As far as I—and, I am sure, most programmers—are concerned, the construction of models based upon such obviously inaccurate software is tantamount to fraud. Regardless, Nature does not agree...
When hackers leaked thousands of e-mails from the Climatic Research Unit (CRU) at the University of East Anglia in Norwich, UK, last year, global-warming sceptics pored over the documents for signs that researchers had manipulated data. No such evidence emerged...

Where to start? The fact that there was far more data than the HARRY_READ_ME.txt file to examine, and I hadn't the time to collate the results—if anyone can donate links to those who did, please leave them in the comments.

But the HARRY_READ_ME.txt is enough: it details the lack of raw data, the rough estimates, the use of rainfall as a substitute for temperature, the use of synthetic data (i.e. "data" that was made up to fit the climatologists' prejudices) and any number of other really poor practices.

Are they fraudulent? Maybe not.

But the fact that the software programme created by Harry was used to construct the next lot of models—despite the fact that the file existed and that it is inconceivable that Harry didn't tell his employers what a fucking massive pile of shit it was—most certainly is.

These people knew that the software did not operate according to specification, but they used it anyway. FAIL.

These people knew that much of the original data was missing, corrupted or faked, but they used it anyway. FAIL.

These people knew that, together, these factors would produce results that were incorrect. FAIL.

These people knew that, regardless, the software would produce the result that they wanted. FRAUD.

But the killer comment is made by Bishop Hill...
Now correct me if I'm wrong, but none of the inquiries actually looked at the computer code, apart from there being a brief word from Tim Osborn in evidence to Muir Russell, denying that the bodges he'd mentioned affected published results. I'm pretty sure the Harry Readme was not looked at by any of the inquiries.

You are not wrong. None of the "independent" enquiries looked at the code, and this was for the same reason that none of the media rebuttals mentioned the code.

The reason that it was only the emails that were mentioned was that they had some kind of plausible deniability. Excuses were wheeled out, along the following lines...

"Oh, don't worry! Scientists are always having little spats. These were personal emails, not intended for release."

Well, we know that they weren't intended for release because the scientists in question were all urged to delete data and emails to prevent them being released under FoI.

This was to ignore the fact that the data had been examined—the code had been examined too. And from looking at those files, there were only two conclusions to draw:
  1. the climatologists were deliberately defrauding the community about their results (very likely), or

  2. the climatologists were so fucking incompetent that their data and results mean nothing at all (even more likely), or

  3. both.

Either way, there is simply no way that we should be restructuring the world economy—and, by the by, killing fuck-loads of poor people—on this evidence.

Of course, facts, logic and science are seriously unlikely to trouble the idiots at Nature—they might lose some of their share of "the money flood"...

Sunday, May 16, 2010

Like the turn of a page, or a change of gear...*

Ah, technology! How do we love thee? Let me count the ways...

... later. Right now, the normally non-sweary Ministry of Type has a gripe that he needs to get off his chest.
Then we get to the real fake-Georgian pediment over the front door, the overly-shiny brassy door furniture, the PVC window frames, something that infests reading software rather than dedicated e-reader hardware (but is no less annoying for it): yes, it’s the page turn animation. Oh how these software producers love their page turn animations. They might not make a big deal about their font selection, their crappy justification algorithms or even the number of books you can buy through their store, but they will always make a great big bloody feature of their sodding page turns, even the app I pointed to above. Even if an app doesn’t have these damn things, you get the impression they’re working on adding them. In a book, an actual dead-tree book, you don’t notice turning the page because it’s just part of what a book is. That’s how you get to the next bit of text. The whole idea of pages bound like that is an artifact of a particular printing technology — it’s the nature of the delivery medium, not the message. So when we have a digital book, we’re using technology that has its own set of conventions, its own restrictions and its own freedoms, and every bit of digital technology has some means of moving through any arbitrary content: a keyboard has cursor keys, page up and page down keys, a mouse has a scroll wheel, laptops have trackpads with scroll areas, and smartphones have touchscreens, joysticks or D-pads. But no. Those aren’t good enough. They’re not booky enough. You’re going to be reading Ullysses on this thing, War and Peace, The Illiad with this thing for crying out loud! You can’t sully things like that with a scroll wheel! You’re supposed to be imagining reverentially turning the thick, musty, ancient pages in some great national library somewhere, worshipping at the altar of Knowledge! Never mind the story! Never mind leaving you free to just read! No, every 250 words, perform the gesture, watch the animation!

Just let me scroll, please? I’ve been reading stuff off the screen seriously for what, 15 years? More? Scrolling is fine, you know.

Like Aesir, I'm not interested in the state of e-readers (or whatever) themselves. Nor have I ever used an e-book reader. But I have watched the videos of those page-turning animations and thought...

... o god, why?

* From the excellent Waterboys song, Good News.

Monday, May 03, 2010

Understanding the computer market

The simple fact is, the computer market is changing—traditional PC makers have been having a torrid time of it lately. And whilst everyone might be thrilled that their SuperMegaFast™ Windows™ Funky Series 7 WhizzoGraphics GameMonster Workstation now costs only tuppence ha'penny, they should accept that these may become rather rarer quite soon.

Because computer manufacturers whose business model depends on shifting ka-gillions of units, on very slim margins, are soon going to realise that people just don't need yet another black/blue/beige gamebox even if it the light-up translucent bits are red rather than blue.

As a quick illustration of where it's all going, John Gruber has a concise summary...
Jean-Louis Gassée:
The center of financial gravity in the computing world—the Center of Money—has shifted. No longer directed at the PC, the money pump now gushes full blast at the smartphones market.

He backs this up with a striking financial comparison: Apple makes six times the profit from iPhone OS device sales than HP makes from PC sales — despite the fact that by unit sales, HP is the world’s leading PC maker, and Apple is not the leading smartphone maker.

HP’s purchase of Palm shows that they understand this opportunity.

Palm's WebOS got quite good reviews when it came out, but it was too little too late for a company that basically bet the farm on the PalmPre thrashing the iPhone in the market place. Yes, they did.

In March 2009, major Palm investor Roger MacNamee said... [Emphasis mine.]
Palm Inc.’s new Pre smart phone will lure customers away from Apple Inc.’s iPhone when subscribers’ contracts start expiring in June, Palm investor Roger McNamee said.

“You know the beautiful thing: June 29, 2009, is the two-year anniversary of the first shipment of the iPhone,” McNamee said today in an interview in San Francisco. “Not one of those people will still be using an iPhone a month later.

How the mighty are fallen, eh? But, as has been pointed out, Palm has now been bought by Hewlett Packard (HP): speculation has been rife for some time that HP have been frustrated with Windows and desired to create their own operating system. With Palm's WebOS on board, they now have the foundations of an operating system in what is fast becoming the single most profitable area of the computer market.

Again, here's John Gruber, commenting on HP's purchase.
MG Siegler, interviewing HP senior VP Brian Humphries:
“This is a great opportunity to take two Silicon Valley idols and put them together,” Humphries noted. That’s an obvious statement, but he quickly moved on to the meat. “WebOS is the best-in-class mobile operating system. Our intent is to double down on WebOS.”

I really do think this is a great move for HP. I don’t know that it’s going to work, but it certainly gives them better opportunities in the mobile space than they would have had otherwise. They should announce that the Windows 7 “slate” they pre-announced a few months ago has been canned, to be replaced by a version running WebOS. Just saying they’re “doubling down” doesn’t mean squat if they don’t act on it. The easiest way HP could screw this up is by not committing fully to WebOS for all mobile devices — phones, handhelds, tablets.

Only a couple of days later, HP did announce the death of the Slate—and Microsoft the death of its Courier slate concept. HP, it seems, has plans of its own for tablet computing—and I'm pretty damn sure that they don't involve Microsoft or Windows.

For some time, I have opined that Microsoft is over the hill and on its way out. Sure, it'll take a very long time, but the company is an utter irrelevance in terms of technological enhancements. The money is in mobile computing and—given the limitations of form factor in these devices—in "cloud" storage and services.

Cloud storage and services are led by built-in applications—such as the way in which the iTunes Store and the App Store are built into the iPhone—or though web browsers. And the simple fact is that in web browser technology too, Microsoft is way behind Mozilla or WebKit (which is fast becoming the de facto rendering engine for built-in mobile web browsers).

Although Internet Explorer 9 (IE9) will provide enhanced support for HTML5 and CSS3 (including rounded corners. Finally) it is unlikely to be released before the first quarter of 2011. And whilst many corporate set-ups still run IE6 or 7 (irritatingly), more and more people are switching to Firefox (Mozilla) or Chrome (WebKit) for home use.

Given this, web application designers like myself are no longer designing for IE: we are not even aiming to give IE users the same experience as those on more advanced browers—no, we are aiming only to make the applications work in IE.

Microsoft is dead: it just doesn't know it yet.

Saturday, May 01, 2010

On Apple

Brian Mickelthwait has written a superb riposte to Instapundit (and all those others who maintain that Apple is mostly about "image").
Like Apple, Obama’s strength is mostly in the image department ...

That may be right on the money about Obama. Don't know for sure. Don't live there. But I definitely think it's wrong about Apple. For me, Apple's stellar "image" is based on an underlying reality of product quality, not on how nicely Apple supposedly behaves, or did behave until this recent atrocity.

I'm not quite sure how Apple insisting that those who stole its property be prosecuted could be described as an "atrocity" (I'd call it "the rule of law administered under California statute" myself), but Brian's assertion that Apple makes good products is absolutely bang on the money.

No, they are not the cheapest or event the most powerful products for your buck (although the so-called "Apple Premium" is nowhere near as high as most people suppose, or as high as it used to be when John Sculley and Co. were running the company into the ground); a few of them have not even been particularly good products (beautiful and fantastically engineered though it was, the G4 Cube springs to mind).

But what Apple does produce is beautifully designed products. Like Brian's, my Apple keyboard is far and away the most comfortable and high-quality keyboard that I have ever used. Is it the cheapest? No. Was it worth every pound that I paid for it—absolutely.

Most importantly, it is pretty much unlike any other keyboard that I have used, or even seen (though I am sure that there are now numerous cheaper rip-offs available). Somebody at Apple thought about how people use a keyboard; someone at Apple realised that a very slightly convex keyboard would fit the shape of one's hands better, that less travel in the keys was better, that people wanted labelling on the keys that wouldn't rub off; then someone at Apple designed something new and better (for me and Brian, at least).

As Brian quite correctly points out, however, there is always a darker side to such perfectionism.
Meanwhile, I also think of Apple, not as serenely nice people, but more like neurotic and borderline psychotic artists. The kind of artists who regard the transcendent excellence of their creations as a excuse to be mad bastards. I pretty much agree with them. It comes down to my understanding of the character of Steve Jobs. Genius. Mad bastard. Hell to work for, apart from that little thing that you get to make supremely great stuff and everyone thinks you are great too, which you are. "Insanely great", you might say. So, for me, Apple getting the government to smash down the door of some defenceless little tech-bloggers is no deviation for them. That's regular Apple behaviour. That's Jobs throwing a mad tantrum and stamping his never-grown-up feet, insisting that just as his products must be perfect, so must the launching of them be perfect, or not enough people will buy them quickly enough and the network effect won't cut in soon enough, and can't you pathetic fuckheads see that!!!!

Personally, that is a price that I am willing to pay for the quality of Apple products that I enjoy. And I do enjoy them.

Many people will, I am sure, pop up in the comments and call me a mindless Mac fanboi—they will be ignored. If someone turned up on your doorstep and started asserting that you were far stupider than they and too utterly stupid to be able to distinguish between good and bad, you would slam the door in their face.

Similarly, people will also turn up in the comments and assert lots of lies and idiocies that they've picked up from a 1992 issue of PC World—rubbish about Macs are so much more expensive, or how they only have one mouse button, or how there are no applications (particularly games) for the platform.

So let's nip those in the bud right now by pointing out that Macs are far less expensive than, for instance, Sony machines, have come equipped with a multi-button mouse for many years (and before that we used our other hand to press Control on the keyboard); and that there are many thousands of excellent (and cheap) Mac applications out there for doing just about anything; and that, yes, many games come late to the Mac but I, you see, not being a 12 year old boy, don't particularly give a crap and would certainly not spend more than tuppence ha'penny on a machine that is used primarily for indulging my rape, theft and murder fantasies.

Let's focus instead on the fact that I love my Macs in a way that I never loved the PCs that I used; let's focus on how I appreciate good design (like so many other designers) and how anyone who has seen me rage about abysmal user interfaces can utterly understand why I admire Steve Jobs.

And, finally, how I can agree with every word that Brian has written about why Apple is not in the least like some ersatz lawyer who has somehow found a way to rise to be the most powerful man in the world through promising the Earth and delivering only rubbish.

DISCLAIMER: I no longer hold Apple shares, having sold them last week at roughly $268. I bought my first tranche at about $80 and my second at about $140.

Saturday, April 10, 2010

Adobe is hiring! (Mac users need not apply)

As some will know, Adobe is the software development company that makes such applications as Photoshop, Illustrator and InDesign—all those applications that professional graphic designers rely on. One such application is Flash and the next release of that application was going to allow programmers to compile Flash applications for the iPhone—only Apple has just scuppered that in their new iPhone OS 4 SDK (which delivers, amongst other things, multi-tasking—eliminating one of my gripes about the iPad).

Incensed, Adobe developer Jim Dowdell tweeted thusly...
I know that a number of good people work at Apple. If you're seeking a more ethical company, Adobe is hiring: adobe.com/aboutadobe/careeropp

Really? Gosh—let's go and have a look at the recruitment page that Jim is pointing those Apple developers to, shall we? Hang on, what's this...? [Emphasis mine.]
Adobe has a new talent acquisition system. This system is optimized for performance on IE 6 or IE 7, running on Windows XP. Unfortunately it is not supported on Firefox, nor is it supported on a Mac at this time.

Way to go, Adobe! Here's a software development company whose "talent acquisition system" software, apparently, doesn't even work on standards-based browsers.

Further, a developer at Adobe—a company which was started by ex-Apple employees and became a big company through, initially, selling Mac-only software—is urging Apple employees to apply for jobs through a system that doesn't support Macs.

Nice one, Jim, you moron.

DISCLAIMER: I own an insignificant number of Apple shares—currently sitting at $241.79...

Sunday, January 31, 2010

Apple's iPad

Apple's iPad: a thing of beauty—but is it any use?

A number of people—commenting on the blog and in email, IM and physical conversations—have asked your humble Devil for my thoughts on the Apple iPad. Having had a few hours to digest the announcement, and glide around the web to see the opinions of others (most notably this superb rundown from Daring Fireball), I am now ready to unburden myself (with the usual disclaimer*).

First, I would like to say that it is quite obviously a thing of beauty. When Steve Jobs first held it out, between his two hands, I was unconvinced; once he sat down to use it, however, holding it in one hand, I realised that the proportions were exactly right.

Second, there are some features that are sorely lacking (although I expect them to be in the next release). The first is that there is no camera; no, not in the back, but in the front—surely being able to make video-calls via Skype or iChat is an obvious use for the iPad? I cannot understand why this would have been left out, as it would have been superb to demo too. As such, I shall have to put it down to a desire to keep something back for the next edition.

The next gripe here is the lack of multi-tasking—and I have two specific problems (which may or may not transfer to the final product). The first is with music: on the iPhone, some of Apple's applications do run in the background—I am thinking of the Mail programme and of the iPod element. As such, I can listen to music whilst doing other things, e.g. answering an email, etc. I have heard that one cannot do this on the iPad at present and it seems counterintuitive since one can perform these tasks on its smaller sibling.

Further, I have heard that one cannot have more than one Safari browser window open at a time: this, too, is a problem since one of my main activities—blogging—requires me to shuttle back and forth between windows, copying and pasting sections of text and URLs.

As I have pointed out, however, both of these features are present in the iPhone, so it may simply be that the software was not ready for the demo and that Apple intend to replace these features in the two or three months before the iPads actually go on sale. Or, of course, they may be provided in a software update shortly afterwards.

One of the other main criticisms is, of course, that the iPad ecosystem is, like the iPhone, entirely closed—even to the extent that you cannot see the file system. For many, this is, of course, a deal breaker but I am not sure that it entirely matters.

Why? Well, the iPad is clearly not intended, for most people, to be their main computer but an adjunct to it. As long as one can transfer files between the iPad and one's main machine (a Mac Pro in my case—this has relevance later) then this is not really a problem.

In fact, for many people, it might actually be a virtue—as Frasier Speirs notes in his excellent Future Shock article.
For years we've all held to the belief that computing had to be made simpler for the 'average person'. I find it difficult to come to any conclusion other than that we have totally failed in this effort.
...

I'm often saddened by the infantilising effect of high technology on adults. From being in control of their world, they're thrust back to a childish, mediaeval world in which gremlins appear to torment them and disappear at will and against which magic, spells, and the local witch doctor are their only refuges.

With the iPhone OS as incarnated in the iPad, Apple proposes to do something about this, and I mean really do something about it instead of just talking about doing something about it, and the world is going mental.

Fraser makes the point that many techies are up in arms about this because "secretly, I suspect, we technologists quite liked the idea that Normals would be dependent on us for our technological shamanism" but for many normal people, a computer can be a massive hassle.
The tech industry will be in paroxysms of future shock for some time to come. Many will cling to their January-26th notions of what it takes to get "real work" done; cling to the idea that the computer-based part of it is the "real work".

It's not. The Real Work is not formatting the margins, installing the printer driver, uploading the document, finishing the PowerPoint slides, running the software update or reinstalling the OS.

The Real Work is teaching the child, healing the patient, selling the house, logging the road defects, fixing the car at the roadside, capturing the table's order, designing the house and organising the party.

Think of the millions of hours of human effort spent on preventing and recovering from the problems caused by completely open computer systems. Think of the lengths that people have gone to in order to acquire skills that are orthogonal to their core interests and their job, just so they can get their job done.

If the iPad and its successor devices free these people to focus on what they do best, it will dramatically change people's perceptions of computing from something to fear to something to engage enthusiastically with. I find it hard to believe that the loss of background processing isn't a price worth paying to have a computer that isn't frightening anymore.

I couldn't agree more, and I think that the iPad is aimed at precisely this market.

It is also worth noting that a consensus is forming, amongst those who have actually used the iPad, that there really is no substitute for getting the machine in your hot little hands—here's Cruftbox on its power.
Well, I am lucky enough to have been at the Apple Event today. Deep within the Reality Distortion Field. I saw the demo live, not snap shots on a web site. I got to use the iPad and see how it worked in person. I talked with other people that had tried it.

And you know what, just like Steve Jobs said, you need to hold it for yourself. It’s a different computing experience. It’s intuitive and simple. The device is blazingly fast and obvious how to use. It is a third kind of computing between a smartphone and a laptop.

For those that have iPhones, you know the experience of showing someone the iPhone for the first time. The look in their face, when they first flick the screen or squeeze the image to zoom. The realization that this is something different, very different, than what they have experienced before.

I am a technology professional. For almost 20 years I’ve tested, used, broke, fixed, and played with all kinds of technology from broadcasting to air conditioning to software. I am not easily swayed in these things. But even with all my skepticism, I think the iPad is something different. A new way of computing that will become commonplace.

Oh Internets, I know you won’t believe till you hold one in your hands. You’ll bang on about features, data plans, DRM, open source, and a multitude of issues. You’ll storm the message boards, wring your hands, and promise you won’t buy one till ‘Gen 2’. The din will grow and grow as time passes.

And then one day, in a few months, you will actually hold one and use it. And you will say, “I want one. Iwant one right now.”

This lack of multi-tasking is massively offset by just how fast the damn thing is—applications launch instantly. John Gruber points out that a very significant development—not simply that the iPad is fast but that one of the reasons for this is that it's driven by an Apple-manufactured chip. This is extremely significant: Apple have never manufactured their own chips before—yes, they had financial input into the AIM chip group (before the switch to Intel) but they didn't actually design or manufacture the chips. Apple really do want to control the whole eco-system—because the company believe that this allows it to make better products (and thus more money).

Now, I know that very many people object to this—after all, they have popped up on this blog to criticise Apple's control of the far less closed Mac platform. And that's just fine—you don't have to buy an iPad (or a Mac).

But, your humble Devil simply isn't worried about such things: I am a designer, a graphic artist, a website coder, a writer, whatever—I don't want to get down and dirty with my computer. As Fraser Speirs points out (above), fucking around with my computer is not my Real Work—my computer is a tool that allows me to do my real work more efficiently. As soon as I spend even an hour fixing, hacking or otherwise configuring my tool then I am able to do an hour's less of my Real Work.

Do I really need to start mucking about in the guts of my machine? After all, as Jeff Lamarche succinctly puts it...
I'm a techie, but I don't need to be able to program on every electronic device I own. I don't hate my dishwasher because I can't get to the command line. I don't hate my DVD player because it runs a proprietary operating system. Sheesh.

And how much more exciting would websites be if the only browser that anyone used was WebKit? As it is, we will have to wait many years before we can use the amazing CSS advancements—such as CSS-driven animation—that the WebKit group have built in.

Unless, of course, you are designing websites purely for the iPhone or iPad—because they run WebKit as the rendering engine for Safari. In the same way that I currently design websites for standards-based browsers and then hack for those that aren't (yes, IE, I'm looking at you) can see myself starting to design websites for WebKit browsers, and then hacking for less-advanced browsers such as Firefox and IE. It's incredibly exciting.

Anyway, that is slightly off-topic and yet also relevant because, ironically, the iPad is also desirable to techies like me (and yes, this is where I answer the question, "will you get one, dear Devil?")—and, yes, I will get an iPad when they are available. Why?

It is because I am a power-user that I will get an iPad. Let me explain...

I have had Apple laptops but I never really used them very much. The screens were too small for me to do graphics work on them and, besides, the trackpad is not much good for that. So, I used to find myself carrying not only the laptop and its heavy power block, but also a mouse so that I could use it half-way effectively.

But still I didn't really use it—I had no real need to. With a bigger, more powerful machine at home and a reasonable one at work, I had no need to use the laptop in any meaningful way—it felt underpowered and, as such, rather frustrating (although this is partly because Adobe's software is increasingly bloatware). As such, I always felt that I was wasting its potential. And, of course, once it was nicked, I felt no need to get a new one.

In short, because I am a power-user a laptop does not have enough power for me—and yet it is too expensive and too powerful for me not to try using it for the power work.

Nevertheless, I do travel more and more these days—both for work events and for speaking engagements on behalf of the Libertarian Party—and, given the volume of it, I want to be able to get work done whilst I am travelling.

What I mainly need to get done is presentations or speech-writing: these are two activities which the iPad—equipped with the new iWork Suite—is admirably suited for. In fact, it gets even better...

One of the problems that I have is that I am constantly translating my Keynote slides into Powerpoint so that we can present them on the work's demo laptop—and, of course, a lot of things just don't translate tremendously well. Sure, there are other options, but at present I still need to spend the time to check and make corrections to my slides. But with the addition of a VGA-out dock, I can simply connect my iPad to the projector, thus avoiding all of the translation problems that I currently have—plus I can use a remote control to move my presentation along without breaking my rapport with the audience.

In addition, the iPad will do all of those other things that I want to do whilst on the move—although an iPad edition of Coda would make my day (hear that, Panic?)—and in a package that is smaller and, crucially, cheaper than one of Apple's (admittedly superb) laptops**.

In other words, the iPad does enough for me to use it as a mobile device, whilst being cheap enough for me to justify buying one.

Plus, of course, it is a thing of beauty—and, yes, I just want one.

* DISCLAIMER: I own an insignificant number of Apple shares, which have provided a pretty good return, i.e. 200%+ over the last few years. They have, as usual, fallen after the news of this announcement (they feel pretty heavily after the iPhone announcement too—and I picked up some more on the cheap) to a current price of $192.06. It's a good price since they were up at around $217 a few weeks ago. Not, of course, that I am giving anyone investment advice.

** This is not to say that I think that Apple's laptops are overpriced—I don't think that they are. It is just that they are too expensive for me to justify buying another one given the very limited use that I would get out of it.

Saturday, January 30, 2010

The Sun won't rise again

Daring Fireball notes the demise of Sun Microsystems, which has been bought by Oracle and thus no longer lists in its own right.

Sun was already pretty much irrelevant by the time that I got into computing—which was, admittedly, quite late. In the last couple of decades, the company had mainly been associated with a number of Open Source projects—which, it seems, weren't making any money.

So, Sun is dead and nothing much will change. Your humble Devil will just have to review the Apple iPad instead...