Creepy crawlies on the desktop

A friend of mine says she’s completely distracted by the tiles on her Windows 8 desktop. “It’s like a picture gallery on the staircase at Hogwart’s. Things that shouldn’t move are always squirming.” I have sympathy. I also find it distracting. But what astonishes me even more is that we’ve been here before. Doesn’t anyone remember ‘Active Desktop’? Remember how we were all going to watch live feeds of weather, news, pictures and share prices through OCX controls embedded in our Windows 98 desktops? That idea crashed along with the shares. You don’t remember that? There’s my answer. Neither did anyone in Redmond on the Windows 8 team. Why would they? That was almost 15 years ago!

The short memory of people in this industry is a constant worry to me. Youth drives innovation but must we constantly revisit the failed experiments of the past? I know the answer is, yes, because conditions change, but if we are going to try again, we need to be clear about what has changed to give something a chance to work a second time.

It is in this frame of mind that I’ve been musing – for a couple of years now – on the state of software applications. And here’s the confusion: should I now really say, on the state of Apps? While the world marvels at the remarkable democratization of software creation and consumption, I find that I am constantly faced with what I consider to be a mess of software that creates quite a few obstacles to getting things done. Since I am talking about ‘getting something done’, we can exclude everything to do with entertainment, although I suspect some of what bothers me is equally an obstacle to ‘having some fun’. Anyway, if it stands in my humble way, I’m sure that it is also costing organizations dearly.

Looking at how first the Internet and now mobile (and soon wearable) technology have/are churning up the whole of application delivery – from the process to the product – I find myself wondering, are we actually making any progress? Just what is the right architecture for applications? Do our current technologies support it? How did we get here? Have we been here before?

Programs and data

Part of my disquiet is over something I thought was a fundamental principle of application design: separation of programs and data. However, we’ve never been 100% sure that separation was needed, and when they were separated, we’ve had different ideas about which one is more important.

Let’s take a stroll through application history. In the beginning the line separating programs and data was blurry. There was some external storage (think paper tapes, cards) but a lot of data was stored in program memory too. Pretty quickly more external storage options were added. Computing became ‘data processing’. You had your data; you had your programs to process it. By the 1980’s this dichotomy was entrenched in the mainframe world. You designed your data and then you designed your programs to process it. Designers and programmers had data models. But in operation, the program was king. Operators ran programs.

Come along microcomputers (pre-PC) and we’re again using devices and languages that don’t easily process external data. You have computers with no external storage, and Get and Print are the only instructions for retrieving unformatted data. In architectural terms, the distinction between data and program is blurred. The program is once again king of computing on micros and then PCs.

This centrality of programs in the PC world continued but another idea took hold in the world of the Macs. Mac users got used to opening documents that brought up the programs that created them. Allegedly users weren’t concerned about word processors, but their reports. The idea transferred over to PCs in the 1990’s. We started to be aware of associating the file extension to a program to allow us to open a document directly. This was followed by a move to create ‘composite’ documents (think Excel spreadsheets embedded in Word documents) and the wars between rival standards (OpenDocument and Open XML).

What did users really do? In fact, they largely continued to open programs and create simple, not composite, documents.

Bruisers and browsers: through thick and thin

Another perennial seesaw act in software architectures has been the thin or thick client debate.

With the Internet came the viability of the thin client (the concept had been around for many years, after all, what were dumb terminals?) However, permanent connectivity and bandwidth have made it a fairly slow process of moving towards computing in what we now call the Cloud. Let’s take email as an example. It took quite a few years for people to accept using web mail solutions instead of Outlook or other desktop mail client. And to this day, thick mail clients are still widely used. Likewise for most ‘office productivity applications’. Maybe that is now changing.

But the Internet did usher in a whole new era of self-service applications like banking, retail and even financial trading. These new applications were presented to us through the browser, which became a powerful execution and presentation engine through the addition of Web 2.0 technologies. It is now to the point where a browser-based application can be as functional, interactive, responsive and robust (maybe more so) as a thick client application.

The heterogeneity of technologies to achieve this is both a wonder and a worry. Look under the hood and you find a dog’s dinner of simple HTML with embedded scripts – in multiple script languages; outward references to style sheets who are acting in a passive, declarative way; calls to external services; coupled with implicit browser behaviour. Who’s in charge? Use CSS floats and you can’t even tell what order the HTML elements will be displayed on the page. Not the worst problem in the world, but a fundamental violation of a simple premise of HTML in the beginning that things just stacked on top of each other from top to bottom.

Never mind where’s the data? Where’s the program? It’s all in there somewhere. Despite this muddlement, the browser became a bruiser and looked destined to dominate the shape of programs and programming until…

Back to the smartphone future

What got me thinking about data versus programs and thin versus thick clients – about architectures in general –  was the introduction of the iPhone. Among the many disruptions it caused, it equivocated over the direction of application development. At the launch, Steve Jobs proudly revealed how browsing on the iPhone became feasible. The device had a clear screen. Tap to enlarge an area of interest on a page. Touch to interact with buttons, good drop down lists, usable text entry. Anything you could work with in a browser on your desktop computer, you could do on the iPhone. (Except Flash, we all know.)

But in the wake of the phenomenal success of the iTunes Store, Apple also launched the App Store, which has become a model for the industry. What we got from that was a whole new way (yet again) of delivering old-fashioned programs. I don’t know whether Apple knew how this would pan out. But what has happened drives me to blog. Or should I say rant?

Rant 1: Simple. Rejoice. Great to have new ways of delivering functionality that requires portability and mobility.

Rant 2: Democratisation of software. Anyone with an idea can build and market an App. They could have done that on the Internet using web technologies. The App Store adds the dimension of a clear marketing vehicle. It was probably more rewarding when the Apps numbered in the thousands. Trouble is, of course, app stores (no brand intended here) give credibility to some unholy and unmitigated junk. I resent the lies, the deceptively named-to-confuse Apps, and the debilitated functionality. This includes software that is nothing more than advertising. How does it get through screening? As bad as a link farm.

Rant 3: Half-baked mobile software. Make up your minds. Is it something I should run in a browser or is it a self-contained App. Thank you for your App’s big buttons to select a primary function, but shame on you for those buttons that simply take me to an in-App or out-of-App browser window. The user interface suddenly changes and the out-of-App browser window has no integral navigation back to the App.

Rant 4: If I have the App please don’t run the browser version. I hate when this happens: Facebook, YouTube, Twitter – Apps all on my iPhone. Sometimes they open in the browser; sometimes in the App. Why?

Rant 5: Same functionality everywhere, please. I notice that with iOS 6, if you use the browser it will tell you that there is an App. Good idea. I’ll get that App. Of course, your App had better not do any of the nonsense mentioned in my prior three themes, and one other further thing, too. If you tell me to use the App, it had better have at least the same functionality as your browser application. The worst case is that they both do some things not done by the other. That’s a nightmare for a user – for your customers if you need reminding.

Rant 6: Where’s my data? Clearly not an issue for most people, although I daresay, some people might detect there’s something wrong when they can’t see text done in one program (say a Note) when they want to look at it in another program (say a Diary). The trouble is that programs make very poor viewers of collections of data. If I have word processing documents filed by topic on my computer, why can’t I have them filed with documents of all formats about the same topic on my smartphone or tablet? Is staying organized really anathema to mobile computing? I know that Search is the new organizational paradigm (i.e. don’t organize, but instead use Search to find your needle in a haystack) but I just did an experiment to see if I can find a Pages document via title or contents on my iPhone with Spotlight. The answer is no.

What’s up, App?

Mobile devices are very App-centric. You’d almost think they contained no data. I think this is the legacy of embedded devices being largely single-purpose sensing devices – telemetry, imagery, etc. But our current mobile devices are truly small all-purpose computers.

The irony for me in Apple’s spearheading Apps is that my first experience of clicking on a document to open a program was on my very first 128KB Macintosh in 1984. Now for Apple – and the rest of the industry – we’re definitely back to the App. The program remains supreme and the App is crown prince. And data is almost invisible.

Are Apps a good thing? My view is that there is as much drag from Apps as tailwind. Innovation and anything that fosters it is a great thing. However, we still need to remember what has and hasn’t worked in the past. And we need to be discriminating about what’s appropriate. Just because an App is possible doesn’t mean it’s right.

Alongside Apps, I have to ask what will happen to the browser-based application model? I don’t know. I’m an analyst and so my crystal ball should tell me, but it is more like a snowy winter scene in there when I ask that question. There are times I really want a desktop version of a mobile App and it doesn’t exist. I’d be glad to have it in either thick-client or browser form. When you look at them, Apps are not really thin clients. Nor are they exactly thick clients. What the world is moving to is maybe a crossbreed ‘thinck’ client. (And I want credit if ‘thinck’ or ‘think’ takes hold.)

Whether we are talking thin- thick- or thinck-,  we need to be careful with the latest technologies. It is not the architectures that are a problem, but their expressions in development technologies. HTML with CSS with XML with Javascript with Jquery and JSON – making clients and RESTful services, and, and, and – it is a dangerous cocktail. I challenge you to get one of your 2-year programmers to decipher the work of a colleague.  My view is that you’ll find the only way to manage the incredible inscrutability and complexity is through automation that gives you clarity on the diverse components of a single program, let alone whole systems, and helps you manage the software delivery processes. In an industry where the cobbler’s children rarely have decent shoes, will you believe me that you must use the best development and ALM tools available? We can’t afford undisciplined or individual artisan programmers. Because if you thought you had mounted up technical debt in legacy systems, I assure you that it’s just taken on global debt-crisis proportions.