Showing all posts tagged tech:

Apple TV Siri Annoyance

I finally got hold of my new Apple TV. The timing was not ideal, as it arrived on Monday - but I had left early on Monday morning for a week-long trip abroad, so I only got to set it up on Friday morning. I wasn’t exactly worried about spoilers, though, so I went ahead and read many of the early reaction reviews. My reaction was similar to what Michael Rockwell describes:

Reviews of the new Apple TV started showing up on Wednesday of last week with deliveries of the device starting to arrive on Friday. I wholeheartedly expected to see overwhelmingly positive reactions from reviewers and owners in my Twitter timeline. But what I saw instead was a barrage of complaints about what I'd consider to be relatively minuscule pain points about the experience.


The complaints I have seen focus mainly on text input. The issue is that all the letters are on a single row, so you end up swiping left and right a lot to enter text. This is somewhat mitigated by the super-easy initial setup, where the Apple TV simply asks you to place your phone near it and picks up your Apple ID, wifi settings, and so on from the phone. Inexplicably, it made me enter my Apple ID password again to set up Home Sharing, though, and the input process was indeed mildly annoying. However, at least in password fields the numbers and punctuation marks appear on a second row, and you can go up and down between rows without having to scroll all the way to the end, so IMHO it’s no worse than any other on-screen text input method. Also, you don’t really enter a lot of text after the initial setup process, so the pain is pretty contained.

On the old Apple TV you could get around the pain by using the Remote app on iOS, which then let you use your iPhone, or even better, your iPad’s soft keyboard to enter text. Unfortunately, the Remote app has not yet been updated to support the new Apple TV.

My own complaint is different. No matter what I do, Siri remains stubbornly disabled.

It seems that Apple have only made Siri available on the Apple TV in certain countries. At time of writing, the list is as follows:

  • English (Australia, Canada, UK, US)

  • German (Germany)

  • French (France)

  • Spanish (Spain)

  • Japanese (Japan)

I gather that this limitation is because they want to train Siri to pronounce media titles and artists’ names correctly for each locale. However, the way they have implemented it is, as I stated in my hot reaction tweet above, bullshit1.

I spent some time attempting to fool the Apple TV into enabling Siri by setting language and region combos that were supported, disabling Location Services, and so on. Nothing I tried got past it - it seems to be going exclusively by the country of the iTunes Store account, so I can choose whether to have Siri or the Store, but not both.

Why can’t big companies understand that some people live in Region A, but want their media from Locale B? If I set everything up to be in en-GB, you don’t need to worry about Siri mangling anything, because it will be speaking the Queen’s English2.

Unfortunately Apple is not new to this particular brand of bullshit1. The iTunes Store forces users to register to the country where their credit card bills are sent. This means that all the catalogues, curated selections, promotional offers and whatnot are specific to that country. In my case, I consume most of my media - books, films, music, etc. - in English, and so the front page of the Italian iTunes Store is utterly useless to me.


It gets worse, though. Sometimes something is not available to me for no apparent reason, even though it is in the UK or US iTunes Store, and available in Italy from other (legal) sources. There is of course never any explanation of why this might be. A few times I have asked writers if they could shed any light (thinking of ongoing international rights negotiations, that sort of thing), and none have yet had any answer - although all have been unfailingly polite and usually suggested alternatives.

The worst, though, is the subtle differences. Animation movies in general, and Pixar movies in particular, are often available in the iTunes Store with only one audio track, which is the Italian dub. If you buy the DVD you get the original English as well, but Apple in its wisdom will only sell you the dub - even though almost every other film in the Store has multiple audio tracks.

Just to be clear, this is not only Apple’s problem. Another recent offender is OpenTable. OpenTable does not operate in Italy, so reasonably enough, the app is not available in the Italian App Store. However, I spend a lot of time in regions where OpenTable is supported, and web apps on a phone are a faff, so I jumped the fence and got the app on my phone anyway. When I fired it up though, all it would do was to give me a snippy message about only being available in certain countries - despite the fact that I was standing in the middle of the capital city of one of those countries, within stone’s throw of a dozen restaurants that supported OpenTable.

I ended up eating at a restaurant that did not accept OpenTable, and enjoyed an excellent meal without their help.

Michael Rockwell is bullish about the software gremlins in the new Apple TV getting fixed soon:

I have high hopes, though. In a few short months, after Apple's shipped a software update or two, we'll no longer have quite as many criticisms to talk about. What we'll be left with is a well-crafted software platform that could revolutionize the way we think about our TVs, in much the same way the App Store has changed how we think about our telephone. As long as developers build incredible software and Apple continues to focus on improving the experience for users, this is going to be a big deal.

I wish I could be equally bullish about the bullshit1 regional policies being addressed equally soon, or indeed ever.

  1. Sorry about the swearing, but this really is bullshit. 

  2. Also known as "English (Traditional)" - as opposed to the "English (Simplified)" they have in the colonies… Don’t be afraid of the U, Americans - it won’t bite you! 


Enterprise Brand Advertising

I've spent most of my career around enterprise IT sales. I have learned a lot from my colleagues on the sales side, which makes me much more effective in supporting them. After all, let's not forget - whatever your business card or your email sig say your particular role is, ultimately we're all in sales, or we're all out of a job.

One of the things I have learned, however, is that there is a very common misunderstanding of the role of marketing and advertising in enterprise IT sales.

The first thing to bear in mind is the difference between direct marketing and brand marketing. To quote Wikipedia,

Direct marketing is attractive to many marketers because its positive results can be measured directly. For example, if a marketer sends out 1,000 solicitations by mail and 100 respond to the promotion, the marketer can say with confidence that campaign led directly to 10% direct responses. This metric is known as the 'response rate,' and it is one of many clearly quantifiable success metrics employed by direct marketers. In contrast, general advertising uses indirect measurements, such as awareness or engagement, since there is no direct response from a consumer.

Measurement of results is a fundamental element in successful direct marketing. The Internet has made it easier for marketing managers to measure the results of a campaign. This is often achieved by using a specific website landing page directly relating to the promotional material. A call to action will ask the customer to visit the landing page, and the effectiveness of the campaign can be measured by taking the number of promotional messages distributed and dividing it into the number of responses. Another way to measure the results is to compare the projected sales or generated leads for a given term with the actual sales or leads after a direct advertising campaign.

Sales people tend to assume that all marketing should be direct marketing - that is, marketing that should trigger a measurable action. Each campaign must generate a certain number of leads, a certain percentage of which will turn into actual qualified opportunities, and with any luck some of those opportunities will eventually close.


Coffee is for closers only

Of course this situation generates all sorts of fun conversations where sales people question the quality of those leads, while marketing answers back with barbed retorts about the number of opportunities that the sales team actually convert. Lost in the Sturm und Drang of the resulting blamestorm is the question of whether this model can even work at all.

On the other hand, straightforward brand advertising is ridiculed as a waste of money. Since it is hard to measure almost by definition, it doesn't fit into the usual opportunity conversion spreadsheets, and is therefore the first expense to be cut when Sales is driving the bus. In fact, the only way of measuring the impact of brand advertising is by looking at what happens when you stop doing it.

I would argue that for the sort of long-duration, big-ticket sales cycles that we have in enterprise IT, the expectations of direct marketing in the traditional sense are overinflated. On the other hand, the potential of brand advertising is vastly underestimated - including by marketing departments.

Brand advertising - what is it good for?

Very few people in this space will make or even consider a purchase based on a single campaign or targeted VITO letter, no matter how good. This idea plays into the heroic self-image of sales, but it is rarely true. In actual fact, before receiving that first formal sales approach, our prospect already has an opinion of what our company and products are like. This opinion is formed from various different sources, but the most important ones are personal experience and received opinion. If you can get personal experience right the first time you have generally bought yourself a customer for life, but that's a whole other topic. The way you can influence the frame of mind of your VITO is to work on that other axis: the received opinion that they already have of your product and/or company.

In turn, that received opinion is also determined by two main sources: word of mouth, and - yes - brand advertising. Brand advertising helps position your company as the sort of company that your VITO would want to do business with: innovative, customer-focused, stable & reliable, or whatever your particular values might be.


A prospect reading a VITO letter

So far, so much like direct marketing, except without the all-important metrics. Where brand advertising pays off is in what happens next. What you want to happen after VITO reads their customised1 letter is that they run down the hallway, burst into their colleague's office or their team's open space, and announce excitedly that they have just read about a very cool-sounding product from your company.

Their willingness to do this - to expose themselves in this way - is going to be predicated on your company's credibility in the space where you operate. In 2015, anyone getting excited about a new offering from Blackberry would have to do a fair amount of explaining (sorry, Blackberry). Vice versa, nobody has to explain why they are planning to buy cloud services from Amazon, storage from EMC, networking gear from Cisco, or insert your own favoured example.

While no amount of brand advertising could save Blackberry at this point, Amazon, EMC, Cisco, et al got to where they are in no small part by working on their perception in the market. In other words, their brand advertising ensures that VITO will be excited, rather than embarrassed, to involve their colleagues in an evaluation and advocate for a new offering.

What advertising can and cannot do

None of this is to say that brand advertising alone is sufficient if the products themselves don't meet expectations around price, performance, support, or any other axis that is important to customers. However, brand advertising can help get to the point where those variables can come into play.

Ironically, this distrust of advertising is one of the very few things that sales people and engineers both agree on. Both are making a category error which is nicely explained in this episode of the excellent Exponent podcast, with Ben Thompson and James Allworth.

Bottom line, I think overlooking brand advertising is a false economy. If you’re a startup, of course, you can’t blanket airports and fill out business magazines like the big guys2, so figure out what you can do in that space. And if you are big company that doesn’t do brand advertising, know that prospects are asking themselves why that is - and this definitely feeds into the perception they have of your company.

  1. You are customising your VITO letters to each prospect, aren't you? 

  2. Startups should not waste resources trying to act like bigger, more established companies. Startups have a disruptive value of their own, and can play on that3

  3. Says the guy with the psychedelic cow logo on his business card. I mean, check out our website. Nobody will mistake us for a staid, established vendor - and that’s the point

DVDs: Dead Video Discs

It seems that Microsoft has not only removed Windows Media Center from Windows 10, but will charge users $15 to restore its functionality.

I think Ars Technica's theory that this is about offsetting DVD format licensing fees in a free upgrade is probably correct. Apple of course provides all its operating systems for free, but since none of its computers include optical drives any more, maybe it's not affected. Even when Macs did have DVD drives, I assume the license fee was covered by the hardware.

What is more significant to my mind how long it took after the release of Windows 10 for this to come to light. It's yet another sign of the death of physical media.

PFY story

Tales from the front

In honour of Sysadmin Day, here’s a story from my own sysadmin days, originally posted here. It’s a snapshot of history, from CodeRed, to Ghost, to my sub-BOfH stylings. Enjoy!

So I'm talking to my BOFH, currently undergoing recovery on some Mediterranean island with his s/o, via a rather good free online SMS service, when I notice the (usually) non-PH B bearing down on me at a high rate of panic. It's too late to hide beneath my desk (too much "reassigned" hardware), so I stand my ground...

Once I make sense of his pathetic gibbering, I gather that he has just received a call from IT-Security (an oxymoron if I ever heard one, at least in these people's hands) about some machines in our IP pool still sending requests from CodeRed, only three weeks after it first turned up in the wild and (of course) instantly penetrated $ORK's network. So I speak to him in calming tones, get into my records and find the owner of one machine, but note that the second IP address is still listed as "free" in my records. Now any discrepancy between the Real World(tm) and my records must immediately be adjusted, naturally in my records' favour, and traditionally with much screaming and gnashing of teeth on the part of the luser who made the Real World(tm) alteration.

I wander over to the luser I did manage to locate, but he tells me that he passed the machine on to $OTHER_LUSER and did not see fit to inform me... Sometimes being the PFY sucks - I need to instill more respect in these people, but I'm leaving soon anyway.

At $OTHER_LUSER's desk I find both the problem machines... This one will need watching, I fear. I enquire of $OTHER_LUSER what part of the extra-ultra-high-priority email I sent around about patching all Windows machines1 he didn't understand, and he burbles something about how he thought pointless waste of neurons forgotten even as it emerged from his mouth.

Sighing I kill the processes, remove the backdoors, patch the machines and reboot them without asking if he had anything important to save among the myriad apps uselessly cluttering his task bar. I sense him about to protest and turn all 500 watts of my hardest stare on him - he holds out longer than most, and his eyebrows are beginning to singe when he finally looks down.

"Good", I mutter darkly, and wander back to my cube to check the web servers’2 logs yet again - I had foolishly assumed that all was well, but yet again my naive trust in humanity's native intelligence has been proved wrong. Fortunately this time everything would appear to be well, so it's back to snoozing for another hour or so until it's time to go home...

  1. Yes, even if it's a test machine. Yes, even if you're going to Ghost it again in a few weeks. Yes, even if it would be hugely inconvenient. YES, ALL FSCKING WINDOWS MACHINES!!! YOU INSISTED ON HAVING THEM!!! YOU IGNORED ME WHEN I POINTED OUT EVERYTHING THAT WOULD GO WRONG! YOU THEN INTERRUPTED ME WHEN EVERYTHING ON THAT LIST PROCEEDED TO PROVE ME RIGHT BY GOING WRONG! sigh 

  2. All Apache on various flavours of UNIX and Linux - do I look like I enjoy pain3 

  3. Someone evidently did - when I arrived there was IIS everywhere. Not any more! 


Presentation Mode

I do a lot of PowerPoint in my job, and have done ever since I moved out of being a full-time sysadmin. Whether it was preparing the stage for a demo when I was in technical pre-sales, delivering RoI projections during my stint in sales, or big-picture context setting in marketing, the vehicle of choice always ended up being PowerPoint.

While life has got better over the years, one thing is still surprisingly difficult at times, and that is getting the presentation to show up with the prospect’s equipment. When you are presenting at an event, you typically have some time to go test all the A/V kit and so on, but when you’re pounding the pavement, you get shown to a meeting room and you have to plug in to whatever is there and be ready to go.


This is where the trouble can start. First off, we are still working with VGA connectors. While not a terrible connector for fixed equipment, it’s not ideal for laptops - and in fact, most modern laptops have followed Apple’s lead and dropped the VGA connector, usually in favour of HDMI1. However, only the best-equipped conference rooms offer HDMI (and vanishingly few have Mini DisplayPort), so this means that we all get to carry VGA dongles around.

Physical connection achieved, we have to cross our fingers and hope for decent resolution. SVGA - 1024x768 pixels - is still the lowest common denominator, so you have to make sure your slides look okay at that resolution. Getting the slides to work is the easy part, unfortunately; most modern software GUIs will struggle at that resolution. Make sure you do your warm-up exercises for you scrolling finger!

The Great Demo blog has a great set of tips for making sure your slides will work in unexpected situations. They’re mostly good suggestions, as is the rest of that blog, but I really take issue with the last point, which recommends disabling Presenter Mode.

I could not disagree more. I have never yet seen a situation where Presenter Mode was the factor that made the difference between being able to work with a projector and not. However, I have often been in situations where having the ability to keep an eye on the time, see presenter notes, take a peak at the next slide, or even jump to a backup or optional slide without having to break the presentation flow, have been invaluable.

In fact, one of my pet peeves is at conferences or events where the organisers provide their own laptop instead of allowing you to connect your own. On the one hand, this avoids all the trouble with connecting the laptop to the projector in the first place - but on the other hand, it means that you’re not using your own setup. Nine times out of ten, the presentation laptop is in mirror mode, not Presenter Mode. During prep time, if I have the time I will switch it to Presenter Mode - and all too often, A/V staff will then switch it back to mirror mode.

It may well be that Presenter Mode is confusing to inexperienced presenters, but this means that our suggestion to them as seasoned presenters should be to learn it and love it, not just to turn it off. Sure, it’s a power user feature, so maybe don’t mess around with it in your first week on the job - but maybe you shouldn’t be giving customer presentations until you are confident enough to roll with that anyway.

That said, don’t allow it to turn into a crutch. Too much jumping around within a deck will confuse your audience. They will be aware of it even if they don’t actually see you doing it on screen. Also, if you are presenting in a webinar, you will almost certainly not be able to use presenter mode unless you jump through a lot of hoops. In that situation, the better and more robust solution is to have your presentation on a second machine (or my personal solution: an iPad) and a timer on your phone (muted!) to help you stick to your story thread and timing.

  1. On the other hand I did have a Dell a few years ago with a DisplayPort outlet. No, not Mini DisplayPort - full-size DisplayPort. It looks like an HDMI port with one end squared off. I have never seen a single piece of DisplayPort hardware apart from that generation of Dell laptops. 

Apple Watch

Since I’m in the US this week, I thought I’d go check out an Apple Watch.

You can’t just rock up at the Apple Store though - you need to make an appointment beforehand through Apple’s concierge service. I duly did this, and was greeted at the door by a blueshirt - by name!

The device itself was pretty cool - I particularly like the Watch with the Milanese Loop band - but the try-on experience was underwhelming. The watch you get to strap on is running a canned demo video, which is not interactive in any way. To test the applications, you have to use the Watches that are in the display units with the iPad controllers.


The problem with this experience is that it doesn’t let you try the features I was most interested in: Glances, and the Taptic Engine. I was really curious about these two, and in fact I still am, since I didn’t get to check out either one.

Both of these features relate to notifications. The Taptic Engine is what generates the "tapping" sensation on your wrist that tells you that you have a notification in the first place. It can also be used for other things, such as giving you silent navigation directions. Meanwhile, the idea of Glances is that when you receive that notification on the Watch, you can rotate your wrist towards you, and the Watch wakes up and displays the notification on the screen. At this point you can either rotate your wrist away again, in which case the notification is dismissed, or start interacting with it in whatever way is appropriate.

Based on everything I’ve read and heard, I expect the main value of the Watch to be in its ability to help wearers process notifications. If you can take a quick look at your wrist to see whether a notification requires your immediate attention or not, that is much less disruptive (and rude!) than getting your phone out, unlocking it, and so on. However, I can’t tell how good the Watch actually is at doing that on the basis of a canned demo reel or interacting with stored notifications!

I have also seen some comments that the watch is sometimes slow to display the actual, y'know, watch face. The screen is off when you're not looking at it, but it's supposed to turn on when you rotate your wrist to look at the watch. This was something else that the try-on experience doesn't allow you to test.

I really hope Edition buyers get a more complete experience. Maybe I should have tried for that instead? I was after all conspicuously the only person in the Apple Store wearing a blazer…

If nothing else, one good thing that has come of the Watch is that Apple has redesigned its UK power adapter.

I really hope that when the Watch launches in continental Europe, that plug also gets redesigned. The US Apple power adapter, with its folding prongs, is definitely the best to carry around. The UK & European1 versions always seem like afterthoughts in comparison, sticking out and catching on things.

  1. Heavy fog in the Channel; Continent cut off. 

Faster disruption

There are theories which seem just intrinsically right when you hear them. Clayton Christensen's famous "disruption theory" is one of these. I was recommended to read "The Innovator's Solution" by a friend of mine who had previously worked directly with Professor Christensen, and it definitely shaped my thinking about the technology business.

When the core business approaches maturity and investors demand new growth, executives develop seemingly sensible strategies to generate it. Although they invest aggressively, their plans fail to create the needed growth fast enough; investors hammer the stock; management is sacked; and Wall Street rewards the new executive team for simply restoring the status quo ante: a profitable but low-growth core business.


In sustaining circumstances—when the race entails making better products that can be sold for more money to attractive customers—we found that incumbents almost always prevail. In disruptive circumstances—when the challenge is to commercialise a simpler, more convenient product that sells for less money and appeals to a new or unattractive customer set—the entrants are likely to beat the incumbents.

Disruption theory explains a lot about many markets, although it is not without its critics. In particular, Jill Lepore caused a minor furore with a piece in the New Yorker entitled The Disruption Machine, in which she accused Professor Christensen of cherry-picking his evidence.

There is one famous exception and objection to disruption theory, and that is Apple. According to the orthodox version of the theory, Apple should have been disintermediated by now by smaller, more agile modularised competitors. Every year seems to bring its candidate as the disruptor of Apple: whether it's the Nexus, or Samsung, or Xiaomi, or whoever. Apple somehow survives them all, and not just survives, but goes from strength to strength.

Why is this?

Ben Thompson wrote about how classic disruption theory applies mainly to enterprise products, where the buyer is not the user, and so "feeds & speeds" that can be mapped on a checklist rule the purchasing process. The buyer is looking for a product that can satisfy some simplified criteria. Beyond that binary fit, the decision is primarily based on price.

Apple emphatically does not fit this model, with its focus on design and the user experience. The buyer is the user, and once their basic criteria are met, they can still be swayed by different user experience and personal preferences - of course, up to the budget they have available or are willing to assign to a piace of electronics. Therefore, Apple continues to be able to command much higher prices for devices that are - on paper - comparable to their modularised competitors. Users return again and again for newer versions of their device, every year or two, and Apple's profit margins are legendary.

Enterprise software had always seemed to be on a much more classic track to disruption, with procurement departments working from Request for Proposal (RfP) documents that generally allow yes/no or at most grading on a short scale, typically from one to four. Recently, though, the market has been changing, and incumbent vendors are being disrupted by offerings which bypass the traditional buyer in Procurement and appeal directly to the end user. One model is open-source, where sufficiently technical users have been able to download and use free software without support from central IT for at least the last fifteen years. The main roadblock to this avenue of disruption was in users' willingness to futz around with graphics drivers or whatever. More recently, a new avenue has opened up, namely software as a service. SaaS offerings require much less technical acumen, and much less effort even when the technical acumen is available. Even users who are able and willing to get their hands dirty only have a limited amount of time available to do so, and are quite happy to hand the responsibility off to someone else.

This is how you get the infamous "shadow IT" - enterprise IT's particular incarnation of disruption theory. However, I do wonder whether enterprise software might not also have exceptions to classic disruption theory. Buyer inertia may prevent modularisation, or at least complete modularisation, from taking hold, or delay it for a long time.

From integrated to modular to orchestrated

Apple cannot easily be replaced by modular competitors, even when those competitors offer lower prices and nominally higher performance, because the overall user experience delivered by those competing devices is inferior. There is an equivalent mechanism in enterprise software - although, unfortunately, it does not often manifest in attractive user interfaces and satisfying interactions. Rather, it is the experience of the buyer which is important.

Many mature, established companies have a "vendor rationalisation" initiative of some sort. Some may even go so far as to have an "Office of Vendor Management" or equivalent. One way of looking at this is the "one throat to choke" school of thought taken to its extreme, but there is something else going on here.

As software becomes more complex, and user requirements more varied, there are fewer and fewer one-stop software packages. Even within a single vendor's offerings, users will need to select multiple packages, many of which will have been developed by different teams or even by different companies acquired by the vendor over the years. Customers are looking for a trade-off between best-of-breed solutions from different vendors or open-source tools that require substantial work to integrate with each other, versus vertically integrated solutions from a single vendor that may not excel in any one area but can deliver on the whole task.

The variable that will drive choice in one direction or the other is the rate of change. If the integration between the best-of-breed packages remains valid, once developed, for a significant length of time, then the modularised, disrupting solutions - whether commercial on-premise, open-source, or SaaS - will win. If on the other hand integration is a constant effort that never fully stabilises, requiring never-ending development to chase a constantly moving target, then the benefits of the pre-integrated solution become more attractive in their turn.

Timing is everything

The twist is in the incentives. Developers of the modularised solutions are in a race with other modularised solutions, hired to do the same job, in Christensen's terminology. The way they keep ahead in the race is by evolving faster, adding more functionality sooner than their competitors. They have no direct incentive to stabilise their solutions. For the same reason, they have no particular incentive to stabilise the interfaces to their solutions, as this makes them more easily replaceable by their competitors (less sticky).

The upshot of all this is that a vertically-integrated company can stay ahead of the curve of disruption by innovating just enough to maintain stability for its users, while supporting a certain speed of evolution. This is the job that their customers hire them to do.

The commercial Unix platforms were displaced by Linux because both followed standards (GNU, Posix, and so on) that made them largely fungible from the point of view of their buyers, once Linux had developed beyond its beginnings. iPhones were not displaced by Android phones because they were not fungible to their buyers.

How not to be fungible

What are the characteristics of enterprise software that can make it non-fungible? Simply put, it comes pre-integrated, both with itself (or rather, between different components of itself) and with everything else. This is why content is so important. An API is not enough to avoid being disrupted; any open-source project worth its salt comes with an API - probably RESTful these days, but the principle is independent of technology. What prevents disruption, making the software "sticky", is content: pre-built integrations, workflows, best practices, and data transformations, that make the software work seamlessly for customers' needs.

Enterprise software needs enterprise-grade content that takes advantage of those integrations. Relying on technical capabilities alone leaves enormous vulnerability to motivated developers and agile start-ups. A would-be enterprise vendor must focus on what it can do to prevent disruption. Agility - chasing the bleeding edge - is not the job that buyers hire it to do.

However, there ain't no such thing as a free lunch: those integrations have to keep up with those other fast-moving targets. There is no "we support these two products, and we will add the third-placed platform with the next release of our software in a year's time"1. The integrations themselves have to evolve, and do so in a way that is both backwards-compatible (you can't break everything your users have built every time you upgrade) and fast-moving (you have to keep up with where your users are going, whether to new platforms, or to new versions of existing platforms2).

The bottom line

All of this represents yet another level of abstraction. The competition moves to a different layer of the stack: the content and integrations. Enterprise vendors who refuse to follow along are simply ceding to their competition; users - and importantly, buyers - are there already.

Hardware got commoditised, then operating systems got commoditised, and now it's the next layer up. It's not what you have under the hood, it's what you do with it - and both people and enterprises will buy the tool that enables them to get the most done.

  1. Please note the small print around any roadmap estimates. 

  2. Note that saying "nobody is using X in production yet" doesn't cut it. Users are most certainly using X in testing as a prelude to putting it into production, and as a part of that process they need to test that everything else integrates with X. Missing that wave is the first step to hearing "we went into production with your competitor on all new projects because they were able to support us in our move to X". 

Happiness in Typesetting

In my usual spirit of always wanting to try an alternative way of doing something - partly in hope that the alternative might be better, partly due to my latent hipster gene trying to express itself - I have always been curious about LaTeX. It's a big commitment, though, and until now I had lacked data to drive my decision.

Somebody (shockingly, they are in Germany) has done a scientific comparison of LaTeX vs Word.

So it turns out that a specialised tool is really good at a specialised task, while a jack-of-all-trades tool does better at general tasks. Big whoop.

The interesting question to me would be a breakdown of how many Word users know how to use even fairly basic features. The style sheet functionality seems to be a mystery even to people who really should know better.

People complain a lot about Word being obtrusive, and there is definitely truth to that complaint: try nesting tables, or trying to pad them, or doing two-column layouts that don't flow, and then come back and tell me about Word - but only once you stop swearing and twitching, please. However, many of the complaints that I hear tend to be more about people not knowing about a feature in Word, or not using it properly.

Part of that problem is of course due the design and usability of Word itself, but it's noticeable that all of the alternatives to Word run into the exact same problem of complexity, as soon as they get past the basics. It's often said that 80% of users use only 20% of the features of software - but Word is the perfect example of the fact that everybody has a different 20% subset that is critical to them.

Anyway, while it looks like LaTeX only really shines for mathematical equations, since LaTeX users appear to be happier, I may yet have to give it a go.

Watch this space…

Management agents

I went to buy some lunch today to eat at my desk, because it's a short week and I'm busy and shut up don't judge me. I got back from my 15' trip to find my MacBook Air fairly propelling itself off the desk with its fan, uncomfortably hot to the touch, and minus 30% of the battery life expectancy that it was showing when I left.

This is why people hate those management agents that corporate IT departments foist on them.

I have this unkillable process (running as root, natch), which creates its own undeletable user account and does Cthulhu only knows what horrible things to the filesystem. Now I don't have a problem with my employer keeping tabs on their machine that is currently assigned to me. Even if you assume all users are honest, someone might make an honest mistake that winds up endangering corporate data. What I do object to is when that process of keeping an eye on things gets intrusive.

This is why I first did the BYOD thing, after all. Unfortunately as Macs went from niche to ubiquitous, the Security Solutions came to the Mac too.

Maybe it's time to go back to doing the VM dance, with the clunky corporate environment sandboxed safely away in a VM that can be shut down when I don't want to deal with its overhead?

Sic Transit Gloria Mundi

Our local IT person came round looking for an external optical drive. Seems the new generation of corporate-standard laptops has finally ditched the internal optical drive, but she only has optical media to install the OS. Of course the only optical drive we could find speaks eSATA, and the new laptop doesn't have that either.

(Now of course with USB Type-C arriving there will be three interfaces to worry about…)

I no longer own a machine with an onboard optical drive. When my sister (who also owns no optical drives) asked me to rip the CD that came with a (paper) book to MP3 for her, I had to dig out an old internal drive from the Pile Of Stuff That Might Be Useful Someday, hook it up with a USB-to-ATA bridge, and download a bunch of programmes to rip and encode the data - because of course I didn't have any of that installed either.

This used to be, not a daily activity, but probably a weekly one on average for me - and I didn't even take note of the last time I did it. I will remember this time because it was such a hassle, but otherwise I might have completely forgotten. One day my kids will find my old CDs and ask me what they are, and be amazed at the clunkiness. Meanwhile I still remember how amazing CDs were compared to tape cassettes. Nobody will remember CDs fondly, though, precisely because they weren't clunky.


Imperfections are what sticks in the memory and stirs emotions. People go to a lot of trouble to make iPhone apps that simulate the imperfections of old cheap or disposable cameras - [Lomography]( "Lomography")

, [vignetting](

"Vignetting"), and so on.

The slick, digital CD will slip into the past and out of our memories as smoothly as it arrived.