There's a pattern to successful outcomes of IT projects — and it's not about who works the longest hours, or has the most robust infrastructure, or the most fashionable programming language.
Here is a recent specific example, which came to my attention specifically because it mentions my current employer — although the trend is a general one: How Nationwide taps Kafka, MongoDB to guide financial decisions. And here is the key part that I am talking about:
A lot of organizations try and go for a big data approach — let’s throw everything into a data lake and try and capture everything and then work out what we’re going to do with it. It’s interesting, but actually it doesn’t solve the problem. And therefore, the approach we’ve taken is to start at the other end. Let’s look at the business problem that we’re trying to solve, rather than trying to solve the mess of data that organizations are typically trying to untangle.
It is indeed a common pitfall in IT to start with the technology first. You hear about some cool new thing, and you want to try it out in practice, so you go casting around for an excuse to do that. You'll notice, however, that very few of these decisions lead to the sort of success stories that get profiled in the media. The more probable outcome is that the project either dies a quiet death in a corner when it turns out that the shiny new tech wasn't quite ready for prime time, or if the business stakeholders are important/loud enough, it gets a vastly expensive emergency rewrite at the 11th hour into something more traditional.
Meanwhile all the success stories start with a concrete business requirement. Somebody needs to get something done, so they work out what their desired outcome is, and how they will know when it has been achieved. Only then do you start coding, or procuring services, or whatever it is you were planning to do.
This is not to say that it's not worth experimenting with the new tech. It's just that "playing around with new toys" is its own thing, a proof of concept or whatever. You absolutely should be running these sorts of investigations, so that when the business need arises, you will have enough basic familiarity with the various possibilities to pick one that has a decent chance of working out for you. To take the specific example of what Nationwide was doing, data lakes are indeed enormously useful things, and once you have one in place, new ways of using it will almost certainly emerge — but your first use case, the one that justifies starting the project at all, should be able to stand on its own, without hand-waving or references to a nebulous future.
Re idiotic “we should teach kids coding in school” takes: if we start in elementary school — what are the odds of the specific stack they learn still being around two decades later when kids are entering the job market? I learned Turbo Pascal in school; very very dead now. https://t.co/YSJNGIY1Fv— Dominic 🇪🇺 (@dwellington) June 14, 2021
This is also why it's probably not a good idea to tie yourself too closely to a specific technology, in business let alone in education. You don't know what the requirements are going to look like in the future, so being overly specific now is to leave gratuitous hostages to fortune. Instead, focus on a requirement you have right now.
Nationwide is facing competition from fintechs and other non-traditional players in banking, and one of the axes of competition is giving customers better insight into their spending. The use case Nationwide have picked is to help users achieve their financial goals:
We’re looking at how we create insight for our members that we can then expose to them through the app. So you’ll see this through some of the challenger banks that will show you how you’ve spent your money. Well, that’s interesting — we can do that today. But it isn’t quite as interesting as a bit of insight that says, "If you actually want to hit your savings target for the holiday that you want next year, then perhaps you could do better if you didn’t spend it on these things."
Once this capability is in place, other use cases will no doubt emerge.
But what is the education equivalent of this thinking? Saying "let's teach kids Python in school!" is not useful. Python is in vogue right now, but kids starting elementary school this September will emerge from university fifteen or twenty years from now. I am willing to place quite a large bet that, while Python will certainly still be around, something else, maybe even several somethings, will have eclipsed its current importance.
We should not focus narrowly on teaching coding, let alone specific programming languages — not least because the curriculum is already very packed. What are we dropping to make room for Python?
And another question: how are we actually going to deliver the instruction? In theory, my high school curriculum included Basic (no, not Visual; just plain Basic). In practice, it was taught by the maths and physics teacher, and those subjects (rightly!) took precedence. I think we got maybe half a dozen hours a year of Basic instruction, and it may well have been less; it's been a while since high school.
The current flare-up of the conversation about teaching IT skills at school has this in common with failed projects in business: it's been dreamed up in isolation by technologists, with no reference to anyone in actual education, whether teachers, students, or parents. None of these groups operate at Silicon Valley pace, but that's fine; this is not a problem that can be solved with a quick hackathon or a quarter-end sprint. Very few worthwhile problems can be, or they would not remain unsolved.
Don't confuse today's needs with universal requirements, and don't think that the tools you have on the shelf today are the only ones anyone will ever need. Take the time to think through what the actual requirement is, and make sure to include the people doing the work today in your planning.