Tuesday 31 October 2017

How Open-Source Can Be the True Catalyst for Digital Change

[This article was first published in an abridged form in CIO Outlook magazine]

Twenty years ago, when the web was just starting to become available to consumers, a large proportion of non-tech websites was essentially advertising space: ephemeral virtual billboards promoting some aspect of an organisation’s product or service, but driven mostly by marketing departments with no solid connection to day to day business. Having been involved in developing website sin those days, and having often argued in vain with my customers that their web presence needed to be more than just a one-off disconnected experiment, it’s still remarkable to me that today the online world has so quickly become intrinsic to our lives. Today it is the disconnected organisation that is the anomaly: the first thing anyone does when starting up a new business these days is to check whether their proposed company name is available as a domain name, with changes made if it is not. Customers expect an online-first and mobile-first experience: we all know people who will skip past providers if they don’t have an easy-to-navigate site that lets them conduct business online, rather than having to telephone, or even worse: do something in person.

So, it’s a truism that customers expect to be able to access business services on their own terms, and to be able to do as much as possible online without having to resort to (potentially) slower methods of interaction. Businesses have to provide this ease of access or risk being passed by – but developing and maintaining the degree of interaction required is much more complex than the advertising websites of the 1990’s. In order to be truly effective, a company's digital entry-point must be able to reach directly into the workings of the organisation to be able to handle transactions in real-time and satisfy customer’s demands. This is the crux of digital transformation: the business processes that used to be kept internal to an organisation have to be codified and expressed in a way that makes it easy for customers to interact with the company… or else they will simply go to a competitor who can offer that experience.

This approach can be relatively easy when starting from scratch, but for established businesses the need to unravel years (or decades) of business logic and interconnected systems can become a nightmare, especially when time-to-market is important to satisfy customers’ ever-growing demands. Even for new companies, the need to constantly refresh and update quickly brings its own challenges: the market is never static, and if a competitor finds a new and more attractive way of providing the service, then a response needs to be found quickly.

Many organisations are turning to agile methodology and the associated concepts of DevOps and continuous integration to attempt to address the need for speedy time-to-market, yet the challenge for IT organisations is how to provide the systems needed to support the new approaches when up to 70% of their resources are spent just keeping the lights on in their day-to-day operations.

This is where open-source technology can help. In recent years the vast majority new systems innovation has arisen from the open-source arena; almost all cutting-edge technologies have an open-source aspect, with industry giants such as Google, Intel, IBM, and even Microsoft embracing open-source as a way to accelerate development and spread the adoption of the technologies underlying the modern web beyond the traditional proprietary boundaries. While open source software itself is not at all new, it has in recent years become the mainstream way of developing innovative ideas – based in most part on the foundation of that quintessential open source project: GNU/Linux. The freely-available nature of the Linux OS provides a platform with a level playing field for involvement, and the free GNU tools (compilers, libraries, and development environments) also lower entry barrier for new developers who can contribute towards community projects. Some of these projects can be tiny, perhaps with only a single contributor. Others, such as the OpenStack cloud infrastructure project, can include thousands of developers, project managers, and technical writers from big corporations, research organisations, and individuals.

The open nature of development means that more eyes and more ideas are brought to bear on a project: performance and security issues have a higher chance of being observed and resolved, with individual developers eager to make and maintain their personal credentials, and little to no opportunity for sweeping problems “under the carpet” to meet a specific deadline, as is the risk with closed, proprietary code. The open development also means that users aren’t locked-in to a particular vendor’s technology, or subjected to the risk of either unconscionable price hikes, or the prospect of a product being “killed” due to an acquisition or other business imperative. 

Of course, the challenge for IT managers when applying open source technology is how to support it – and especially how to support it whilst maintaining their existing systems. The widely-available nature of open source may make it very cost-effective to acquire, but those savings can be quickly eroded if an organisation has to employ their own experts to build, manage, maintain, and integrate those technologies, and can be a risky undertaking if those key employees leave the company for any reason (or even want to take a vacation). That’s where open-source software companies such as SUSE come in: over 25 years ago the Germany-based software company produced the first ever enterprise-ready version of Linux, and it has been building and integrating “infrastructure software” for enterprise ever since. This (profitable) longevity means that even though the technologies it products may be cutting-edge, the engineering and support is solid and reliable. A lot of this success is based on the hugely experienced development team, which makes up over 50% of the company’s employees. The egalitarian nature of open source development communities means that an individual developer’s personal credibility is extremely important in making a difference in the direction of an upstream project, and since SUSE boasts some of the most experienced developers in the industry, their influence can be seen across a wide range of projects, with the added result that the real-world scenarios they observe in customers’ workloads are considered when making changes or improvements.

The message, then, for CIO’s wanting help with digital transformation, is to look to the open source world for the reactive, adaptive, and innovative technologies that will make it possible to deliver on consumer expectations at a price point that is affordable, and to do so with the help of an experienced open-source partner who can provide the enterprise-grade support necessary to provide the stability needed for a reliable business.

Thursday 7 July 2016

n-1 isn't necessarily the wisest choice

BMW 328 - original and hommage versions

Ask any vendor & you will find that one of their greatest frustrations is when customers insist on implementing only the "n-1" release of a particular product. 

At almost every meeting, vendors are asked about the availability of new features, or new capabilities, or new supported configurations that will match what the customer is trying to achieve, and yet when these are finally made available after much development and testing, customers will wait, and stick with supposedly safer older versions.

The risk-management logic, of course, is that the latest release is untried, and may contain flaws and bugs. Unfortunately this misses the point that fixes to older flaws are made possible by deploying a new release. It also brings up the laughable scenario of customers asking for new features to be "back-ported" to the older, "safe" release. Pro-tip: if you back-port all of your new features to the older release, then you end up with the new release anyway!

There are also some times when you just can't take advantage of the latest technology unless you're up-to-date: for example getting the most benefit out of new CPU's requires the operating system software to be in-sync. As SUSE VP of engineering Olaf Kirch points out in this article from 2012, when new features are introduced, you can either back-port to old code (possibly introducing errors) or take the new code and harden it. 

Which brings me to the real point of this article - when we're  talking about open source, the rate of change can be extremely rapid. This means that by the time you get a hardened, tested,  enterprise version of software out of the door, it is already at least version "n-1" : the bleeding-edge stuff is happening at the forefront of the community project, where many eyes and many egos are working on improvements to correctness and performance as well as features. So there's really no reason to require an n-1 release of, say, enterprise Linux ... all you're doing in that case is hobbling your hardware, paying more for extended support, and missing out on access to improvements.

So when SUSE introduces a new kernel revision mid-way through a major release, as it is doing with SUSE Linux Enterprise 12 Service Pack 2 (SLE12SP2), don't fret about the risks: the bleeding edge has already moved forward, and what you're getting is just the best, hardened, QA'd, engineered version of Linux with the most functionality.

Tuesday 7 June 2016

More on Scalability, Again


This week Intel announced its new Xeon E7 v4 processor, which takes x86_64 processor scale to another level: a single CPU socket now gives you 24 cores and access to 3TiB of RAM.  That means a medium-sized server of 8 sockets can now give you access to 192 cores and 24TiB RAM.  The upshot of this is that is you actually want to access all of that RAM with a supported operating system, SUSE Enterprise Linux is your only choice.

The new architecture also raises the limit for CPU sockets in a box to 64 – which means that you could max out this system in a standard kind of configuration at 1,536 cores. Again, SUSE Enterprise Linux is the only OS to support this degree of scalability for this kind of processor.

I wrote about this just a few weeks ago in the context of the HPE Integrity Superdome X,  which still only has published benchmarks running SUSE Linux Enterprise Server. It's interesting to see that all of the numbers have doubles (yet again) in such a short time.

Of course, SGI has been doing this degree of scaling with NUMA systems for a while, which is why SUSE Enterprise Linux is known to scale to 8,192 CPU cores and 64TiB RAM (they couldn't fit in any more memory): it's a little frightening to consider what they might end up doing with these new CPU's – at the very least 128TiB RAM will be near on the horizon.

So when the processor hardware manufacturers can still drop a doubling of capacity on us, it's worthwhile taking a note of whether your software can deal with it....