By now, you’ve probably heard of low-code, and if you haven’t you soon will. In 2017, Forbes called the technology ‘extraordinarily disruptive’, and this message was repeated this year – calling it ‘the most disruptive trend for 2021’.

The growth of low-code platforms has also ballooned since 2016, reaching an estimated $10.1 billion in 2020 and forecast by Forrester to reach $21.2 billion in 2022, and $65 billion in 2027.

This breath-taking growth is a direct reflection of the real-world use of low-code platforms for building enterprise-class software and apps – a growing trend that looks to continue.

Low-code has become one of the most powerful tools in software development. Using a graphical interface, code can be easily assembled into powerful and secure apps in a fraction of the time it used to take. Just like how humans share 90% of the same DNA with a cat, much of the basic code is the same across very different software applications. Cutting out unnecessary repetitive hand-coding just makes sense. A skilled coder is still needed to create all of the customized elements of the software, but a low-code platform saves a lot of the ‘grunt work.’

The experts at Sourceful have become very familiar with low-code platforms; some of them have been developing apps using low-code platforms since the mid-1990s. Somewhat puzzlingly, despite low-code being used for about twenty years already, they only gained traction from 2016 onwards. So where did low-code come from, and why is it so pivotal now?

Behind the waterfall

During the 1980s the ‘waterfall’ development approach was challenged by James Martin’s concept of Rapid Application Development (RAD), which he developed while working at IBM.

It incorporated adaptive approaches to software development based on the realization that software requires a fundamentally different approach compared to ‘static’ development projects. Instead, a more adaptive and dynamic approach was prescribed and outlined. There were drawbacks and limitations to the RAD method, but programmers had now recognized that new tools and methods were needed. Things could – and should – be easier.

The fundamental concepts leading to low-code include model-driven development and the development of Fourth-generation Programming Languages (4GL).

During the 1990s, model-driven development led to an entirely new approach to building software tools – one which sought to abstract all of the processes, systems, functions and data resources into something that could be represented graphically.

4GL was critical to enabling this movement as they allowed the handling of more diverse tasks and resources. Another big feature of 4GL was the way it made coding more versatile and accessible to a wider base of coders, being comprised of more English-like syntax.

Low-code arose from this fertile ground of user-friendly development. Companies like Outsystems and Mendix were among the first in the field, and are still market leaders in the low-code space today. But the market in the 1990s was not quite ready yet for all of the possibilities low-code could offer.

Stepping cautiously into the 21st Century

By around 2010, the Agile/Scrum development approach started to be more widely adopted. At the time, Business Process Management (BPM) technology was the ‘hot technology’ of the day, and companies were already starting to see how software could manage and improve increasingly complex systems.

Back then, once the ‘millennium bug’ had just been vanquished. The IT department was largely seen as an expense – an ongoing maintenance cost. It was simply a department hidden in the basement who you would only call on if you needed to be told to “turn something off and turn it on again”, or if there was a paper jam in the printer.

Companies had no idea about just how useful the IT department could be, or how crucial it would become in the near future in staying ahead of the competition.

The Agile/Scrum approach had already started to substantially break down the old ‘waterfall’ system in favor of a more agile and modular methodology that could be tested at each stage. Businesses were seeing impressive results using these methods, and when low-code technology was deployed in combination with an Agile approach the results were amazing.

Impressive synergies

Of course, at this time many companies were already using agile/scrum methodologies in their development projects, but the synergies achieved when these were combined with the use of low-code were a surprising benefit.

Using low-code, each sprint could now deliver much more functionality, and reach completion more quickly. One major side-effect was the increased engagement with stakeholders within the company, whose testing and feedback was a vital part of the process. Fast turnarounds meant that executives and other decision makers were inclined to test out and give feedback more quickly, to keep pace with the development of the software itself. Increased engagement also resulted in richer feedback which in turn generated better end-results.

This sense of pace and noticeable progress was refreshing for companies who were accustomed to projects that never seemed to end, and frequently delivered too little too late. Although the reduced costs certainly sweetened the appeal of low-code, this improved process and heightened engagement was the icing on the cake.

As the internet started to mature, it became a chief enabler of new efficient business practices. Possibilities started to emerge for companies to link together different systems and information using custom-built software solutions or enterprise software packages.

However the barrier to entry for companies was set pretty high and only those with lots of spare cash (and time) were willing to develop custom projects, while SMEs struggled to find software packages that could actually adapt to their business model.

Ten times faster

Before low-code became established, software development was a costly enterprise that relied on the vulnerable ‘waterfall approach’ and hand coding that would typically see projects taking 2 to 3 years to reach fruition.

Along the way, many rounds of tests would be needed. When bugs were detected (and they always were), laborious searches would need to be conducted, scanning thousands of lines of code for missing semicolons or other characters accidentally omitted by the hand-coding process.

Then – suddenly – a perspective shift came about.

Around 2016, everyone seemed to be talking about low-code, with its promise of transforming business through intelligent enterprise applications that would drastically increase efficiencies and decrease burden.

The earliest adopters of low-code technology were the companies at the top end of the mid-market. They could see the potential benefits and were willing to take the small risk (and big rewards) that came from the possibility of having both rapid development and low costs.

Following their success stories, bigger companies started to experiment with using low-code to develop mission-critical software – with amazing results.

Compared to the many years of development time taken by using traditional methods, low-code platforms made it possible for skilled coders to create new software in just days or weeks. Many companies were initially skeptical, feeling that these claims were ‘too good to be true’ but, as time has shown, the promise of low-cost, rapid development of robust and effective enterprise-class software was fulfilled.

According to Forrester, low-code methods are 10 times faster than traditional approaches.

The applications built in the low-code platform use the .NET core or Java. They are intrinsically flexible; capable of handling multiple processes, combined with business logic – all built via a single platform.

Applications developed using low-code are versatile enough to be able to work with legacy infrastructure too, and the quick development enables a collaborative process that uses input from users to create the perfect software tool to fit the need while not being as burdensome for stakeholders as a longer traditional development process would be.

Small Teams, Big Projects

Before long, the success stories about low-code were everywhere. Small teams were successfully developing huge projects in very little time – and the difference in cost was staggering.

In the hands of a skilled coder, high quality, secure, enterprise-class software could be developed and tested in a matter of weeks or months.

From this point, things started snowballing fast, and companies were soon replacing their cumbersome ERPs and CRMs with low-code-built, fully customized solutions. These solutions exactly fit their requirements but had a fraction of the development cost. It was a persuasive argument no profit-seeking company could ignore.

What lies ahead

Low code is set to become a major player in the software development industry, but there is also a lot of talk about how no-code platforms might play a role. According to Forbes, no-code can be 100 times faster than hand-coding, and can be used by practically anyone. This opens up some very interesting possibilities, while perhaps blurring the traditional boundaries between departments. While these can put a rocket-booster on the efficiency of many repetitive processes, there are legitimate concerns about security when programming is taken out of the hands of skilled programmers and is taken over by non-technical employees. The trick to succeeding with this technology is finding which of these tools can be used where, for what purpose, and by whom.

Timeline of development  

1957 FORTRAN (Formula Translation) language developed by IBM. Still used for some niche purposes such as data-heavy computation.

1959 COBOL was developed based on the FLOW-MATIC language created by Grace Hopper, which was the first English-based computer language. COBOL included object-oriented programming with an English-like syntax. Standardized in 1968 and in 1997 80% of businesses ran on COBOL.

1964 BASIC was created by researchers at Dartmouth College. Using an English-based syntax, this was developed over time to become procedural and object-based. BASIC was the foundation of many early video games.

1972 C – the first language in the ‘C’ family was developed.

1991 Visual BASIC was released by Microsoft, using the blueprint of BASIC. Python was also released the same year.

2001 Visual BASIC .NET was released by Microsoft as a further development of the Visual BASIC model, but with a Graphical User Interface (GUI) that enables the drag and drop of pre-prepared code.

2001 Low Code is born. Outsystems hit the market with their low-code platform. After a slow start, low-code becomes increasingly popular, finding more applications as the internet expands and technology such as smart devices becomes more prevalent.

2016 .NET Core is released. Not so much a language on its own, but a common cross-platform framework that uses a Common Language Infrastructure to enable software to operate across different platforms and handle multiple languages.