Back to the future
It’s time to get the De Lorean out of the garage, service the Flux Capacitor and get ready for some time travel to the early ages of Low Code. Or you could skip the risky journey and just read this article.
Low code is often presented as the future of application development. Is it true? As a matter of fact, it already was. Let’s activate the time circuits by turning the handle, select February 15th, 1986 as a target date using the keypad, and accelerate to 88mph (you may close your eyes for a stronger effect).
Terminals are still character based and the biggest innovation for the popular VT100 terminal was black on white display. I still remember the “Whaouh, it looks like paper!” when those 25 lines, 80 characters terminals were introduced. However, those screens that reacted directly upon a key press were a serious improvement of those old-new ages, with the rise of Unix minicomputers providing real-time feedback to the user, bringing a new level of User Experience. Warning: the term “User Experience” might be an anachronism as cognitive psychologist and designer Don Norman will only create the term in the next decade.
Those years are also the early years of SQL, and programs running on minicomputers are mostly written with classic 3rd generation (read: procedural) languages such as C.
1986: the new wave is there
And then there is the 4GL wave. Let’s quote wikipedia:
“A fourth-generation programming language (4GL) is any computer programming language that belongs to a class of languages envisioned as an advancement upon third-generation programming languages (3GL). Each of the programming language generations aims to provide a higher level of abstraction of the internal computer hardware details, making the language more programmer-friendly, powerful, and versatile.”
Although there have been early approaches such as Sperry Mapper, released in 1979, Informix ignites the real revolution by releasing Informix 4GL on this very day of February 15th, 1986.
Before we used to write pieces of code to fetch data out of a SQL database, to push it to the screen, to manage the user interactions, such as displaying a form, moving from one field to the other, controlling the input in real time, and displaying errors before they had consequences. Now we just link DB fields and display fields, and put some glue in-between.
Wait… the Flux Capacitor must have had a problem. This sentence could have been written in 2022. Looks like we are reading a cheap pitch about a low code system!
Let’s travel back to 2022 (you may again close your eyes). We’ll leave the De Lorean in the garage for now. After double checking, nothing was wrong. As long as we don’t mention that the glue is made with a natural language instead of windows, panels, sliders, and wires dragged by a mouse then the sentence still applies to low code, at least to parts of it.
4GL Rise and Fall
Informix 4GL and other 4GL application development systems were a massive hit. The productivity delta was so high that only the hard and tattooed men/women were still writing applications using C (all my apologies to tattooed people, this image is for the sake of bringing you back to the spirit of the 80’s). The term low code didn’t exist yet, but the market had already chosen its side. Adoption was massive, and there were good reasons for that: productivity, flexibility, easy skilling, better quality amongst others.
The meteorite that caused 4GL extinction (at least for new setups) is called Windows. The richness of this new interface was never properly embraced by this generation of low code systems, and the end result always looked like a revamped version of a character screen. This looked like Windows, but it still was a character tainted user journey. But until that fatal event, 4GL were dominating the world of application development with productivity and quality ratios that are hardly reproduced today.
History is repeating itself
There are troubling similarities between 4GLs and Low Code systems, whether we are speaking about promises, flexibility, easy skilling or go to market acceleration.
The main promise with 4GLs was to eliminate unnecessary complexity from the application development process.
- Back in 1986, it suddenly looked crazy to have to build a business application using technical languages such as C. After all those applications were only manipulating business entities without any specific technical engineering: the goal wasn’t to launch a moon rocket. We weren’t speaking about embedded computing. And after the mainframe-only era, having to code so much for the same business result looked like a regression.
- In 2022, the same reasoning applies. But it got worse. Just count the number of dependencies (front + back) and number of lines of code of a modern business application. Compared to 1986 those numbers have evolved by a factor of 10.
In fact, it has never been so technical to build a business application! If you’re not convinced, just buy yourself an Arduino and follow a tutorial to do a simple, yet fully functional time critical control application. Then follow SQL, Java or .Net, and React.js/Angular/Vue.js tutorials and build a simple, yet fully functional business application. You’ll be surprised by the total effort required by the latter.
Discussing what happened in IT for us to have to write so much technical code to build business applications is not the subject here. However, competition to attract and keep users is a run for survival. This is even the case for internal applications, were it translates into a run for acceptance. 4GLs are so long gone that the question ”Why?” wasn’t even raised until recently.
Flexibility (until the massive paradigm change of Windows) was also a strong point of some 4GLs with the possibility to integrate pieces of code written in C, and also the possibility for a 4GL program to be called from a C program (or any other 3GL). This allowed to encapsulate technical stuff once and for all and continue building business applications with no technical skills, even with some technical parts.
Windows (and earlier the Macintosh) have broken down this flexibility. Being flexible, but not enough to cope with the exponential user journey opportunities brought by the windowing systems, 4GLs died. Today, micro-services, APIs and orchestrators allow Low Code platforms to call other systems and to be called: flexibility is back, and stronger than before thanks to those new decentralized architectures.
Easy skilling is also a common point. By eliminating most of the technical work required to pursue a non technical goal, 4GLs have simplified the setup of high performing teams, where the developers are mostly selected based on their ability to understand business needs, and less on pure technical skills.
The same happens for Low Code systems. The more natural competences distribution within the team is beneficial to the quality of the delivered application. This point might even be stronger nowadays than back in the 80s: developing an application is more technical than ever, and the race for good IT resources amplifies the difficulty to find people who understand business while being highly technical. On that topic the situation is definitely worse in 2022 than it was in 1986.
Finally, go to market acceleration is comparable, as in both cases the paradigm shift translates into faster developments for the same business result.
So, are Low Code systems the new 4GLs?
What happened to the IT world for us to have to write so much technical code to build business applications? What happened for us to even find that this is absolutely normal?
First, the fast evolving web made it difficult, with the WEB 1.0, the rise and fall of Rich Internet Application (Flash Player, Silverlight), and the arrival of the current Single Page Application wave. The adoption of this last wave comes with a better maturity through a well-defined separation of concerns: the front-end in charge of the user interface on one end, and the web-services in charge of the business logic on the other end. This common understanding allowed to create low code tools that take this separation into account, with most of the time user-interfaces that can be extended with any of the market leading front-end technologies. This was a huge paradigm shift, and as such it took some time.
Today, different flavours of Low Code systems promise go to market acceleration, easy skilling, and flexibility. Just like in 1986, some are more flexible that others, with matching pros and cons. The market is still gaining maturity, just like any other development tool segment.
From their promises, they are the new 4GLs.
Due to their visual development approach, one might be tempted to compare them with some code generators from the same era, but personally I tend to match them with 4GLs due to the same ease of maintenance. Even some criteria of choice are similar to the ones used for their ancestors, with cost of ownership (and so Return on Investment), flexibility, easiness to learn and long-term roadmap.
4GLs successors or not, they might become the new kings. It’s even surprising that we still continue to develop so many business applications using more classic technologies, spending time performing non value added tasks.
Finally: a classic story about costs, go to market and information system coherence
Dev(Sec)Ops now add new constraints. Not only the Low Code system must integrate with a changing technical environment (such as front-end libraries), it must also integrate nicely with Dec(Sec)Ops pipelines. This makes the choice even more complicated and the decisions depend on the type of industry, company size, applications scope, re-skilling possibilities, DevSecOps practices. But when all boxes are ticked, Low Code allows the developers to save even more time. It can even be an easy way to adopt DevOps principles by using the tooling natively integrated with the Low Code suite.
Also, the monolith era is now long gone. Modern low code systems integrate within a modular architecture instead of being THE architecture like 4GLs were. During the past 15 years with have seen that companies having embraced service oriented architectures (and APIs) had a greater facility to evolve to new technologies and architectures. Front-end interfaces and back-end business services have completely different concerns and lifecycles. Although some champions of today such as Angular have a longer lifetime than initially predicted for a front-end framework, it is still not comparable to back-end systems stability.
Fitting into this modular and coherent business landscape is a key criteria. It is also an opportunity as it makes the move easier. There are some sweet spots for entering the game, for example a new major project combined with the desire to adopt a more Agile delivery cycle.
But the main change that the IT industry might be facing is how the resource shortage will impact the Return On Investment calculations and the go to market capabilities:
Good tools have a price, so have good developers. Where the two curves are crossing is rapidly changing.
In such a context, go to market might become the decision-maker.
Photo credits
By Dwurban – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=114823858