The best-protected applications feed on data

When visitors to a tropical aquarium rave about beautiful corals, they don’t see what’s hiding underneath. Because under the sand and the rocks is an incredible ecosystem of micro-organisms without which the corals cannot survive and even less prosper.

And they see even less the vigilance with which aquarium staff monitor the conditions of their environment. They don’t see actions taken minute by minute, and day by day, that allow them to react as quickly as possible when something goes wrong. They also do not hear the alarm that goes off when any parameter crosses a safety threshold.

And it’s exactly the same in the world of technology.

When people shop online, they only see one thing: user experience. They don’t see the myriad of apps, infrastructures, environments and services that secure it all and make this experience possible. But they’re there, and they’re essential to ensuring a healthy user experience.

The importance of data

There is an important difference between knowing something is wrong and knowing what to do. In an aquarium, something as simple as understanding the relationship between pH and temperature can be the difference between solving the problem and making it worse. The same goes for the user experience. And for that, a first fundamental step is to make sure to collect the right data.

Unfortunately, a significant percentage of organizations do not.

A Turbonomic survey exposes this phenomenon: “When we asked respondents how their organization measures application performance, it was promising that over 60% measure it in some form or another. But in reality, the most common approach has been measuring availability, as opposed to managing service level objectives (SLOs), which typically take the form of response time or transaction throughput. And worse: 13% don’t measure app performance at all“.

Focusing on availability alone and not on the overall user experience can have a very tangible negative impact on businesses. For example, 89% of customers say they are more likely to turn to a competitor after a bad user experience. And once gone, the acquisition cost to replace them is high, ranging from an average of € 70 per customer in the retail sector to over € 200 in the financial world. And the loss of revenue is even greater: loyal customers are worth an average of ten times the value of their first purchase.

Thus, maintaining a superior user experience is an absolute must!

The key to it all: fierce attention to measuring the right data, as tropical aquarium experts do! And to achieve this, it is important to determine the right control points to automate responses, both on the infrastructure and the services that provide and secure the applications.

Adaptive applications are data driven

In Warsaw, Poland, eight clams have been found to be more efficient than any other technology at measuring water quality. “When these shells, very sensitive to pollution, detect polluted water, they close, an action which then triggers alarms thanks to special sensors attached to their shell.“. Until recently, no one knew of their existence. Residents only knew they had access to safe drinking water, regardless of how polluted water was detected.

Living organisms like this clam instinctively measure everything and are particularly astute at recognizing danger based on measured data. But no single internal system is responsible for this superpower. It takes the collaboration of hundreds of internal systems generating measurements, plus the ability to analyze the resulting data to decide in a fraction of a second that the water is dangerous.

Likewise, it’s the metrics – the data – that make an application adaptive. Without clear triggers for action, it is not possible to adapt. Identifying malicious activity triggers security actions, while detection of performance degradation triggers optimization actions instead. They are two very different things and it is necessary to do both.

To be effective, this data must include measurements taken from each layer of the technology stack. Taken as a whole and confronted with business processes – right down to workflows – they can be analyzed and transformed into information to which applications will have to adapt automatically.

By then carrying out more in-depth analyzes, it becomes possible – again based on that same data – to highlight relationships, patterns and trends that enable business leaders to align architectures, infrastructure and applications with actual business results.

This rich knowledge makes it possible to take very rapid automated actions when necessary (thanks to AI and automation), while allowing the humans in charge to make more informed decisions.

To act, therefore, we need systems capable of receiving instructions and implementing them. In nature, it is not our interface with the world that reacts to dangerous conditions. It is our immune systems and other internal systems that act on our behalf based on the information collected by our interface with the world. In a digital world, these internal systems are application services and infrastructures. The technologies behind the interface generate data and act to protect, extend and optimize the user experience.

Thus adaptive applications are driven by data, the collection of which is made possible by telemetry generated by application services, the infrastructure and the systems that provide, secure and scale them. With a platform capable of analyzing this data and producing actionable and automatable insights, organizations will be able to move faster and with confidence towards implementing applications that can truly adapt to external conditions.

The most disappointing part of all this? It is that customers will still not realize it… they will only know that their user experience is perfect at all times!