As we evolve toward a software-defined world, there’s a new user experience urgency emerging. That’s because the definition of “user” is going to be vastly expanded. In the Internet of Things (IoT) era, users include machines.
Companies today are generating, collecting and analyzing more data than ever before. They want to get better insights into their customers and their business operations. This is driving substantial Investments in new architectures that extend to cloud and mobility.
They’re also yielding to user demands for more and newer sources of big data. They’re experimenting with data lakes to store this potential trove. And they’re investing in data blending and visualization technologies to analyze it all.
In the IoT world of the near future, however, much of this analysis is going to be done by machines with deep learning capabilities. With forecasts for as many as 50 billion connected devices by 2020, the experience of these “users” with the applications they engage with will be no less critical to achieving strategic objectives than customer experience is now – and will remain.
But how are companies going to get smarter if user experience sucks? Where is this greater insight going to come from if whatever business intelligence software they’ve deployed is not performing to user expectations?
They’re not going to win customer satisfaction and loyalty by frustrating users. And the risks involved with disappointing machine users could be catastrophic.
It’s Time to Get Strategic
More companies have come to realize the strategic value of their data. As such, they’re seeking ways to get a higher return on those data assets. The databases – both transactional and analytic – they’ve invested in are critical to corporate strategy.
In order to maximize the performance of business-critical apps companies must get strategic about user experience and application performance. Monitoring technologies can no longer be implemented as short-term tactical bandages.
They might put out a brush fire temporarily, but they create more complexity and management headaches in the long run. They often don’t work well together and generate more false positives than a smoke detector with a failing battery. Annoying right?
IT teams are going to have to get more efficient with their ops data. They will need a standardized approach to integrating diverse data sets, including those from SaaS applications and IaaS or PaaS clouds. This is critical to gaining physical and logical knowledge of the computing environment across the entire application delivery chain.
Next-generation data integration technologies can unify ops data from traditional monitoring solutions with real-time streams of machine data and other types of big data. They automate much of the cleansing, matching, error handling and performance monitoring that IT Ops teams often struggle with manually.
As this ops data grows with IoT, it can be fed into a data lake for analysis. In fact, IT teams can kill two birds with one stone. First, IT Ops data is a natural fit as an early test case for a data lake. And by starting now they can hone skills sets for big data analytics and the coming IoT data deluge.
IT Ops, which are increasingly becoming a part of DevOps teams, can learn from and share their experiences with data management and analytics teams – as well as business teams. It makes sense to bring application governance and data governance together because they share a common goal: ensuring that users have access to the highest quality data at the point of decision to optimize business outcomes and mitigate risks.
The Path to ROI and Risk Management Objectives
This environment necessitates communication and collaboration among IT and business teams to proactively anticipate, identify and resolve application performance and user experience problems. It also facilitates orchestration and management of both internally and externally sourced services efficiently to improve decision-making and business outcomes.
Through a unified approach to performance analytics, IT can help their companies leverage technology investments to discover, interpret and respond to the myriad events that impact their operations, security, compliance and competitiveness. Ops data efficiency becomes actionable to facilitate strategic initiatives and positively impact financial results.
Successful strategy implementation manifests in return on investment (ROI) and risk management. Multiple studies, including ours and the annual Puppet Labs State of DevOps report confirm that companies taking a strategic approach to user experience and application performance outperform their respective peer groups in financial metrics and market performance.
Vendors in this space – usually referred to as application performance management ( APM) – need to advance their thinking and technology. Machine learning and predictive analytics are going to be table stakes in the IoT future.
APM vendors have a choice: they can maintain a focus on human user experience, which will always be essential. Or they can think more broadly about user experience in the IOT world. Because some of today’s enterprise customers – that produce everything from home monitoring devices and appliances to turbine engines, agricultural machinery and healthcare equipment – could one day well become competitors.
By capturing data from embedded sensors and applying advanced analytics to provide customers using their equipment with deeper insights, they could close out what will become the lion’s share of the IoT user experience market. Leading manufacturers are already there.
This article first appeared on LinkedIn.
Photo: Gorbash Varvara