The Monitoring Renaissance: Putting People at the Center

hands reaching out to touch each other

In the post “Living the Monitoring Renaissance” I mentioned: “The Monitoring Renaissance”, a “revival of or renewed interest in something” is here today. In this earlier posting, I discussed how next generation, cloud-based monitoring platforms are supplying the capabilities we need to deliver on the desire for proactive and predictive monitoring. Traditional on-premises platforms have made a lot of promises and have come up short because of the limitations imposed on them by being on-premises. Most monitoring teams usually end up spending much of their time managing the monitoring platform: Ensuring availability, provisioning storage, and tending to glue scripts that keep the solution working.

bandage over broken glass

If this sounds familiar, you may be wondering how to move from “keeping the lights on” to a more innovative and forward-thinking approach. As monitoring practitioners, next generation platforms like LogicMonitor and others give us a number of new and improved capabilities to solve for the needs imposed by containers, virtualization, cloud infrastructure and cloud services, as well as addressing the needs of traditional infrastructure in a new and novel way. It’s time for us to rethink monitoring and to leverage these new offerings to reduce cost, speed up monitoring of new technology stacks and improve visibility for the business.

Next-gen Cloud-based Monitoring Platforms with Empowered Networks and LogicMonitor

As Empowered Networks has engaged and worked with savvy customers who are making the switch from legacy systems to next-gen cloud-based monitoring platforms, we’ve noticed some interesting and unique challenges being raised. The traditional monitoring landscape is made up of a collection of tools from various vendors that have been acquired and integrated over the course of years. Not only is there a significant technical job to do in understanding what tools are in the environment and how they’re being used, but there’s also an organizational challenge around who owns the tools and who uses these tools and why.

In a recent customer engagement, we needed to work with more than 12 internal teams and examine more than 100 different tools to understand how best to improve their monitoring landscape. All too often, this assessment work can seem like a “win-lose” situation with some groups feeling like their favourite tools are being threatened. This is a very natural human reaction, and it points to a key learning:

There’s something for everyone in a next-gen monitoring system.

The next-gen platforms have worked very diligently to ensure there’s a baseline of capability for all consumers. What’s more, the benefit of having a single tool that everyone integrates and collaborates over cannot be understated. Practically, having a single tool at the centre of the monitoring discipline reduces misunderstandings between teams that often lead to extended recovery times.

A single presentation of the data provides your teams with a common vision of the state of infrastructure.

The intrinsic value of next generation monitoring platforms may not be obvious at first blush. It’s in the idea of democratization (the action to make something available to everyone) that we uncover the significant value the platforms can deliver. Our task in design and implementation becomes a conversation about how we make monitoring safe for democracy, with capabilities for all consumers.

Does having this single core tool mean that this is the only tool you’ll use? Absolutely not. There is a balance to be struck between what’s available in a next-gen cloud-based monitoring platform and discipline-specific tools. 

Understanding Core Monitoring Tools

What’s key to examine in choosing a core monitoring tool that everyone will use is:

  • Understanding what capabilities are available for each discipline; and,
  • Understanding how rich the ability to integrate with discipline-specific tools is
grid of face icons

As we continue to work with customers to explore the value of the platform, our experiences with LogicMonitor show us its robust capabilities for database teams, network teams, infosec users, application owners, storage and compute teams and folks who deal with public and private cloud operations. The platform also supplies an outstanding ability to surface additional data through the use of Data Sources, and there’s a strong community behind LogicMonitor that is active in adding to those Data Sources as needed.

Are you ready to improve your monitoring system? Reach out to our team and let’s explore how we can help you.