VMblog: If you were giving an AWS re:Invent attendee a quick overview of the company, what would you say? How would you describe the company?
Michael Allen: In today's digital-first world, one of the most pressing challenges organizations face is limited visibility and control over their cloud environments. Dynatrace helps the world's largest 15,000 organizations overcome this to ensure their software runs perfectly. The Dynatrace unified platform combines the deepest and broadest observability, continuous runtime application security, and AI to deliver precise answers and intelligent automation. With insights about the performance of their applications, microservices, underlying infrastructure, and the experience of end-users, organizations can innovate faster, collaborate more efficiently, and deliver more value to customers.
VMblog: How can attendees of the event find you? What do you have planned at your booth this year? What type of things will attendees be able to do at your booth?
Allen: During the event, Dynatrace will be at booth #606. At the booth, we'll be hosting live demos of the Dynatrace platform all day. Stop by to learn how the Dynatrace unified observability and security platform leverages the industry's most powerful AI engine to deliver precise answers and intelligent automation from all cloud data.
Onsite at our booth, we'll be giving away Dynatrace swag (t-shirts, water bottles, socks, and more). We'll also be raffling off an Oculus VR headset each day of the show.
VMblog: What do you attribute to the success and growth of this industry?
Allen: Digital transformation is accelerating and it is happening in the cloud. Software defines how we bank, manufacture, deliver healthcare, receive government services, and communicate with friends, families, and colleagues. Organizations have shifted to the cloud for improved scalability, flexibility, and cost savings. By leveraging cloud computing, businesses can reduce their IT infrastructure costs, improve their operational efficiency, and enhance their agility.
To drive successful digital transformation, organizations need a modern observability platform, like the Dynatrace platform, that uses AI and automation to proactively manage and optimize cloud operations so teams can focus on delivering software faster and more securely.
VMblog: Do you have any speaking sessions during the event? If so, can you give us the details?
Allen: Yes, we'll have several Dynatrace representatives participating in speaking sessions during the event. Details can be found below and here.
- "Accelerate secure and reliable AWS deployments with Dynatrace for DevSecOps"
- Organizations can spend a lot of time on manual and repetitive work, monitoring tools, compliance challenges, and inconsistent, undocumented approaches. DevOps teams must go beyond putting data on dashboards and manually sifting through scans and tests from observability, security, and business data. In this session, learn how Dynatrace delivers intelligent workflow-based automation that proactively identifies quality, performance, and security issues before release and facilitates faster, more secure deployments.
- Matt Gibiec - Sr. Solutions Engineer at Dynatrace
- Susan St. Clair - Principal Security Solutions Engineer at Dynatrace
- Time: Wednesday, Nov. 29th at 1:00PM
- Location: The Venetian - Bellini 2005
- "Best practices for optimizing Kubernetes applications on AWS"
- The importance of effective Kubernetes deployments on AWS cannot be overstated. Key technologies such as automation, security analytics, and causal AI will help you optimize Kubernetes applications while delivering higher-quality software faster. In this session, learn about best practices for operating Kubernetes on AWS infrastructure and gain actionable insights to elevate your Kubernetes deployment on AWS.
- Jason Ostroski - Principal Solutions Engineer at Dynatrace
- Markie Duby - Principal Solutions Engineer at Dynatrace
- Time: Tuesday, Nov. 28th at 1:00PM
- Location: The Wynn - Cristal 7
VMblog: What kind of message will an attendee hear from you this year? What will they take back to help sell their management team and decision makers?
Allen: As organizations continue to build and run millions of new apps in the cloud, observability has moved from optional to mandatory. Traditional cloud monitoring methods can no longer scale to meet organizations' demands. IT teams can't resort to playing defense, fighting daily fires rather than focusing on more important tasks, like innovation.
Teams need a modern observability approach with artificial intelligence at its core, like Dynatrace. The Dynatrace platform is tightly integrated with major cloud providers like AWS, enabling customers to quickly and easily gain insights into their dynamic cloud environments so they can optimize operations and improve digital experiences for end-users.
VMblog: Can you double click on your company's technologies? And talk about the types of problems you solve for an AWS re:Invent attendee.
Allen: Dynatrace is the leader in observability and security, and we exist to make software work perfectly. The Dynatrace platform unifies observability, business, and security data at a massive scale (via the GrailTM data lakehouse) with continually updated topology and dependency mapping to retain data context.
Dynatrace Davis® AI combines predictive AI to anticipate future behaviors, causal AI to deliver precise answers and intelligent automation, and generative AI to automatically provide recommendations, create suggested workflows or dashboards, and let people use natural language to explore, solve, and complete tasks. Dynatrace analytics and automation capabilities enable teams to modernize and optimize cloud operations, deliver software faster and more securely, and ensure flawless digital experiences.
VMblog: While thinking about your company's solutions, can you give readers a few examples of how your offerings are unique? What are your differentiators? What sets you apart from the competition?
Allen: A significant differentiator of the Dynatrace platform is its innate AI capabilities. Dynatrace has been an AI leader for over a decade. Dynatrace Davis AI sits at the core of the Dynatrace platform and is the only artificial intelligence solution to combine fact-based causal and predictive AI insights with generative AI capabilities.
Davis AI enables customers to harness the power of AI for cloud observability and security at scale.
- Davis predictive-AI models use dynamic machine learning and statistical methods to anticipate future behavior based on past data and observed patterns and recommend future actions related to the performance and security of their software.
- Davis causal AI analyzes real-time and context-rich observability, security, and business data within the Dynatrace platform to provide precise, fact-based, and deterministic answers and intelligent automation.
- Davis CoPilotTM generative AI works with Dynatrace predictive and causal AI to automatically provide recommendations, create suggested workflows and dashboards, or let people use natural language to explore, solve, and complete tasks.
Leveraging the powerful combination of these artificial intelligence techniques, the Dynatrace platform helps organizations speed up issue resolution and increase security in their cloud environments by quickly detecting and pinpointing the root cause of problems in software, recommending solutions, and automating remediation before end-users are impacted. With Dynatrace, organizations cut through the complexity and massive volumes of data that modern cloud ecosystems create to ensure the reliability, efficiency, and stability of applications and infrastructure.
Thousands of Dynatrace customers have benefited from Davis predictive and causal AI for many years. The expansion of Davis AI to include generative AI capabilities via Davis CoPilot will be generally available to all customers in late 2023 and accessible to all customers as a core technology within the Dynatrace platform.
VMblog: Are companies going all in for the cloud? Or do you see a return back to on-premises? Are there roadblocks in place keeping companies from going all cloud?
Allen: Gartner predicts that by 2026, 75% of organizations will adopt a digital transformation model predicated on cloud as the fundamental underlying platform. Organizations are realizing the scalability, flexibility, and cost savings benefits the cloud has to offer and we believe cloud adoption will continue to accelerate. As new technologies continue to emerge to help businesses improve digital services, like generative AI, organizations will need to continue to shift workloads to the cloud in order to scale and be successful.
VMblog: The keynote stage will be covering a number of big topics, but what big changes or trends does your company see taking shape as we head into 2024?
Allen: We're seeing a few key trends take shape in the AI and observability markets as we head into 2024. The first being that the world will go hypermodal in its approach to AI. As generative AI enters the later stages of its hype cycle in 2024 and organizations realize that it cannot deliver meaningful value by itself, they will move toward a hypermodal approach. We will see organizations combine generative AI with other types of artificial intelligence and sources of data to enable more advanced reasoning and bring precision, context, and meaning to its outputs.
Additionally, we predict that AI-generated code will create the need for digital immune systems. As software developers continue to use generative AI-powered autonomous agents to write code for them, organizations will be exposed to greater risk of unexpected problems that impact customer and user experiences. Those that attempt to use generative AI to review and resolve issues in the code created by another generative AI will find themselves with a recursive problem, as they will still lack the fundamental knowledge and understanding needed to manage it effectively. This will drive organizations to develop digital immune systems that protect their applications from the inside, by ensuring they are resilient by default. To enable this, organizations will harness predictive AI to automatically sense problems as they begin to emerge and trigger an instant, automated response to safeguard the user experience.
And lastly, data observability will become mandatory. As the volume of data has continued to double every two years, organizations are urgently seeking to ingest and analyze it faster and at a greater scale. However, the cost and risk of poor-quality data is greater than ever. In a recent survey, 57% of DevOps practitioners said the absence of data observability makes it difficult to drive automation in a compliant way. As a result, there will be an increased demand for solutions that provide data observability to enable organizations to rapidly and securely ingest high-quality and reliable data that is ready for analytics on demand.