How to Run Analysis?

date January 1, 2011 02:13 by author Sukesh Ashok Kumar

One of the core features of Debug Analyzer.NET is to run Analysis on the memory dumps and provide a friendly report to show you the details in dumps, issues detected and recommendations for resolving those issues. This is accomplished by using a modular approach using Analysis Plugs. The Analysis Plugs are then grouped into what are called "Analysis Templates" according to the scenarios or issues we are troubleshooting.

Let us see how the "Analysis Templates" are shown on the left side of the main screen.

How do you Run Analysis?
So when you want to run analysis, you need to select a memory dump and select one of these "Analysis Templates" and click on "Run Analysis".
Each sub element shown under the Template Name (highlighted in blue) are individual Analysis Plugs.
Please see the article titled "Hello World Analysis Plug" on the right for learning to write your Analysis Plugs.

How do you create Analysis Template?
From the top menu click Tools > Analysis Templates. You can get the following screen where you can edit your Templates.


Architecture & Design

date January 1, 2011 01:13 by author Sukesh Ashok Kumar

From my experience as a developer I have seen a lot of great applications from architecture perpsective. One of those things which is often left out and comes as last is the application extensibility and developer experience. This leads to either complicated configuration settings or a huge learning curve to accomplish simplest of things.

When I started building this application, I had few guidelines and I worked backwards in many things to make developing experience more easier (still learning to do more though...).
One of the best things to happen in Developer world is the intellisense support. It is more like copy/paste of the developer world and I wanted to make sure I use that in the best possible ways.

Design Thoughts

  • Write Analysis just once and it works for both x86 and x64
    Debug Analyzer built-in framework abstracts away the Data Access, so your Analysis just works without changes on x86 & x64 targets since it's written against the Object Model and have no dependency on processor architecture of the memory dumps.
  • Plug Framework
    This framework allows the flexibility of dropping a DLL in designated folder and the application consuming it. Makes it easier for writing analysis with little or no learning curve. This framework is used in all core features like Analysis Plug, Command Plug, Visualizer Plug etc...
  • Analysis Workflow
    Enabling developers to learn debugging memory dumps was one of the requirements and this is achieved by having the flow of analysis plugs follow almost the same pattern of how you do manually using windbg and sos/psscor. This enables you to do both, write analysis if you know how to debug using windbg + sos/psscor and vice versa.
  • Object Model
    Build a Object Model heirarchy which is consistent and easy to discover in a heirarchical manner. This is implemented with the Analyzer Object Model heirarchy. It also uses intellisense to show you the entire heirarchy and smooth flow.
  • Extensibility with Abstraction layers
    Designed as a tiered application with extensibility to replace components through config using abstraction layers. Analysis Engine can be called by passing simple parameters. Data Access Components can be replaced by just changing it in config file, specially if you want to change the data source.
  • API is contained in single DLL (Debug.Analyzer.API)
    Whatever you do, you set reference only to a single DLL, whether it's to write Plugs, Tracing, DAC or anything else.
  • Reporting Framework
    Reporting Framework handles look and feel completely using XML/XSL. This was done to make sure there is a structured way for reporting and also enables to change the look and feel of the reports without rewriting or recompiling the Plugs.
  • Reduced footprint.
    Since entire Windows Debugging Tools is a larger download, Debug Analyzer uses MSDN documented COM interfaces to interact with the memory dumps and use sos/psscor to extract data relevant to CLR. Moving to .NET Framework 4.0 was also done to reduce the number of prerequisites!
  • Better resources to educate.
    Learn mode plays a major role in educating developers on what the issue was and how it can be avoided by explaining the reasons in detail for the issue. Often times, analysis results comes with complicated terminologies assuming everyone are aware of the said terminologies. Learn option enables developers to get more information about the said problem and gain knowledge to make sure the mistakes are not repeated.

Simplistic looks at the different component layers

How it works!
Here is the flow of events which happens when you click 'Run Analysis'

  • Calls Debug Analyzer Engine with the 'Analysis Template' name and the memory dump file details
  • The Engine loads all the Analysis Plugs which are part of the specific Template
  • Instantiates each individual Plug and run the code using the entry point 'RunAnalysis'
  • On 1st use Plug Framework invokes the Input Provider which is part of Provider Framework
  • Provider Framework does lookup in the config and loads/instantiates the relevant Data Access Component
  • Provider Framework also does caching inside the Object Model on 1st load of the Collections to make the analysis faster
  • Data Access Component has the complete logic of how the the data is extracted out of the Data Source which are memory dumps (for now)
  • Reporting Framework captures the output generated in XML form and returns the Analysis fragment for each Plug
  • After all the Plugs have returned, the individual fragments are used by Reporting Framework with built-in XSL to generate output HTML
  • Writes the output to Reports folder and Shows the generated report in the main application.

There were few challenges as well which are explained below.

Challenge #1 - Textual output from Debugger and Extensions
Debugger Engine (dbgeng) works great for manually debugging and analyzing issues but when you want to build automation for debugging, textual output from the debugger becomes difficult to use. None of the debugger extensions provide alternate output which can be consumed for automation easily.

This became a challenge specially since this is also true with Debugger extensions like sos/psscor used for .NET debugging. Writing an extension with all the features does not make too much sense, since that would also involve keeping things upto date with the changes in CLR etc. So the best alternative was to use the debugger output from sos/psscor and parse the string output.

This introduces a question, what happens when the debugger extension changes the output format. Do I need to change my analysis as well? The answer to that was to build a Data Access Component abstraction layer so that Analysis is written against the Object Model and the underlying data can be changed without breaking them. This also helps in changing from SOS strings as it is used today to probably use ICorDebug later, all these done without breaking the existing Analysis already written.

Challenge #2 - Debugger (dbgeng) redistribution and the version in the OS
Debugger Engine which provides all the debugger functionality is contained in a DLL called dbgeng. This is distributed with the OS but not regularly updated. The latest updates come with the 'Windows Debugging Tools' package. Since the size of this package is larger, it becomes a larger download dependency for a simple application like Debug Analyzer.NET

Best choice was to use the version of the Debugger Engine available in the OS and use the COM based interface as documented on MSDN. This is implemented at the Data Access Component layer which is written in C++/CLI.

[Note: If you have any questions/concerns/feedback you can reach out to me through comments or Contact page ]