Idea originates from the problem that some sites simply just get stalled on page load especially when cache has been cleared. First real usage was when I contributed to Discuss to fix load issues. With xBug Page Profiler it was easy to see that there were queries running for seconds which definitely needed attention. Even profiler does not point exactly which file, or plugin it is. It definitely gives heads up identifying the issue just by knowing which page you load, even easier if the slow one is a snippet which will be clean outlier in the output tables.
Along came query profiler which was developed to profile individual xPDOCriteria / xPDOQuery objects with out doing the hassle with prepare(), toSQL() and log it to error log with xPDO::log(). The tool is bit more complex to use, but after this post it should be easy to use.
When this article comes out, the stable version of xBug is 0.7.3-pl and hopefully all the basic functionalities stay the same to first major version or are improved like profiler columns could be added or split to more precise. If you encounter an bug, or got idea to improve the original utilities or want to see more of them. Please do create an issue to xBug github repository.
xBug Page Profiler
The easier to use of the the tools and most likely the one that will be used more often that the query debugger. To get started just simply enter id number of resource or drag and drop resource from tree you want to test to â€œUrl to be testedâ€ field and click 'Profile Page', the outcome will look something like next. Note: Only uncached tags are shown.
Overview of sections
Parser Data Columns
- The tag processed without [ and ] characters and passed variables. Only output modifiers are left to tag. The collapsed view shows total number of tags in that group.
- The complete tag in it's original form with inner tags processed when the group is expanded.
- Processing Time (s)
- Time in seconds for tag to get processed completely. When collapsed shows total time taken for tags in that group.
The SQL Profiles
This is profile generated using MySQL profiling feature, which explain the truncated queries. The profiles in MySQL does not hold full queries, but should give good pointer which queries are the bad ones. Here are the columns which are self explanatory most likely anyways.
- Internal id from profiles list, this is used by "SHOW PROFILE" command. Could be that with later versions this has full data available from profiling.
- How long does the query take
- The truncated query
Page Profile Options
- Host URL (will be included in 0.8-pl)
- Can be used to change context. Accepts subdomain, domain and domain/path values. Defaults to default context's base_path
- URL to be tested
- Id number of the resource or uri of the resource. Also drag and drop from resource tree in format [~N] is accepted for resource id. When friendly urls are enabled, both id values are turned to uri's for page load.
- URL Parameters
- Pattern: &var1=value&var2=value eg. The typical url parameters passed to page. Can be used to test pagination for example if using getPage. As an example indeksed.net has &page= on front page.
- Refresh cache before page load
- Ticking the checkbox refreshes the whole cache before page load.
Simple use cases
Got page that loads painfully slow when cache is refreshed
Got page that always loads slowly
Want to cache page after cache refresh. Just drag and drop resource and leave 'Refresh cache before pageload' unticked
Just want to see general oveview of what the page does
xBug Query Debugger
Query Debugger can be used in variety of way. Initially it was designed to show query information like memory usage by collection. With the later releases plenty of new stuff has been added to it, like MySQL documentation bits for Explain tabs select_type, type and Extra fields. Basically it does the workflow of what I use when developing new query or troubleshoot bad query where most crucial is the Explain tab and what data it holds. Also it does jump over many stages typically needed to troubleshoot xPDO queries.
In simplicity you can run the query that comes with it with two different options: getCollection and getObject. The output should show somewhat to similar manner with next picture:
Overview of Sections
On the left is the query editor page. If you have ACE installed it will be used to render the editor.
On top of the editor, the drop down has five options for what kind of query processing will be used at the background.
On the right of editor is the SQL presentation of the query. This is not editable at all.
First tab shows the result set generated by the query. You can view individual rows by clicking them.
Explain tab comes next. This is the same result as what â€œEXPLAIN <the query>â€ would show using MySQL cli or other tools. There are three fields that can be clicked to show different information taken from MySQL documentation. The three fields per row are select_type, type and Extra
Last but not least, some basic information of the query and generated collection.
Query Debugger is bit more complex to use than Page Profiler due to it's nature how it handles the criteria to processor. At the background new snippet is generated each time for get*() methods. This way it can return the criteria to collector. This why the return is mandatory part of the query. As it uses snippet to handle everything you may include PHP to script like addPackage which is needed to test custom tables.
Using getObject and getCollection
These two really require no special attention. You can create new xPDOCriteria object or new xPDOQuery in normal manner and then return it with return $theObject. The background processor knows how to handle it.
Using 'Pure SQL Query'
You can write normal MySQL compliant SQL query to editor and process it without return value. Also you need to leave out the <?php tags if there are any. The MySQL at the background obviously cannot handle PHP.
Using getObjectGraph and getCollectionGraph
These two are bit more complex and due their nature of having the JSON/Array graph required by the collector and need some refining. The return clause requires array with two key's to get processed. Example of simple getCollectionGraph query:
<?php $crit = $modx->newQuery('modResource'); $crit->where(array('parent:=' => 1)); $graph = array('Children' => array()); return array( 'graph' => $graph, 'criteria' => $crit );
Note that you can use normal array as graph instead of JSON. This is actually normal behavior and in my opinion lot easier to write instead of JSON string. There is a known issue with output of the result set which does not show child results properly. Aside from that fact, give the tool a try and do report to Github.
The future and conclusion
I will be developing the package as often I can take time off from regular everyday jobs. For the future what is coming, that is truly unknown. There are quite many bugs and "features" in the application right now which take some time to fix. So there might be one more minor batch version before 0.8.0-pl. For the 0.9 and 1.0 the feature list is still empty and hope that users would create feature requests. Basically plenty of things could be done and especially could be done differently like how the output shows.
Hopefully this tool is useful to developers out there having issues with slow pages. The page profiler definitely should show where the issues lie. And if you don't seem to find solution for slow tag, there is plenty of help available for free from MODX forums, IRC and always possibility for paid support from me or multitude of other MODX developers out there.