Archive

Posts Tagged ‘analyze’

MySQL profiler

March 4th, 2009 1 comment

When your query is slow you can debug your query with EXPLAIN. But did you know that since MySQL 5.0 you can use a profiler in MySQL?
It’s easy to use and can help you to find the bottleneck in your query.

How it works

Open a MySQL console and turn on the profiler:

mysql> set profiling=1;
Query OK, 0 rows affected (0.00 sec)

Then execute the query you want to profile, you can execute as many queries as you want. But by default only the last 10 will show up in the profiler.

mysql> select count(1) from files;
+----------+
| count(1) |
+----------+
|    13631 |
+----------+
1 row in set (0.09 sec)

Now check what’s the query ID with the command:

mysql> show profiles;
+----------+------------+----------------------------+
| Query_ID | Duration   | Query                      |
+----------+------------+----------------------------+
|        1 | 0.09114000 | select count(1) from files |
+----------+------------+----------------------------+
1 row in set (0.00 sec)

To get a the details of this query use the query id.

mysql> show profile for query 1;
+----------------------+-----------+
| Status               | Duration  |
+----------------------+-----------+
| (initialization)     | 0.000003  |
| checking permissions | 0.000031  |
| Opening tables       | 0.000028  |
| System lock          | 0.000018  |
| Table lock           | 0.000009  |
| init                 | 0.00002   |
| optimizing           | 0.000008  |
| statistics           | 0.000023  |
| preparing            | 0.000012  |
| executing            | 0.000008  |
| Sending data         | 0.0909409 |
| end                  | 0.000013  |
| query end            | 0.000005  |
| freeing items        | 0.00001   |
| closing tables       | 0.000008  |
| logging slow query   | 0.000003  |
+----------------------+-----------+
16 rows in set (0.01 sec)

Now you see the duration of every step MySQL takes. You can see a lot more like the CPU usage, just take a look at the manual.

pgFouine PostgreSQL log analyzer

February 11th, 2009 No comments

If you have a busy PostgreSQL database you may want to know which query takes up most of the time of the postmaster. To create a nice overview of slowest and most frequent queries you can use pgFouine.
This PHP script can parse a PostgreSQL log and create different reports.

The command I usually use:

./pgfouine.php -from "- 1 week" -memorylimit 512 -file <logfile.log>
-logtype stderr -title 'PostgreSQL analyze' -report db-report.html=
overall,bytype,hourly,slowest,n-mosttime,n-mostfrequent,n-slowestaverage,
n-mostfrequenterrors -format html-with-graphs