PostgreSQL & PHP Tutorials - Using Explain - Setting up

PostgreSQL »  Using Explain

Posted By Chris Smith Posted on 05 Mar 2006, 07:31 AM
Viewing page 1 of 5
« | Back | Next | »

Now we've seen how to index our database, how do we make sure they are being properly used?

Postgresql allows us to 'explain' queries to see what's going on. All database queries use a plan of some sort to work out how to deliver the best results for a particular query. We'll only look at 'select' commands.

First of all we'll need to create a large database table so we can look at different plans.

This little perl script will create 5 million records to import into a temporary table:

$count = 1;
$outputfile = "import.sql";
@chars = ("A" .. "Z", "a" .. "z", 0 .. 9);
@numbers = (0 .. 9);
@single_chars = ("a" .. "e");
$totalrecords = 5000000; # 5 million

open(OUTPUT, "> $outputfile");
print OUTPUT "DROP TABLE indextest;\n";
print OUTPUT "CREATE TABLE indextest (";
print OUTPUT "id INT, name VARCHAR(10), number INT, letter CHAR(1)";
print OUTPUT ");\n";
print OUTPUT "COPY indextest (id,name,number,letter) FROM stdin;\n";
while ($count <= $totalrecords) {
    $randstring = join("", @chars [map{rand @chars} (1 .. 8 ) ]);
    $randnum = join("", @numbers [map{rand @numbers} (1 .. 8 ) ]);
    $randletter = join("", @single_chars [map{rand @single_chars}(1)]);
    print OUTPUT $count."\t".$randstring."\t".$randnum."\t".$randletter."\n";
print OUTPUT ".\n";
print OUTPUT "CREATE INDEX id_index ON indextest(id);\n";
print OUTPUT "CREATE INDEX number_index ON indextest(number);\n";
print OUTPUT "VACUUM ANALYZE indextest;\n";
close OUTPUT;

It may take a while to generate the data, depending on your hardware.

The "VACUUM ANALYZE" command updates the postgresql statistics so it will know when to use an index and when to ignore it.

Once it's done we'll import it (I'm assuming you've read how to create new databases):

psql -d indextest < import.sql

Again, this will take some time depending on your hardware.

We create the indexes after importing all of the data because postgresql doesn't have to work as hard.

If we created the indexes before hand, postgresql would have to update the index as each row went into the table - so it would take a lot longer to do that for 5 million rows. Of course, this is only useful when you bulk load data like this, importing from an application (eg through php) will be different.

Once that's done we can get to work!
Viewing page 1 of 5
« | Back | Next | »

Related Articles:
Avg Rating: 4
Vote Count: 4


Want to post a comment? Fill in the details below.

Your Name  : 
Your Email  : 
Your Website  : 
Spam Check! Please answer this question  : 1 + 6 =