How to improve performance in reporting database?

From: Matthew Wilson <matt(at)tplus1(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: How to improve performance in reporting database?
Date: 2010-07-22 14:45:45
Message-ID: i29lip$9q2$1@dough.gmane.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I have a daily job that pushes data from the production database into
the reporting database, which right now, is an exact copy.

I have a webapp that builds lots of reports for users. Most of these
reports involve elaborate joins of lookup tables and lots of summations,
and they take too long to run, even after using everything I know to
tune the queries.

Since I know this is a read-only data, it seems like I should be able to
speed everything up dramatically if I run the queries offline and then
save the results into new tables. Then the web app could just grab the
cached results out of these new tables and then spit them out quickly.

I've heard people talking about using "materialized views" for this, but
that was with Oracle.

What's the postgresql way here?

More generally, any advice on running reporting databases well is
welcome.

Matt

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Ben Chobot 2010-07-22 14:48:43 Re: Getting statistics for each sql statement?
Previous Message P Kishor 2010-07-22 14:41:12 optimizing daily data storage in Pg