Out of memory error during updating huge table

Lists: pgsql-general
From: Gabor Siklos <siklosg(at)yahoo(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Out of memory error during updating huge table
Date: 2006-02-10 15:58:56
Message-ID: 20060210155856.62005.qmail@web54215.mail.yahoo.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-general

I'm trying to update a table in transaction mode with
300 million records in it, and I'm getting an out of
memory error.

ERROR: out of memory
DETAIL: Failed on request of size 32.

In transaction mode, I first delete all the records in
the table and then try to use COPY to populate it
again with the new data from a file. The out of memory
error happens during the COPY.

Any thoughts? Is there a way to get around this?
Thanks a lot!

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com


From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: Gabor Siklos <siklosg(at)yahoo(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Out of memory error during updating huge table
Date: 2006-02-12 00:43:33
Message-ID: 782.1139705013@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-general

Gabor Siklos <siklosg(at)yahoo(dot)com> writes:
> I'm trying to update a table in transaction mode with
> 300 million records in it, and I'm getting an out of
> memory error.

The update per se shouldn't be a problem, but if you have AFTER ROW
triggers on the table then the list of pending trigger events could
be a problem. Foreign key constraints, in particular, use AFTER
triggers.

> In transaction mode, I first delete all the records in
> the table and then try to use COPY to populate it
> again with the new data from a file. The out of memory
> error happens during the COPY.

If the problem is coming from an FK constraint, you might consider
dropping the constraint and then re-adding it after the COPY. This'd
probably be faster than checking the rows "retail", too.

regards, tom lane