Re: My Experiment of PG crash when dealing with huge amount of data

From: Jeff Janes <jeff(dot)janes(at)gmail(dot)com>
To: 高健 <luckyjackgao(at)gmail(dot)com>
Cc: pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: Re: My Experiment of PG crash when dealing with huge amount of data
Date: 2013-08-31 20:42:29
Message-ID: CAMkU=1xDZ-yaK+mzLweqHL1wNCf0MTgh+Vpe1SwmGzzgicBXBQ@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao(at)gmail(dot)com> wrote:
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));

The construct "values (srf1,srf2)" will generate its entire result set
in memory up front, it will not "stream" its results to the insert
statement on the fly.

To spare memory, you would want to use something like:

insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);

Cheers,

Jeff

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Jeff Davis 2013-09-01 17:35:57 Re: Dump/Reload pg_statistic to cut time from pg_upgrade?
Previous Message Kevin Grittner 2013-08-31 14:19:06 Re: SSI and predicate locks - a non-trivial use case