Re: BUG #4527: Prepare of large multirow insert fails without error

Lists: pgsql-bugs
From: "Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de>
To: pgsql-bugs(at)postgresql(dot)org
Subject: BUG #4527: Prepare of large multirow insert fails without error
Date: 2008-11-13 01:17:10
Message-ID: 200811130117.mAD1HAra061986@wwwmaster.postgresql.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-bugs


The following bug has been logged online:

Bug reference: 4527
Logged by: Vincent Kessler
Email address: vincent(dot)kessler(at)quantec-networks(dot)de
PostgreSQL version: 8.3.4
Operating system: Debian Linux 2.6.18-6-686-bigmem
Description: Prepare of large multirow insert fails without error
Details:

Hi,

i am trying to do large multirow inserts using PQsendPrepare. I have not
found a limit in the number of parameters or the size of the querystring, so
i assume memory is the limit.
When executing the PQsendPrepare function using a querystring of about 100kb
in size and about 10000 parameters the function returns after timeout. A
tcpdump shows a "parse" message with a length of 100kb but the transfer
stops after roughly 30kb.

The server log shows:
LOG: incomplete message from client
LOG: unexpected EOF on client connection

A pcap file for analysis using tcpdump or wireshark is available here:
http://www.quantec-networks.de/downloads/dump.pcap

I have no quick way to upgrade to 8.3.5 but according to the changelog it
should not be relevant in this case.

What can i do to narrow down the problem?
Any help is greatly apreciated.

Regards,

Vincent


From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: "Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de>
Cc: pgsql-bugs(at)postgresql(dot)org
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error
Date: 2008-11-13 15:28:47
Message-ID: 20112.1226590127@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-bugs

"Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de> writes:
> i am trying to do large multirow inserts using PQsendPrepare. I have not
> found a limit in the number of parameters or the size of the querystring, so
> i assume memory is the limit.
> When executing the PQsendPrepare function using a querystring of about 100kb
> in size and about 10000 parameters the function returns after timeout. A
> tcpdump shows a "parse" message with a length of 100kb but the transfer
> stops after roughly 30kb.

With such a large statement it's unlikely that the PQsendPrepare call
would have been able to push all the data out immediately. Since it's
intended to not block the application, it would return with some data
still unsent. You need to call PQflush periodically until the data is
all transmitted, if you want to run in nonblocking mode.

> The server log shows:
> LOG: incomplete message from client
> LOG: unexpected EOF on client connection

Although this explanation doesn't say why the client apparently dropped
the connection. I think you need to show us a complete example of what
you're doing, if the above hint isn't sufficient.

regards, tom lane


From: Vincent Kessler <vincent(dot)kessler(at)quantec-networks(dot)de>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: pgsql-bugs(at)postgresql(dot)org
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error
Date: 2008-11-13 17:03:22
Message-ID: 491C5DDA.7090909@quantec-networks.de
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-bugs


Tom Lane schrieb, Am 13.11.2008 16:28:
> "Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de> writes:
>> i am trying to do large multirow inserts using PQsendPrepare. I have not
>> found a limit in the number of parameters or the size of the querystring, so
>> i assume memory is the limit.
>> When executing the PQsendPrepare function using a querystring of about 100kb
>> in size and about 10000 parameters the function returns after timeout. A
>> tcpdump shows a "parse" message with a length of 100kb but the transfer
>> stops after roughly 30kb.
>
> With such a large statement it's unlikely that the PQsendPrepare call
> would have been able to push all the data out immediately. Since it's
> intended to not block the application, it would return with some data
> still unsent. You need to call PQflush periodically until the data is
> all transmitted, if you want to run in nonblocking mode.
>

Thank you very much, that was exactly the problem. Everything works
perfectly now.

Regards,

Vincent