Re: Adapter update.

From: Ow Mun Heng <Ow(dot)Mun(dot)Heng(at)wdc(dot)com>
To: Richard Huxton <dev(at)archonet(dot)com>
Cc: Murali Maddali <murali(dot)maddali(at)uai(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Adapter update.
Date: 2007-09-07 03:06:30
Message-ID: 1189134390.17218.40.camel@neuromancer.home.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Wed, 2007-08-22 at 20:41 +0100, Richard Huxton wrote:
> Murali Maddali wrote:
> > This is what I am doing, I am reading the data from SQL Server 2005 and
> > dumping to out to Postgresql 8.2 database.

My 2 cents.. I'm doing roughly the same thing, but I'm using perl and
DBI to do it.

> Fastest way to load data into PG is via COPY, don't know if npgsql
> driver supports that. If not, you'd have to go via a text-file.
>
> Load the data into an import table (TEMPORARY table probably) and then
> just use three queries to handle deletion, update and insertion.
> Comparing one row at a time is adding a lot of overhead.

My way of doing it..

1. pull from SQL Server via DBI to temp csv file.
2. Import via \copy into PG to temp table.
begin transaction
3. DElete duplicate pkey entries in actual table
4. insert new entries into actual table
5, truncate temp table
6. update a log file
end transaction.

works great..

Note on [3]..all data are new.. so instead of just doing update, I
resorted to doing a delete like the mysql's mysqlimport --replace
command. (my choice)

In response to

Browse pgsql-general by date

  From Date Subject
Next Message A.M. 2007-09-07 03:16:02 Re: Call for Speakers PostgreSQL Conference Fall 2007
Previous Message Merlin Moncure 2007-09-07 02:57:35 Re: array_to_records function