Re: array_to_json re-encodes ARRAY of json type

From: Andrew Dunstan <andrew(at)dunslane(dot)net>
To: Itagaki Takahiro <itagaki(dot)takahiro(at)gmail(dot)com>
Cc: PostgreSQL Hackers <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: array_to_json re-encodes ARRAY of json type
Date: 2012-02-20 14:01:08
Message-ID: 4F425224.4010904@dunslane.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On 02/20/2012 07:30 AM, Itagaki Takahiro wrote:
> If we pass an ARRAY of json type to array_to_json() function, the
> function seems to
> re-encode the JSON text. But should the following examples be the same result?
> I'm not sure why we don't have a special case for json type in datum_to_json()
> -- do we need to pass-through json types in it?
>
> =# \x
> =# SELECT '["A"]'::json,
> array_to_json(ARRAY['A']),
> array_to_json(ARRAY['"A"'::json]);
> -[ RECORD 1 ]-+----------
> json | ["A"]
> array_to_json | ["A"]
> array_to_json | ["\"A\""]
>

Hmm, maybe. The trouble is that datum_to_json doesn't know what type
it's getting, only the type category. We could probably fudge it by
faking a false one for JSON, say with a lower case 'j', which should be
fairly future-proof, where the category is detected - for efficiency
reasons we do this for the whole array rather than for each element of
the array.

There's another case I have on my list to fix too - some numeric output
such as "NaN" and "Infinity" are not legal JSON numeric values, and need
to be quoted in order to avoid generating illegal JSON.

cheers

andrew

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message ktm@rice.edu 2012-02-20 14:24:11 Re: Qual evaluation cost estimates for GIN indexes
Previous Message Robert Haas 2012-02-20 14:00:46 Re: Displaying accumulated autovacuum cost