From: | Andrew Dunstan <andrew(at)dunslane(dot)net> |
---|---|
To: | Teodor Sigaev <teodor(at)sigaev(dot)ru> |
Cc: | Pgsql Hackers <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: json/jsonb inconsistence - 2 |
Date: | 2014-05-29 15:23:55 |
Message-ID: | 5387510B.2090601@dunslane.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On 05/29/2014 08:15 AM, Andrew Dunstan wrote:
>
> On 05/29/2014 08:00 AM, Teodor Sigaev wrote:
>> postgres=# select '["\u0000"]'::json->0;
>> ?column?
>> ----------
>> "\u0000"
>> (1 row)
>>
>> Time: 1,294 ms
>> postgres=# select '["\u0000"]'::jsonb->0;
>> ?column?
>> -----------
>> "\\u0000"
>> (1 row)
>>
>> It seems to me that escape_json() is wrongly used in
>> jsonb_put_escaped_value(), right name of escape_json() is a
>> escape_to_json().
>
>
> That's a bug. I will look into it. I think we might need to
> special-case \u0000 on output, just as we do on input.
Actually, this is just the tip of the iceberg.
Here's what 9.3 does:
andrew=# select array_to_json(array['a','\u0000','b']::text[]);
array_to_json
---------------------
["a","\\u0000","b"]
I'm now wondering if we should pass though any unicode escape
(presumably validated to some extent). I guess we can't change this in
9.2/9.3 because it would be a behaviour change.
These unicode escapes have given us more trouble than any other part of
the JSON spec :-(
cheers
andrew
From | Date | Subject | |
---|---|---|---|
Next Message | Peter Geoghegan | 2014-05-29 16:30:55 | Re: Ancient bug in formatting.c/to_char() |
Previous Message | Teodor Sigaev | 2014-05-29 14:52:09 | Re: SP-GiST bug. |