Migrations with csv / json file #1402
Replies: 2 comments 2 replies
-
|
CSV/text/binary is rather straightforward as long as you read it from STDIN and not trying to access a file, because the file has to be accessible from server, not from the client I can implement this, since I also need this from time to time to populate tables that I use as flexible and externally accessible enums. All that is needed is to write this type of SQL, so it's really not overly difficult to do: COPY target_table FROM STDIN CSV HEADER DELIMITER ';' QUOTE '"';
"id";"name";"date"
1;"foo";"2025-07-03"
2;"bar";"2025-07-03"
\.Documentation gives immediately the usable options With JSON pure SQL approach is rather difficult, but still theoretically possible, since JSON(B) can be parsed from a string by typecasting with BEGIN;
CREATE TEMPORARY TABLE target_table (
id INT,
v1 TEXT,
v2 DATE
) ON COMMIT DROP;
WITH source_data(data_as_json) AS (
VALUES ('
[
{
"id": 1,
"v1": "foobar 1",
"v2": "2025-06-03T21:00:00.000+03:00"
},
{
"id": 2,
"v1": "foobar 2",
"v2": "2025-06-03T22:00:00.000+03:00"
}
]
'::JSONB)
)
INSERT INTO target_table (id, v1, v2)
SELECT
x.*
FROM
source_data, jsonb_to_recordset(source_data.data_as_json) AS x(id INT, v1 TEXT, v2 DATE);
SELECT * FROM target_table;
END;A very big issue is that the script has to know intended columns and their types, so it would mean either heuristics or then unnecessarily clumsy, so this would in practice be best done with separate or grouped |
Beta Was this translation helpful? Give feedback.
-
|
Now I am actually not sure if you meant if it was possible to define the migrations as CSV/JSON rather than injecting data. Which was it? :) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey, just wondering if there is any plan to add the feature of csv / json file importing / migration to postgres in this package? Thanks.
Beta Was this translation helpful? Give feedback.
All reactions