Question :
We have a query which we want to optimize for high performance.
The message column in table is of JSONB type and it holds a large json object.
Is there a way to optimze this?
We have an index already on the batch_num
SELECT jsonb_array_elements_text(message #> '{results}') AS res FROM TABLE WHERE batch_num = '11';
Answer :
You painted yourself into a corner by modeling this with jsonb
. The whole large thing has to be read to extract the attribute.
The best I can think of is to break the desired value out of the large object into its own column:
ALTER TABLE mytab
ADD results jsonb GENERATED ALWAYS AS (message -> 'results') STORED;
This will work on recent PostgreSQL versions; for older versions you can use a trigger that does the same thing.
Then you only have to read the results
column instead of the whole message
.