Question :
I am aware that this has been discussed before, but I’ve read conflicting information about this particular topic, so I’m hoping for a bit more decisive opinions from you.
I have multiple tables, which stand in a 1:1 relationship to a main table linked by a foreign key.
The largest of these tables, the parameter table, has about 1000 columns by now, each of the columns holding a single numerical parameter.
The table in question has a size of about 16 MB with 7000 rows right now.
After creating a normalized structure (ID, Key, Value) and inserting all the data into this new table, it turned into 3 million rows (after filtering out the columns that hadn’t been used in the table with the many columns) and a size of 107 MB.
Now I’m wondering, if the effort is really worth the gain. Sure, it’s a lot more scalable (I’ve had to add columns manually way too often lately to make sure it continued to work), but like this I have up to 1000 rows to sort through and I’m just not sure any more.
Some input would be appreciated. Thank you very much.
Edit: The majority of the operations will be ‘SELECT’, followed by UPDATE/INSERT … ON DUPLICATE KEY UPDATE respectively (the later is for the relational approach)
Answer :
MySQL has limits, such as 1017 columns in an InnoDB table (as of 5.6.9).
If you are not filtering or sorting on most of the columns, put them in a JSON
string and put that string in a TEXT
(or maybe MEDIUMTEXT
) column.
A key-value store can lead to lots of JOINs
. The limit is 61.
I’m confused — Do you have 1000 sets of 7000 parameters? Or what? If it is a single set of parameters, then a single key-value table makes some sense.