Question : I have a lot of large tables (around 10 million wide rows) which need to be regularly loaded […]
Tag: compression
GZip existing varbinary(max) column in Sql Server 2008+
Question : I have an existing legacy table that is ~180GB in size due to a client application storing PDF […]
How to plan bring down the space used by databases to as low as possible
Question : Per our security breaks and guidelines when a database grow too big that it cant meet the requirements […]
SQL Server 2012 compress column for whole table
Question : I was wondering if SQL Server supports a kind of dictionary compression for a whole table, and not […]
Postgres: Is migrating old records to separate table and creating new aggregated records a good strategy to improve search speed?
Question : I have a table with 1M+ records. New records are created and updated each day each time an […]
Compressing Base64 Encoded Image in Database Field. Table Size too Large
Question : Database Info Database Type – Percona MySQL. Table engine – InnoDB. I have a database table that is […]
slow disk with InnoDB page compression
Question : I have a write-intensive MariaDB with both NVMe SSD and HDD disks. I recently enabled page compression (innodb_compression_default=ON). […]
Do SQL Server compressed indexes remain compressed on rebuild without specifying data compression?
Question : After one rebuilds their SQL Server indexes using page compression (ALTER INDEX IX1 REBUILD PARTITION = ALL WITH […]
Which Database to choose: MySQL or Oracle
Question : I have an application generating time series data (32 channels, 22KHz, 6 secs) once a minute. These have […]
Do I gain read performance improvement by using zlib rather than snappy compression in MongoDB?
Question : My current storage engine is WiredTiger and its compression level is as default, snappy. I’ve come across MongoDB […]