Question : The follow is a paragraph from Microsoft Docs: New pages allocated in a heap as part of DML […]
Tag: compression
I wish to restore a gzipped tarball – what does mongodump –archive do? Howto do a mongorestore with –gzip?
Question : I created a backup with mongodump -h $RANDOM_SECONDARY_SUP -u $BACKUP_USER -p $BACKUP_PASSWD –out /data/$BACKUP_USER/sup-repl/sup_$DUMPFILE tar cvzf /data/$BACKUP_USER/dbt/st_$DUMPFILE.tar.gz /data/$BACKUP_USER/st-repl/st_$DUMPFILE […]
How does SQL Server calculate the initial size of a (compressed) backup?
Question : When a backup is created, SQL Server guesses(?) the a size of the intial backup file. Later on, […]
Does Windows Server File Compression slow Oracle DB performance?
Question : I’m starting a new a new contract and I get to troubleshoot their Oracle instance… I’m more apps […]
Postgres: Is migrating old records to separate table and creating new aggregated records a good strategy to improve search speed?
Question : I have a table with 1M+ records. New records are created and updated each day each time an […]
backup using gzip slow
Question : We are currently backing up some schemas in Postgres using this command: pg_dump -h localhost -n test_schema mydb […]
How does SQL Server calculate the initial size of a (compressed) backup?
Question : When a backup is created, SQL Server guesses(?) the a size of the intial backup file. Later on, […]
Are toast columns compressed also in shared_buffers?
Question : Reading here I cannot find a clear answer: http://www.postgresql.org/docs/9.1/static/storage-toast.html I need to know if setting storage to EXTERNAL […]
Can I bulk insert into an empty page-compressed table and get full compression?
Question : I have a lot of large tables (around 10 million wide rows) which need to be regularly loaded […]
GZip existing varbinary(max) column in Sql Server 2008+
Question : I have an existing legacy table that is ~180GB in size due to a client application storing PDF […]