Question :
i’ve been tasked to check whether MySQL can be used for Large DataWareHousing
Requirement is as following
the Table contains 83 column
each row pay load size is 1272 bytes
the requirement is the DB should be handled 150,000 Inserts of above mentioned row’s per second,
currently i was been able to go past 20K using innoDB with data files set in a RAM Disk
tmp-table-size = 320M
max-heap-table-size = 320M
query-cache-type = 0
query-cache-size = 0
max-connections = 500
thread-cache-size = 50
open-files-limit = 65535
table-definition-cache = 1024
table-open-cache = 2048
innodb_write_io_threads = 8
innodb_read_io_threads = 8
innodb_thread_concurrency = 12
innodb-flush-method = O_DIRECT
innodb-log-files-in-group = 2
innodb-log-file-size = 512M
innodb-flush-log-at-trx-commit = 1
innodb-file-per-table = ON
innodb-buffer-pool-size = 20G
innodb_buffer_pool_instances = 12
innodb_log_buffer_size = 10M
Answer :
For InnoDB on a RAM disk:
- innodb_flush_log_on_trx_commit = 1
- innodb_doublewrite = OFF
- batch inserts (100-1000 per transaction)
- probably other settings
Better yet, use ENGINE=MEMORY, not InnoDB — Either way, your data will be lost in a power failure.
If you need high speed, multi-threaded ingestion, see http://mysql.rjweb.org/doc.php/staging_table