How to speed up queries for 2m Datasets in PHP & Mysql

Posted on

Question :

We have a SaaS Application site, where we would have over 2M records in a table.

Server version: 8.0.19 - MySQL Community Server - GPL
Protocol version: 10

Even with the indexes these queries are slow.

My Question:

  1. Our Database query response in very slow in our PHP User Table Page. takes more than 4-5 mins to fetch the data. How can we improve the response time.
  2. Our login page also respond slow to authenticate the user? How can we improve the performance?
  3. Isn’t our server good enough. Here is our specs (Dedicated Server XL6 managed. CPU. AMD Hexa-Core. 6 Cores x 2.8 GHz. (3.3 GHz Turbo Core). RAM. 16 GB. DDR3 ECC. HDD. 1,000 GB (2 x 1,000 SATA))

4.Would separating this table/data unto another database on a server
that has enough ram to store this data in memory would this speed up
these queries? Is there anything in anyway that the tables/indexes
are set up that we can improve upon to make these queries faster?

Table Structure

MySQL Table Information

Data    102.9   MiB
Index   10.2    MiB
Overhead    380 B
Effective   113.1   MiB
Total   113.1   MiB

Row statistics
Format  dynamic
Collation   latin1_swedish_ci
Rows    1,034,964
Row length  104 B
Row size    115 B

Answer :

OR is un-optimizable.

SELECT*FROM user_tbl
    where  (EMAIL_ID = '$username'
              OR  MOBILE = '$username'
      AND  PWD = '$pwd' 


SELECT * FROM user_tbl
    where  EMAIL_ID = '$username'
      AND  PWD = '$pwd' 
SELECT * FROM user_tbl
    where  MOBILE = '$username'
      AND  PWD = '$pwd' 

and have these two separate indexes:


Meanwhile, SELECT * FROM user_tbl cannot be sped up. Do you really need to show all the 1,034,964 rows and show all the columns? Elaborate! (See Phill’s Comment.)

Leave a Reply

Your email address will not be published. Required fields are marked *