I’m the developer, not the DBA. I don’t know much about setting up databases but i’m concern of the setup at my current work.
The server is installed on desktop PC hardware(i7, 16GB RAM, 1 TB SATA disk) running windows server 2003(yup that’s correct)
The system that’s running on it has 25 users(will most likely increase to 30 soon) with (2 selects, 2 updates, 1 insert) at least twice every minute. Table isn’t that big yet either, just reaching 2 million rows with 21 columns. I’ve checked every query and my indexes works fine.
As in all things SQL Server, the answer is “it depends”. A good database developer can design an application to mitigate SQL Server hardware requirements with attention to detail on database design, index/query tuning, and application design (e.g. appropriate data caching). So it seems you may have covered that part if you see index seeks rather than scans in the execution plans with the actual number of rows being the minimum needed.
Whether or not your hardware is up for the task will depends much on response time service level agreements (SLAs). Doing the math at 30 users and twice a minute, the server will need to be able to execute 120 selects, 120 updates, and 60 updates a minute. This comes to 5 queries per second, which my gut indicates can achieve sub-second response time on the PC class hardware you mentioned, especially if the entire database will fit in RAM (which is probably can based on your description) and you are accessing a small number of rows per query.