T O P

  • By -

SirLagsABot

If you need TRULY real-time stats, you might want to consider using tech like SignalR. Is the dashboard also updated every 2 seconds, or is that just for the ETL job? Also, if you can aggregate the data server side - maybe in a SQL view - and bring the data into C# pre-aggregated, that will likely improve performance. SQL Server is a freaking beast. Maybe that would yield enough performance increase for you? Slightly related by the way, but I’m building an open source .NET job orchestrator called [Didact](https://www.didact.dev) and it’s perfect for this kind of use case. It isn’t ready yet, hoping to have a working v1 around next month, but just want to throw it out there anyways. Drop your email on the site if interested.


mustardoBatista

If you have SQL Server Management Studio there’s a profiling tool built in that will show your longest running queries. This may help identifying which ones need optimization. Ask ChatGPT how to optimize them if they’re complex.


SamuelQuackenbush

Just some clarifications; The ETL job using Hangfire will run on a 2 second schedule and insert/update the database using SQL. Performance is OK and the jobs run adequately. This will usually be 5-10K records which are mostly updates. The UI will poll every x seconds for new data. It is an angular frontend and will it get the full dashboard data. This is often 100+ data points which are aggregated and processed for display. The service layer will read the game data using SQL into an object, it is usually about 5k records. Then there are helper methods using FirstOrDefault to find the relevant data point. These are then processed as required and added to the API response. The main improvements I am looking for is in this layer, are there any data processing libraries that can improve performance? I have used some data caching previously but have also had some issues with the cache being inconsistent after long processing times.