At first, they weren't much to handle until now that is according to Avalanche Software: Hogwarts Legacy full PC specs are out
Minimum:
- OS: 64-bit Windows 10
- CPU: Intel Core i5-6600 (3.3 GHz) or AMD Ryzen 5 1400 (3.2 GHz)
- RAM: 16GB
- GPU: NVIDIA GeForce GTX 960 4GB or AMD Radeon RX 470 4GB
- DirectX Version: DX 12
- Storage: 85GB HDD
- Notes: SSD (Preferred), HDD (Supported), 720p/30 FPS, Low-Quality Settings
Recommended/High Specs:
- OS: 64-bit Windows 10
- CPU: Intel Core i7-8700 (3.2 GHz) or AMD Ryzen 5 3600 (3.6 GHz)
- RAM: 16GB
- GPU: NVIDIA GeForce 1080 Ti or AMD Radeon RX 5700 XT or INTEL Arc A770
- DirectX Version: DX 12
- Storage: 85GB SSD
- Notes: SSD, 1080p/60 FPS, High-Quality Settings
Ultra Specs:
- OS: 64-bit Windows 10
- CPU: Intel Core i7-10700K (3.80 GHz) or AMD Ryzen 7 5800X (3.80 GHz)
- RAM: 32 GB
- GPU: NVIDIA GeForce RTX 2080 Ti or AMD Radeon RX 6800 XT
- DirectX Version: DX 12
- Storage: 85GB SSD
- Notes: SSD, 1440p / 60FPS, Ultra-Quality Settings
Ultra 4K specs:
- OS: 64-bit Windows 10
- CPU: Intel Core i7-10700K (3.80 GHz) or AMD Ryzen 7 5800X (3.80 GHz)
- RAM: 32GB
- GPU: NVIDIA GeForce RTX 3090 Ti or AMD Radeon RX 7900 XT
- DirectX Version: DX 12
- Storage: 85GB SSD
- Notes: SSD, 2160p / 60FPS, Ultra-Quality Settings
32GB Ram once again proving to be the norm over 16GB Ram as we have seen for Returnal's spec. I feel if Hogwarts Legacy utilized UE5 instead of UE4 which the software DX12 RT (like in the recent Fortnite update) would’ve seen much better RT performance on AMD GPUs as well...maybe this may be an update later?
But overall should be a great-looking title that is getting the WOW factor on a major Harry Potter AAA game we've seen so far and hope we do get shader caching to boost game enjoyment rather than on-the-fly rendering popouts. I think the 3090 Ti recommended for 4k is accurate as well for Radeon RX 7900 XT since it can handle RT but is not as powerful as the 3000 series.
(I'm only speculating. Hogwarts Legacy might have RT if they're asking for a beefy GPU recommend)
Log in to comment