Thanks. It's certainly not the fan noise, though. Anyway, the whine is not that bad either and it's kinda late to return it now. Funny thing is, I used to think coil whine only happened when the GPU is outputting like thousands of frames per second, such as on a menu screen, and yet I often don't get whine on menus and instead I'm getting it in-game when the GPU is outputting 100 frames or something.
Athawolfus' forum posts
Hello,
I have a distinct "coil whine" problem with my PC. It only happens in games. It's often a high-pitched, noticeable "whining" sound. I know that often happens when the frame-rate is too high but it doesn't just happen in menus or loading screens but can also happen when a game is lagging (like when it stops for a second) or while changing the camera angle in a strategy game. There is also a different "wind tunnel" type of noise that happens normally in most games, for example with GTA V. I have a GTX 970.
Now I can't even be sure if it's the CPU, GPU or the PSU but most people say it should be the GPU. For some reason some forum posts claim it can somehow be fixed by disabling CPU power-saving features like C1E in the BIOS but I don't understand how that's related. In order to isolate the problem, I used the "stress CPU" option in CPU-Z and there was no coil whine. But when I fired up 3DMark there was no whine either. But it DOES happen in games with similar framerates so that's weird.
What do you think? Is it the GPU? Is it a major problem other than the sound? How would disabling C1E fix this? Thanks.
EDIT: I actually kinda got that wrong. It's more of a buzzing than a high-pitched whine, and it actually disappears momentarily when the game lags, which can happen as I said when I'm changing the camera angle or right after loading screens or something. Which probably also points to the GPU. The buzzing can come at all times even when the framerate is around 120 and not when it goes way up like in menus. The "tunnel wind" sound is consistent with games with heavy loads though.
Thanks. So is it better for me in terms of performance etc. to prevent the hard drive from going to sleep? Would it cause any problems?
Hey there,
I recently got a new 1TB Seagate hard drive to support my first one and plugged it in. It was recognized alright and I have no problems with it, but I have one issue:
It seems that sometimes the second hard drive goes "inactive," its noise decreases after a click, and when I do something with it, even something as simple as creating a new folder, the PC freezes for a moment and the hard drive goes back into action with another click.
Is there a way to disable this feature? Would it cause problems if I do so? More importantly, if it's left on, will this affect performance? I got the new drive mostly for the purposes of putting my games on it, but this worries me about performance etc.
I'm not familiar with new hard drives much so I don't know if this is a regular and common feature nowadays. I guess it's for power-saving and longevity purposes, but I'm not too happy with it.
Thanks. So I will switch to the HDMI connection. I guess the compability issues may possibly be due to small scaling errors. I'll try and see.
It seems they didn't bother with a DVI connection while the HDMI would already provide digital input.
Now on a slightly unrelated note, when you say that it's pretty low-quality, I'm aware it's not really an expensive monitor, but does it being a budget monitor affect picture quality? I'm pretty much pleased with the resolution and the contrast ratio, which is comparable to Samsung monitors. So is it being "low-quality" due to it being less reliable over the long term, less durable, more power consumption etc ? Don't make me unhappy about my new monitor :D
Hey there,
I have an issue with my new LG E2251 monitor. It has HDMI and D-Sub connection ports, while my GTX 570 graphics card has HDMI and DVI-D connections. My 4-year-old monitor had a DVI-D connection so I was surprised not to find one in the monitor. For now, I connected the monitor with the D-Sub connection using a D-Sub-to-DVI-D adapter to connect it to the graphics card, and the image quality seems good enough (1920x1050 resolution).
Question is, will I get better quality and reliability if I just use a HDMI cable? I don't care about the fact that the HDMI can also carry sound signals, just the image/video quality. The user's guide of the monitor says I could possibly encounter "compatibility issues" while using the HDMI connection. I'm not sure.
Also, does anyone know why a brand-new monitor would not carry a DVI-D connection while an older one does? It seems counter-intuitive to me.
Thanks.
Hey guys,
I bought a new monitor a few days ago. It's running at 1900 x 1080 resolution, it caused a noticeable FPS drop in some games compared to my old 1280 x 1024 monitor. Especially in Skyrim, it caused a lot of sudden FPS drops that persist for a few seconds and stuttering.
So in the Nvidia control panel, I went into Power Management settings for the particular program (TESV.exe) and switched it to Maximum Performance from Adaptive, as well as giving the program the highest priority in the Task Manager, and the problem seems to be largely solved with a steady framerate around 60 FPS.
The question is, would this cause any issues/problems for my GTX 570 in the long run? The card should have switched to this performance level, but some reason it doesn't, I guess. The temperature in MSI Afterburner did not exceed 66c, which is around what would normally be with the Adaptive setting, but I'm still worried if setting the card to the maximum clock speed etc. would cause any problems other than the temperature. If not I will use this same approach for other GPU-intensive games with stuttering and lag too.
Thanks.
What's LLC? I don't think I noticed it in the BIOS.
So, it's okay if the core voltage is actually lower than the VID, even under load, as long as it's stable. I ran the Intel Burn Test with 10 runs on Standard, and I got around 62C max on both cores (stock cooling).
I might try to go up to 3.6 Ghz later, as 3.2 Ghz isn't really making much of a difference. Do you think I'll be able to go there with my current vcore reading of 1.18v, or do you think I should push it up a little bit?
I set the FSB:RAM ration to 1:1 and the CPU is running at 3.2 Ghz now. The thing I have not figured out fully is the voltage. CoreTemp gives the VID as 1.225V so I set the "CPU Voltage" to that in the BIOS. But now CPU-Z shows the vcore to be around 1.18V, lower than the value I set it to, despite Speedstep etc being disabled. Is that normal? And can I set the voltage to a value lower than the VID, which I understand to be the CPU's "normal" voltage?
Log in to comment