|
I want a way to switch between a 1680x1050 monitor and 1080p tv, but have them both use a single PC output. I only need one to work at a time.
Splitters seem to just want to duplicate the signal but I think that'd be problematic given two different resolutions. Switches seem to want multiple inputs into one output, while I want to do one input to two outputs (toggled, not shared).
What do I need for this?
|
What output? HDMI?
I think if you only want one device on at once, a passive splitter should work. I've never tried it and don't know much about what goes on in terms of handshaking and signalling (if there's a problem if connected to a device that's also powered off, that kind of thing).
http://www.monoprice.com/Category?c_id=101&cp_id=10113&cs_id=1011303
A push-button selector switcher I think should work too...? Even if labeled for two inputs and one output. AFAIK it's just connecting copper. If it's not, it probably doesn't work.
|
Yeah, most likely HDMI. The TV has HDMI and the monitor in question would have VGA/DVI-D, but I think I could throw a passive adapter at it here and be fine (HDMI -> DVI).
Ideally I want some kind of one click / push where it both switches output device (monitor <--> tv) and resolution simultaneously. If it's something I could control through the PC I can probably program that myself, but I expect it mostly to be something you physically do on the switch itself.
|
Is it generally safe to plug a good cpu into a questionable mobo?
|
So my motherboard has a PCI-E 2.0 x 16 slot (GIGABYTE GA-P67A-UD3-B3). I was thinking about upgrading my video card .
Does a PCI-E 3.0 card work on a PCI-E 2.0 slot? and if it does, are there any repercussions?
And performance wise, is the GTX 770 worth shelling out extra $$ versus the GTX 760?
My current rig: Intel i5-2500k 8GB RAM GTX 560 (not the TI)
|
I use a TV connected to my computer via HDMI as a second monitor, but I only turn it on to watch stuff with a group of people. I've just recently found out that having it connected and using it as a duplicate display (even when the TV isn't on) significantly lowers the framerate of my primary monitor. I've used 2 different monitors and both of them have had their framerate lowered by having the TV connected. Is this just a shortcoming of my graphics card? Is there a way to keep the framerate of my primary monitor up while having the TV connected?
My computer specs are: Intel i5-3350P 3.10GHz 8GB RAM AMD Radeon HD 7570
|
On March 11 2014 12:27 minitelemaster wrote: Is it generally safe to plug a good cpu into a questionable mobo? Usually. Unless the motherboard CPU power VRMs failed catastrophically, I don't really see how there would be a problem.
On March 12 2014 00:17 MiyaviTeddy wrote: So my motherboard has a PCI-E 2.0 x 16 slot (GIGABYTE GA-P67A-UD3-B3). I was thinking about upgrading my video card .
Does a PCI-E 3.0 card work on a PCI-E 2.0 slot? and if it does, are there any repercussions?
And performance wise, is the GTX 770 worth shelling out extra $$ versus the GTX 760?
My current rig: Intel i5-2500k 8GB RAM GTX 560 (not the TI) You get PCIe 2.0 transfer rates instead of PCIe 3.0. For gaming and these kinds of graphics cards, this results in a performance loss of... well, somewhere around the margin of testing error. Very slight.
Both upgrades would be kind of reasonable in general. GTX 770 isn't priced in the territory where generally it's warned that you're getting heavily ripped off, but it's not really the price/performance sweet spot either. So it depends on games, settings, expected fps, etc.
Note that GTX 7xx is just one process shrink and one architectural change over your GTX 560. Later this year there will be another process shrink and new architecture, which promises much higher power efficiency and maybe some other improvements. I don't expect the price/performance to be better on launch, though. If you always wait for the next big thing, you'll never get anything. Just be aware of what's coming. Even GTX 760 should be a significant upgrade now.
On March 12 2014 02:16 wongfeihung wrote: I use a TV connected to my computer via HDMI as a second monitor, but I only turn it on to watch stuff with a group of people. I've just recently found out that having it connected and using it as a duplicate display (even when the TV isn't on) significantly lowers the framerate of my primary monitor. I've used 2 different monitors and both of them have had their framerate lowered by having the TV connected. Is this just a shortcoming of my graphics card? Is there a way to keep the framerate of my primary monitor up while having the TV connected?
My computer specs are: Intel i5-3350P 3.10GHz 8GB RAM AMD Radeon HD 7570 What's the framerate of the primary monitor and the TV? How are you determining that the framerate is lower? And you mean the actual refreshes per second and not say some game's fps?
|
On March 12 2014 03:14 Myrmidon wrote:Show nested quote +On March 12 2014 02:16 wongfeihung wrote: I use a TV connected to my computer via HDMI as a second monitor, but I only turn it on to watch stuff with a group of people. I've just recently found out that having it connected and using it as a duplicate display (even when the TV isn't on) significantly lowers the framerate of my primary monitor. I've used 2 different monitors and both of them have had their framerate lowered by having the TV connected. Is this just a shortcoming of my graphics card? Is there a way to keep the framerate of my primary monitor up while having the TV connected?
My computer specs are: Intel i5-3350P 3.10GHz 8GB RAM AMD Radeon HD 7570 What's the framerate of the primary monitor and the TV? How are you determining that the framerate is lower? And you mean the actual refreshes per second and not say some game's fps? Yes, I'm referring to the refresh rate. Sorry about that. The refresh rates of my current monitor and the TV are 60Hz. My old monitor's refresh rate is 30Hz. I can visibly see the difference in the refresh rate, as everything becomes much choppier and not nearly as smooth as seeing 60Hz. It's not choppy to the point of being unplayable, but enough to make it more difficult to play something like Starcraft 2. I don't know exactly how much the refresh rate is lowered.
Speaking of which, when I display the FPS in Starcraft 2 while playing under the lowered refresh rate, the FPS counter still displays 100+ FPS. However, it is not visibly so.
|
So you just mean perceptually? Does it skip frames say here? http://www.testufo.com/#test=frameskipping
How's it look on the desktop? In other games?
It kind of sounds like cloning is forcing VSync, and that's just it.
|
On March 12 2014 03:14 Myrmidon wrote:Show nested quote +On March 11 2014 12:27 minitelemaster wrote: Is it generally safe to plug a good cpu into a questionable mobo? Usually. Unless the motherboard CPU power VRMs failed catastrophically, I don't really see how there would be a problem. Show nested quote +On March 12 2014 00:17 MiyaviTeddy wrote: So my motherboard has a PCI-E 2.0 x 16 slot (GIGABYTE GA-P67A-UD3-B3). I was thinking about upgrading my video card .
Does a PCI-E 3.0 card work on a PCI-E 2.0 slot? and if it does, are there any repercussions?
And performance wise, is the GTX 770 worth shelling out extra $$ versus the GTX 760?
My current rig: Intel i5-2500k 8GB RAM GTX 560 (not the TI) You get PCIe 2.0 transfer rates instead of PCIe 3.0. For gaming and these kinds of graphics cards, this results in a performance loss of... well, somewhere around the margin of testing error. Very slight. Both upgrades would be kind of reasonable in general. GTX 770 isn't priced in the territory where generally it's warned that you're getting heavily ripped off, but it's not really the price/performance sweet spot either. So it depends on games, settings, expected fps, etc. Note that GTX 7xx is just one process shrink and one architectural change over your GTX 560. Later this year there will be another process shrink and new architecture, which promises much higher power efficiency and maybe some other improvements. I don't expect the price/performance to be better on launch, though. If you always wait for the next big thing, you'll never get anything. Just be aware of what's coming. Even GTX 760 should be a significant upgrade now. Show nested quote +On March 12 2014 02:16 wongfeihung wrote: I use a TV connected to my computer via HDMI as a second monitor, but I only turn it on to watch stuff with a group of people. I've just recently found out that having it connected and using it as a duplicate display (even when the TV isn't on) significantly lowers the framerate of my primary monitor. I've used 2 different monitors and both of them have had their framerate lowered by having the TV connected. Is this just a shortcoming of my graphics card? Is there a way to keep the framerate of my primary monitor up while having the TV connected?
My computer specs are: Intel i5-3350P 3.10GHz 8GB RAM AMD Radeon HD 7570 What's the framerate of the primary monitor and the TV? How are you determining that the framerate is lower? And you mean the actual refreshes per second and not say some game's fps?
Sweet. There is one last thing that I'm unsure on:
There always multiple brands to a card: EVGA, Gigabyte, Zotac, Asus, etc.
Zotac GTX 760 is cheaper than most of the other brands thats available to me by $50 but whats the differene?
EDIT: The difference in core clock and boost core is 150MHz and 100MHz respectively.
|
The other brands use higher-grade hamster feed.
|
On March 12 2014 04:58 Myrmidon wrote:So you just mean perceptually? Does it skip frames say here? http://www.testufo.com/#test=frameskippingHow's it look on the desktop? In other games? It kind of sounds like cloning is forcing VSync, and that's just it. Just tried it. It does skip frames on that link. It skips every other square.
It does look just as bad on the desktop, like when I'm scrolling down a page, it's not as smooth as it when the TV isn't connected. I'm not sure if it's as bad in other games, but I would guess that it isn't. For Diablo 3, it isn't as noticeable. Counter-Strike still plays optimally, though I haven't played it on the new monitor.
Now that you mention VSync, I've noticed so minor screen tearing when watching videos on Youtube or VLC when the TV is connected. It usually creates a horizontal divide on the bottom half of the monitor that flickers sometimes. On the TV, there is no screen tearing and the refresh rate is fine.
If VSync is the issue, is there a way to disable it? Or should I even disable it?
|
Cloning as I understand it forces both outputs to have identical specs (lowest common denominator gets chosen) so that makes sense.
I think it has to do with avoiding potential hardware damage trying to send more than what a device can handle.
|
Well, the thing is I didn't used to have this problem. It started around October after I installed Windows 8.1. Whether or not the problem stems from 8.1 is unknown to me. But it used to run fine, with the older 30Hz monitor and TV both connected at the same time; there were no refresh rate issues before October, and all of my games played perfectly.
|
United Kingdom20157 Posts
On March 12 2014 00:17 MiyaviTeddy wrote: So my motherboard has a PCI-E 2.0 x 16 slot (GIGABYTE GA-P67A-UD3-B3). I was thinking about upgrading my video card .
Does a PCI-E 3.0 card work on a PCI-E 2.0 slot? and if it does, are there any repercussions?
And performance wise, is the GTX 770 worth shelling out extra $$ versus the GTX 760?
My current rig: Intel i5-2500k 8GB RAM GTX 560 (not the TI)
770 and 760 are both gk104:
gk104 without any parts disabled (680/770) has 8 smx, 32 ROP's and 256 bit memory bus
760 has 2 of those 8 smx disabled, but does not lose anything else aside from VRAM clocks (it has worse memory, so loses like 10% bandwidth)
typically the gap is about 1.3x in favor of the 770, but it's in a more awkward place IMO as it can't really fall into the super high end segment due to being somewhat limited by memory bandwidth, memory amount and ROP's (the next step up, 780, has a 50% wider memory bus and 50% more ROP's, as well as 3gb RAM over 2gb which is a safer bet for maxing games in the future)
x8 2.0 is fast enough, it's a marginal difference on high end cards, you can benchmark twice and have the 2.0 outperform 3.0 by raw chance. If you're using a single GPU in x16 slot, having 2.0 and not 3.0 is not really something to worry about. I'd hang and wait for midrange maxwell card though, to upgrade two gens (400/500, 600/700, 800+) instead of one, given that we've had this gen for like two years straight already and it's not the best time to buy
Just tried it. It does skip frames on that link. It skips every other square.
This is a major problem, worse than being vsynced @30hz.
|
Is there any possible way to use a combination of 3 HDMI + DVI ports or is it strictly impossible to use all three simultaneously (i.e. you MUST use the displayport slot for the 3rd)?
Could you run two unrelated cards in a non-bridged setup (i.e. NOT sli/crossfire) and use the additional card for running additional monitors. What about 2 bridged + 1 not? And what if these 2 bridge + 1 not were not the same vendor (i.e. 2 nvidia + 1 amd)?
|
Can one watch twitch.tv via the native browser of Windows RT?
|
On March 12 2014 17:34 Craton wrote: Is there any possible way to use a combination of 3 HDMI + DVI ports or is it strictly impossible to use all three simultaneously (i.e. you MUST use the displayport slot for the 3rd)?
Could you run two unrelated cards in a non-bridged setup (i.e. NOT sli/crossfire) and use the additional card for running additional monitors. What about 2 bridged + 1 not? And what if these 2 bridge + 1 not were not the same vendor (i.e. 2 nvidia + 1 amd)? There's something about some sort of clock generator hardware that's needed for the signal for DVI and HDMI. The normal cards have two of those and that's where that two-monitor-limit comes from. If you build something with monitors that are identical, always show same resolution and Hz, two ports on the card can share the same clock. That way, you can get to three monitors without DisplayPort on a normal card. It seems to me, people often report problems getting this to run.
+ Show Spoiler +Regarding resolution and Hz needing to be equal, what's perhaps actually important is the kHz the clock for the signal has to run at for going from pixel to pixel, not the Hz and resolution end result on the whole frame. This can sabotage the set-up for you, two screens showing same resolution and refresh rate might run differently behind the scenes as there's a whole bunch of numbers involved, empty lines at the top and end of the frame, empty pixels at the start and end of the line. At least that's what I think where the problems people report probably come from.
You can run multiple graphics card without problems nowadays. Windows can do that fine, and the Intel and AMD and NVIDIA drivers all don't murder each other.
What you might have overlooked, if you have graphics through your CPU and your board, that is usually by default disabled if the BIOS finds a graphics card at boot. You have to manually enable the integrated graphics in that case, but it does run fine afterwards, you will be able to connect monitors to the ports on the motherboard.
I don't think SLI or Crossfire improves anything. The second card has no connection to the outside, only the first card does things like that, so the limits about the clock signals is still there?
|
On March 12 2014 16:22 Cyro wrote:Show nested quote +Just tried it. It does skip frames on that link. It skips every other square. This is a major problem, worse than being vsynced @30hz. So, I guess I should just keep the TV unplugged from the computer unless I want to watch something on it, then?
|
United Kingdom20157 Posts
On March 13 2014 03:10 wongfeihung wrote:Show nested quote +On March 12 2014 16:22 Cyro wrote:Just tried it. It does skip frames on that link. It skips every other square. This is a major problem, worse than being vsynced @30hz. So, I guess I should just keep the TV unplugged from the computer unless I want to watch something on it, then?
If you can't fix it, yea
|
|
|
|