Problem is, I would need white fans, or white and gold to fit the aesthetics I'm going for.
Enermax does make some case fans which have the look and have good static pressure ratings, at least.
Normally i wouldn't be. But the enermax has revealed that it basically fails within a year and putrifies its own innards, while tons of people are talking about that ractor looking version of the castle AIO basically popping its own assembly and spewing its coolant over the entire PC.
I've only had two AIO failures; neither of which left residue. One was a Corsair h50 that was the best part of seven years old already. The other was a second-hand thermaltake thick 120, whose pump itself failed. Neither of them had coolant dripping into the system, so were completely recoverable. The failure i don't want, is having it burst and drip coolant over the entire system. Get what I'm saying?
Yeah, that freezer 34 is about as white as an FBI pursuit vehicle.
To the same end, neither of the Thermalright coolers are all that white either, but the silver should work about well enough.
Strictly on air coolers, these fit the aesthetic goal best, but all of them seem to review horribly even with a less demanding CPU. Which makes me want to lean toward liquid cooling.
Though for a 360mm AIO, my choices for a 360mm rad in white are known to all have some concerning failure rates.
Even though I don't care for RGB, I have had good success with enermax air coolers, so this T50 ARGB is looking tempting, even though I really don't care for RGB lighting. The rear "vent" thing seems much a gimmick, to the point I feel I'd be better off just trying to find a second fan to attach to it.
Yeah... those plates with shipping and tax, are close to $40 by themselves on Amazon. combined with the sinfully ugly fans Noctua makes, that's just not even worth considering. Besides, if I had to replace the fans on the NH-D15, it's already going to be more expensive than even a 360mm AIO. And the CLC is much more likely to get better results for doing so.
Would any of those be available in White?
Also, my experience doesn't really mirror that. I had this big tower cooler on my 5820K, but even at stock, it would instantly hit 90C when testing with Prime95. The 240mm cooler I do have now, with the fans it came with, wasn't much better, true, but since finally swapping the fans for something with more static pressure, it dropped tremendously, to now only peaking at 60C even with peak sustained load.
Or at the least, with a radiator, there's more I can do to address cooling it seems than just an air cooler.
Part of it is your use case, and whether you have the hearing (expectations) to pick out those finer differences.
Like, I'm no audiophile to the degree some go to. However, there is an expected audio reproduction quality i look for, especially in games; to that end, especially for anything old enough to expect DirectSound and/or some degree of EAX, no onboard really cuts it. Realtek in particular is guilty of this, since their software and drivers barely even work a lot of the time. Realtek audio is kind of like the intel GMA graphics for sound, if you get the analogy. For anyone serious about it, it's a joke; but because it's so cheap, everyone and their dog has it in their computers unless you specifically build it out of your system.
Most PCI-E sound cards would only need a 1x slot, or bandwidth for one. Sure, you could stick that into an x16 slot, but that feels pretty wasteful.
Not sure where, but I did see something of an adapter/bridge that was likely made for mining with GPU's that would turn a PCI-e x1 into a USB 3.0 type to plug into the computer. Not sure if that's an option you'd want to entertain, or if you were hoping to keep it internal.
As for sound card recommendations? My use case and experience leans toward SoundBlaster, so for around $50~60, the Audigy Rx is the go-to choice, unless you need more specific outputs. Towards the high-end, unless you have a specific need for the extra hardware in the AE-9, either the AE-5 or AE-7 are fantastic premium options with plenty of outputs to support whatever it is you plan to drive on it.
That said, please don't neglect putting some good speakers to be driven by either of those choices.
What a lovely rig. Very well done, and I'm sure your son is going to be super happy with it.
So? are you not aware that restored content mods exist for the first two games?
Kotor 2 on Steam also got a tremendous update, and steam workshop support, making it easier than ever to play it the way it was meant to be, with restored content in spades.
The good thing is that SWTOR doesn't use a bunch of cores, so it's entirely possible to build something that could easily play the game and stream it on current hardware.
that said, it's very much bound to clock speed. It doesn't really care so much about having a bunch of cores, so much as having enough fast cores.
Tech Deals on YouTube demonstrated as such with a build featuring the i3 8350K, and how overkill it was with a suitably high-end enough GPU. The PvE portion sure as heck doesn't need all that extra power; even my old Phenom II could handle it. The extra speed and IPC is more for the PvP features that otherwise would bring performance down.
Reading this, I brought up google maps to look for something like that. Found one that's almost within walking distance from me. They want $75 for the privilege, and I would already have to have the 9900K and a cooler to install for it. Yeah, that seems pretty excessive for something that should take a couple of minutes, honestly.
You sure? because at least Square's own benchmark list would show in their testing, for instance, that a Radeon 5700XT does worse than a standard gtx 1080 on 4K high. Yet I was able to find a video showing that they were able to get the card doing a consistent 60+, on a 9900K.
While I would personally prefer to upgrade the GPU, I see no point in spending $1200 for a 2080Ti to maybe get an increase... or $2900 for the Galax HOF model to fit the aesthetic theming I would have been going for. But that's neither here nor there. So I was just going to take the aesthetic hit and bring my 1080Ti over for the meantime, until a more palpable upgrade for the price comes along that can be had in white. Or I just get stupid lucky on an auction for Gigabyte's white edition 2080. Either way...
Based on footage of people running the game on a more modern platform (either Ryzen 9 or intel i9), it seems evident that basically the platform upgrade is what I need to do to beg enough extra performance to keep playing it and games like it in the future.
Every cpu compatibility list I've seen fot this board basically excludes everything but the i chips on launch BIOS. Given the overclocking focus, I'm not terribly surprised, but it basically means i have no reason to suspect any of those pentiums or celerons would even work.
I'd still have to buy a board, and I'd still have no way to use this one. There's the rub.
the latency would have to do with how snappy the system is. How long it takes to start fetching and loading data. Once it's retrieving the thing, the clock and bandwidth factor in.
It's why stuff rated for a particularly quick set of timings can command such a high asking price. It might not affect gaming terribly much, but a system trained to run tight timings would invariably feel snappier than one that's on some cheap, slow kit.
XMP is an intel standard. AMD's work-alike is more commonly called DOCP. Some boards can elect to support intel XMP on an AMD platform, but this is hit-or-miss. It might be possible that you have an intel-only kit in an AMD platform or vice versa. Hopefully you're still within your return window to get that sorted out.
3466 / CL 18 == about 10.4ns latency
3466 / CL 16 == ~9.23ns latency
the improvement would result in a snappier system. But you might be able to use memtest and train your memory to go tighter. I'd suggest giving that a try first, and if it fails, then yeah, feel free to go for tighter kits if you can afford it.
If that were the case, why do those that flag for virtual 7.1 always sound so hollow and miserable? this is all so confusing, and it's not really helping my case of why I have such annoyance with headphones on top of it.
Not true. Or rather, not necessarily accurate.
I use the turtle beach x41's with my tv, since it had an optical out. It supports 5.1... its biggest problem though, is that using RF for wireless, the battery life is pretty awful. But they were the only way I was able to play through Alien Isolation on the console to completion. With just the TV's speakers in stereo, I was getting shredded by the Alien constantly. Using the x41's had me scared a lot more often, but I survived a lot more than died, because I actually had the positional awareness.
It's especially stressful to me right now, because I think one of my speaker sets is going on the fritz... and it seems like my suspicions about audio degradation on PC stuff is basically confirmed. Can't even find any locally anymore, and even the same model that I had, goes for twice the price online now. Back when I bought this set of speakers, they were $50. at least on here, the same set is $98 now online.
Every, and I mean, every set of headphones or earbuds I've ever had would break on me. Be they wired or bluetooth. With wired, it was always the wire fraying that would render one or both sides of it unusable. And that's assuming that they were able to fit my head. I also fail to see what you're seeing about the Tiamat 7.1, since it shows it accepts four sets of analog inputs. Are you sure you're not comparing them to the USB-based 2.2 set? There are a ton of "gamer" headphones that claim to be virtual surround, and everything that says virtual surround has always disappointed, sounding hollow and of course, not really supporting anything. I wouldn't even be considering them if they didn't take the audio out from the sound card, which I'd rather have handle the audio processing.
again, why would I spend that much for something that's just going to break and still not meet my needs or preferences?
Also, the Tiamat takes the 5/7.1 from the sound card; I'm deliberately avoiding the trashy usb headphones because their drivers don't support any of the audio stuff for the legacy games I still play.
I already said the $100 for these Tiamats was going to be the most expensive i would consider paying for headphones. And looking up that model number, those start at $200. Why would i spend that much, ever? They're only stereo, so i lose the positional advantage of 5.1 already. Just on specs, they don't make a lick of sense.
Then, mind at defining what are good cans? I don't even understand that term. As far as I'm aware, there's no such thing. A lot of that has to do with the fact that all of them are marked up as heck just because of whatever arbitrary name is on the box. I see no point in spending $400 for Glorified stereo headphones that are just going to break anyway
The houser brothers that run Rockstar will sooner sue every person that plays RDR2 for PC before they'll admit their launch was anything but perfect.
Yeah, keep in mind the notes for that set of recommended specs is for 1080p gameplay. 4K is a whole other order. And the latest/final build of the game has a memory leak that allows it to creep past 20GB in my main system's usage. It probably hasn't affected me because of having such a cavernous amount of system memory on that setup, but something worth noting.
Can't speak for RDR2 (not my kind of game) but I've been pleased enough with my 1080ti at 4K. At this point i need to do a platform upgrade at minimum, and my most demanding game would be Final Fantasy 15. So it probably favors nvidia more there.
Down side? It's all realtek.
Good news is... unless you're on an ITX board, you should definitely have the ability to upgrade the sound hardware. Though I would hope that you'd have some really high quality speakers and/or headphones to justify it. And even on the headphones from, if it's on USB, it won't use a sound card or onboard audio anyway.
For my use case, I prefer internal sound cards because the software for most all external such cards just don't meet what I'm looking for it to handle. But in every scenario, Realtek is inadequate, unless you're just that tone-deaf or using really crap hardware. GPU and CPU balancing is a thing. So is audio chip and your output medium of choice.
As of this typing, for frame of reference, an rx 580 8gb card can be had for typically $100 usd used. A 1660 (non-Super/Ti) can be had for $220 new. Inexplicably, an RX 590 sells for $150, which makes no sense.
A used Vega 56 on a blower cooler, can typically be had for around $200. The rx 590 is overpriced compared to either the 580 or the Vega 56. If you can negotiate below $200 for a 56, it's a great option. It just requires a lot more tinkering to get the best out of it and finding a sweet spot of clocks and voltages. You'll also want to look for a better cooler if you have to settle for a blower model, as those that came with better coolers from the factory, are also inflated in price.
My suggestion? Sub $200 USD for a 56, and set aside the savings for an Arctic Accelero Xtreme 3 cooler to mount on it. That should help keep temps in control ans give you a greater experience and some longevity out of the Vega.
Some of it is unjustifiable in how much it appreciated in value.
Kind of needed a late era Socket A board to support the 400FSB Barton cores to begin with. And the nForce 2 Ultra 400 boards in particular are harshly expensive. As in, you could almost buy a new x470 motherboard for what some people want for the old platform; especially for one that has the 4pin cpu power connector. Which id crucial to get such a build to work with a modern power supply, since before that, AMD socket a drew most of its power through the 5v rail.
And for some graphics cards, the asking price for a working one now is just obscene. I don't even want to get into that.
True, but i also painfully remember how obnoxious it was to get a VIA based anything to accept anything. Back then i had a first gen Audigy and with that combo i was reinstalling weekly.
Only reason i went with VIA was because the most affordable nforce 2 motherboard at the time was 4x the asking price of this one, to support the xp 3200+.
As in, the cheapest listing for an Asus a7n8x (so not even a dfi lanparty) was $120 with nothing. This Abit kw7 was $37 with an extra cpu, ram and video card thrown in that i figured i could use to troubleshoot with or if i needed to update BIOS before it would accept such a (relatively) flagship cpu for the platform.
Out of curiosity, I went and downloaded the apps/install CD for the base Audigy 2, and found that it does have drivers for Win98... that would seem to make it the no-brainer choice, presuming it works for this build. Even if it doesn't, I already have an XP build it could drop into, and at least another Windows 10 build I can use it in thanks to the Daniel K drivers having such a wide breadth of OS support. The only issue is if it somehow doesn't work on this VIA based board, I'd probably be back here later looking for another alternative.
Supposedly, with as famously fussy as VIA's chipsets are (gee, no wonder they're not really a thing in modern PC hardware...), one name that also came up to mind was the Turtle Beach Santa Cruz. Another card I used to have; not too bad... but the drivers basically stopped working once Half Life 2 released. Though safe to say I wouldn't be playing HL2 on a Windows 9x box anymore.
Price for it is a little inflated for my liking, but the vogons link you also shared also had some other links that were rather eye opening... not least of which, implying there is an ISO for an Audigy 2 that would have Win9x drivers. Which, if true, would be fantastic news IMO.
I did discover one thing asking around. That USB thing uses an ensoniq chip, and Ensoniq is also famously incompatible with VIA chipsets.
Already went through one hot episode of that kind of incompatibility with the Aureal. I'd really rather avoid going through that again.
To be fair, it is cheap enough to be worth trying as a last resort... A very last resort, though.
Based on my limited knowledge of DOS, it's safe to say this flat out won't work if i have to go into DOS mode for any such period correct games. Not to mention, i doubt this would even be adequate for more than the most basic of Windows sounds.
That's because of a few factors:
Let's be real, the majority of viewers are below average intelligence
OpenAL drastically reduced the dependency for basic audio support, admittedly
You have to have a use case where upgrading the audio make sense to you
You're able to fit one
Let's be real, the majority of people are idiots and tone-deaf, and likely using $5 usb powered speakers with their stuff or maybe $10 headphones. They aren't going to sound like triple-digit reference monitors, cans, or anything approaching a 7.1.2 Dolby Atmos arrangement, no matter how much sound hardware you put behind it. So of course they think the $2 Realtek trash on most sounds fine. Yes, I'm salty. Deal with it.
As I referenced earlier, you need a use case where having better audio makes sense for you. Do you play legacy games at all? Do you do streaming with commentary ? Though, to be fair... for the speakers that you did select there, it would not be unreasonable to upgrade your audio just on that factor alone.
If I may, I'll share my own use case. Even with my current beefy main PC, I still happily spin up old games that were made when EAX was still the predominant audio format. And what most people seem to fail to grasp, is that the whole interface of how DirectX would scale graphics fidelity up/down also applied to audio, too. So it doesn't matter how capable Realtek claims their ****** $2 audio chip is. If it isn't reporting that capability to the game, it won't matter what Realtek claims it does. And realtek is especially guilty of this because they won't even try to provide any avenue or recourse to support these games even on Windows XP, let alone anything after it where such support was needed. Basically, as of Windows Vista, Microsoft tried to break Creative's monopoly hold via EAX by adopting OpenAL as the de-facto audio interface for Windows. Not knocking that decision, because it greatly opened up the ability for more to run modern games with the proper audio fidelity. The problem, is that Microsoft provided no form of backward compatibility layer to smooth over the transition. As of right now, the only companies providing anything (to my knowledge) to handle that, would be Asus with their GX program for their Xonar based sound cards, and Creative, with ALchemy for their modern Sound Blaster products. And for complete EAX support, you're still stuck with needing a Creative Sound Blaster sound card. Despite Creative obviously licensing access to third parties, Realtek simply goes "nope" and is too cheap to offer it. So yeah, theirs makes games sound hollow and trashy, in my opinion.
The good thing is that a sound card won't really need a high bandwidth PCI-E lane at present. So you shouldn't need to put it so close to a GPU unless your other slots are going to be occupied for some reason. If you want to talk tight spacing, you should see my mATX build, where I have an X-Fi OEM basically butting up against the GPU in it... there is a PCI-E x1 slot, under the cooler, which I basically can't use as a result. ):
Even if you don't play legacy games, I also mention streaming with Commentary, because even the YouTube channel Tech Deals noticed a considerable improvement on audio fidelity, in recording and production when they dropped in a sound card to use in place of the onboard audio of their board. So if being able to record your vocals clearly is a thing that you care about and you find the onboard isn't cutting it, then that's also another valid reason to use a sound card.
This is what I got at present as my main pc.
PCPartPicker Part List
Would that be something I'd need to ask the motherboard maker if they support? because as it is right now, one of the examples I'm looking at, is someone selling a 2690 v3 for ~$215 or best offer, which means that if I could enable that, that's 3.5GHz across 12 cores. Which should definitely have me round-about first-gen Threadripper performance.
Okay, but how do I figure out which of these are overclockable? Or at the least, able to be able to say, set all the cores to their max turbo frequerncy?
That's pretty impressive, all considered. Everywhere else I've read that a 9900K really needs a 280mm AIO or better. Tech Deals also alluded to it; his 8700K was able to hit 5GHz with a 240mm AIO cooler, but it really seemed to be unhappy during stress testing.
Try running Prime95 with AVX enabled. I doubt it'll stay as cool. The really paranoid run it consecutively for a week straight. At a minimum i look for eight continuous hours without errors. Preferably, a day or two. If said 240mm AIO can handle that? Color me impressed.
simply, I would look at no less than a 280mm. Preferrably, a 360mm radiator or better. I would also not rule out replacing whatever fans that an AIO comes with for something with a lot of static pressure in its specs. you could probably have some more flexibility with an open loop, but when it comes to an AIO, the things that matter are surface area and static pressure. Most brands nowadays use the same pump type, so there simply isn't that much variance as there could be.
As for the fans, pay attention to the static pressure rating, and then just get one with an appropriate sound profile or noise rating that you'd be okay with.
For reference I started with a CoolerMaster MasterLiquid 240, and while only $40 USD at the time, the stock fans were pretty trash, and even at stock I was hitting 90C in stress testing on a 5820K. Replacing the thermal compound helped a bit, but replacing the fans not only reduced the noise at full speed greatly, but it also drastically brought temps down for me more.
Just a shame about the sTIM on a 9900K, because I think a direct die mount would likely help that CPU a bunch if it could be done. It is possible to de-lid an i9, true; but the soldered TIM from intel makes it where it's not really safe for someone with an off-the-shelf delid kit from an 8700k or similar to be able to do it. Not without warming up the chip enough to have said stuff be malleable enough to pop.
Taking some advisement, this is one avenue that I'm considering, in the event of simply importing my 1080Ti over. It'll just mean I would have to basically take the aesthetic hit for a while.
Noticed this optical drive had support for UltraHD BluRay discs, which would cover whatever kind of movies I may encounter now, or in the future.
Already familiar with the Enthoo Pro, and thus would look like a fitting sibling to my current main PC.
The next change was dropping the secondary storage, and going with a 2TB NVMe drive... reasoning that should finally give me enough storage internally so I wouldn't have to manage storage as aggressively as I currently do, and give me enough wiggle room to actually install Final Fantasy XV with the optional high res texture pack DLC that Square provides, which together eats about ~170GB of space.
Also, as a bit of workflow and quality of life enhancement, using the money that I would have used for the GPU, to instead get a 4K monitor that has VRR support, and rated very well on (DisplayLag)[https://displaylag.com/benq-el2870u-4k-hdr-gaming-monitor-review/]. Just thinking out loud at the moment.
Huh? but even eVGA's QVL has tested up to 4133 on this board... 3600 for 4x16, admittedly.
Only reason I'd want the iGPU is on the off chance I didn't have a geforce in the system, or until AMD gets its act together and offers more support for whatever its equivalent to NVENC for the likes of OBS. Seriously, on my HTPC, the low latency encodes enable me to play all but the most timing sensitive games from the preview window while it's supposed to be encoding. But the newest radeon I have that supports this has like a 3+ second delay. Nowhere near as bad as the delay when I first tested an elgato game capture hd, but it was still seconds behind, rather than milliseconds. At least from what I see, just about every flipping encoding app (paid or free) supports Quicksync to great effect. Only the Turing stuff so far seems to encode at the same quality that QS can do in real-time.
Gaming wise, the most intensive game I care about is FF 15. Even though the benchmark is decisively more aggressive than the full game (from my own experience), 4K high results show the 5700XT doing worse than a standard GTX 1080, let alone a 1080ti. And chances are, FF 7 remake will be no less intensive in the same manner. For reference, the Radeon VII did even worse. Not sure what the deal is there, but it's an important enough game to me to warrant a pause.
On the flip side, saving on the GPU upgrade now could possibly let me try and snag another 4K monitor, maybe one with VRR. As is, the closest I could find affordably that is matching with the desired aesthetic would be the Gigabyte Gaming OC White RTX 2080, not even a Super variant. I did see a manufacturer refurb model on Newegg's ebay page.
I noticed you picked DDR4-3000 as opposed to 3200 or higher... is there a reason for this?
Big surprise, i still like to spin up EAX era games on my current gaming pc. The good star wars games, halo, need for speed... Yeah.
So yeah, i need a way to correctly handle that. Telling me to get something that won't support that, isn't helping. I don't understand what part of this you're failing to comprehend, here.
I'm sorry you have the tonal response of a Bose speaker, but don't force your ****** lo-fi audio on me.
my HTPC has an OEM X-Fi card not just for the gaming, but because the card in question also has optical in and out, so that when applicable, I have an avenue where I can capture digital audio from whatever I happen to be capturing from.
Even Mr. Tech Deals, who didn't have such a gaming need for a sound card, found one useful for their encoding and streaming session needs. As in, a marked improvement in quality of their recording and streaming sessions. Even on the bandwidth constraints that Twitch puts for streaming, it was a noticeable improvement... one that I first mistook for him buying a more expensive studio mic for.
I use the companion program ALchemy with my X-Fi and Audigy Rx in my present main quite a bit. Because I still have old games in my Steam library or on disc that I like to spin up and play. And where I can't get a native OpenAL audio patch, ALchemy fills in the blanks to make correct audio quality happen.
Clearly, your bias colors your ability to provide constructive feedback. Do us both a favor, and disregard this topic.
My current build is 64gigs of RAM. My most demanding game at present is Final Fantasy XV... and with everything running, it easily reaches past 16GB now. Psychologically, it would not make sense for me to go with less RAM than my current main has. Chances are, the eventual PC release of FF VII remake will be no less memory intensive.
As far as BD playback, there is a freeware program called Leawo BluRay player. Suspect name aside, it's come up clean in everything I've scanned against, and has been flawless for playing such movies on my system.
As far as a sound card goes... a must run for me are the original KOTOR games. KOTOR 2 did get a modern patch for it on steam, but no such luck for kotor 1. And it doesn't matter what the DAC claims to support; if it doesn't report to the game that it supports what it's expecting, it will sound like trash. I don't understand why people continue to be obstinate about this. And to date, a modern Creative card with its companion ALchemy, is the only thing that completely solves and makes these games work and sound as good as they did when EAX was still the de-facto standard. Unless your DAC can support this, advising me to go this way is but a waste of both our time and writing.
You'll definitely notice an improvement for raster gaming performance. But first gen ray tracing tech will suck for the meantime, unless you're willing to settle for 1080 or even 720p rendering. And for a $700 video card in 2019, that's excessive. Past that, the improved NVENC will help for streaming, if that's a thing you do.
Way back, my previous main pc eventually became a Q6600 with 6GB of DDR2-800 (2x2 and then 2x1GB) that was all able to run 4-4-4-12 timings. When the primary 2x2GB set errored and while waiting for the warranty replacement (props to G.Skill there), I broke down and bought a 4x2GB kit for cheap... which refused to run tighter than 6-6-6-18. The added capacity helped for programs, but the system did feel laggy ever since.
That same kit, I now have on a build with a Pentium EE firstname.lastname@example.org, and at least I was able to make time to try tuning down to 5-5-5-15... not as snappy as 4-4-4-12, but a marked improvement... by the math, the latency with the new settings went from 15ns to 12.5.
My most memory demanding game to run would be Final Fantasy XV, easily. And with background stuff running (like say browser window or even OBS), I definitely pushed toward 20GB. Part of the reason I went with 4x16 for my current main pc, was with the reasoning that by the time 64GB of system RAM is normalized, I should have already at least enough to move to a much faster machine, if not had an intermediary one in between. My Previous main lasted (no joke) as such for 10 years. So I went into building it with the mind that I probably didn't know if I would be able to upgrade it at all again. I certainly wasn't expecting to receive such a modern board so soon to warrant thinking of a new build again. but I'd be remiss to not use it.
Thing is for DDR4, I'm not seeing all that many memory kits that go that much below 10ns latency. the fastest kit that I did see that reached down to 8.25 for 4x16 is no longer on sale, so knowing that, doesn't do me a whole lotta good.
Obs does support the amd version of things (vce i think they call it). What is a buzzkill for my use cases, personally, is that the radeon team doesn't support any low latency encoding for third party apps (basically, nothing except their own ReLive software). This might not be a big deal to you, but for a pc that is set up to encode from a capture card source, it's kind of a dealbreaker for me. With Nvidia and their low latency support for obs, i can actually play the console capture from the obs preview on all but the most latency sensitive games. The Radeon hardware should be able to do this, but AMD still doesn't extend that support out yet, for whatever odd or contrived reason.