PDA

View Full Version : Sega Rally 2 - Model 3 arcade emulation takes a big step forward



parallaxscroll
03-23-2016, 09:00 PM
https://www.youtube.com/watch?v=1GRsPQyaHro

Be sure to switch to 720p to get 60fps.

Supermodel emulator is being updated. A work in progress, but it's playable and we can forget the mediocre ports.

Thief
03-23-2016, 09:14 PM
Always wanted to play Sega Rally 2. Never played it before. Nice to see Model 3 emulation has come this far too. The last of the classic Sega Arcade games awaits.

Blades
03-23-2016, 09:23 PM
Isn't Supermodel dead? Last I checked, it still had problems with timing issues (music and gameplay too fast) in SCUD Race/Super GT.

parallaxscroll
03-24-2016, 01:14 AM
Seems like Scud Race has been improved.


https://www.youtube.com/watch?time_continue=94&v=zrjLqnjuFLc

Blades
03-24-2016, 10:27 AM
No, it hasn't changed. You can easily check this by comparing against the original arcade machine, or just time the timer with a stopwatch, it runs way too fast. The guy skipped the intro movie, in which the music from the SCSP plays too fast also. When I talked to Bart (the author), he said the problem lies in the SCSP emulation, and is unlikely to be corrected since Supermodel is not in active development.

Apparently the model 3 itself runs at a refresh rate slightly lower than 60Hz, which creates all these weird timing issues.

parallaxscroll
05-25-2016, 10:40 PM
Some improvement in this build, but probably not what you were talking about in terms of timing or sound.


https://www.youtube.com/watch?v=L-gIpAuUXBU



Scud race on supermodel 3 svn 415 emulator.
Played on Medion pc i5-4570@3.2GHz, 16mB, with geforce gtx 760
Fantastic new improvements in the latest svn build. Thanks to Bart and Ian.

Blades
05-26-2016, 07:34 PM
No change.

Damn, they are so close. The only problem is the music plays back slower than the real hardware while the game itself runs faster than the real board.

So closeeee.

Tower of Power
05-27-2016, 12:28 AM
Screw emulation, a real gamer would get a full size Sega Rally 2 arcade cabinet.

Soulis
05-27-2016, 03:22 PM
Screw emulation, a real gamer would get a full size Sega Rally 2 arcade cabinet.
A real gamer with enough room space and money you mean.

Tower of Power
05-27-2016, 10:20 PM
A real gamer with enough room space and money you mean.

A real gamer would find a way.

CasetheCorvetteman
01-06-2018, 03:12 AM
A real gamer would find a way.

I have the money, i have the space, and i have a way, but i still havent done it.

Not even with Daytona or Sega Rally, i just use the more than adequate emulation that allows the same thing to be done, and if it breaks down? Just build another basic PC.

Barone
01-06-2018, 03:54 AM
Screw emulation, a real gamer would get a full size Sega Rally 2 arcade cabinet.

A real gamer would find a way.
So this what the real gamers do? Lol.

Such a poor view of what gaming is, but I can't say I'm surprised.

Tower of Power
01-06-2018, 09:28 AM
So this what the real gamers do? Lol.

Such a poor view of what gaming is, but I can't say I'm surprised.

Are you saying you don't have a hundreds of full size arcade machines in your house/apartment? Weak. Obviously you're not a real gamer.

Barone
01-06-2018, 10:44 AM
Yeah, I'm not up to the standards. :(

CasetheCorvetteman
01-06-2018, 07:39 PM
Yeah, I'm not up to the standards. :(

Be at the standard you want to be at, who cares what others can or want to achieve, its all about you, and never let anyone tell you any different.

Blades
01-07-2018, 06:37 AM
All I want is a port of Sega Super GT.

Vector
01-21-2018, 04:50 PM
I asked this in another thread but perhaps it is more suited for this one. "Ha I love my Fusion for SMS, Genesis, SCD and 32X games but I love my real genesis model 1 va2 and va3 hooked up to my 80s Poineer Stereo systems too. I'm sure other Genesis emulators are better than Fusion, but it looks great to me and the ability to play 4 different Sega systems is a plus for me. For Saturn and DC to me, I always choose hardware. I have an iCore5 or iCore7 computer and I doubt Saturn or DC Emulation would run as good as Fusion does. Although my Windows 10 now is giving me crappy errors that Windows 7 or XP never did with Fusion, like red screen of death for certain roms when I patch them (Golden Axe.md) that worked before (although Thief hooked me up Big Time thanks bro), constantly having to plug in and re plug my Saturn usb controller and redoing the config when before I never had to and other weird errors like it just crashing and Fusion window magically disappearing when playing a game ha but overall I still love Fusion and I don't have all the Genesis Add Ons so it helps. I guess it doesn't matter even if I go out and buy a iCore9 because the Saturn and DC software emulation (emulators) are not up to par, not finished or not even updated in some time correct?"

TrekkiesUnite118
01-21-2018, 08:40 PM
I asked this in another thread but perhaps it is more suited for this one. "Ha I love my Fusion for SMS, Genesis, SCD and 32X games but I love my real genesis model 1 va2 and va3 hooked up to my 80s Poineer Stereo systems too. I'm sure other Genesis emulators are better than Fusion, but it looks great to me and the ability to play 4 different Sega systems is a plus for me. For Saturn and DC to me, I always choose hardware. I have an iCore5 or iCore7 computer and I doubt Saturn or DC Emulation would run as good as Fusion does. Although my Windows 10 now is giving me crappy errors that Windows 7 or XP never did with Fusion, like red screen of death for certain roms when I patch them (Golden Axe.md) that worked before (although Thief hooked me up Big Time thanks bro), constantly having to plug in and re plug my Saturn usb controller and redoing the config when before I never had to and other weird errors like it just crashing and Fusion window magically disappearing when playing a game ha but overall I still love Fusion and I don't have all the Genesis Add Ons so it helps. I guess it doesn't matter even if I go out and buy a iCore9 because the Saturn and DC software emulation (emulators) are not up to par, not finished or not even updated in some time correct?"

What are you trying to ask?

If it's about Saturn and Dreamcast emulation, I can't speak for Dreamcast, but Saturn emulation has come miles recently. It's still not perfect but it's definitely playable for most games and most casual users wont notice most of the issues. SSF works quite well on my AMD FX-8350. And I've heard good things about Mednafen's recent Saturn core coming out of nowhere. The issues I've noticed tend to be in things like Capcom Fighters not running at quite the right speeds (too fast), or games like Lunar either crashing or just acting odd. Though if you just want to play the popular exclusives like the Shining games or the Panzer Dragoon games, those have all worked fine on most Saturn emulators for years now.

Vector
01-22-2018, 05:05 AM
Since you answered my question perfectly I'd say you knew what I was asking. :cool: Thank you for answer. I haven't tried the Saturn emulators since 2011 or so. When I used it, it wasn't 100% working perfectly with many games I liked. And the last time I tried DC emu the games played but was very slow and glitchy. The reason I typed it like that is because that I haven't used those emus in years and even if I had an iCore9 with great graphics card if the emulator/software wasn't updated since 2011 then it doesn't matter how powerful my hardware was now. Let me check what they were. Ok yabause, SSF v10, model 2 emulator, Satourne etc.

Ace
01-24-2018, 12:07 PM
You would be wasting your money on the likes of a Core i9. I don't know how heavily multi-threaded emulation of the Dreamcast and Saturn are, but when you hit the Core i9 range, you're getting a CPU with a ton of cores, but one that also has a questionable internal cache structure (seems to degrade performance more often than not). Not to mention, I STRONGLY advise against any Skylake-X CPU (Core i series CPUs requiring an X299 chipset motherboard) as they have very high power draw and inadequate thermal compound between the CPU die and the heat spreader (a layer of crummy thermal paste rather than solder or liquid metal). You're pretty much forced to either delid your chip or invest in a fairly beefy liquid cooler to keep something like that running at acceptable temperatures.

Depending on how many threads emulators use, you'd be better off with either AMD Ryzen (Ryzen 5 at least) or Intel's new overclockable Coffee Lake CPUs (Core i3 8350K - quad-core, Core i5 8600K - hex-core, Core i7 8700K - hex-core with multithreading). AMD is strong with multi-core performance while Intel does best with single-core performance. These would be my suggestions for current CPUs to use with software emulation depending on whether the emulators are mostly single-threaded (Intel) or multi-threaded (AMD). Don't bother with a Core i9 or Threadripper for those scenarios, those are made more for very heavy workloads requiring a large number of cores.

Vector
01-25-2018, 02:15 PM
You would be wasting your money on the likes of a Core i9. I don't know how heavily multi-threaded emulation of the Dreamcast and Saturn are, but when you hit the Core i9 range, you're getting a CPU with a ton of cores, but one that also has a questionable internal cache structure (seems to degrade performance more often than not). Not to mention, I STRONGLY advise against any Skylake-X CPU (Core i series CPUs requiring an X299 chipset motherboard) as they have very high power draw and inadequate thermal compound between the CPU die and the heat spreader (a layer of crummy thermal paste rather than solder or liquid metal). You're pretty much forced to either delid your chip or invest in a fairly beefy liquid cooler to keep something like that running at acceptable temperatures.

Depending on how many threads emulators use, you'd be better off with either AMD Ryzen (Ryzen 5 at least) or Intel's new overclockable Coffee Lake CPUs (Core i3 8350K - quad-core, Core i5 8600K - hex-core, Core i7 8700K - hex-core with multithreading). AMD is strong with multi-core performance while Intel does best with single-core performance. These would be my suggestions for current CPUs to use with software emulation depending on whether the emulators are mostly single-threaded (Intel) or multi-threaded (AMD). Don't bother with a Core i9 or Threadripper for those scenarios, those are made more for very heavy workloads requiring a large number of cores.

Thank you very much for your outstanding and detailed reply. It was just that I was out of Saturn and DC emulator loop for almost a decade and I didn't know that Saturn emulation has greatly improved until Treks also answered me.

I have these 2 computers, I know they aren't great with great gaming graphics card etc but they do have a SSD main drive and TB Mechanical drives with ok graphics cards. What graphic card do you recommend (NVIDIA) or is what I have fine?

https://images-na.ssl-images-amazon.com/images/I/71CVxVvDGdL._SL1000_.jpg

I'm sure some members would laugh at my computers being they are sold at Best Buy because some of you making your own PCs with way more power and I would consider mine just mid range computers, because I've seen computers with more upgraded icore7s like a icore7-7747 or something (getting near icore9 levels) and with nvideo graphics card, multiple ssd drives for video editing, etc out for $2,000 sold at Best Buy now. I guess just having the video file on the ssd you are using internally (vs externally is that actually true - or ssd so much better than mechanical drives that it doesn't matter?), makes the preview window in a video editor like Wondershare seem smoother but no matter what I still get a slight delay of either video or audio and still must watch the exported video to see if everything came out correctly - I wonder what kind of computers big channel youtubers use of if they use Wondershare or more powerful video editors or Macs instead of PC or something?

Ace
01-25-2018, 05:09 PM
Nothing wrong with staying on mid-range hardware. As long as the computer does what you want, that's all that matters. That said, you're probably better off building your own PC as you then have full control over how powerful your computer will be and how you want it to look.

Content creators tend to use expensive, high-core-count CPUs like AMD's Threadripper or the Core i9 (as well as some high-core-count Core i7s, usually with 8 or 10 cores) and I commonly see them use Adobe Premiere or Sony (now MAGIX) Vegas (I've used both myself and switched over to Premiere after seeing that it cut my render times in half). Unless you are churning out a ton of videos or render videos at high resolution like I do, a CPU like this is just throwing your money away for most other applications. If you're mostly doing heavy software emulation with the likes of MAME (by that, I mean 3D games from the late '90s, I don't believe MAME is too heavily multithreaded), getting an Intel CPU that can overclock to 5GHz would be the better option (Core i7 7700K minimum, or if you want something with 6 cores, Core i5 8600K or Core i7 8700K). Otherwise, if your main focus is for Sega Model 2 and Sega Model 3 emulation, I would say save your money and go with AMD Ryzen (minimum Ryzen 5 1400, sweet spot being the Ryzen 5 1600, and if you want the best balance of emulation performance and video encoding thanks to AMD's very strong multi-core performance, get a Ryzen 7 1700). I personally have a Ryzen 7 1700 running at its stock clocks (usually runs at 3.2GHz on all cores, but with proper cooling, the highest it can go is 3.9GHz - Ryzen doesn't overclock very far due to the 14nm process used by AMD hitting a voltage wall at high clock speeds). If you're willing to wait, AMD will refresh the Ryzen lineup; hopefully, those refreshed chips will hit higher clock speeds (they're on a slightly different 12nm process that is likely to hit higher clock speeds). Ideally, you should wait for those before making up your mind.

Graphics cards are DISASTROUS right now as the explosion in cryptocurrency mining has made graphics card prices go through the roof into unbelievably absurd territory. Even used cards can be found overpriced. Normally, for a new card, I would recommend a GTX1060 6GB for nVidia or an RX 580 8GB for AMD (I tend to lean towards AMD myself, but mostly due to their noticeably clearer upscaling at low resolution than nVidia, and having a 4K HDR-capable TV, I prefer how AMD handles HDR than nVidia - that last bit is irrelevant to you unless you have recently-released games with HDR support). Used, I'd lean towards the GTX980 for nVidia or the R9 290/R9 290X/R9 390/R9 390X for AMD (the GTX970 was hugely popular, but I'm hesitant to recommend this one due to the VRAM controversy surrounding this card having 4GB of VRAM, but having 512MB of it virtually unusable due to how nVidia shaved down the GTX980 to make the GTX970 - the multiple AMD cards is because the four of them are very similar to one another). If you choose to go AMD, be sure to undervolt the card. Recent Radeon cards have a tendency to be massively overvolted out of the factory and will consume a lot of power as well as dump out a lot of heat. These cards typically don't overclock very far, so you're better off undervolting the cards instead.

Vector
01-26-2018, 04:16 AM
Nothing wrong with staying on mid-range hardware. As long as the computer does what you want, that's all that matters. That said, you're probably better off building your own PC as you then have full control over how powerful your computer will be and how you want it to look.

Content creators tend to use expensive, high-core-count CPUs like AMD's Threadripper or the Core i9 (as well as some high-core-count Core i7s, usually with 8 or 10 cores) and I commonly see them use Adobe Premiere or Sony (now MAGIX) Vegas (I've used both myself and switched over to Premiere after seeing that it cut my render times in half). Unless you are churning out a ton of videos or render videos at high resolution like I do, a CPU like this is just throwing your money away for most other applications. If you're mostly doing heavy software emulation with the likes of MAME (by that, I mean 3D games from the late '90s, I don't believe MAME is too heavily multithreaded), getting an Intel CPU that can overclock to 5GHz would be the better option (Core i7 7700K minimum, or if you want something with 6 cores, Core i5 8600K or Core i7 8700K). Otherwise, if your main focus is for Sega Model 2 and Sega Model 3 emulation, I would say save your money and go with AMD Ryzen (minimum Ryzen 5 1400, sweet spot being the Ryzen 5 1600, and if you want the best balance of emulation performance and video encoding thanks to AMD's very strong multi-core performance, get a Ryzen 7 1700). I personally have a Ryzen 7 1700 running at its stock clocks (usually runs at 3.2GHz on all cores, but with proper cooling, the highest it can go is 3.9GHz - Ryzen doesn't overclock very far due to the 14nm process used by AMD hitting a voltage wall at high clock speeds). If you're willing to wait, AMD will refresh the Ryzen lineup; hopefully, those refreshed chips will hit higher clock speeds (they're on a slightly different 12nm process that is likely to hit higher clock speeds). Ideally, you should wait for those before making up your mind.

Graphics cards are DISASTROUS right now as the explosion in cryptocurrency mining has made graphics card prices go through the roof into unbelievably absurd territory. Even used cards can be found overpriced. Normally, for a new card, I would recommend a GTX1060 6GB for nVidia or an RX 580 8GB for AMD (I tend to lean towards AMD myself, but mostly due to their noticeably clearer upscaling at low resolution than nVidia, and having a 4K HDR-capable TV, I prefer how AMD handles HDR than nVidia - that last bit is irrelevant to you unless you have recently-released games with HDR support). Used, I'd lean towards the GTX980 for nVidia or the R9 290/R9 290X/R9 390/R9 390X for AMD (the GTX970 was hugely popular, but I'm hesitant to recommend this one due to the VRAM controversy surrounding this card having 4GB of VRAM, but having 512MB of it virtually unusable due to how nVidia shaved down the GTX980 to make the GTX970 - the multiple AMD cards is because the four of them are very similar to one another). If you choose to go AMD, be sure to undervolt the card. Recent Radeon cards have a tendency to be massively overvolted out of the factory and will consume a lot of power as well as dump out a lot of heat. These cards typically don't overclock very far, so you're better off undervolting the cards instead.

Wow, that is top notch information exactly answering all my questions, thank you very much I appreciate it. My problem is I input large avi from my ssd (seems to run smoother in preview window than off mechanical) into my editor and layer it with many PIP (makes preview freeze the more video on top of video or picture on to of picture in video I'm making) but maybe it is my outdated program not syncing video and audio in "real time". I have to watch it after rendered exported to see if it is exactly like it was when previewing it because sometimes it is a frame off or audio etc from preview. Maybe I'll buy this one too, https://www.liken.sale/index.php?dispatch=products.view&sl=en&product_upc=191628492534&product_price=499.000000&product_name=HP%20Pavilion%20Power%20580-023w%20Gaming%20Tower,%20Intel%20Core%20i5-7400,%20NVIDIA%20GT it has a graphic card you recommended =) but sadly a "generation" before the icore5 8600k and no ssd main drive =( however if I added a ssd would you recommend it for the stuff we discussed, Mame, editing etc?

Ace
01-26-2018, 10:04 AM
A Core i5 7400 is not enough for video work. That sort of workload benefits more from high core count than high clock speeds and single-threaded performance. You're looking at a quad-core with no multithreading.
My personal sweet spot for gaming/emulation performance and video work is a multithreaded 8-core CPU. Open up a little and stop considering just Intel, this isn't 2015 where AMD had absolutely nothing that could even touch Intel's CPUs. I had two PCs between mid-2016 and mid-2017 that I would use, one with the FX 8320 and another with an overclocked Core i7 6700K, and as much as I liked my FX 8320 PC, it simply couldn't handle it when I made the move to 4K videos and I had to retire it as my main PC and used another with the Core i7 6700K until I botched a delid on it in an attempt to improve temperatures, which were pretty damn high at the 4.6GHz overclock I had on it. The FX 8320 is still going strong, but I'm mostly experimenting with it rather than actually using it. As for the Core i7 6700K, I sold my Z170 motherboard and bought a Ryzen 7 1700 with an ASUS B350-Plus motherboard (Ryzen uses socket AM4, if you're wondering) in May 2017 (not quite fond of the motherboard choice, and I have since got an MSI motherboard that has better port placement, but can be annoying and has questionable VRM quality - this last bit is critical when using overclocked CPUs). I have no regrets; even at stock clock speeds (should be 3GHz, but my Ryzen 7 1700 sits at 3.2GHz unless one core is under heavy load, in which case that one core hits 3.75GHz), my games run very nicely, software emulation is pretty good and it just churns through 4K 60FPS videos better than my old Toshiba laptop would render 1080p 30FPS videos in 2012 (and that thing rendered 1080p 30FPS videos fairly quickly). Although I don't really have proper cooling (need a bigger heatsink or a liquid cooler, the latter of which I'm not entirely fond of), I got my Ryzen 7 1700 to hit 3.9GHz on all 8 cores and the performance difference is fairly significant.
On a Ryzen 7, you can expect to hit between 3.8-4GHz, with the 1700X and 1800X having better odds of hitting 4GHz. Just make sure not to overvolt your CPU past 1.425V (some spikes above that are fine, but don't give the CPU a sustained voltage above 1.425V) and ideally get a motherboard with a 6-phase or better VRM for the CPU (you can get cheaper B350 chipset boards with 4-phase VRMs, but these tend to be under heavier stress than the 6-phase+ motherboards). You'll typically find these higher-phase count VRMs on X370 motherboards, mostly from ASRock and ASUS. Avoid Gigabyte (their VRMs seem to run very hot) and MSI (most of their X370 motherboards use 4-phase VRMs with doubled components to make it appear like an 8-phase, their top-of-the-line motherboard has a very inefficient 6-phase VRM and every single AM4 motherboard in their product range uses low-quality components in their VRMs). Motherboard suggestions that don't break the bank with 6-phase or higher VRMs would be the ASRock X370 Killer SLI and the ASUS X370-Pro (the X370 Killer SLI has an 8-phase VRM, but the X370-Pro has a 6-phase with higher-quality components). Pre-built PCs with Ryzen 7s are far and few between, and between you and me, you're better off building your own PC than rely on pre-built PCs from the likes of HP, Dell, Acer, etc.
As for the GTX1060, you'd be better off with the 6GB model. The 3GB model isn't just missing half the RAM, the GPU on the card also has fewer shaders than the 6GB model. At that kind of power level, you should just choose what's cheaper between a 6GB GTX1060 or an 8GB RX 580. They're pretty much equal in terms of performance.
On that note, I suggest to move to private messages to continue the conversation as this thread is starting to derail off-topic a little bit.

Vector
01-26-2018, 02:16 PM
A Core i5 7400 is not enough for video work. That sort of workload benefits more from high core count than high clock speeds and single-threaded performance. You're looking at a quad-core with no multithreading.
My personal sweet spot for gaming/emulation performance and video work is a multithreaded 8-core CPU. Open up a little and stop considering just Intel, this isn't 2015 where AMD had absolutely nothing that could even touch Intel's CPUs. I had two PCs between mid-2016 and mid-2017 that I would use, one with the FX 8320 and another with an overclocked Core i7 6700K, and as much as I liked my FX 8320 PC, it simply couldn't handle it when I made the move to 4K videos and I had to retire it as my main PC and used another with the Core i7 6700K until I botched a delid on it in an attempt to improve temperatures, which were pretty damn high at the 4.6GHz overclock I had on it. The FX 8320 is still going strong, but I'm mostly experimenting with it rather than actually using it. As for the Core i7 6700K, I sold my Z170 motherboard and bought a Ryzen 7 1700 with an ASUS B350-Plus motherboard (Ryzen uses socket AM4, if you're wondering) in May 2017 (not quite fond of the motherboard choice, and I have since got an MSI motherboard that has better port placement, but can be annoying and has questionable VRM quality - this last bit is critical when using overclocked CPUs). I have no regrets; even at stock clock speeds (should be 3GHz, but my Ryzen 7 1700 sits at 3.2GHz unless one core is under heavy load, in which case that one core hits 3.75GHz), my games run very nicely, software emulation is pretty good and it just churns through 4K 60FPS videos better than my old Toshiba laptop would render 1080p 30FPS videos in 2012 (and that thing rendered 1080p 30FPS videos fairly quickly). Although I don't really have proper cooling (need a bigger heatsink or a liquid cooler, the latter of which I'm not entirely fond of), I got my Ryzen 7 1700 to hit 3.9GHz on all 8 cores and the performance difference is fairly significant.
On a Ryzen 7, you can expect to hit between 3.8-4GHz, with the 1700X and 1800X having better odds of hitting 4GHz. Just make sure not to overvolt your CPU past 1.425V (some spikes above that are fine, but don't give the CPU a sustained voltage above 1.425V) and ideally get a motherboard with a 6-phase or better VRM for the CPU (you can get cheaper B350 chipset boards with 4-phase VRMs, but these tend to be under heavier stress than the 6-phase+ motherboards). You'll typically find these higher-phase count VRMs on X370 motherboards, mostly from ASRock and ASUS. Avoid Gigabyte (their VRMs seem to run very hot) and MSI (most of their X370 motherboards use 4-phase VRMs with doubled components to make it appear like an 8-phase, their top-of-the-line motherboard has a very inefficient 6-phase VRM and every single AM4 motherboard in their product range uses low-quality components in their VRMs). Motherboard suggestions that don't break the bank with 6-phase or higher VRMs would be the ASRock X370 Killer SLI and the ASUS X370-Pro (the X370 Killer SLI has an 8-phase VRM, but the X370-Pro has a 6-phase with higher-quality components). Pre-built PCs with Ryzen 7s are far and few between, and between you and me, you're better off building your own PC than rely on pre-built PCs from the likes of HP, Dell, Acer, etc.
As for the GTX1060, you'd be better off with the 6GB model. The 3GB model isn't just missing half the RAM, the GPU on the card also has fewer shaders than the 6GB model. At that kind of power level, you should just choose what's cheaper between a 6GB GTX1060 or an 8GB RX 580. They're pretty much equal in terms of performance.
On that note, I suggest to move to private messages to continue the conversation as this thread is starting to derail off-topic a little bit.

Thank you once again for the great answers about emulation/video. I don't want to derail thread. About PMs I tried PM you yesterday to not derail this thread a little bit but it said your PMs were full.

https://image.ibb.co/i4b2Yb/Screenshot_2018_01_25_14_22_25gg.png

Ace
01-26-2018, 04:34 PM
I deleted a number of messages after having received multiple e-mail notifications. You can send them to me again.

gamevet
01-29-2018, 02:21 AM
A Core i5 7400 is not enough for video work. That sort of workload benefits more from high core count than high clock speeds and single-threaded performance. You're looking at a quad-core with no multithreading.
My personal sweet spot for gaming/emulation performance and video work is a multithreaded 8-core CPU. Open up a little and stop considering just Intel, this isn't 2015 where AMD had absolutely nothing that could even touch Intel's CPUs. I had two PCs between mid-2016 and mid-2017 that I would use, one with the FX 8320 and another with an overclocked Core i7 6700K, and as much as I liked my FX 8320 PC, it simply couldn't handle it when I made the move to 4K videos and I had to retire it as my main PC and used another with the Core i7 6700K until I botched a delid on it in an attempt to improve temperatures, which were pretty damn high at the 4.6GHz overclock I had on it. The FX 8320 is still going strong, but I'm mostly experimenting with it rather than actually using it. As for the Core i7 6700K, I sold my Z170 motherboard and bought a Ryzen 7 1700 with an ASUS B350-Plus motherboard (Ryzen uses socket AM4, if you're wondering) in May 2017 (not quite fond of the motherboard choice, and I have since got an MSI motherboard that has better port placement, but can be annoying and has questionable VRM quality - this last bit is critical when using overclocked CPUs). I have no regrets; even at stock clock speeds (should be 3GHz, but my Ryzen 7 1700 sits at 3.2GHz unless one core is under heavy load, in which case that one core hits 3.75GHz), my games run very nicely, software emulation is pretty good and it just churns through 4K 60FPS videos better than my old Toshiba laptop would render 1080p 30FPS videos in 2012 (and that thing rendered 1080p 30FPS videos fairly quickly). Although I don't really have proper cooling (need a bigger heatsink or a liquid cooler, the latter of which I'm not entirely fond of), I got my Ryzen 7 1700 to hit 3.9GHz on all 8 cores and the performance difference is fairly significant.
On a Ryzen 7, you can expect to hit between 3.8-4GHz, with the 1700X and 1800X having better odds of hitting 4GHz. Just make sure not to overvolt your CPU past 1.425V (some spikes above that are fine, but don't give the CPU a sustained voltage above 1.425V) and ideally get a motherboard with a 6-phase or better VRM for the CPU (you can get cheaper B350 chipset boards with 4-phase VRMs, but these tend to be under heavier stress than the 6-phase+ motherboards). You'll typically find these higher-phase count VRMs on X370 motherboards, mostly from ASRock and ASUS. Avoid Gigabyte (their VRMs seem to run very hot) and MSI (most of their X370 motherboards use 4-phase VRMs with doubled components to make it appear like an 8-phase, their top-of-the-line motherboard has a very inefficient 6-phase VRM and every single AM4 motherboard in their product range uses low-quality components in their VRMs). Motherboard suggestions that don't break the bank with 6-phase or higher VRMs would be the ASRock X370 Killer SLI and the ASUS X370-Pro (the X370 Killer SLI has an 8-phase VRM, but the X370-Pro has a 6-phase with higher-quality components). Pre-built PCs with Ryzen 7s are far and few between, and between you and me, you're better off building your own PC than rely on pre-built PCs from the likes of HP, Dell, Acer, etc.
As for the GTX1060, you'd be better off with the 6GB model. The 3GB model isn't just missing half the RAM, the GPU on the card also has fewer shaders than the 6GB model. At that kind of power level, you should just choose what's cheaper between a 6GB GTX1060 or an 8GB RX 580. They're pretty much equal in terms of performance.
On that note, I suggest to move to private messages to continue the conversation as this thread is starting to derail off-topic a little bit.

The prices for the GTX 1060 and RX 580 are insane because of miners driving up the prices. The better choice is finding a GTX 980 on the used market.

Ace
01-29-2018, 11:54 AM
The prices for the GTX 1060 and RX 580 are insane because of miners driving up the prices. The better choice is finding a GTX 980 on the used market.

Wholeheartedly agree with that suggestion. I would personally look for a GTX980 Ti, but that's just me. I personally feel more comfortable with 6GB of VRAM.

gamevet
01-29-2018, 02:37 PM
Unfortunately, the prices of GTX 980 Ti have risen as well. I'm just glad I got my EVGA GTX 1080 Classified right after the 1080 Ti came out. The price dropped down to $519.

You can expect to pay about $400 plus to get a 980 Ti. It was around $325 about a year ago.

Ace
01-29-2018, 09:14 PM
I'm pretty pissed off at the whole situation with RX Vega. I personally have an R9 Fury, RX 580 (4GB and 8GB) and a GTX1070, though I've ditched the R9 Fury in my main rig because it doesn't have HDMI 2.0 and I'm now using the 8GB RX 580 because although it's not as powerful at the GTX1070, I prefer AMD's picture quality and how HDR looks like on my 4K TV. Problem is, I can't record at 4K with the RX 580 using ReLive, but RX Vega allows that, and while it is a power hog that isn't as powerful as nVidia's higher-tier offerings, the aforementioned preference towards AMD's picture quality and presentation in HDR (couldn't even get HDR10 to work on the GTX1070, which only seems to want to work with scRGB). I usually move my cards around for testing purposes, though my new case doesn't allow the R9 Fury to fit, so that's out. RX Vega has been a nightmare since launch with non-stop shortages and custom cards with open-shroud cooling solutions taking an eternity to show up (I am not a fan of blower coolers after having used an 8800GTX and GTX260 with such a cooler). I game at 4K, but I have no issue turning down details as necessary (unless the drop in visual quality is really bad or performance is below what I would accept, in which case, I drop the resolution down and up the detail level). Kinda wish more games offered checkerboarded 4K; I only have F1 2017 that does that, and the picture quality is very close to what you would get with native 4K with better performance than even 4K without anti-aliasing.

gamevet
01-29-2018, 11:15 PM
4K just isn't there yet. Even the 1080Ti has to have settings turned down to run top tier games a 60fps in 4k.

I have my 1080 paired with a Dell 1440p/144hz G-Sync monitor. It's the sweet spot for that GPU.

Ace
01-31-2018, 11:15 AM
I had the GTX1080 Ti in my sights for a while until I found that my GTX1070 handled HDR in a bit of an odd manner when compared to my RX 580. I only have one HDR-compatible PC game, and that's F1 2017, which only offers one HDR option for the GTX1070, scRGB, while on the RX 580, I have that plus HDR10. scRGB doesn't look as nice on my TV (yes, I use my 4K TV as a monitor). At the same time, there are some games that I find are not worth running maxed out as you lose a lot of performance for a visual upgrade that is either insignificant or non-existent, and with the GTX1080 Ti still requiring dropping details at 4K to maintain 60FPS, what's the point in buying one? I might as well get something within the GTX1070/GTX1080 performance tier and pocket the savings (however little they may be with this exceptionally inflated pricing), and the best option I can see is a Radeon RX Vega 56 that is flashed with an RX Vega 64 VBIOS, has its HBM2 VRAM overclocked to the limit and its core undervolted as low as it can go.

There are some instances where 4K at 30FPS is acceptable, but it's really game-dependent for me.

gamevet
02-03-2018, 02:35 AM
I had the GTX1080 Ti in my sights for a while until I found that my GTX1070 handled HDR in a bit of an odd manner when compared to my RX 580. I only have one HDR-compatible PC game, and that's F1 2017, which only offers one HDR option for the GTX1070, scRGB, while on the RX 580, I have that plus HDR10. scRGB doesn't look as nice on my TV (yes, I use my 4K TV as a monitor). At the same time, there are some games that I find are not worth running maxed out as you lose a lot of performance for a visual upgrade that is either insignificant or non-existent, and with the GTX1080 Ti still requiring dropping details at 4K to maintain 60FPS, what's the point in buying one? I might as well get something within the GTX1070/GTX1080 performance tier and pocket the savings (however little they may be with this exceptionally inflated pricing), and the best option I can see is a Radeon RX Vega 56 that is flashed with an RX Vega 64 VBIOS, has its HBM2 VRAM overclocked to the limit and its core undervolted as low as it can go.

There are some instances where 4K at 30FPS is acceptable, but it's really game-dependent for me.

That 4K @ 30 FPS is going to be harder to obtain when newer games like Far Cry 5 and Final Fantasy XV are released.

Far Cry 5 running at 4K @ 30 FPS will require a GTX 1070 or a Vega 56 with a mixed setting of high and medium, while 4K @ 60 FPS will require 2 GTX 1080 Ti cards in sli, or Vega 56 X-fire with a mix of high and ultra settings.

https://news.ubisoft.com/article/far-cry-5-pc-specs-and-system-requirements-revealed

I just ran the FFIV benchmark with my overclocked GTX 1080 @ 2061 Mhz and the memory @ 11 GBPS, running the benchmark @ 1080p with high settings. I got a score of 8100, compared to the GTX 1070's 6215 and the Vega 56's 4700. The GTX 1080 Ti managed just over 9100 points.