PDA

View Full Version : Comparison of 7th generation (HD) game console hardware



sheath
01-12-2013, 06:45 PM
Okay, I'm calling it. We have seen everything that 2005 hardware is going to show us, I don't care how long the megacorps milk their products. I'm ready for a comparison. I'm also ready to have a laugh at my (seven year) younger self, so I'll start the comparison with my own research at the early part of the generation.

http://s4.zetaboards.com/Whip_Ass_Gaming/topic/454081/1/#new


http://dpad.gotfrag.com/portal/story/35372/?spage=1

Some choice quotes:

On the topic of the Consoles' CPUs in comparison to PCs.

"Both the 360 and PS3ís CPUs are heavily stripped down compared to what most of us are probably using on our desktop computers to view this article. Both consoles are labeled as 3.2GHZ, but they donít offer performance comparable to that of a typical Athlon 64 3200+ or better than even an Athlon XP 2800+ CPU. The CPUs inside the Xbox 360 and PS3 are ďIn-Order ExecutionĒ CPUs with narrow execution cores, whereas what we use on our computers are classified as ďOut-of-Order ExecutionĒ CPUs with wider execution cores.

... This is because theyíve stripped out hardware designed to optimize the scheduling of instructions at runtime. As a result, neither the 360 nor PS3ís CPU contain an instruction window. Instead, instructions pass through the processor in the order in which they were fetched; hence both are ďIn-Order ExecutionĒ CPUs."

"# #1 Both consoles are using in-order execution CPUs that are half the speed of out-of-order execution processors when it comes to running most game code, especially the more troublesome type which contains branches, loops and pointers.
# #2 The very code theyíre hoping to get improved performance out of isn't the type to lend itself so easily to multi-threadingÖ to say it's hard would be the understatement of the century.

Here is a bit of what John Carmack, technical director of id Software, has to say about this.

ďI do somewhat question whether we might have been better off this generation having an out-of-order main processor, rather than splitting it all up into these multi-processor systems.Ē

ďItís probably a good thing for us to be getting with the program now, the first generation of titles coming out for both platforms will not be anywhere close to taking full advantage of all this extra capability, but maybe by the time the next generation of consoles roll around, the developers will be a little bit more comfortable with all of this and be able to get more benefit out of it.Ē "

A direct comparison of both consoles' CPUs.

"GFLOPS is something that gets thrown around a lot, but it should be clear that the peak theoretical GFLOP numbers for both these CPUs are:
# 115GFLOPS Theoretical Peak Performance for 360 CPU
# 218GFLOPS Theoretical Peak Performance for PS3 CPU."

"In IBMís controlled testing environment, their optimized code on 8 SPE only yielded a performance number of 155.5GFLOPS. If it took 8 SPE to achieve that, no way 6 will be able to and that testing was done in a fashion that didnít model all the complexities of DMA and the memory system. Using a 1Kx1K matrix and 8 SPE they were able to achieve 73.4GFLOPS, but the PS3 uses 6 SPE for games and these tests were done in controlled environments. So going on this information, even 73.4GFLOPS is seemingly out of reach, showing us that Sony didnít necessarily lie about the cellís performance as they made clear the 218GFLOPS was ďtheoretical.Ē But just like Microsoft they definitely wanted you to misinterpret these numbers into believing they were achievable."

"Itís also worth mentioning that even the PS2 CPU had more than twice the GFLOPS of the original Xboxís CPU, but it didnít necessarily lead it to being the performance winner. This time around, while the cell has the GFLOPS advantage, its advantage isnít quite as big as the PS2 CPU had on the Xbox. This teaches us that there is more than one meter of real world performance."

"The reason the PS3ís CPU will be significantly more difficult to program for is because the CPU is asymmetric, unlike the 360ís CPU. Because of the PS3 CPU only having 1 PPE compared to the 360ís 3, all game control, scripting, AI and other branch intensive code will need to be crammed into two threads which share a very narrow execution core and no instruction window. The cellís SPE will be unable to help out here as they are not as robust; hence, not fit for accelerating things such as AI, as itís fairly branch intensive and the SPE lacks branch prediction capability entirely."

"Microsoft made a better decision from the perspective of the developer; it's still difficult, but much easier compared to working with the Cell architecture. The 360ís CPU isnít asymmetric like the PS3ís cell and has 3 PPE as opposed to 1, but all 3 are robust enough to help handle the type of code only the PS3ís single PPE is capable of handling. When Microsoft says they have three times the general purpose processing power this is what they mean. Based on the simple fact that the 360 has 3 Power PC cores to the PS3ís 1, more processing power can be dedicated to helping with things such as game control AI, scripting and other types of branch intensive code."

"Itís highly doubtful that Blu-Ray will lead to better graphics because the PS3, due to split memory pools containing 256MB worth of GDDR3 memory and 256MB worth of XDR memory can at best dedicate 256MB worth of ram to textures at any given moment whereas the 360 uses unified memory for a total of 512MB."

"There are many titles that are currently claiming to already be filling up an entire Blu-Ray disc, but Iíd be lying to myself if I said I actually see visual signs of it. There just isnít anything yet that makes me say ďThis is a result of having Blu-RayĒ. From talking to developers none seem to be concerned with the 360ís disc space and there are some that say they expect disc space to become an issue only if games use a lot of high definition movie content. With the graphical horsepower we have today the need for CG video is dropping significantly, but even so 360 titles like Blue Dragon appear to have a healthy dose of CG video and ďThe DarknessĒ reportedly has over 4 hours worth of high def movie footage in addition to the actual game, all on the same disc."

"One thing that escapes most people is that anything that makes it into a game level is taking up space in memory. Now if the PS3 had a gigabyte maybe even 2 gigs worth of ram, for example, then in that case Blu-Ray would end up being a major factor between the 2 machines, but as of now it seems more like a luxury or convenience rather than a necessity."

"The 360ís DVD drive pulls information off of a 12X DVD disc twice as fast as the PS3ís 2X Blu-Ray does off of a Blu-Ray disc. The 360ís 12 DVD drive has a speed of 16.5 megabytes per second compared to the PS3ís 2X Blu-ray drive which has a speed of 8.7 megabytes per second. I found this information regarding the speed of the ps3's blu ray drive and 360's DVD drive while reading a developer's rants on what he thought about both these consoles and he said both consoles are extremely powerful, but are neglecting something rather important. He says processing speed continues to increase, GPU performance continues to increase and the amount of available memory is increasing and yet there have been no such similar improvements as to how fast they can read data from the disc. He suggests either giving the console 1GB of ram or come up with a solution in the future."

"The Cost of the Operating System"

Xbox 360:

"# 32MB of the 512mb of available GDDR3 RAM
# 3% CPU time on Core1 and Core2 (nothing is reserved on Core0)"

PS3

"# 32mb of the 256mb of available GDDR3 memory off the RSX chip
# 64mb of the 256mb of available XDR memory off the Cell CPU
# 1 SPE of 7 constantly reserved
# 1 SPE of 7 able to be "taken" by the OS at a moments notice (games have to give it up if requested)"

"RSX (PS3GPU) & Xenos (360GPU)"

"Alright letís get underway the GPU inside the PS3 is NV47 based which is another name for the 7800GTX. It has 24 pixel shader pipelines and 8 vertex shader pipelines. Itís capable of 136 shader operations per clock and according to Sony it has 256MB of GDDR3 memory at 700MHZ and performs 74.8 billion shader operations per second."

(Snip) exposure of Sony's lie about the RSX's FLOP performance

"The RSX has 20.8GB/s of video memory bandwidth from the GDDR3 ram. The RSX has an extra 32 GB/sec writing to the system's main memory. If the RSX can fully utilize the memory system it can achieve pushing out 58.2GB/s worth of pixel rendering to memory. The RSX is pretty much a 7800GTX class GPU in some cases its worse in some cases better, nothing that is really new. Now the same canít be said about the 360ís GPU at all."

"Now the 360ís GPU is one impressive piece of work and Iíll say from the get go itís much more advanced than the PS3ís GPU so Iím not sure where to begin, but Iíll start with what Microsoft said about it. Microsoft said Xenos was clocked at 500MHZ and that it had 48-way parallel floating-point dynamically-scheduled shader pipelines (48 unified shader units or pipelines) along with a polygon performance of 500 Million triangles a second."

(Snip) No current gen game will ever see that many triangles in a second.

"To quote ATI on the 360ís GPU they say.
"On chip, the shaders are organized in three SIMD engines with 16 processors per unit, for a total of 48 shaders. Each of these shaders is comprised of four ALUs that can execute a single operation per cycle, so that each shader unit can execute four floating-point ops per cycle."
# 48 shader units * 4 ops per cycle = 192 shader ops per clock
# Xenos is clocked at 500MHZ *192 shader ops per clock = 96 billion shader ops per second."

(Snip) Lots of math proving the 360s GPU is super advanced and has better bandwidth than the RSX.

""Needless to say the 360 not only has an overabundance of video memory bandwidth, but it also has amazing memory saving features. For example to get 720P with 4XFSAA on traditional architecture would require 28MB worth of memory. On the 360 only 16MB is required. There are also features in the 360's Direct3D API where developers are able to fit 2 128x128 textures into the same space required for one, for example. So even with all the memory and all the memory bandwidth, they are still very mindful of how itís used."

zyrobs
01-12-2013, 07:44 PM
There is no comparison, all machines of this generation suck.

Hardware wise the X360 is better designed, easier to use, and arguably more powerful. PS3 only has the advantage in bluray space and for select models, backwards compatibility.

Games wise, the PS3 has more exclusives, but multiplatform titles look worse.

This is the gist of it.

sheath
01-13-2013, 10:12 AM
And now for the Wii U:



http://en.wikipedia.org/wiki/Wii_U

Processors Main article: Espresso (microprocessor) (http://en.wikipedia.org/wiki/Espresso_%28microprocessor%29)


CPU (http://en.wikipedia.org/wiki/Central_processing_unit): IBM (http://en.wikipedia.org/wiki/IBM) PowerPC 750 (http://en.wikipedia.org/wiki/PowerPC_7xx)-based tri-core processor "Espresso (http://en.wikipedia.org/wiki/Espresso_%28microprocessor%29)"[5] (http://en.wikipedia.org/wiki/Wii_U#cite_note-hardware-5) reportedly clocked at 1.24 GHz.[82] (http://en.wikipedia.org/wiki/Wii_U#cite_note-eurogamer-82)



GPU (http://en.wikipedia.org/wiki/Graphics_processing_unit): AMD Radeon (http://en.wikipedia.org/wiki/Radeon) High Definition[5] (http://en.wikipedia.org/wiki/Wii_U#cite_note-hardware-5) processor codenamed "Latte" with an eDRAM cache built onto the die[83] (http://en.wikipedia.org/wiki/Wii_U#cite_note-iwataasks.nintendo.com-83) reportedly clocked at 550 MHz.[82] (http://en.wikipedia.org/wiki/Wii_U#cite_note-eurogamer-82)

The Wii U CPU (http://en.wikipedia.org/wiki/Central_processing_unit) is designed by IBM (http://en.wikipedia.org/wiki/IBM). It is described by IBM as an "all-new, Power (http://en.wikipedia.org/wiki/Power_Architecture)-based microprocessor",[84] (http://en.wikipedia.org/wiki/Wii_U#cite_note-IBM_Wii_U_Press_Release-84) the processor is a multi-core (http://en.wikipedia.org/wiki/Multi-core_processor) design manufactured at 45 nm (http://en.wikipedia.org/wiki/45_nanometer) with an eDRAM (http://en.wikipedia.org/wiki/EDRAM) cache (http://en.wikipedia.org/wiki/CPU_cache). Neither Nintendo nor IBM has revealed detailed specifications, such as the number of cores, clock rate (http://en.wikipedia.org/wiki/Clock_rate), or cache sizes. References have been made to the chip containing "a lot" of eDRAM and "the same processor technology found in Watson (http://en.wikipedia.org/wiki/Watson_%28computer%29)".[85] (http://en.wikipedia.org/wiki/Wii_U#cite_note-85) The Wii U CPU is produced by IBM at their 300 mm semiconductor manufacturing facility in East Fishkill (http://en.wikipedia.org/wiki/East_Fishkill,_New_York), New York (http://en.wikipedia.org/wiki/New_York).[84] (http://en.wikipedia.org/wiki/Wii_U#cite_note-IBM_Wii_U_Press_Release-84) Both the CPU and the GPU are on one MCM (http://en.wikipedia.org/wiki/Multi-chip_module).[83] (http://en.wikipedia.org/wiki/Wii_U#cite_note-iwataasks.nintendo.com-83)
RAM

2 GB total, consisting of four 512 MB (4 Gb) DDR3-1600 DRAM (http://en.wikipedia.org/wiki/DRAM) chips at 12.8 GB/s total bandwidth, with 1 GB reserved for the operating system and unavailable to games[86] (http://en.wikipedia.org/wiki/Wii_U#cite_note-86)




That's a CPU clocked at 1/3rd of the Xbox 360's and DDR3-1600 RAM clocked at half the bandwidth. Supposedly the newer video chip can compensate for this though.

TVC 15
01-13-2013, 01:40 PM
The large pool of EDRAM helps a lot in the Wii U's case, much like 360's own EDRAM for effects that require a lot of bandwidth.

However the Wii U according to sources has 32MB vs 4MB, which is enough to either keep a full 1080p frame buffer in there or a 720p buffer with some anti-alising, the design thus relies a lot on EDRAM bandwidth to make up for the slower main Ram pool.

By keeping a lot of stuff such as shadow buffers, back buffers etc inside the very fast on chip EDRAM I assume you wouldn't incur a massive penalty by using the slow ram, hence why its able to just keep up with multi-platform ports where the lower system bandwidth would cripple the game a lot more. A technical articles such as those over at Digital Foundry on Eurogamer, compare cross-ports of big titles like Batman Arkham City, and defiantly indicate the EDRAM helps, Shadow buffers for example are much lower resolution than the comparable PS360 ports, indicating difficulty squeezing everything into the 32MB of EDRAM, where the other platforms can simply use main unified ram (360) or vram (PS3) which have higher bandwidth.

A lot of dev's have also complained about the Wii U's main cpu, some have even gone as far as stating naively its little more than 3 Wii cpus duck taped together, but thats far too simplistic, even if the Wii U's cpu isn't massively high performance. I assume most contemporary titles rely on the floating point grunt of the PS360's designs where, using simplistic prevailing logic, if the Wii U cpu is refabbed and updated Wii tech, then it will probably have more ALU's but less floating point grunt. But game code is completely different to standard Operating and Application code with specific needs, there have been mentions of background objects missing in Wii U games, indicating the weaker CPU, to handle the logic and branching code for objects onscreen.

However, you cant really stick the 3 Wii Broadways together, the design they originated from, the PowerPC 750 (aka apple branded G3) lacks multi-processor logic, and you would have to rework the Broadway cpu for cache-coherancy and multi-processor support to glue them all together, which means IBM probably have done a lot of tinkering to the basic design, perhaps they've beefed the general design up a bit with more ALU's and better out of order support since the G3 was a very basic design. I would imagine its a new CPU design compatible with Gamecube/Wii code. There may also be some additional vector/fpu units.

According to a developer, ERP, over at beyond3D (worked on the team that coded World Driving Championship on N64), there is a sad irony to how Nintendo designed the Wii to be low cost developer friendly, it actually cost more for Dev's to redesign pre-exsisting engines from lead platforms like the Xbox and PS3 or build new engines from scratch along with reworking assets to sub-hd level, hence why there was less developer interest for the Wii, third party sales also convinced devs and publisher it wasn't worth the effort.

I' am from a gameplay perspective genuinely interested in the Wii U, whilst respecting its not to everyones tastes, theres certainly a WiiU-Fail bandwagon, but I could see nintendo painting themselves into a similar corner next-gen, when PS4/Durango 720 release with massively higher specs, using more traditional AMD x86 processors leaving the Wii U orphaned yet again, the Tablet controller being less an incentive but an actual disincentive to make ports, like the Wii's motion controls requiring reworking games. There has been some limited attempts a redressing the balance with the Pro Controller.

sheath
01-13-2013, 02:57 PM
Supposedly the Radeon chip in the Wii U is the same as an HD 4000 series. My HD 4850 blows the balls off of the PS360 for light shading, shadows, texture mapping and speed. I was able to run Batman Arkham Asylum/City at 1080p and 30FPS or 720p at 60FPS with 8x FSAA and a generation or two newer shaders than the PS360 have. This was all on my older Dual Core Athlon X2 6400+ (3200Mhz). I would at least hope that the Wii U's CPU is more designed for game calculations than my general purpose Dual Core is.

I also agree there are a lot of things IBM could have done with that CPU to compensate for the slower speed, in fact they would need to or the games would already be showing at least half the performance of the PS360. Anyway, here is a GPU comparison from Tom's Hardware.





GeForce
Radeon


GTX 295




4870 X2


GTX 280, GTX 285
4850 X2


9800 GX2, GTX 260
4870


8800 GTX, 8800 Ultra, 9800 GTX, 9800 GTX+
3870 X2, 4850 (Wii U?)


8800 GT 512 MB, 8800 GTS 512 MB, 9800 GT
4830 (Wii U?)


8800 GTS 640 MB, 9600 GT
HD 2900 XT, 3870


8800 GS, 9600 GSO
3850 512 MB, Mobility (http://www.tomshardware.com/reviews/geforce-gtx-radeon,2151-6.html#) 3870, 4670


8800 GT 256 MB, 8800 GTS 320 MB, GO 8800M
HD 2900 PRO (360?), 3850 256 MB, Mobility 3850, 4650 (Wii U?)


7950 GX2
X1950 XTX


7800 GTX 512, 7900 GTO, 7900 GTX
X1900 XT(360?), X1950 XT, X1900 XTX


7800 GTX (PS3), 7900 GT, 7950 GT
X1800 XT, X1900 AIW, X1900 GT, X1950 PRO, HD 2900 GT (360?)




I have highlighted the respective graphics chips in their generations of PC hardware. Next I'll see if I can't find a benchmark or two with some performance numbers for each card on a quad core general purpose PC processor.

Chilly Willy
01-13-2013, 03:33 PM
The ram in a modern Wii (from the AnandTech breakdown) is misleading... the original Wii had 64MB of DDR3 ram, so that's all games can count on to be compatible with all models. I imagine that new models use the extra ram in the OS apps, like for browsing and such. So for purposes of comparison (gaming), you can only count 64MB for system ram on the Wii. It has another 24MB of SRAM and 16MB of ARAM, mostly for GC compatibility.

kool kitty89
01-21-2013, 04:28 AM
The Wii-U is just weird and disappointing on the raw hardware point . . . it's an even worse situation than the Wii was since the Wii was at least universally superior to the best 6th gen consoles (well, Xbox arguably has some modest advantages in some areas), but the CPU is a huge bottleneck on the Wii-U and even worse, this apparently wasn't the case on the preproduction dev units. (from what I've read, several developers had to scale games back, others dropped them entirely, and still others who had showed interest in future development -like Bathesda- dropped that after the final release specs)

On top of that, the Wii-U is even further behind given just how old the other 7th gen consoles are . . . with the launch of the Xbox 360 in 2005, this has been the longest generation ever without a superior and/or full next generation system being launched by either a competitor or the parent company (even the Famicom's 1983-1990 timeframe has been exceeded by Microsoft).

Clock speed certainly isn't everything, and the G3 (PowerPC 750) derived Wii CPU core is certainly faster per clock than the PPE based cores of the Xenon and Cell (SIMD comparisons aside), but the clock speeds are so vastly different, that it really can't come close for CPU bottlenecked games . . . though the PS3 could have more tradeoffs (a lot of trade-offs depending on how balanced multithreading is and how the SPEs can be used -though for the latter, any added SIMD capabilities of the Wii CPU as well as newer GPu core features would be mitigating factors for the SPEs).
Had it been something like a 2 GHz triple core PowerPC G4 drived CPU (or maybe even a PPC 970/Power4), it would have been a lot more sensible and still shouldn't have been a major compromise in terms of cost/power consumption (better for the G4) . . . or a hypothetical 2 GHz G3 for that matter, if the short pipelines weren't a limiting factor. (or whatever was in the dev units)

That and 2 GB of shared RAM (with 1/2 locked to OS usage) is still pretty limited for a forward-looking console, though given it's in simple DIMM type modules now, there's the slight potential for expansion. (albeit internal only, so in-store only for an official upgrade or a user based, warranty-voiding hack . . . so not likely Nintento's plan)





But on that note, for the actual mainstay 7th gen consoles, my biggest thought would be on RAM. TBH, this is something that goes back to when CD/mass storage based consoles became the defacto standards (ie mid 90s). If you could make provision for just 1 component, it should really be RAM, especially for a system with a unified memory subsystem (thus expanding the RAM shared for most/all purposes). Rather ironic that the last system to do this was the N64, and it benefited a fair bit less than a CD based console might have.

RAM is one of the most basic components to a system, and also one that tends to be a mass-market commodity that dramatically drops in price. The Playstation used common EDO DRAM for its main memory, the Saturn used SDRAM (which became commodity towards the end of its life), the Dreamcast used PC-100 SDRAM (common PC standard at the time), PS2 used RDRAM (more exotic, but still came close to commodity prices while competing with DDR1 in the early 2000s), and on to the real topics of choice:

The Xbox 360 uses GDDR3, high-end at launch and still never commodity PC memory (DDR2 or DDR3), but it became the mainstream/lower-end standard for video cards by the middle of the system's life, and should easily have been cheap enough to be offered as a cost-effective expansion module to allow 2, 3, or perhaps 4 times the launch system's RAM at a sensible low price for the user (depending on MS's marketing scheme -ie loss as with console hardware or profit). To really make sense, they'd have to standardize newer models with expanded RAM as the default, and offering more than 1 step in RAM expansion probably would get too confusing for marketing/users to be worthwhile.
They made no hardware or marketing provisions for that, of course, so it was never a real option from the launch of the console onward . . . same for the PS3, though the use of XDRAM for main memory would complicate that somewhat by comparison.

The raw computational and graphics power of the Xbox 360 isn't horribly off from a basic/budget bottom end gaming-capable PC of today if not for the RAM bottleneck . . . and, hell, if they handled detail options somewhat like PC games tend to (along with decent auto-detect presets), you could have a lot of games benefiting from RAM expansion without absolutely requiring it either. (higher res textures, other added details, more buffering -less frequent loading, options for triple buffering video, etc, etc) While games that really needed it to work at all would add to the system what would otherwise be totally absent.


IMO, this is one area where the concept of add-ons for consoles really shouldn't have been abandoned. Things like CPU or GPU upgrades are far more troublesome, let alone optical drive upgrades (particularly since PCs still haven't adopted BD for games -unlike the FD to CD or CD to DVD transitions), though HDD upgrades are accessible and common at least. (and really necessary in many cases)

-And a side note on Blu-Ray in general: it's mostly a shift on emphasis of realtime, dynamic/semi-interactive cutscenes combined with DLC options (and mods) that make the transition less nagging than CD to DVD. With CD games, there were dozens of cases of multi-disc games (4 to 7 discs not that uncommon on PC), and that could get really cumbersome . . . so the switch to DVD was a godsend, and a big chunk of those were due to multimedia content.
You just don't see what with current DVD based games (including PC exclusives) . . . but it does beg the question of whether developers are intentionally holding themselves back or whether they really don't need the added space.

kool kitty89
01-21-2013, 04:40 AM
Supposedly the Radeon chip in the Wii U is the same as an HD 4000 series. My HD 4850 blows the balls off of the PS360 for light shading, shadows, texture mapping and speed. I was able to run Batman Arkham Asylum/City at 1080p and 30FPS or 720p at 60FPS with 8x FSAA and a generation or two newer shaders than the PS360 have. This was all on my older Dual Core Athlon X2 6400+ (3200Mhz). I would at least hope that the Wii U's CPU is more designed for game calculations than my general purpose Dual Core is.
Saying it's 4000 series means virtually nothing . . . that could imply anything down to the likes of the basic HD-4350 or even the HD-4290 IGP (both massively outclassed by the Xbox 360 GPU).

Now, given the rumored performance spects (and absolutely necessary lower power consumption), it's probably something more in line with the lines of the HD-4650, which is farr off from the 4850, but still way ahead of other consoles. (and pretty similar to AMD's current top-end 7660 IGP of their A10-5800 APU -ignoring DX11 features, the 7660 is ahead in FPU performance but behind in texture/pixel fillrates given the raw specs on wiki -actual benchmarks/games would vary obviously, depending on those aspects and how well drivers actually utilize the hardware)

Kind of strange that they'd be bothering with a 4000 derivative at all though, given AMD's emphasis on 6000 and 7000 series embedded GPUs.

In any case, this is mostly moot, as the Wii U CPU will be the primary bottleneck in a huge number of games . . . though use of multiple touch-pad controllers would mitigate that more (being GPU intensive), aside from fundamentally GPU heavy games with limited CPU overhead. (though even there you'd have a problem if things weren't very multi-thread balanced -a single thread intensive/bottlenecked game would still drag things down)
As such, that aforementioned A10-5800 would kick the crap out of the Wii U in the vast majority of cases, even though the GPU is (presumably) no better. (actually, most modern PC games end up GPU bottlenecked on that APU, at least at 1920x1080)

sheath
01-21-2013, 08:50 AM
Yeah, I agree that it could be any 4000 series card. With the Wii U CPU being as slow as it is I would assume that the GPU would have to be at last one generation better than the one in the 360 to even be able to play the games it already has available. I don't even understand what the tablet gaming is all about on the Wii U, it sounds like such a weird thing.

TVC 15
01-21-2013, 05:06 PM
The Wii-U is just weird and disappointing on the raw hardware point . . . it's an even worse situation than the Wii was since the Wii was at least universally superior to the best 6th gen consoles (well, Xbox arguably has some modest advantages in some areas), but the CPU is a huge bottleneck on the Wii-U and even worse, this apparently wasn't the case on the preproduction dev units. (from what I've read, several developers had to scale games back, others dropped them entirely, and still others who had showed interest in future development -like Bathesda- dropped that after the final release specs)

The Xbox did have slightly more feature rich DX-8 equivalent GPU, but your certainly right, the Wii was rewarmed 6th gen, and topped most of the stuff from that period, when developers could be bothered. Still pretty pathetic they couldn't have bolted on an R300 GPU, with some workarounds for backwards compatibility if it was such a big selling point.


Clock speed certainly isn't everything, and the G3 (PowerPC 750) derived Wii CPU core is certainly faster per clock than the PPE based cores of the Xenon and Cell (SIMD comparisons aside), but the clock speeds are so vastly different, that it really can't come close for CPU bottlenecked games . . . though the PS3 could have more tradeoffs (a lot of trade-offs depending on how balanced multithreading is and how the SPEs can be used -though for the latter, any added SIMD capabilities of the Wii CPU as well as newer GPu core features would be mitigating factors for the SPEs).
Had it been something like a 2 GHz triple core PowerPC G4 drived CPU (or maybe even a PPC 970/Power4), it would have been a lot more sensible and still shouldn't have been a major compromise in terms of cost/power consumption (better for the G4) . . . or a hypothetical 2 GHz G3 for that matter, if the short pipelines weren't a limiting factor. (or whatever was in the dev units)

I'm pretty sure Motorola/Freescale hold the I.P for the G4, so thats out of the question. The G5 was just part of that whole horrible NetBurst generation of processors, grunt clock speed with less efficiency per watt/cycle, and all the associated electrical & heat problems. Xenon and Cell PPE were based on PPC970/Power4 tech, but I would guess you already know that.


That and 2 GB of shared RAM (with 1/2 locked to OS usage) is still pretty limited for a forward-looking console, though given it's in simple DIMM type modules now, there's the slight potential for expansion. (albeit internal only, so in-store only for an official upgrade or a user based, warranty-voiding hack . . . so not likely Nintento's plan)


If nintendo believed a mid-gen ram upgrade was viable, there would be an expansion slot, this isnt the early 90's carting your Apple LCII over to a qualified engineer to get the motherboard swapped for a Quadra. ;)

I imagine Nintendo will free up more ram once the OS is better optimised. Hopefully for disc cacheing, apparently loading times are atrocious.

The other big issue is ram bandwidth, aggregate bandwidth is around 12.5 GB/s due to the 64-Bit bus. Ninty cheapened out on the ram. The EDRAM pool is there to alleviate that. I imagine the latency will be a lot better with the Wii U perhaps with a decent memory controller, also doesn't DDR3 handle latency better than GDDR3? I'll to check up on that.

Anyway latency was pretty horrendous on PS360.

kool kitty89
01-21-2013, 06:25 PM
Yeah, I agree that it could be any 4000 series card. With the Wii U CPU being as slow as it is I would assume that the GPU would have to be at last one generation better than the one in the 360 to even be able to play the games it already has available. I don't even understand what the tablet gaming is all about on the Wii U, it sounds like such a weird thing.
A fast GPU still won't help CPU bottlenecked games at all. For certain things (like physics) you can potentially offload some of the computation work to the GPU (sort of like some of the intended uses of the SPEs), but that only helps some CPU bottlenecked areas and I'm pretty sure that technique is mostly emphasized on Nvidia's GPUs (along with their PhysX software).

The hybrid tablet gaming thing actually has some really interesting potential that's partially demonstrated in Nintendoland. That includes some things that have already been done previously via network play (mostly just exploiting the 2nd monitor), along with other things taking more specific advantage of the tablet's touch/motion control abilities.

OTOH, the relatively high cost of the tablets and the added GPU overhead of each added screen limits the practicality to a pretty significant degree.
And, of course, it also has the potential to be inundated with cheap, poorly tacked-on gimmicks with poorly made games solely riding on that or games forcing that feature without benefiting from it at all . . . and in the latter case, quite often being worse off than with conventional controls. (ie the same thing that happened to numerous DS and Wii games, even including some relatively well made exclusives like Star Fox Command) It's really stupid that more games didn't at least allow the option for conventional input methods. For the DS and Wii games that actually benefited from touchscreen, motion, and (especially) pointer controls, some were only about equal to conventional controls while a select few really used it well . . . and if only those sorts of games had forced the control method, it would be another story entirely.




The Xbox did have slightly more feature rich DX-8 equivalent GPU, but your certainly right, the Wii was rewarmed 6th gen, and topped most of the stuff from that period, when developers could be bothered. Still pretty pathetic they couldn't have bolted on an R300 GPU, with some workarounds for backwards compatibility if it was such a big selling point.
That and a faster CPU . . . even for the existing 750 based series, there were CPUs in the 1 GHz range, and a 1 GHz dual core G3 derivative would have been a lot more sensible at least. It still should have made for a much lower power, lower cost system than the competition, perhaps more analogous to how the Dreamcast hardware compared with the later 6th gen fare


I'm pretty sure Motorola/Freescale hold the I.P for the G4, so thats out of the question. The G5 was just part of that whole horrible NetBurst generation of processors, grunt clock speed with less efficiency per watt/cycle, and all the associated electrical & heat problems. Xenon and Cell PPE were based on PPC970/Power4 tech, but I would guess you already know that.
I'd gotten more the impression that the G5/Power4 had a more typical high IPC rate, but was just power hungry (somewhat more in line with the early Athlon 64s than the P4 -plus at that time it was still the contemporary of the cooler running Northwood rather than the Presscott). IIRC the G4 ALU had a much higher IPC rate than the PPE/Xenon cores at least, which was apparently a shocking problem for early developers transitioning from the dual G4 based dev units. There was a very long-winded discussion on this topic here: http://forum.beyond3d.com/showthread.php?t=33335 (also cropping up was the Xenon's superior SIMD performance, but conversely that that was also true for Pentium D vs Core Duo, but the latter having better real-world gaming performance -also pointing out some of the fundamental ALU bottlenecks)
In any case, we're also talking about a revised (hypothetical) modern 45 nm part.

OTOH, if the existing G3 derivatives were/are totally capable of higher speeds and were clocked back for power/heat and/or yield reasons, then it certainly could have been those that were featured in the (rumored) dev kits. I may be wrong on my presumption of the short pipeline too, as I'd gotten the impression that the entire 750 series (and this current derivative) were still working with the old short 4-stage pipeline . . . or maybe I'm mistaken in thinking such a short pipelined CPU wouldn't be able to scale up to such high speeds, even at 45 nm. (in which case, it also kind of makes me wonder how something like AMD's old K6 core might have scaled up, even with its 6-stage pipeline)


If nintendo believed a mid-gen ram upgrade was viable, there would be an expansion slot, this isnt the early 90's carting your Apple LCII over to a qualified engineer to get the motherboard swapped for a Quadra. ;)
Yeah, I get that . . . I rambled too much in that comment. Still, there's the potential for homebrew/hacker exploits with RAM expansion. (only useful for homebrew software, of course)


I imagine the latency will be a lot better with the Wii U perhaps with a decent memory controller, also doesn't DDR3 handle latency better than GDDR3? I'll to check up on that.

Anyway latency was pretty horrendous on PS360.
It may have been more an issue with early generation GDDR3, as with early gen DDR2 having such high latencies that the added bandwidth was useless compared to using old DDR-400. OTOH, by 2005, DDR2 latencies were pretty decent, so that might be another argument for plain dual-channel DDR2 being a better alternative to what MS ended up using. (along with lower cost and commodity/mass market supply, either allowing better cost trade-offs -and maybe better, ahem, quality control- or maybe more RAM from day 1)

kool kitty89
01-21-2013, 06:59 PM
The Xbox did have slightly more feature rich DX-8 equivalent GPU, but your certainly right, the Wii was rewarmed 6th gen, and topped most of the stuff from that period, when developers could be bothered. Still pretty pathetic they couldn't have bolted on an R300 GPU, with some workarounds for backwards compatibility if it was such a big selling point.
That and a faster CPU . . . even for the existing 750 based series, there were CPUs in the 1 GHz range, and a 1 GHz dual core G3 derivative would have been a lot more sensible at least. It still should have made for a much lower power, lower cost system than the competition, perhaps more analogous to how the Dreamcast hardware compared with the later 6th gen fare


I'm pretty sure Motorola/Freescale hold the I.P for the G4, so thats out of the question. The G5 was just part of that whole horrible NetBurst generation of processors, grunt clock speed with less efficiency per watt/cycle, and all the associated electrical & heat problems. Xenon and Cell PPE were based on PPC970/Power4 tech, but I would guess you already know that.
I'd gotten more the impression that the G5/Power4 had a more typical high IPC rate, but was just power hungry (somewhat more in line with the early Athlon 64s than the P4 -plus at that time it was still the contemporary of the cooler running Northwood rather than the Presscott). IIRC the G4 ALU had a much higher IPC rate than the PPE/Xenon cores at least, which was apparently a shocking problem for early developers transitioning from the dual G4 based dev units. There was a very long-winded discussion on this topic here: http://forum.beyond3d.com/showthread.php?t=33335 (also cropping up was the Xenon's superior SIMD performance, but conversely that that was also true for Pentium D vs Core Duo, but the latter having better real-world gaming performance -also pointing out some of the fundamental ALU bottlenecks)
In any case, we're also talking about a revised (hypothetical) modern 45 nm part.

OTOH, if the existing G3 derivatives were/are totally capable of higher speeds and were clocked back for power/heat and/or yield reasons, then it certainly could have been those that were featured in the (rumored) dev kits. I may be wrong on my presumption of the short pipeline too, as I'd gotten the impression that the entire 750 series (and this current derivative) were still working with the old short 4-stage pipeline . . . or maybe I'm mistaken in thinking such a short pipelined CPU wouldn't be able to scale up to such high speeds, even at 45 nm. (in which case, it also kind of makes me wonder how something like AMD's old K6 core might have scaled up, even with its 6-stage pipeline)


If nintendo believed a mid-gen ram upgrade was viable, there would be an expansion slot, this isnt the early 90's carting your Apple LCII over to a qualified engineer to get the motherboard swapped for a Quadra. ;)
Yeah, I get that . . . I rambled too much in that comment. Still, there's the potential for homebrew/hacker exploits with RAM expansion. (only useful for homebrew software, of course)


I imagine the latency will be a lot better with the Wii U perhaps with a decent memory controller, also doesn't DDR3 handle latency better than GDDR3? I'll to check up on that.

Anyway latency was pretty horrendous on PS360.
It may have been more an issue with early generation GDDR3, as with early gen DDR2 having such high latencies that the added bandwidth was useless compared to using old DDR-400. OTOH, by 2005, DDR2 latencies were pretty decent, so that might be another argument for plain dual-channel DDR2 being a better alternative to what MS ended up using. (along with lower cost and commodity/mass market supply, either allowing better cost trade-offs -and maybe better, ahem, quality control- or maybe more RAM from day 1)

smurfted
02-05-2013, 06:17 PM
A serious shame about the WiiU..

sheath
02-21-2013, 11:22 AM
OK, that's more of a complaint about the convenience (and distribution costs) of multiple DVDs vs BD as well as the PS3's lowest common denominator HDD space being higher than the 360s at that point. (thus being more risky to require a large chunk of HDD for permanent install and potentially alienate some 360 owners . . . granted, 20 GB PS3 owners wouldn't be THAT much better off)

It is, but it was also a technological thing because Carmack admitted he started the super texture thing with high capacity storage in mind. Here is an interview (now deleted) from Tom's Games (http://web.archive.org/web/20080831113051/http://www.tomsgames.com/us/2008/08/07/carmack_interview/) were Carmack discusses some of it from 2008's perspective. I think what I saw was this Tom's Guide report (http://www.tomsguide.com/us/carmack-id-rage-xbox-ps3,news-2254.html) at that time, which confirmed what I had been arguing over at WAG forums for a couple of years. Apparently VGChartz and 1-up reported on it (http://www.vgchartz.com/article/1614/quakecon-carmack-dishes-dirt-on-sony-and-microsoft/) as well. They were also discussing this online in 2007 (http://forum.blu-ray.com/showthread.php?t=13112). It was all the rage to discuss whether this one engine would finally prove the PS3's superiority and shift sales from Xbox 360 on over.

sheath
02-20-2014, 10:53 PM
Okay, I finally found an old write up I did on the Tech 5 engine and super textures for the first HD generation consoles.

"Xbox 360 and PS3 don't need HD disc formats for games. (http://www.gamespot.com/profile/blog/xbox-360-and-ps3-dont-need-hd-disc-formats-for-gam/25190289/)

by SHEATH013 (http://www.gamespot.com/profile/SHEATH013/) on October 21, 2007 0 Comments (http://www.gamespot.com/profile/SHEATH013/blog/xbox-360-and-ps3-dont-need-hd-disc-formats-for-gam/25190289/#comments-block)

Oct 21 2007 Sony's hype machine has convinced many that HD disc formats will become necessary for games within the lifespan of the 360 and PS3. Yet again, Sony's hype and reality simply do not match up. First, it is an illusion that a storage medium larger than 8GB can be used to benefit games on these two systems directly. What will be shown is that the added space of an HD disc format will only be used for FMV cutscenes and more high definition audio. Games will not be made for either the 360 or PS3 that require more than 8GB of storage space for game data.

My first question is whether or not either of these consoles can display textures of the resolution that a larger disc format would be needed. Bandwidth from the system RAM to the GPU is going to be the best indicator of this.

http://www.gamespot.com/features/6125087/i....html?type=tech (http:///features/6125087/index.html?type=tech)

The Xbox 360 and the PS3 have 22.4 GB/s memory bandwidth relevant to the GPU. That translates to a maximum of 746 MB per frame of data transfer from RAM to the GPU. That means that even if the Blueray or DVD drive could stream 746 MB thirty times per second to RAM, and the system could render this much data per frame, that the theoretical maximum amount of data that can be on screen at once is 746 MB. In a vacuum with no limitations on texture mapping, that would translate into 746 512x512 textures. This size of texture is huge by current standards, the highest screen resolution is still 1920x1080. Only eight of these textures could fit on the screen at one time. Running with this hypothetical situation, let's say that a game will never reuse a single texture over 10 levels (another impossibility), that would mean that the game would take up 7460 MB, or just under 7.5 GB. These systems are not capable of rendering this much data per frame. http://www.gamespot.com/pc/driving/idtech5...opslot;action;1 (http:///pc/driving/idtech5/news.html?sid=6176210&tag=topslot;action;1)

However, there are a few choice quotes here about Id's Tech 5 engine being able to display as many or as few textures as the developer wants, and this not affecting graphical performance. Carmack has apparently developed something unique that will revolutionize bandwidth utilization, but there will still be limitations.

Now, apply this possibility to a comment on megatextures that I found here:http://episteme.arstechnica.com/eve/forums...0001606831/p/14 (http://episteme.arstechnica.com/eve/forums/a/tpc/f/48409524/m/820001606831/p/14)
QUOTE Originally posted by Hat Monster: The biggest texture you're likely to see on any of the current generation systems is 4096x4096 (16.7 million pixels) and each pixel is four bytes, making the texture exactly 64MiB. Your texture would be 62.5 GIGAbytes. Even compressed with something like DXTC3, you're looking at a cool 20GB.

I'm not aware of any system capable of rendering with such a large texture.

I think Carmack used it for exaggeration, but ID seems firm on the fact that megatextures are the future. I will concede that the meaning of this is "textures up to 128000x128000 pixels". I'm sure a world overlay texture would more likely be quarter of that, but still. BIG! And more detailed textures for place-ables will add up fast too.


Let's take that 4096x4096 texture as an example, it is 64MB in size. Both the PS3 and the 360 are fairly well limited to 256MB of video RAM, that would mean that only four such textures could possibly, ever, be fit into one scene. Now, we know that 512x512 textures are considered large today, but we are talking about needing a larger storage medium for textures. How many practically unusable (for the PS3 and 360) 4096x4096 textures would fit onto 8GB? Roughly 125 textures of this size would fit on one DVD uncompressed. That is not considering using lossless compression on the disc, which could account for as much as 4X the storage. So, when 360 and PS3 games need over 500 unique 4096x4096 textures to function, let me know, and I will concede the point.Now, let's consider disc access speeds of these systems.
Xbox 360:
http://www.cdfreaks.com/news/Xbox-360-exte...-not-games.html (http://www.cdfreaks.com/news/Xbox-360-external-HD-DVD-drive-supports-movies-not-games.html)

"The internal DVD drive of a XBOX 360 is a 12x drive with about 12 MBPS max transfer rate.

The internal drive of a PS3 is a BD 2x drive with about 8 MBPS max transfer rate."

That means that the 360's drive can do 96 Mbits/s data transfer to the PS3's 64Mbits/second maximum.

Another example, this time with a maximum and minimum comparison between the two formats.
http://www.gamespot.com/pages/profile/show...69&user=skektek (http:///pages/profile/show_blog_entry.php?topic_id=23916169&user=skektek)

"The comparison

Mb = megabits
MB =megabytes

Blu-ray 1x: 36Mbps / 4.5MBps
12x DVD: 66 - 132Mbps / 8.2 - 16.5MBps

Blu-ray 2x: 72Mbp / 8MBps
12x DVD: 66 - 132Mbps / 8.2 - 16.5MBps

Blu-ray 3x: 108Mbps / 13.5MBps
12x DVD: 66 - 132Mbps / 8.2 - 16.5MBps

Blu-ray 4x: 144MBps / 18MBps
12x DVD: 66 - 132Mbps / 8.2 - 16.5MBps"

PS3 is a 2X, so its constant speed is slower than the minimum of the 360's drive. At any rate, neither system can stream enough data to keep up with the systems' internal memory bandwidth. This means that the above hypothetical situations are entirely impossible for both the PS3 and Xbox 360. Megatextures will not replace standard resolution textures in this generation, and therefore console gaming can safely dwell on DVDs until the next gen wars heat up. With a variety of lossless compression methods at developer's disposal, and with the storage medium still being the bottleneck for both the PS3 and the Xbox 360, both true HD audio and High Definition FMV cutscenes will only be of limited use.

Also, I've been looking for these middleware articles everywhere, finally found them too."

"What can we really expect from Next gen systems? (http://www.gamespot.com/profile/blog/what-can-we-really-expect-from-next-gen-systems/24338055/)

by SHEATH013 (http://www.gamespot.com/profile/SHEATH013/) on February 19, 2006 0 Comments (http://www.gamespot.com/profile/SHEATH013/blog/what-can-we-really-expect-from-next-gen-systems/24338055/#comments-block)
Feb 19 2006 Climax "Blimey 2" Middleware specs for PS3 *and* Xbox 360
http://www.beyond3d.com/forum/showthread.php?p=175494
http://www.gaming-age.com/news/2003/11/18-49 (https://web.archive.org/web/20070812211718/http://gaming-age.com/news/2003/11/18-49)
http://www.gamenews.pcvsconsole.com/view.php?news=2227 (https://web.archive.org/web/20060219180540/http://gamenews.pcvsconsole.com/view.php?news=2227)

"Performance Specs:
4 pass renderer - 12 million poly/sec
# Sprite renderer - 6 Million sprites/sec (12 M polys/sec), fully textured
# Terrain renderer - 10 million poly/sec
# Textures - 12MB per frame, fully managed
# Xbox - Full usage of both pixel and vertex shaders
# Hundreds of interactive objects in one scene
# 32 players online "

There's our next gen performance, and now we know no current gen console was pushing over 5 million polygons per second, though they were all doing their own things in texture mapping and special effects."

Crazyace
02-21-2014, 02:22 AM
Weren't those Blimey 2 specs for the Xbox/PS2 - not the 'next gen' consoles at the time? The press release was in 2003 after all , and mentions the following ( From here as I couldn't get your links to work: http://forum.beyond3d.com/showthread.php?t=7746 )



"We've already got games in development that are using Blimey 2 and we're ready now for PS3, Xbox 2, PSP and any other next generation games platform."

sheath
02-21-2014, 08:31 AM
That is a great question. I went back and checked the links last night on archive.org shortly before going to bed. I updated two of them in the previous post. Thanks for finding the beyond3D thread, I hadn't found that yet.

This press release (https://web.archive.org/web/20070812211718/http://gaming-age.com/news/2003/11/18-49)leaves me with a lot of questions. Is the 12 million polygons per second four pass figure a total? The other figures don't seem compatible with 6th generation consoles, but perhaps they are. Everything about the press release and even the forum discussion is aimed at the soon to be finalized 7th gen HD consoles. It is somewhat vague about the target platform(s) though, and keep in mind that Blimey 1 & 2 was used for PC games as well.

I will highlight the new features that I don't think apply to Xbox, PS2 and Gamecube capabilities.

"Blimey 2's improvements include:


New plug-in based rendering architecture allowing complete control of rendering.
Massive terrains - dynamic LOD, streaming heightfields.
Advanced particle systems - multiple chainable behaviors.

Full screen effect - motion blur, depth-of-field, bloom effects.
Highly optimized 4 pass rendering - allowing base texture, environment map, specular highlights and damage / scratches and scrapes.

Advanced character animation - supporting blending, overlays, component animations, IK, ragdoll and event triggers.

Solid multi-platform core libraries - streaming file systems, powerful memory management, optimizes vector mathematics.
State-of-the-art sound rendering - Dolby Surround, 3D sounds with multiple realtime effects, streaming music.
Highly optimized libraries - optimization using Sony's PS2 performance analyzer ensuring maximum performance.
Fully cross-platform online support (including support for online API's such as Gamespy, Xbox Live, SCE-RT etc).
DYNE2 highly advanced and optimized physics library - constraint based rigid body dynamics, adaptable, flexible and accurate collision detection, multi-body equation solver.
Advanced AI framework and library for all racing games, including those that have split routes.
Enhanced toolset allowing artists and designers greater control over fine tuning game assets for outstanding visuals and gameplay (Nipple, Tomcat). Key Features:
Modular architecture
Solid design principles
Dedicated core development team
Fully cross-platform
Open and extensible Performance Specs:
4 pass renderer - 12 million poly/sec

Sprite renderer - 6 Million sprites/sec (12 M polys/sec), fully textured

Terrain renderer - 10 million poly/sec

Textures - 12MB per frame, fully managed
Xbox - Full usage of both pixel and vertex shaders
Hundreds of interactive objects in one scene

32 players online "


I don't think 6th gen consoles saw 10 million polygons per second in the terrain alone. We did find some choice quotes from Melbourne house about Grand Prix Challenge having 6000 polygons per car model with 22 cars on track and 60FPS. That would be just shy of 8 million polygons for the cars alone if true. I mixed that figure up with some other game. 11 million polygons per second by 22 cars at 60FPS would be 14,520,000 for the cars alone. I have no way of guessing how 500k polygons per entire track factors in to the polygons per second. The game has quite obvious LOD step downs for the car models though, and the tracks aren't pushing what I would call an extraordinary amount of detail.

If the 12 Million Polygons Per Second 4 pass allows for "base texture, environment map, specular highlights and damage / scratches and scrapes" I would say that has to be 7th Gen (or PC) they are talking about. How many passes do we think the 6th gen consoles could manage per frame before slowdown?

Chilly Willy
02-21-2014, 01:21 PM
I noticed this starting point for the quoted material:


"Xbox 360 and PS3 don't need HD disc formats for games.

by SHEATH013 on October 21, 2007 0 Comments

Oct 21 2007 Sony's hype machine has convinced many that HD disc formats will become necessary for games within the lifespan of the 360 and PS3. Yet again, Sony's hype and reality simply do not match up. First, it is an illusion that a storage medium larger than 8GB can be used to benefit games on these two systems directly. What will be shown is that the added space of an HD disc format will only be used for FMV cutscenes and more high definition audio. Games will not be made for either the 360 or PS3 that require more than 8GB of storage space for game data.

This is stupid - it's on the same level as saying "Computers will never need more than 639KB of ram!" First of all, no matter HOW big a medium is, devs WILL find a way to fill it, if only to boost sales by proclaiming how big their games are. We saw MANY XB360 games that came on SEVERAL DVDs. Second, so what if most of the extra space will be full of HD FMV and audio. FMV cut-scenes and audio are important parts of many games, and people will NOT want to see DVD quality cut-scenes in their HD game. Can you imagine? Break into your 1280x720/1920x1080 HD game with a 640x480i video cut-scene??

sheath
02-21-2014, 04:15 PM
There is a big difference between being able to use more space and absolutely needing more space to make the same game. Sony, and Carmack for a little while, were claiming that DVD couldn't even handle the texture mapping that BluRay allowed, which was ultimately questionable at best. My comment was also limited to the 7th gen consoles.

Chilly Willy
02-21-2014, 05:18 PM
There is a big difference between being able to use more space and absolutely needing more space to make the same game. Sony, and Carmack for a little while, were claiming that DVD couldn't even handle the texture mapping that BluRay allowed, which was ultimately questionable at best. My comment was also limited to the 7th gen consoles.

It's not questionable at all. It's a fact. There are tons of games that have graphics assests bigger than 8GB. That's not counting video or sound. Even if there wasn't ANY that big at the time, you could see it coming easily. Textures are the biggest part of any game after FMV, and they keep consuming more and more of a game. Why do you think most better video cards come with 4 to 8 GB of ram? You can consume an entire DVD worth of texture IN A SINGLE LEVEL now-a-days!

sheath
02-21-2014, 05:51 PM
It's not questionable at all. It's a fact. There are tons of games that have graphics assests bigger than 8GB. That's not counting video or sound. Even if there wasn't ANY that big at the time, you could see it coming easily. Textures are the biggest part of any game after FMV, and they keep consuming more and more of a game. Why do you think most better video cards come with 4 to 8 GB of ram? You can consume an entire DVD worth of texture IN A SINGLE LEVEL now-a-days!

The PS3 and Xbox 360 have 512MB of RAM total. My comments were about the 7th gen consoles only. Even on PC what few games I have bought in recent years are definitely not downloading 40GB, and most came on DVDs with downloads for extra content. Rage ended up looking fine on Xbox 360 without BluRay, that was the entire context of my comment in 2007.

Expecting BluRay to add next to no technical in game advantage over the 360 doesn't seem "stupid" to me at all. Especially since the PS3's BluRay drive is so damned slow games have to be installed on the hard drive.

Crazyace
02-21-2014, 06:08 PM
Highly optimized libraries - optimization using Sony's PS2 performance analyzer ensuring maximum performance.
Fully cross-platform online support (including support for online API's such as Gamespy, Xbox Live, SCE-RT etc).

Fully cross-platform
Open and extensible Performance Specs:
4 pass renderer - 12 million poly/sec

Sprite renderer - 6 Million sprites/sec (12 M polys/sec), fully textured

Terrain renderer - 10 million poly/sec

Textures - 12MB per frame, fully managed
Xbox - Full usage of both pixel and vertex shaders
Hundreds of interactive objects in one scene

32 players online "
[/LIST]
[/SIZE][/FONT][/SIZE][/FONT]
I don't think 6th gen consoles saw 10 million polygons per second in the terrain alone. We did find some choice quotes from Melbourne house about Grand Prix Challenge having 6000 polygons per car model with 22 cars on track and 60FPS. That would be just shy of 8 million polygons for the cars alone if true. I mixed that figure up with some other game. 11 million polygons per second by 22 cars at 60FPS would be 14,520,000 for the cars alone. I have no way of guessing how 500k polygons per entire track factors in to the polygons per second. The game has quite obvious LOD step downs for the car models though, and the tracks aren't pushing what I would call an extraordinary amount of detail.

If the 12 Million Polygons Per Second 4 pass allows for "base texture, environment map, specular highlights and damage / scratches and scrapes" I would say that has to be 7th Gen (or PC) they are talking about. How many passes do we think the 6th gen consoles could manage per frame before slowdown?

Blimey2 was on PS2 and Xbox ( original ) at least - note the quote about the use of the PS2 performance analyser, and the comment about pixel and vertex shaders on the Xbox. Also this was a 2003 press release, so before 360/PS3. ( which both support pixel and vertex shaders )

11 mill / 22 / 60 would be 8333 polygons per car.

On PS2 four passes would be 4 polygons raw - but if the VU code culled backfacing polys at 50% it would be effectively 2 polygons for four passes. To be honest the size of the polygons becomes more important anyway ( fillrate rather than transform rate )

sheath
02-21-2014, 06:31 PM
Blimey2 was on PS2 and Xbox ( original ) at least - note the quote about the use of the PS2 performance analyser, and the comment about pixel and vertex shaders on the Xbox. Also this was a 2003 press release, so before 360/PS3. ( which both support pixel and vertex shaders )

11 mill / 22 / 60 would be 8333 polygons per car.

On PS2 four passes would be 4 polygons raw - but if the VU code culled backfacing polys at 50% it would be effectively 2 polygons for four passes. To be honest the size of the polygons becomes more important anyway ( fillrate rather than transform rate )

Gah, I completely screwed up that number again. The Melbourne House interview listed 11 thousand polygons per model RAW, and 20 thousand per model with multi pass effects. I shouldn't PM chat and post at the same time. I am trying to figure out if re-rasterizing polygons each pass is just as resource heavy as the raw polygon, or more specifically if it makes sense to count the same polygons again on a per pass basis when determining polygons per second performance. Between that and the wide variety of ways systems render polygons (Tile Based Renderers, Multi Pass to Single Pass Renderers, and whatever else is out there) the whole polygon spec is becoming even less meaningful to me than it was before.

Barone
02-21-2014, 08:10 PM
more specifically if it makes sense to count the same polygons again on a per pass basis when determining polygons per second performance.
If all your quest was about determine how many polygons a system could render, why the heck are you going to invalidate polygons which are used for effects?
It makes no sense IMO. Polygons are polygons, period.

If you're comparing geometry complexity of raw 3D models, then it makes sense but polygon count shouldn't be racialist or sexist IMO. ;)

sheath
02-22-2014, 08:22 AM
If all your quest was about determine how many polygons a system could render, why the heck are you going to invalidate polygons which are used for effects?
It makes no sense IMO. Polygons are polygons, period.

If you're comparing geometry complexity of raw 3D models, then it makes sense but polygon count shouldn't be racialist or sexist IMO. ;)

You keep claiming that I am invalidating this, or ignoring that. Please stop. I want to know whether the re-rasterizing costs as much in bandwidth and resources as the initial one. If this offends you then ignore ME, please, from now on if it suits you. If, and only if the polygon is the same in every way the first, second, third and fourth pass, would I consider it four times the polygons "on screen". Then, as you say, the model itself is actually no more complex for it. So, for these reasons it makes sense to ask. Please note this is in the tech forum, not the console flame war that is the rest of the forum.

Barone
02-22-2014, 10:09 AM
I want to know whether the re-rasterizing costs as much in bandwidth and resources as the initial one. If this offends you then ignore ME, please, from now on if it suits you.
That's sounds very different from what's written in that part I quoted in my previous post. Now, can you quit overreacting for a change?

Until now, all polygons per frame or per second counts that I've heard of were based on how many polygons you rendered, no matter how you rendered or for what purpose you rendered them. That's why I said it makes no sense to me the question about ignoring this or that polygon for counts, and that's why I quoted that specific part and not your whole sentence.


I have a source that will probably answer some of your questions but I will post it in the 6th gen thread, since that engine is clearly for the 6th consoles.

Chilly Willy
02-22-2014, 09:29 PM
The PS3 and Xbox 360 have 512MB of RAM total. My comments were about the 7th gen consoles only.

The point that Sony was making was that with smaller amounts of ram for textures, you're more likely to stream textures off the DVD/BD as the game goes along since you can't load them all at once. So you were STILL going to see tons of textures in games.

rusty
02-23-2014, 11:19 AM
You keep claiming that I am invalidating this, or ignoring that. Please stop. I want to know whether the re-rasterizing costs as much in bandwidth and resources as the initial one.


On PS2? The simple answer is no.

The cost would be less for each pass, because you didn't render the the whole model for one pass and the same again for each following pass. You rendered groups of triangles for all of the passes in one go, so you transformed the verts and then just rendered them multiple times. Also, you would disable things like z-write for every pass after the first one which meant faster rendering.

So say I had to render 12 triangles in a packet with diffuse, environment and shadow mapping. I would do the following (after VU transform).

1) Build GIF tags for 12 triangles for diffuse texture (enable z-write)
2) Build GIF tag for disable z-write
3) Build GIF tags for blending environment map
4) Build GIF tags for 12 triangles
5) Build GIF tags for shadow map
6) Build GIF tags for 12 triangles
7) Wait for GS to finish rendering previous geometry
8) Perform a GS kick and start processing next VIF tag (next packet of triangles)

What's more, in many cases, you didn't reject back facing triangles on the PS2 it was cheaper to just render them than perform the calculation on the VU. Maybe for three passes, it might be cheaper to perform the calculation, especially if you were running out of VU mem to build your double buffered GIF tags.

But Barone brings up an excellent point about effects polys. Since they were typically using alpha blending, which you would sue when doing multiple passes, they were reading from the colour buffer and writing to the colour buffer. They are a VERY good measure of the available rendering bandwidth on graphics hardware.

sheath
02-23-2014, 01:08 PM
The point that Sony was making was that with smaller amounts of ram for textures, you're more likely to stream textures off the DVD/BD as the game goes along since you can't load them all at once. So you were STILL going to see tons of textures in games.

Based on the information available to me when I made my comment about BluRay not being a requirement for 7th gen games to peak the PS3's drive can only transfer 8MB per second, half of what DVD 12X could transfer. What I questioned back then and question today is whether that transfer speed was fast enough to realistically "stream" textures in game, whether they be 512x512 textures (~384-512KB each compressed, or 1MB at 4 bytes per pixel) or super textures of 4096x4096 (64MBytes each).

Now that that generation is behind us, did any game(s) do this? Rage certainly did not depend on BluRay, it depended on the Hard Drive install. Was BluRay used to create a greater diversity in 3D environments the 7th Generation consoles? By asking I am not saying it hasn't happened, I just haven't seen it. I see multi-pass effects pop in a lot more than textures on these consoles even when they are loading from the Hard Drive. I would also figure Sony fans would have painted Youtube with videos of greater texture diversity in even one game if it ever happened. Everything I have seen showed the 360 with sharper textures on average (no I don't want to argue about that).


On PS2? The simple answer is no.

The cost would be less for each pass, because you didn't render the the whole model for one pass and the same again for each following pass. You rendered groups of triangles for all of the passes in one go, so you transformed the verts and then just rendered them multiple times. Also, you would disable things like z-write for every pass after the first one which meant faster rendering.
...
What's more, in many cases, you didn't reject back facing triangles on the PS2 it was cheaper to just render them than perform the calculation on the VU. Maybe for three passes, it might be cheaper to perform the calculation, especially if you were running out of VU mem to build your double buffered GIF tags.

But Barone brings up an excellent point about effects polys. Since they were typically using alpha blending, which you would sue when doing multiple passes, they were reading from the colour buffer and writing to the colour buffer. They are a VERY good measure of the available rendering bandwidth on graphics hardware.

So on the PSt the models' front sides would have their polygons re-rasterized and the back sides or back facing polygons were only typically rendered in the initial pass? By available rendering bandwidth are you saying that these same extra front facing polygons could_have been included in the raw model but the effects polygons were preferred by the developer?

TrekkiesUnite118
02-23-2014, 01:34 PM
Sonic Unleashed, Generations, and any other game using the Hedgehog Engine stream graphics data off the disc in real time as the game is playing. That's how the levels were able to be so long with out having to stop and load new segements.

If I remember correctly the bigger issue with the PS3 was that it's 512 MB of RAM was split in half between the CPU and GPU. That's why games like Skyrim had issues running well on the system.

BluRay's benefit is the same benefit that CDs gave over cartridges, and that DVDs gave over CDs. Developers simply had more space available to make bigger games with higher quality assets. Saying it wasn't needed would be like saying CDs weren't needed back when the Sega CD and Turbo CD came out.

rusty
02-23-2014, 01:35 PM
Now that that generation is behind us, did any game(s) do this? Rage certainly did not depend on BluRay, it depended on the Hard Drive install. Was BluRay used to create a greater diversity in 3D environments the 7th Generation consoles? By asking I am not saying it hasn't happened, I just haven't seen it.


To name a few;

Grand Theft Auto IV and V
inFamous
Resistance 2 + 3
Midnight Club
Red Dead Redemption
Batman:Arkham Asylum

However, I'm not too clear if they all streamed textures and/or geometry from the optical drive. I know that texture streaming was something they wanted to fit into the original Resistance, but didn't have time.

I think Blu Ray meant more diversity in content on 7th gen consoles. I remember when Turn 10 were having issues fitting textures onto Forza 2 on the 360, and it took a lot of work from Microsoft to compress the hell out of them to fit them on disc. Forgetting Blu Ray bandwidth for a second, just the increase in size helps a hell of a lot.




I see multi-pass effects pop in a lot more than textures on these consoles even when they are loading from the Hard Drive. I would also figure Sony fans would have painted Youtube with videos of greater texture diversity in even one game if it ever happened. Everything I have seen showed the 360 with sharper textures on average (no I don't want to argue about that).

There more to texturing than just diffuse maps. These days you have at least three basic texture for a material; diffuse, normal and specular. The diffuse map is ok to compress, so is the specular but I'm not too sure you want to *always* compress the normal map which tends to be quite large.

I think if you're using the wrong metric to prove that the larger capacity was used effectively. It's not so much a case of "how many different materials can we have" but rather "how can we make the materials look better".

rusty
02-23-2014, 01:49 PM
So on the PSt the models' front sides would have their polygons re-rasterized and the back sides or back facing polygons were only typically rendered in the initial pass?


Well...if you culled the back facing polys, you didn't render them at all. And if you did render them, you were doing so because you didn't perform the calculation to determine if they were backfacing or not. So you would have rendered the extra passes for them too.



By available rendering bandwidth are you saying that these same extra front facing polygons could_have been included in the raw model but the effects polygons were preferred by the developer?

Like any console, it depends where you want to spend your budget. You can often get away with lower poly detail because you've worked hard on good lighting and environment mapping. Simple lighting however, will only highlight that you've used a lot of polys for a curved surface and it *still* looks faceted unless you're in a dimly lit environment.

sheath
02-23-2014, 02:10 PM
8MB per second, at 30FPS, for a 2X Bluray drive still equates to 273KB per frame. If that were all used for streaming texture data I suppose you could get some mileage updating as many lines as possible per frame. That is still less than half of a 512x512 texture at 40 bytes per pixel though, or one 512x512 texture highly compressed. The real issue I was attempting to debate against back in 2007-8 was that super textures weren't going to require Bluray in the first place. I have yet to notice anybody claiming any multiplatform PS360 had greater texture diversity in the PS3 version.

I have played the Arkham games extensively first on the 360 and then on PC. Even when I had a video card from 2007 having more VRAM and newer shaders mattered a lot more than anything else. I haven't seen anything to indicate that the PS3 versions have more diversity of features of any kind from the DVD or download based PC and 360 versions. Sure some exclusive content was purchased and then added to the GotY versions later, or as DLC. As you said, extra content was probably the biggest usage for the added disk space Bluray provided in the 7th gen.

rusty
02-23-2014, 03:56 PM
I haven't seen anything to indicate that the PS3 versions have more diversity

As I've said; diversity or looking for more textures is the wrong metric to use when judging the usefulness of the storage available on BD. This is especially true when you're dealing with a cross platform title where the idea is to have the same basic content on all platforms.

sheath
02-23-2014, 05:04 PM
As I've said; diversity or looking for more textures is the wrong metric to use when judging the usefulness of the storage available on BD. This is especially true when you're dealing with a cross platform title where the idea is to have the same basic content on all platforms.

It would be the wrong metric to whether a new medium can be used to make a game better, certainly. I fail to see how it is wrong to look for more detailed textures, or more texture diversity, or some sort of greater diversity in gameplay in a single generation though. CD-ROM was used for more real time cutscenes and FMV, CD-Audio, voice, more levels, more enemy diversity and even more animation right from the start with the Turbo/PCE CD and Sega CD and PC CD-ROM games. That is even with the limitation of only 64-256KB RAM in the Turbo CD and 768KB in the Sega CD and the slower than slow 1X CD-ROM standard. Even still, people complained about CD-ROM games that were no different than Cart games with new cutscenes and CD-Audio. Was that the wrong way to compare CD-ROM to Cartridge? I think in some ways it was, as it ignores the advantages to the manufacturer and developer that these small improvements represent. Still, it is a very common complaint against the Sega CD in particular.

The switch from CD-ROM to DVD-ROM was less dramatic for games at first, but eventually we got more cars and tracks and levels and cutscenes and spoken dialog in DVD-ROM games than could be realistically crammed into a multi-disc CD/GD-ROM. I'm not sure where I would put that cutoff at as Shenmue 1 & 2 were about as in depth as it gets and swapping disks wasn't a game breaking problem.

If such a dramatic shift as CDs over 8-16MB cartridges occurred from DVD-ROM to BluRay in the 7th gen I think Sony and Carmack's claims would have some basis in reality. As things stand, at least from what I have seen, we're talking about more space for "stuff" that does_not affect the typical experience of the game, and DLC and hard drives are negating that advantage handily. I suppose with the Xbone and PS4 having Bluray standard we might start seeing much larger games that probably could not be done on multiple DVDs. Carmack said that he designed Rage in such a way that splitting it between two DVDs would be impossible without significant losses in textures, but he managed to make that work anyway. That is the context of my wayward expectation.

TrekkiesUnite118
02-23-2014, 05:23 PM
8MB per second, at 30FPS, for a 2X Bluray drive still equates to 273KB per frame. If that were all used for streaming texture data I suppose you could get some mileage updating as many lines as possible per frame. That is still less than half of a 512x512 texture at 40 bytes per pixel though, or one 512x512 texture highly compressed. The real issue I was attempting to debate against back in 2007-8 was that super textures weren't going to require Bluray in the first place. I have yet to notice anybody claiming any multiplatform PS360 had greater texture diversity in the PS3 version.

I have played the Arkham games extensively first on the 360 and then on PC. Even when I had a video card from 2007 having more VRAM and newer shaders mattered a lot more than anything else. I haven't seen anything to indicate that the PS3 versions have more diversity of features of any kind from the DVD or download based PC and 360 versions. Sure some exclusive content was purchased and then added to the GotY versions later, or as DLC. As you said, extra content was probably the biggest usage for the added disk space Bluray provided in the 7th gen.

You're not going to see anything significant in most multi-platforms as most of them last generation were developed with the 360 as the lead platform. You may not need to stream those super textures in real time either. You may just need to load them into RAM. BluRay helps with that by giving you more space to have more textures or higher quality textures that still fit into RAM.

Final Fantasy XIII is a pretty good example of what the lack of BluRay did to the 360 version. The cutscene quality looks like absolute dogshit, even after being split into multiple discs.

zyrobs
02-23-2014, 06:10 PM
8MB per second, at 30FPS, for a 2X Bluray drive still equates to 273KB per frame.

You don't stream textures like that on a modern game. Well, if you did, it would be ridiculous. A DVD or Bluray would be just too low latency to stream textures on a per frame basis.

Instead, you'd load whole level chunks at given checkpoints and so (which included the textures too), either from main memory onto the gpu, or from the disc media to the main memory (and maybe from there to the gpu, if it is needed). Sort of like how the first Halo did, if you remember. You could optimize the data for faster loading that way (sequential access of one block of data was faster than randomly accessing some textures per each frame). And this is also the reason why most disc based games have a LOT of duplicate resources on them. It is true for all CD based games down to the Sega CD, but the PS3 was very notorious for it because it had huge Blur-ray space to use but the drive had slower data access.

This is also why installing games to HDD became commonplace, as they had orders of magnitude better random access speeds.

rusty
02-24-2014, 03:14 AM
You don't stream textures like that on a modern game. Well, if you did, it would be ridiculous. A DVD or Bluray would be just too low latency to stream textures on a per frame basis.

Instead, you'd load whole level chunks at given checkpoints and so (which included the textures too), either from main memory onto the gpu, or from the disc media to the main memory (and maybe from there to the gpu, if it is needed). Sort of like how the first Halo did, if you remember. You could optimize the data for faster loading that way (sequential access of one block of data was faster than randomly accessing some textures per each frame). And this is also the reason why most disc based games have a LOT of duplicate resources on them. It is true for all CD based games down to the Sega CD, but the PS3 was very notorious for it because it had huge Blur-ray space to use but the drive had slower data access.

This is also why installing games to HDD became commonplace, as they had orders of magnitude better random access speeds.

I'd give you rep for this, but I have to spread it around first.

You and Trekkie are both spot on with this. I guess you've seen the effects of this in various games.

The first ever game with geometry and texture streaming would load whole chunks in one go. But There's just not enough bandwidth to deal with that these days, and so, the solution is to load is bits over a period of time, rather than in one go. Seek speeds on any type of drive have always been the real killer, and I think that even with the piecemeal strategy, you wouldn't be seeking all over the place - you'd have duplicated resources stored sequentially in order of importance for a stream section.

zyrobs
02-24-2014, 07:56 AM
Well yeah, you can't easily load bigger chunks while also, say, streaming music or whatever from the disc. So the better solution is to load smaller and smaller packs more and more often. I don't think it would be possible to do modern, huge open-area games without doing that anyway, even with modern monster machines with 16gb ram and 3gb vram and whatnot.

Halo CE did load bigger parts, and it paused for a second while it did. But the level design made use for this - usually when it loaded new stuff, you were in an elevator or in a corridor leading to new area. But that was, like, in 2001. It was maybe one of the first titles to do this too. How long did we come since then....


You and Trekkie are both spot on with this. I guess you've seen the effects of this in various games.

I don't think I did since I play very few modern games. But I do dabble in programming.

Borderlands comes to mind, though. It had crazy texture mipmapping issues, when it loaded a level you'd be swimming in textures so bad that it made N64 games look sharp. Then it slowly sharpened them out. It all lasted like 2 seconds but it was very jarring since it ALWAYS happens when you load a level.

sheath
02-24-2014, 08:40 AM
For the record, I wasn't the one who brought up streaming textures or details from disk as a viable option, I'm pretty sure I was called stupid for saying the opposite in my five plus year old post. I was just trying to see what it would look like on a per frame basis based on Bluray 2X bandwidth. The first game I saw stream level data without freezing to load the level segment from disk was probably Crazy Taxi but definitely Soul Reaver on PS1, Dreamcast and PC. I imagine Carmageddon 1+2 on PC were doing it, and I know Half Life was before that, but they were streaming from the hard drive. All sand box games do it quite obviously especially on the PS2 where I can see non-textured objects in the far background, but even on the Xbox 360 where I can see multi pass effects and model detail pop in as it loads.

Again though, does anybody have any evidence that Bluray was *needed* in the 7th generation for in game anything?

Barone
02-24-2014, 08:54 AM
Again though, does anybody have any evidence that Bluray was *needed* in the 7th generation for in game anything?
Lossless 7.1 audio vs Lossy 5.1 audio.

It may help with higher res textures for native 1080p games, I think.
http://forum.beyond3d.com/showthread.php?t=46241

zyrobs
02-24-2014, 09:11 AM
Also, it may have some relation with the fact that the PS3 seems to have more games running natively at 1080p:
http://forum.beyond3d.com/showthread.php?t=46241

The game running in whatever resolution has nothing to do with the storage space. There are 4kbyte demos that can run in 1080p.

edit: unless of course you are talking about the FMVs.

Barone
02-24-2014, 09:15 AM
I suppose that games targeting 1080p natively would use higher res textures to look really good, which would probably require more storage space. IDK, just a guess.
I didn't mean that the bu-ray magically allowed games to run at higher res...

sheath
02-24-2014, 09:39 AM
PS3 multiplatform games in particular tended to run at a lower native resolution than the 360 versions besides. Some cases are as low as 600p if I recall. We saw a lot of that come out, finally, when we started comparing Call of Duty Ghosts on the 7th gen consoles (including WiiU) and PS4 and Xbone. Of course we already knew the 360 didn't always display even at 720p thanks to Halo 3 and the entire internet being up in arms about it's native resolution.

Is there a list somewhere of native 1080p 7th gen games?

Barone
02-24-2014, 09:43 AM
PS3 multiplatform games in particular tended to run at a lower native resolution than the 360 versions besides. We saw a lot of that come out, finally, when we started comparing Call of Duty Ghosts on the 7th gen consoles (including WiiU) and PS4 and Xbone.
Multiplatform games weren't my point, I'm quite sure. And for the reasons that both zyrobs and rusty already pointed.



Is there a list somewhere of native 1080p 7th gen games?
That thread link I provided has the native resolutions listed, you just have to search in there; that's why I linked it.

sheath
02-24-2014, 10:00 AM
Multiplatform games weren't my point, I'm quite sure. And for the reasons that both zyrobs and rusty already pointed.

That thread link I provided has the native resolutions listed, you just have to search in there; that's why I linked it.

I just got to that part, very informative. Just to be clear, I posted my personal recollection not any expression of mastery of your link. I was about to say that some 360 games are down below 600p as well, and both consoles have games that might as well be 800x600 or there abouts. Either way, somebody more interested than me in "proving" one set-top box's superiority might want to look at the native resolutions list.

The reasons rusty and zyrobs cited for ignoring multiplatform games is interesting but not any absolute reason to do so. If anything, an absolute technical advantage, such as many times the optical disk storage space, should show even in multiplatform titles. I sense another free pass being given.

Barone
02-24-2014, 10:22 AM
Either way, somebody more interested than me in "proving" one set-top box's superiority might want to look at the native resolutions list.
CTRL+F "1080" shouldn't be all that hard.


The reasons rusty and zyrobs cited for ignoring multiplatform games is interesting but not any absolute reason to do so. If anything, an absolute technical advantage, such as many times the optical disk storage space, should show even in multiplatform titles. I sense another free pass being given.
I'm pretty sure they didn't say to ignore multiplatform games, as I'm also not saying that.
It just seems to be a very ineffective approach to such for the advantages of Blu-ray *only* in games which were designed to be multiplatform and, in many cases, which targeted the DVD-based platform first. Unless the goal is to not find any advantages...

It's not a "free pass" since the same argument has been used by many people, including you, to advocate the existence of the Sega CD; or the SAT/PS1 against the N64. If you look only at the multiplatform multiformat releases and adopt the "I don't care about stupid extras" approach, well, we should still be using cartridges then.

As for the multiplatform advantage, I've already pointed: Lossless 7.1 audio vs Lossy 5.1 audio.
Better audio fidelity and more channels isn't a valid advantage?

rusty
02-24-2014, 10:27 AM
The reasons rusty and zyrobs cited for ignoring multiplatform games is interesting but not any absolute reason to do so. If anything, an absolute technical advantage, such as many times the optical disk storage space, should show even in multiplatform titles. I sense another free pass being given.

So the bit in bold. Why should it? What is the specific reason that a technical advantage should show itself? Because I can give you a production reason for it NOT happening.

It's not so much a free pass but more of a comment on the reality of cross platform development. You won't have more diverse textures on one platform for a cross platform title "just because". It makes sense to do it if a) it makes financial sense and b) if it makes sense in terms of art direction. But for a game large enough to use streaming technology, that's a lot of overhead to create two completely different sets of texture/material libraries and apply them to a pretty big level.

There's nothing stopping a studio from doing this but why bother if the customer is happy as long as you have parity with the other platform that has lower storage requirements?

sheath
02-24-2014, 10:37 AM
CTRL+F "1080" shouldn't be all that hard.

Just did, the PS3 has more 1080p games, though most of them are first party. I don't discount first party games though, even on the Saturn and Dreamcast, as these games are where the majority of the top performers tend to be.



I'm pretty sure they didn't say to ignore multiplatform games, as I'm also not saying that.
It just seems to be a very ineffective approach to such for the advantages of Blu-ray *only* in games which were designed to be multiplatform and, in many cases, which targeted the DVD-based platform first. Unless the goal is to not find any advantages...

It's not a "free pass" since the same argument has been used by many people, including you, to advocate the existence of the Sega CD; or the SAT/PS1 against the N64. If you look only at the multiplatform multiformat releases and adopt the "I don't care about stupid extras" approach, well, we should still be using cartridges then.

Who said to only look at the multiplatform games? I brought up what games I have seen compared, which tend to be multiplatform, and that comparison was downplayed to the point of having, what, four forum members telling me not to? Oh but now you aren't telling me not to, you are just saying I shouldn't let them be conclusive right?



As for the multiplatform advantage, I've already pointed: Lossless 7.1 audio vs Lossy 5.1 audio.
Better audio fidelity and more channels isn't a valid advantage?

By saying nothing I'm even exposing my bias? God damn!?


So the bit in bold. Why should it? What is the specific reason that a technical advantage should show itself? Because I can give you a production reason for it NOT happening.

It is a simple rubric really. Something that is many times greater than something else ought to show it. If this were not the case then all of the games I have bought for the PC, including the Arkham games, since 2008 wouldn't use the many generations newer shaders and feature faster streaming and less detail pop-in than their Xbox 360 or PS3 counterparts. The advantage of a faster hard drive and full game installs on said drive proves that I have something many times more capable than the other. Bluray on the PS3 should be held to a similar standard. If Sony had only promised greater amounts of extra modes (read: content) and uncompressed 7.1 audio then I would not be questioning Bluray's contribution to these things.



It's not so much a free pass but more of a comment on the reality of cross platform development. You won't have more diverse textures on one platform for a cross platform title "just because". It makes sense to do it if a) it makes financial sense and b) if it makes sense in terms of art direction. But for a game large enough to use streaming technology, that's a lot of overhead to create two completely different sets of texture/material libraries and apply them to a pretty big level.

There's nothing stopping a studio from doing this but why bother if the customer is happy as long as you have parity with the other platform that has lower storage requirements?

It is a free pass. The textures, effects, everything, are better in the PC versions from 2008 on. Should I mention that Sony promised in nearly every press release that the PS3 would begin to show its absolute superiority around this time as well? Granted that was aimed at how the Cell processor would take off and leave the competition a generation behind, but Bluray was integral to the claims as well. It didn't happen from what I have seen, a handful more games at native 1080p (ignoring the low end) and 7.1 audio aside.

zyrobs
02-24-2014, 10:39 AM
I suppose that games targeting 1080p natively would use higher res textures to look really good, which would probably require more storage space. IDK, just a guess.
I didn't mean that the bu-ray magically allowed games to run at higher res...

If anything, targeting 1080p means that the gpu has to spend more time and effort to display at that resolution, and other effects have to be axed in order to keep the framerate up. Including texture quality. Which is the reason why most games didn't bother with neither 1080p nor 60fps - you can get a game to look better if you add extra effects (like better lightning), and instead run at a lower resolution to keep speed up.


The reasons rusty and zyrobs cited for ignoring multiplatform games is interesting but not any absolute reason to do so. If anything, an absolute technical advantage, such as many times the optical disk storage space, should show even in multiplatform titles. I sense another free pass being given.

So if an "absolute" advantage is dismissed because it has little reflection on actual in-game quality, it is considered giving it a free pass?

The difference between X360 with a DVD and PS3 with a Bluray is about the same as a Megadrive and a Mega CD "special edition" of the same game (say, Chuck Rock, Earthworm Jim, etc): you get more space for extra levels, fmvs, and more/better quality speech and music streaming music, in return of loading times. But you don't get extra visual fidelity because you still have to cream the game into the same amount of video ram!

In fact on the PS3 you technically have even less video ram than on the x360... And like I said, the PS3 had a slower drive, so they offset that by duplicating a lot of content, as a form of optimizing loading times.


Should I mention that Sony promised in nearly every press release that the PS3 would begin to show its absolute superiority around this time as well? Granted that was aimed at how the Cell processor would take off and leave the competition a generation behind, but Bluray was integral to the claims as well. It didn't happen from what I have seen, a handful more games at native 1080p (ignoring the low end) and 7.1 audio aside.

You really should not confuse marketing bullshit with real hardware performance (unless of course you consider the Jaguar to be 4x more powerful than the Playstation).


Just did, the PS3 has more 1080p games, though most of them are first party.

I suspect this could also be due to internal politics choosing a design goal of "must run at 1080p", to be able to show the impressive numbers to the media. I mean if it was THAT easy to get 1080p, then more multiplatform games would have it as well.
Not saying it was because of that, just that it MAY have been the reason.

Barone
02-24-2014, 10:49 AM
If anything, targeting 1080p means that the gpu has to spend more time and effort to display at that resolution, and other effects have to be axed in order to keep the framerate up. Including texture quality. Which is the reason why most games didn't bother with neither 1080p nor 60fps - you can get a game to look better if you add extra effects (like better lightning), and instead run at a lower resolution to keep speed up.
And so I was pretty wrong...
Yep, what you said makes a lot more sense to me (I should have taken the fill rate into account as well, duuuhhh). Thanks for the correction and clarification.

Barone
02-24-2014, 11:12 AM
that comparison was downplayed to the point of having, what, four forum members telling me not to?
If four members are telling you the same thing, well, maybe you should re-read or rephrase what you wrote.



By saying nothing I'm even exposing my bias? God damn!?
By presuming no advantages with bigger storage space medium and fiercely attacking what you think to be non-advantages, you didn't paint a good background for your arguments IMO.

sheath
02-24-2014, 11:27 AM
If four members are telling you the same thing, well, maybe you should re-read or rephrase what you wrote.


If anything I have proven that no matter how many times I rephrase my statements you and everybody else in this place will continue push their own agenda and interpretation while reading my statements in the worst possible light. A microcosm of this is how everybody keeps assuming I am saying there is no benefit to bluray whatsoever. This assertion actually is only in your minds.



By presuming no advantages with bigger storage space medium and fiercely attacking what you think to be non-advantages, you didn't paint a good background for your arguments IMO.

These happened only in your mind. I got fierce when you more than implied, again, that I was attempting to dismiss facts or somehow discredit them. The constant defensiveness in regard to all things Sony makes any and all discussion of these facts impossible. It has even allowed the trolls to (accidentally) poke holes in your arguments for the PS2's superiority.

When I ask a question, you accuse me of dismissing and obfuscating facts. To the former I say again and again nothing is dismissed and contextualizing facts is right and beneficial to the discussion. To the later I say the facts are already stupefying and confused in these discussions. You might as well be blaming me for why the wires under my desk are a rats nest right now when the last time I touched them they were in neat rows.

Barone
02-24-2014, 12:32 PM
If anything I have proven that no matter how many times I rephrase my statements you and everybody else in this place will continue push their own agenda and interpretation while reading my statements in the worst possible light. A microcosm of this is how everybody keeps assuming I am saying there is no benefit to bluray whatsoever. This assertion actually is only in your minds.

These happened only in your mind. I got fierce when you more than implied, again, that I was attempting to dismiss facts or somehow discredit them. The constant defensiveness in regard to all things Sony makes any and all discussion of these facts impossible. It has even allowed the trolls to (accidentally) poke holes in your arguments for the PS2's superiority.

When I ask a question, you accuse me of dismissing and obfuscating facts. To the former I say again and again nothing is dismissed and contextualizing facts is right and beneficial to the discussion. To the later I say the facts are already stupefying and confused in these discussions. You might as well be blaming me for why the wires under my desk are a rats nest right now when the last time I touched them they were in neat rows.
Taking forum posts as personal attacks like that usually doesn't lead to a good outcome.


I said something very simple which wasn't supposed to be offensive: if everyone is complaining about what you wrote and you think you're misinterpreted, why not just try to rephrase it instead of engaging a petty fight?

You say that all people are always defensive about Sony when you're discussing it, so, maybe from another point of view people are probably seeing you as offensive towards Sony all the time. What do you think?

Why overreact?

stu
02-24-2014, 01:09 PM
Taking forum posts as personal attacks like that usually doesn't lead to a good outcome.


I said something very simple which wasn't supposed to be offensive: if everyone is complaining about what you wrote and you think you're misinterpreted, why not just try to rephrase it instead of engaging a petty fight?

You say that all people are always defensive about Sony when you're discussing it, so, maybe from another point of view people are probably seeing you as offensive towards Sony all the time. What do you think?

Why overreact?

"You must spread some Reputation around before giving it to Barone again."

Still not able to rep? Seriously? :confused:

sheath
02-26-2014, 10:31 AM
Taking forum posts as personal attacks like that usually doesn't lead to a good outcome.


I said something very simple which wasn't supposed to be offensive: if everyone is complaining about what you wrote and you think you're misinterpreted, why not just try to rephrase it instead of engaging a petty fight?

You say that all people are always defensive about Sony when you're discussing it, so, maybe from another point of view people are probably seeing you as offensive towards Sony all the time. What do you think?

Why overreact?

I over react? Why do you continually twist my words into things I plainly did not write, you even quote me and claim it says something extreme versus what I actually wrote. Bluray wasn't necessary is a far cry from Bluray was useless and couldn't be used. ________ is a far cry from "7.1 uncompressed doesn't exist." You are the one who is over reacting and you are the one who cannot handle any statement regarding a Sony platform that does not make it seem pristine in comparison. You are the one who needs the PS2 CPU to be far and away better at polygonal 3D than anything else that generation.

Why am I offended? It has nothing to do with the systems at all, and everything to do with my expecting somebody who has known me for years to know better and not expect me to ignore/obscure facts. When all facts are on the table, which has not happened yet for the 7th gen, an objective mind has to search for a middle ground. In the mean time my questions are meant to uncover more sources and nothing more.

Still think I'm lying or just blinded by fanboism? Go back and look at all of my posts and consider how you dropped out caveats and conditions, and phrases like "looks like" or "seems like" or "might be" as opposed to assertive conclusions. I knew what I was writing would draw out system advocates and full on fanboys, I just didn't expect you to be one of them.

Barone
02-26-2014, 11:23 AM
I over react?
Yes, you do.



Why do you continually twist my words into things I plainly did not write, you even quote me and claim it says something extreme versus what I actually wrote. Bluray wasn't necessary is a far cry from Bluray was useless and couldn't be used. ________ is a far cry from "7.1 uncompressed doesn't exist." You are the one who is over reacting and you are the one who cannot handle any statement regarding a Sony platform that does not make it seem pristine in comparison. You are the one who needs the PS2 CPU to be far and away better at polygonal 3D than anything else that generation.
Yeah, I twist your words, that's why people were challenging your claims using your own posts.



Why am I offended? It has nothing to do with the systems at all, and everything to do with my expecting somebody who has known me for years to know better and not expect me to ignore/obscure facts. When all facts are on the table, which has not happened yet for the 7th gen, an objective mind has to search for a middle ground. In the mean time my questions are meant to uncover more sources and nothing more.
Oh, so you're offended? Do me a favor: try to assume what you have written by yourself instead of faulting me for that.
Can you do it without getting too emotional or offended? Can you handle criticism in an objective way?



Still think I'm lying or just blinded by fanboism? Go back and look at all of my posts and consider how you dropped out caveats and conditions, and phrases like "looks like" or "seems like" or "might be" as opposed to assertive conclusions. I knew what I was writing would draw out system advocates and full on fanboys, I just didn't expect you to be one of them.
Every time I go back to those posts they sound just as passionate and radical as they did at the first time. And feels like I'm not alone.
It's not my fault when you make claims which you cannot sustain with sources while you evoke "historical facts" all the time.
How much of your reviews and the GP's archive are really just "historical facts"?
Is the criticism that you have received towards those texts also my fault? Am I poisoning people's minds in order to distort their perception towards your "questions" and "historical facts"?

Gimme a break. If it hurts you so much to see your theories confronted with actual sources, just let me know and I'll quit posting in these threads which you like to drive in circles.

Yharnamresident
04-07-2014, 07:09 PM
Does the Wii even do 854x480p, or does it still have to do anamorphic 640x480p?