Quantcast

Page 22 of 32 FirstFirst ... 12181920212223242526 ... LastLast
Results 316 to 330 of 480

Thread: Comparison of 6th generation game console hardware

  1. #316
    Outrunner
    Join Date
    Sep 2012
    Posts
    620
    Rep Power
    14

    Default

    Funny how the PS2 had an edge because of the unexpected crazy high memory bandwidth, and then on the PS3 you had this:


  2. #317
    Wildside Expert
    Join Date
    May 2011
    Posts
    180
    Rep Power
    8

    Default

    Quote Originally Posted by sheath View Post
    Interesting. Did these texture read stalls on the Gamecube and Xbox require 1st party document releases to help developers figure out what was going wrong?
    The PS2 documents are pretty low level - way lower than you might expect - A lot of the optimisation was aimed at postprocessing where fillrate was the bottleneck ( Ie: Clearing a screen is quicker using a mesh of polygons instead of a single large screen sized polygon )



    Quote Originally Posted by sheath View Post
    As far as I know the MIPS and VUs are on the same, what, sub die on the main ASIC. Whatever we want to call that is fine with me. My point wasn't about how many processors and assistant chips were on the same die, but that the PS2 relies on the CPU's processing capabilities rather than the GPU for 3D graphics.
    In some ways VU1+GS are the GPU, even though VU1 is physically present on the CPU chip. VU0 is far more closely coupled with the CPU, and was mainly used as a vector fpu in the same way as the SH4, or SSE on a PC CPU.


    Quote Originally Posted by sheath View Post
    I got the VUs mixed up again, VU1 was the one that was only used to 56% by 2003, VU0 was at 2% (pg14). Again, and I don't know how to be more completely bluntly clear about this, I have always known that the PS2 required the CPU to be fully utilized, including the VUs, in order to be competitive with the Xbox and Gamecube polygon wise or effects wise. How in the world my posts keep resulting in the same "you don't seem to understand the PS2 has VUs" statements is the only thing I am confused about.
    That slide is actually pretty misleading.
    Firstly the 56% VU1 figure. If you look at slide 21 in that presentation you'll actually see that in that frame VU1 is active for half the time ( 56% maybe ) as that frame is bottlenecked by the cpu.
    Secondly the 2% VU0 figure. The counter shows the use of VU0 running it's own programs independently - something not really always used. It doesn't mean that VU0 was idle, as it was also tied into the core as a SIMD coprocessor. You could run EE+VU0 as if it were the SH4 in code giving 3.2GFlops and show 0% on the PA

  3. #318
    Hard Road! ESWAT Veteran Barone's Avatar
    Join Date
    Aug 2010
    Location
    Brazil
    Posts
    6,818
    Rep Power
    142

    Default

    Quote Originally Posted by sheath View Post
    Those disadvantages are certainly worth noting. I would put this under the hazards of being brief, you imply malicious intent or incompetence.
    I'm not implying anything. It's listed under "Chip Features" and it actually doesn't have a z-buffer:

    DirectX components Main differences on Dreamcast
    DirectDraw
    • Full-Screen Exclusive Mode Only.

    • IDirectDrawClipper not supported.

    • Overlay surfaces not supported.
    Direct3D Immediate Mode
    • Dreamcast-specific flags used to expose some PowerVR features.

    • Frame buffer cleared after each frame.

    • No Z-buffer on PowevVR
    "Usage of Direct3D under Dreamcast is very similar to that of its PC equivalent. You create the Direct3D device, create and load textures, and then call BeginScene. Inside the loop you set render-states for each mesh and then call one of the DrawPrimitive* functions. When finished you call EndScene and Flip. One notable difference is that there is no Z buffer on the Dreamcast. However, it is still supported by Direct3D for code compatibility purposes."
    http://www.gamesurge.com/dreamcast/t.../directx.shtml

    "I sense another free pass being given."



    Quote Originally Posted by sheath View Post
    While large textures causing the GS to stall may not be black and white, though Sony's doc more than implies it is, they certainly are a reliable problem based on the evidence on the table.
    It feels like picking one drawback of the PS2's hardware, stretching it out and forgetting about its context.

    Let's see what you claimed: "Didn't we see that PS2 GS stalls were caused by the dev trying to use 256x256 textures without slicing them up into 32x32 first?"
    While the Sony's doc actually states that the VU1 usage is not optimal "Due to stalls on large polygons and textures.".
    There's a whole world of difference between those statements IMO.

    Also:
    Under "Data Packing", in the same doc, page 24, I can read:
    "Big textures are ok, as long as the texel to pixel ratio is ~1:1"

    By calmly reading that or the other docs in SCEE's repository (or some related tech articles), one could figure out that the GS stalls seemed to be more affected by the data flow design of your program than by this or that specific texture size.

    Having to worry that much about the data flow design is a good thing? Of course not.
    The Performance Analyzer could help you with that though:
    "The whole dataset can be displayed, showing the pattern of activity over the entire capture or the view can be zoomed in to show individual machine cycles Ė this is necessary to really show the complex interactions between different parts of the PlayStation 2 hardware. There are a large number of graphs showing things such as pipeline activity, VU usage, bus transfers, and the state of the GS pixel units."
    http://develop.scee.net/files/articl...ceAnalyser.pdf

    "What the PA Can Do
    - Separates processes into their parts
    - Shows how busy the hardware is
    - Shows bottlenecks
    - Shows parallelism or lack thereof
    - Gives facts and figures"
    http://lukasz.dk/mirror/research-sce...er_18Mar03.pdf



    Quote Originally Posted by sheath View Post
    I am leaning toward counting them only as much as they weigh in resources
    Good. The more the system struggles to render polygons, the more they count...
    By that "logic" I'm promoting this new polygon monster: the Sega Genesis!


    Quote Originally Posted by sheath View Post
    and even then only in the context of systems with better GPUs not needing more polys for effects in the first place.
    I can see you struggling to find sources to sustain that sort of "view" and even more to determine how many polygons per frame are used by a given GPU for each effect in each game in each situation or, better yet, which effects are rendered without using any polygons...

    I just hope it doesn't turn out to be another excuse to avoid polygon counts which contradict your claims about the 6th gen games; or maybe just another "evidence" that the DC would fair pretty fine in the later years just by performing all those effects in its untapped PowerVR2 using its imaginary z-buffer free of CPU time cost and without having to render any additional polygons.


    Quote Originally Posted by sheath View Post
    The heavy requirement on the developer and the obvious difficulty in getting the VUs to function even at half efficiency is a technical disadvantage.
    Let's see:
    5. Myth: VU code is hard.
    VU code isnít hard. Fast VU code is hard, but there are now some tools to help you get 80 percent of the way there for a lot less effort.

    VCL (Vector Command Line, as opposed to the interactive graphic version) is a tool that preprocesses a single stream of VU code (no paired instructions necessary), analyses it for loop blocks and control flow, pairs and rearranges instructions, opens loops and interleaves the result to give pretty efficient code.
    by Robin Green, R&D Programmer, SCEA; on September 26, 2001.
    http://www.gamasutra.com/view/featur...n_.php?print=1


    Quote Originally Posted by sheath View Post
    Listing their theoretical peaks without that context is pure biased fabrication on the part of everyone who does it.
    Something similar could be said about DC's and PowerVR2's numbers while using the superb Windows CE Toolkit and its insanely optimized implementation of an early Direct3D.

  4. #319
    Hero of Algol TrekkiesUnite118's Avatar
    Join Date
    May 2010
    Age
    29
    Posts
    7,616
    Rep Power
    97

    Default

    Well there is this article on Dreamcast Direct X Barone:

    http://msdn.microsoft.com/en-us/library/ms834190.aspx

    Basically it mentions that the lack of a Z-Buffer doesn't matter due to the fact that the PowerVR only renders what's visible to begin with.

  5. #320
    Hard Road! ESWAT Veteran Barone's Avatar
    Join Date
    Aug 2010
    Location
    Brazil
    Posts
    6,818
    Rep Power
    142

    Default

    Quote Originally Posted by TrekkiesUnite118 View Post
    Basically it mentions that the lack of a Z-Buffer doesn't matter due to the fact that the PowerVR only renders what's visible to begin with.
    It doesn't mention that it doesn't matter, it says that it isn't needed to render pixels on the DC (since it uses a tile-based rendering):
    "Thus, every pixel on the screen is actually rendered to the screen buffer only once. Other 3-D hardware systems render every pixel as often as that pixel is recovered by a triangle, but not the Dreamcast hardware.
    By using this method, the hardware is not limited by the fill rate. No matter how many triangles recover a single pixel, that single pixel is rendered only once. Therefore, with the Dreamcast hardware, you don't need a Z-Buffer, because only the closest triangle is rendered."

    It doesn't change the fact that it would be harder, more expensive or impossible to implement graphical effects which rely on the z-buffer to be performed.

    "Shadows

    The code for Tony Hawk for Playstation rendered shadows by rendering black ellipses, one for each segment of the character model, onto the ground. We appropriated that code, made it work on the Dreamcast, but discovered that it looked terrible due to Z-buffer poke-through. We then switched to rendering to a texture and then projecting that texture into the world. We tried to find a good way to render that texture dynamically using light sources and projecting it on the nearest surfaces. And after a lot of passable results Wade and Sean came up with the idea of rendering everything near the skater in two passes: one for the geometry, and another to project the shadow texture on the geometry. This resulted in near-perfect shadows, as they would conform to steps, slopes, and walls. It took Sean some elbow grease to get them clipping correctly so they wouldn't show up on the backside of walls or on both sides of the skater, but the end result was well worth the extra time spent on it."
    http://www.gamasutra.com/view/featur...s_.php?print=1

    For the above case, you could probably use the DC's volume modifiers instead but maybe the SDK they were using didn't support it.

  6. #321
    I remain nonsequitur Shining Hero sheath's Avatar
    Join Date
    Jul 2010
    Location
    Texas
    Age
    39
    Posts
    13,313
    Rep Power
    127

    Default

    Quote Originally Posted by rusty View Post
    Then you know more than I do about the die layout But is that such a big deal? The VU is still slaved to the GPU and has immediate access to it. I'm not really sure what it is you're trying to prove, or the relevance. It doesn't matter where the VU was located; its only job was to transform data and push triangles to the GS. For all intents and purposes, it was part of the rendering pipe-line and in principal, part of the GPU.
    That is the problem with your replies to me in a nutshell. You initiated with the assumption that I was coming from some opposing side, and that I was ignorant or stupid like the rest of the Insert Coin crowd. Barone probably didn't help with this perception. In this perception, both of you and a few others persist in looking down on my posts no matter how many times I phrase everything with caveats and as questions rather than statements.

    I am asking questions, I want to understand these systems better so I can develop better hardware sheets with footnotes to all primary docs available. That is all.

    Quote Originally Posted by rusty View Post
    Yeah, but that was from 2003. I had finished work on my first PS2 game by 2002 and the results of that presentation would have been from the first two years of European development for non-launch titles. While I wrote a lot of animation and physics code on VU0, I remember one of the smartest guys (a PhD in Theoretical Astrophysics) in the team just not "getting" the system even though he thought that he did. And it was awful. He wrote this apparently awesome ray-casting code for VU0 that would do NOTHING for long periods of time while the MIPS core would gather data and then suddenly jump into life.

    "One ray-cast per frame is the best you'll ever get on this fucking horrible system" he proclaimed. Two years later, we were doing 120 ray-casts on a much more complex scene at 60fps.
    Two years later, as in 2004 or were you talking about different projects? What games relied so heavily on ray casting? I have always thought that ray casting engines were essentially dumped by the late 90s in favor of hardware accelerated 3D engines. I have also always wanted to see what later hardware could do with ray casting even if combined with a polygonal 3D engine.

    Quote Originally Posted by rusty View Post
    Please don't apologize. It's a good point. Having to make a lot of effort is not the the same as being difficult, at least, not in my view. By effort I mean planning, and bench marking different ideas and features to poke away at the stuff that the hardware documents and sdk documents don't explain, like how the system should be used. If you were fairly intelligent, you'd figure a lot of that from looking at the system as a whole. The PS2 wasn't N64 difficult where no matter what you did, the GPU had bus priority so you were fucked six ways from Sunday no matter how good your code was. On the PS2, you always felt like you had another hardware feature that could help out.
    Okay, contrary to the accusations against me, this is not my idea but rather stems from Sony fans assertions against the Sega Saturn, or even the Jaguar, and Nintendo fans claims against the PS1 vs N64 or Genesis vs SNES. If something is overly time consuming, also phrased as "hard" in most discussion, to implement it probably will not be used to a great extent or by very many developers in general. The published games for each platform show this fairly handily. Platforms with hardware support for effects, or just more capable hardware (not the same thing obviously), see more and more uniform usage of the same.

    As such, what I see you and Barone saying is that because something could be done all instances of it not being done don't count or shouldn't be considered. That is a metric ton of games being slid off the analysis table on a whim by any objective standard. I find it objective and right to catalog a game by its engine, year of release, and whether it was a first gen effort by its developer or the product of experience (and how much experience). I say these conditions first because they are the easiest to uncover, team size and development time are much more obscure conditions.


    Quote Originally Posted by rusty View Post
    So by effort, I meant that you had to really design your code first. You weren't a rock star "coder". You found yourself being a proper, boring software engineer.
    I thought you also compared the same level of development to other platforms and found the PS2 much more time consuming as well. "The Dreamcast's design meant that you spent more time on the game code, where the PS2 forced you to spend a lot of time just getting the basics right." I would say the same applies to development on PC, Xbox or Gamecube. Is there any reason why I should not consider this a technical disadvantage of the PS2?

    Please note the lack of extremism in that statement, a technical disadvantage is not a total absolute technical deficiency in all areas. Also note that I have never claimed that the Dreamcast could outperform the PS2 polygon wise, no not once. My speculations and experience have caused me to post about the possibility that most games that generation did not double or triple the Dreamcast's peak performance. For some reason some cannot handle even the slightest suggestion that might be the case, as indicated by press reports about PS2 RE4 and Sony's 2003 doc.

    Quote Originally Posted by rusty View Post
    Hmm....that depends. This is more a reflection on the tools, rather than the hardware. I came from a low level background, although I also spent a lot of time writing tools and higher level stuff. In fact, most of the industry at that point had a low level background and still looked at C as being a complete luxury. A lot of the guys who were my age, had just come from University (I didn't go to Uni) weren't from the same background as me. And for most of them, the low level nature of the tools for the PS2 was something beyond (or beneath) them.
    This is certainly true, and easily demonstrated by post 2003 PS2 games being visibly improved over earlier titles. This falls directly under what I said above though, if the average developer wasn't going to "get it" then it is a technical disadvantage.


    Quote Originally Posted by rusty View Post
    It's wasn't so much about optimizing micro code. That was pretty straight forward. It was making sure that you had everything running in parallel without stuff blocking or waiting for data. Which is the big trick on modern console hardware as it happens. It was really about tweaking the numbers...like how much data to process to balance the MIPS core and VU0.
    From what I see in the games, on any platform that requires low level parallel processing, optimizing for peak performance would take more time than the system would be on the market, unless you are talking about world dominating PS2.


    Quote Originally Posted by rusty View Post
    Geez...I didn't see one until 2003. I'm pretty sure that other devs doing stuff for Sony had it before then though.
    That gives with the games as well. Similarly the PS1's performance analyzer came out after Gran Turismo 1 and the games take on a very late gen hyper optimized look compared to pre-1998 titles. If I recall, Polyphony developed the analyzer for GT1. It would not surprise me if they did the same for Gran Turismo 3 A-Spec and then worked on it for the eventual release of Gran Turismo 4 Prologue in 2003. Assembler games has the earliest revision in Spring 2003.

    Quote Originally Posted by rusty View Post
    Originally Posted by sheath

    That sounds like a nice feature to free up the MIPS, I wonder if Sony viewed this as the 2-8% usage or something else. "

    I think that it was things like I previously mentioned on that awful ray-cast implementation. It's people not really getting how the system was supposed to be leveraged.

    Originally Posted by sheath
    Also from the previously linked Sony document:

    Originally Posted by sheath

    Should run almost 100% of the time"

    This is a balance issue. VU1 and the GS should be running in parallel and with no interruptions. If it isn't, it means you've not done your homework and just sat back and been impressed with how quickly you can transform a packet of vertices without considering what's going on with the GS. Or you've chosen a simple single buffered approach to VU1/GS workload.

    Originally Posted by sheath

    Often stalls on textures"

    Could be improper GS cache alignment, polys that straddle texture cache blocks, or small triangles that are using the incorrect MIP/not using MIP mapping.

    Originally Posted by sheath
    Often stalls on big polygons"

    Subdivide when possible (e.g. particles)

    Again, a GS cache issue.

    Originally Posted by sheath

    Donít overdo clipping"

    Clipping was done on VU1. I remember one really smart guy who had access to PS2 before I did, and all his VU1 code was limited because put every triangle through clipping even if it didn't need to be. We all just thought 'that's the way you do it" because other graphics hardware put every triangle through it's clip stage, right? Instead, the trick was to have clipping and non-clipping microcode, project the bounding volume of the object into clip space during scene visibility checks and determine which microcode to use.

    Originally Posted by sheath

    So the games pulled up to the time of this research in 2003 used 2-8% of the VU0, and VU1 was still needing further explanation to avoid the above pitfalls. pg 29 advocates VU0 usage as you are describing, or specifically for skinning, testing visibility, AI, physics and particles specifically to prevent the MIPS from stalling. pg 32 summarizes the article to say that "most games still don't use VU0" and that most games are using 2-5 million polygons per second (aka horrible mid gen Dreamcast land for the Insert Coin crowd)."

    It would have been games release in the mid-2002/early 2003 period. I'm betting my first PS2 game was one of them *shudder*. The thing was not that the hardware was difficult; people just didn't get its parallel nature and Sony had a really difficult time in promoting that among developers.
    On a pragmatic, and historical, level it doesn't matter what the development issue was. The facts are the games were what they were for up to three years. All of this went down hill when I pointed out that 2003 was when the Dreamcast would have been 5 years old. Some here like to think what we saw on Dreamcast from 1998-2001 was all we would ever see even while they want to argue that all games prior to 2003 on PS2 are null and void in a comparison. That_is_bias.

    I have no problem with saying that many if not most games failed to take full advantage of the PS2's capabilities, that is not the issue. The issue is by failing to do so the published record of PS2 games are not absolutely outperforming the first console of its generation. That isn't even getting into image quality issues and 480p or how apples to oranges the rendering types are in the first place.

    Quote Originally Posted by Crazyace View Post
    In some ways VU1+GS are the GPU, even though VU1 is physically present on the CPU chip. VU0 is far more closely coupled with the CPU, and was mainly used as a vector fpu in the same way as the SH4, or SSE on a PC CPU.
    As I said, the system relied on the CPU and its support chips rather than the GPU having hardware T&L. Without the VU1 the GS wouldn't be able to pump out polygons on a competitive level.

    Quote Originally Posted by Crazyace View Post
    That slide is actually pretty misleading.
    Firstly the 56% VU1 figure. If you look at slide 21 in that presentation you'll actually see that in that frame VU1 is active for half the time ( 56% maybe ) as that frame is bottlenecked by the cpu.
    Secondly the 2% VU0 figure. The counter shows the use of VU0 running it's own programs independently - something not really always used. It doesn't mean that VU0 was idle, as it was also tied into the core as a SIMD coprocessor. You could run EE+VU0 as if it were the SH4 in code giving 3.2GFlops and show 0% on the PA
    The entire document is easy to misread, that page alone is averaging the games polled, fortunately it gives some highs and lows. Aside from that, I think everybody, in their persistent advocacy, has completely missed my point here. By 2003 even Sony was saying that most games performed on a Dreamcast level. I have been actively insulted for saying that it looks like this might be the case for the whole generation. "Most games" does not equal all games, and "most games" do not indicate peak performance as is true for all platforms.
    Last edited by sheath; 02-26-2014 at 10:16 AM.
    "... If Sony reduced the price of the Playstation, Sega would have to follow suit in order to stay competitive, but Saturn's high manufacturing cost would then translate into huge losses for the company." p170 Revolutionaries at Sony.

    "We ... put Sega out of the hardware business ..." Peter Dille senior vice president of marketing at Sony Computer Entertainment

  7. #322

  8. #323
    Road Rasher
    Join Date
    Oct 2012
    Posts
    404
    Rep Power
    35

    Default

    Quote Originally Posted by sheath View Post


    Please note the lack of extremism in that statement, a technical disadvantage is not a total absolute technical deficiency in all areas. Also note that I have never claimed that the Dreamcast could outperform the PS2 polygon wise, no not once. My speculations and experience have caused me to post about the possibility that most games that generation did not double or triple the Dreamcast's peak performance. For some reason some cannot handle even the slightest suggestion that might be the case, as indicated by press reports about PS2 RE4 and Sony's 2003 doc.
    I'm sorry but I cannot agree with the bolded statement. You have repeatedly attempted to infer that , despite other posters attempts to show that it is false.

    Quote Originally Posted by sheath
    I'm pretty sure Shenmue caps out at 3 million polygons per second. Also, one of the articles I linked to shows that as late as 2003 PS2 developers were struggling to maintain 1.5 million polygons per second at 25-30FPS even into 2003. Most weren't even maintaining that framerate with 52,000 polygons per frame.

    The PS2's polygon prowess is entirely exaggerated, an urban myth a fabrication of Ken Kuturagi's cracked out brain and idiot Sony fans. Resident Evil 4 had to have its textures and lighting chopped for the PS2, and the polygon counts were dropped to 900,000 per second from the Gamecube version's 1.5 Million. Nobody thought Resident Evil 4 on either system looked low detail, quite the opposite. Why? The entire generation was floating around 1 Million polygons per second, especially outside of Racers and FPS.
    In that quote from the PS2 vs Dreamcast thread you made it quite clear that you felt that PS2 was over rated and that the Dreamcast was able to keep up with and in some cases probably surpass the PS2 and have used the DC version of Le Mans and the fact that "it is using 5 million polygons in game" to try and back up your claims.


    Then of course there is this "golden nugget" of a quote from you:


    Quote Originally Posted by sheath
    "Also, the PS2 seems to get a free pass all the time for sucking graphically because somehow it is performing better while looking like complete ass".
    Its clear as day that you think that PS2 graphics suck and it seems to annoy you greatly that others don't see it for themselves. Frankly to me this is no better than the some of the fanboy BS that ABF has been shitting all over that thread since it was moved over to "Insert Coin"

  9. #324
    Wildside Expert
    Join Date
    May 2011
    Posts
    180
    Rep Power
    8

    Default

    Quote Originally Posted by sheath View Post
    As I said, the system relied on the CPU and its support chips rather than the GPU having hardware T&L. Without the VU1 the GS wouldn't be able to pump out polygons on a competitive level.
    Actually, if there was no VU1 PS2 games would have used VU0 a lot more, a bit like a much faster version of the SH4 on the Dreamcast.



    Quote Originally Posted by sheath View Post
    The entire document is easy to misread, that page alone is averaging the games polled, fortunately it gives some highs and lows. Aside from that, I think everybody, in their persistent advocacy, has completely missed my point here. By 2003 even Sony was saying that most games performed on a Dreamcast level. I have been actively insulted for saying that it looks like this might be the case for the whole generation. "Most games" does not equal all games, and "most games" do not indicate peak performance as is true for all platforms.
    I think that the main problem is people misreading the Sony technical presentations. These are almost always counting polygons sent to GS after clipping, and counting over a game frame so idle time is included. Both of those act to artificially reduce the polycounts when compared with most 'game engine' measurements base on total geometry throughput and exact gpu time.

    I can't see where Sony is saying that most games performed on a Dreamcast level in those documents? Are you trying to compare the worst PS2 poly counts to the best Dreamcast poly counts? ( The polycounts here http://www.youtube.com/watch?v=uMeUjG-RPTA seem to show DOA running around 1-1.5Million vertices per second, not polygons - which is at least 4x lower than the actual polygon counts for Ratchet here http://www.research.scea.com/researc...er_18Mar03.pdf - slide 46 )

  10. #325
    I remain nonsequitur Shining Hero sheath's Avatar
    Join Date
    Jul 2010
    Location
    Texas
    Age
    39
    Posts
    13,313
    Rep Power
    127

    Default

    Quote Originally Posted by stu View Post
    I'm sorry but I cannot agree with the bolded statement. You have repeatedly attempted to infer that , despite other posters attempts to show that it is false.

    Originally Posted by sheath

    I'm pretty sure Shenmue caps out at 3 million polygons per second. Also, one of the articles I linked to shows that as late as 2003 PS2 developers were struggling to maintain 1.5 million polygons per second at 25-30FPS even into 2003. Most weren't even maintaining that framerate with 52,000 polygons per frame.

    The PS2's polygon prowess is entirely exaggerated, an urban myth a fabrication of Ken Kuturagi's cracked out brain and idiot Sony fans. Resident Evil 4 had to have its textures and lighting chopped for the PS2, and the polygon counts were dropped to 900,000 per second from the Gamecube version's 1.5 Million. Nobody thought Resident Evil 4 on either system looked low detail, quite the opposite. Why? The entire generation was floating around 1 Million polygons per second, especially outside of Racers and FPS."


    In that quote from the PS2 vs Dreamcast thread you made it quite clear that you felt that PS2 was over rated and that the Dreamcast was able to keep up with and in some cases probably surpass the PS2 and have used the DC version of Le Mans and the fact that "it is using 5 million polygons in game" to try and back up your claims.
    Notice the caveats, "exaggerated" and "an urban myth" based on Kuturagi's claims (eg. individual grains of wood being rendered in a door, etc). The PS2's polygon counts are taken from developer quotes and Sony themselves, do you contest those sources? Now where do you see me claiming that the Dreamcast was more proficient at pumping out polygons, in game or in theory? By citing actual game polygon figures? Obviously every Dreamcast game did not hit these peaks, the games themselves did not average those figures either.

    Okay, let me back up one more time. That silly troll thread started out typically as all comparison threads in forums do. Some, like me, said the Dreamcast was probably good enough for the time and had great image quality, the majority even in this Sega forum said that the PS2 was obviously the most powerful system with dramatic increases in polygon performance in particular. Did the games show this? If so were they the exception, the mean or median? When were these games released?

    Quote Originally Posted by stu View Post
    Then of course there is this "golden nugget" of a quote from you:

    Originally Posted by sheath
    "Also, the PS2 seems to get a free pass all the time for sucking graphically because somehow it is performing better while looking like complete ass".


    Its clear as day that you think that PS2 graphics suck and it seems to annoy you greatly that others don't see it for themselves. Frankly to me this is no better than the some of the fanboy BS that ABF has been shitting all over that thread since it was moved over to "Insert Coin"
    Obviously you associate me with ABF and others, which is why you persistently reply to flames against me with "you need to spread around more rep... bla bla" It's like you can't read a single thing I write without seeing it as something horrible against something you find great. I can understand that, the PS2's marketing was quite successful at establishing it as the most powerful system in the eyes of its supporters.

    In the light of this quite prolific belief, actually no I don't even need to put this under that expectation. When I see a 224/240p PS2 game or even a, what 448 line PS2 game with 4-bit textures in most places and no anti aliasing to speak of I do, in fact, think it looks like ass. I'd rather play a PS1 or Saturn game than a PS2 game with those kinds of graphics. When the PS2 runs at a resolution more in line with reasonable 6th generation expectations I think it looks fine. The problem is, in my not inconsiderable experience with the PS2 library since 2000, including three years of retail experience, those games that look like they should are far, far, far fewer than Sony advocates would ever want to admit.

    All of that is coming around to say, the very common claim that the PS2 is "way more powerful" really does not show up in most games but people believe it because that is how marketing works. The same people also tend to claim the Dreamcast was already maxed out in 2000-01, and that its polygon performance was so low that it was dead as soon as the PS2 launched. If that were so, then most PS2 games would have been laughed off the shelf for their "poor" performance.
    "... If Sony reduced the price of the Playstation, Sega would have to follow suit in order to stay competitive, but Saturn's high manufacturing cost would then translate into huge losses for the company." p170 Revolutionaries at Sony.

    "We ... put Sega out of the hardware business ..." Peter Dille senior vice president of marketing at Sony Computer Entertainment

  11. #326
    Outrunner
    Join Date
    Sep 2012
    Posts
    620
    Rep Power
    14

    Default

    Denuan set up his emulator to display triangle counts and the DOA2 title screen (or was it intro, I don't remember) was hitting over 3 million. DOA2 was also a very early game.

    So the question is not whether the DC can push above 3 million polys, but whether there is a point in doing so.

  12. #327
    I remain nonsequitur Shining Hero sheath's Avatar
    Join Date
    Jul 2010
    Location
    Texas
    Age
    39
    Posts
    13,313
    Rep Power
    127

    Default

    Quote Originally Posted by Crazyace View Post
    I think that the main problem is people misreading the Sony technical presentations. These are almost always counting polygons sent to GS after clipping, and counting over a game frame so idle time is included. Both of those act to artificially reduce the polycounts when compared with most 'game engine' measurements base on total geometry throughput and exact gpu time.
    http://research.scee.net/files/prese...rHaveWeGot.pdf
    pg13:
    "52,000 polys per frame
    - Min 10,000 - Max 145,000

    Framerate: 60% were running at 25/30 or less

    95% were using full height buffers"

    pg 32

    "More than Half the games run at 25/30
    ...
    Most recent games draw over 50k polys
    - Fastest so far seems to be 125k polys at 60fps

    Most games draw between 2 and 5 Mps

    Main slowdown is still CPU efficiency
    - Cache misses"

    So, peak performance of 125k at 60 FPS equals 7.5 million polygons per second, but the average is between 2-5 million, according to Sony prior to report time in 2003. The "over half are 25/30" comment on both pages cited indicates that the polygon counts figures is more than 60% of the games polled, otherwise they just would have said 60% in both cases.

    I am totally open to any discussion over whether these polygons are to be considered polygons worth counting, but then why did Sony count them in an analysis of how far PS2 development had to go?

    Quote Originally Posted by Crazyace View Post
    I can't see where Sony is saying that most games performed on a Dreamcast level in those documents? Are you trying to compare the worst PS2 poly counts to the best Dreamcast poly counts? ( The polycounts here http://www.youtube.com/watch?v=uMeUjG-RPTA seem to show DOA running around 1-1.5Million vertices per second, not polygons - which is at least 4x lower than the actual polygon counts for Ratchet here http://www.research.scea.com/researc...er_18Mar03.pdf - slide 46 )
    Sony wasn't talking about the Dreamcast at all and obviously didn't have any reason to in 2003. I am trying to compare the average, or median, PS2 game polycount to the Dreamcast's conservative in game specs, not the low ends of either. If Dead or Alive is less than 1 million polygons per second then all developer comments about the game are proven false.

    Either way, especially in light of Melbourne House's second and third efforts on PS2, I have no problem seeing the PS2 exactly as I did when I bought my first one in December 2001. That is at least as capable as the Dreamcast with the potential for double or triple the polygons of the Dreamcast peak. What I don't see in the games is what apparently most others, especially Sony fans, see and that is the Dreamcast as a half generation behind and incapable of competing for a full lifecycle with the PS2 on the market. That, too, isn't even getting into the PS2's inherent image quality issues, which apparently everybody wants completely dismissed as inconsequential.
    "... If Sony reduced the price of the Playstation, Sega would have to follow suit in order to stay competitive, but Saturn's high manufacturing cost would then translate into huge losses for the company." p170 Revolutionaries at Sony.

    "We ... put Sega out of the hardware business ..." Peter Dille senior vice president of marketing at Sony Computer Entertainment

  13. #328
    Wildside Expert
    Join Date
    May 2011
    Posts
    180
    Rep Power
    8

    Default

    Hi Zyrobs,

    I haven't looked at many emulators for DC - so I'm just taking the numbers from the NullDC video as is - Do you think that they are inaccurate?

    Hi Sheath,

    That presentation is pretty misleading at times - The comparison with DOA doesn't invalidate the 3M poly/second claim for that engine - it's just a capture of a single frame, just like the PS2 PA captures, which show the drawn GS polygons normalised to the frame/capture time or markers.

    ( It's funny how this thread seems to be PS2 vs DC - and the other PS2 vs DC thread seems to be a general 6th gen comparison )

    I agree that the PS2 is a least as capable as the Dreamcast - but not the other way around. It doesn't equate to not being competitive in terms of games for a full lifecycle though, as better textures would be balanced against lower polycounts ( just like the RE4 Gamecube/PS2 conversion )

  14. #329
    Hard Road! ESWAT Veteran Barone's Avatar
    Join Date
    Aug 2010
    Location
    Brazil
    Posts
    6,818
    Rep Power
    142

    Default

    @stu
    Quote Originally Posted by sheath View Post
    If the PS2 hadn't lasted ten freaking years it would be remembered as a DVD player first, a PS1 second and a PS2 player third.
    Quote Originally Posted by sheath View Post
    The PS2 is the ultimate version of what Nintendo 64 marketing attempted. People have bought that line hook, line and sinker and I see no reason to do anything but oppose it.
    But we are the fanboys.




    Quote Originally Posted by sheath View Post
    Barone probably didn't help with this perception. In this perception, both of you and a few others persist in looking down on my posts no matter how many times I phrase everything with caveats and as questions rather than statements.
    Yeah, we'll see about that...


    Quote Originally Posted by sheath View Post
    As such, what I see you and Barone saying is that because something could be done all instances of it not being done don't count or shouldn't be considered.
    And you complain about people distorting your posts? Ha.


    Quote Originally Posted by sheath View Post
    Also note that I have never claimed that the Dreamcast could outperform the PS2 polygon wise, no not once.
    My speculations and experience have caused me to post about the possibility that most games that generation did not double or triple the Dreamcast's peak performance. For some reason some cannot handle even the slightest suggestion that might be the case, as indicated by press reports about PS2 RE4 and Sony's 2003 doc.
    Quote Originally Posted by sheath View Post
    Now where do you see me claiming that the Dreamcast was more proficient at pumping out polygons, in game or in theory? By citing actual game polygon figures? Obviously every Dreamcast game did not hit these peaks, the games themselves did not average those figures either.
    __________ much?
    Quote Originally Posted by sheath View Post
    The bottom line is the Dreamcast was more capable of surpassing 3 million polygons per second than the PS2 and the entire generation floated around 1 million with a few exceptions on each platform.
    What about that, sheath? Am I "twisting" your words?


    Quote Originally Posted by sheath View Post
    I am asking questions, I want to understand these systems better so I can develop better hardware sheets with footnotes to all primary docs available. That is all.
    Quote Originally Posted by sheath View Post
    Either way, especially in light of Melbourne House's second and third efforts on PS2, I have no problem seeing the PS2 exactly as I did when I bought my first one in December 2001. That is at least as capable as the Dreamcast with the potential for double or triple the polygons of the Dreamcast peak.


    This is how you actually acted all the time:
    Quote Originally Posted by sheath View Post
    My experience is very few games this generation even achieved a solid 30FPS, and even fewer managed 60FPS, most of which are on the Dreamcast or are graphically unimpressive games.
    Quote Originally Posted by sheath View Post
    Yeah, those specs were as outrageous as the PS2's specs. I don't think any of these systems breached the 5 million per second mark.
    Quote Originally Posted by sheath View Post
    For whatever reason 4 to 5 million polygons per second was all any developer boasted of their engines for the entire generation.
    Quote Originally Posted by sheath View Post
    The PS2 got a lot of hype early on, and empty promises of games with 10 or even 20 million polygons per second, but that never happened.
    Quote Originally Posted by sheath View Post
    Then I looked at polygon counts for known games and found that these games were not performing at that level and proceeded from there.
    Quote Originally Posted by sheath View Post
    The point of the polygon counts discussion is that the marketing for the PS2 and Xbox, and to a lesser extent the Gamecube, have "proven" to people that these were more powerful than the Dreamcast when they apparently were_not.

    On top of that, you also got cocky without having researched about the other hardware designs:
    Quote Originally Posted by sheath View Post
    The other systems were never going to match the Dreamcast in fillrate, ever.
    Quote Originally Posted by sheath View Post
    By all means please explain exactly how much slower the Dreamcast's CPU is in comparison to the Xbox and Gamecube CPUs.
    Quote Originally Posted by sheath View Post
    I guess since that's EA's article this isn't an uncommon practice. Still, there's a lot of assumptions in that figure. Do we have a single confirmed developer quote of any Xbox, Gamecube or PS2 game surpassing 5 million polygons per second? I have no delusions that the Dreamcast wasn't going to achieve that in a sustained way, but from what I've seen the later consoles benefit more from their RAM advantage than from any clear rendering advantage.
    Quote Originally Posted by sheath View Post
    If the PS2 ends up being the polygon monster Sony always promised, even if that took five, eight or ten years, I will not deny it. Do you have that fact documented? Why have you not shared it? Why are you going completely nuts on me if you have not seen documentation of a PS2 game well and above the Dreamcast performance wise?

    And the proof that you didn't grasp the hardware that you love to thrash all the time came in quotes like these:
    Quote Originally Posted by sheath View Post
    Yes, the PS2's VRAM was intended to be high speed cache,but it still needs to store the polygons and frame buffer I'm sure.
    Quote Originally Posted by sheath View Post
    More fun with numbers from the first page of the 6th gen page:
    "Let see, if the Dreamcast can render more polygons then it can store, and I will use 6 mpps as an example:

    6,000,000 (polygons) / 60 (frames per second) = 100,000 polygons per scene
    100,000 x 40 Bytes (size of polygon) = ~4 MB

    Since the Dreamcast only has 8 MB of video memory, that is a lot of memory!

    8 MB - 1.2 MB (640x480x16-bits double buffered frame buffer) - 4 MB (polygon data) = 2.8 MB

    Only 2.8 MB left for textures, and even with VQ compression that is not very much. At 3 mpps per second, there is 5.8 MB available for textures, and that is much better. Just shows you, that there is not much point in creating a game engine on the DC that does more than 3 million polygons per second. Anyway 90 percent of the developers out there cannot even get over a million polygons per second on the Dreamcast."

    How do you suppose, with 4MB of VRAM, the PS2 was supposed to normally function at or above this level? Why is it so obvious to you that the PS2, with 4MB of VRAM and hundreds of games that run at 240/224 lines (as your Melbourne House interview agrees with), completely outclasses the Dreamcast in graphical capabilities in the actual library?
    If you had spent 1% of the time that you've dedicated to blame Sony, PS2, the mainstreamers and your evil friend (a.k.a. me) in actually researching and trying to learn how those systems worked, as you're claiming now, you'd have read stuff like this.
    And you'd know that a calc like this is for Tile-based rendering but not for consoles like the PS2/GC/Xbox.


    So, after claiming all that, blaming people which have different gaming tastes from yours, getting cocky and all; you were confronted to actual sources which proved:
    - Dreamcast was less capable of breaking the polygon count barriers that you had established for that gen than the other consoles.
    - There were several games which surpassed the 1 million, 3 million and 5 million barriers that you've created in your mind.
    - There are PS2 games which did surpassed the supposed DC's polygon count limit by a wide margin.
    - The polygon count promises about the PS2 games weren't empty.
    - Not only ultra-late releases surpassed your polygon count barriers.
    - 3D platformers, and not only Racing and FPS games, also surpassed those barriers.



    What was your reaction to that?
    Quote Originally Posted by sheath View Post
    I will never assume or accept what you and a couple of others have read into my statements. The actual truth is as I have always said, we need more facts not more opinions.
    Quote Originally Posted by sheath View Post
    In these quotes you have bolded yourself you see Melbourne House, which always developed the most advanced engines, did it again on the PS2 with Grand Prix Challenge. Are you claiming that all or most other PS2 racers were up to this performance level? I have already admitted that "in a vacuum" benchmarks for the PS2 are just about triple that of the Dreamcast's peak performance.
    Quote Originally Posted by sheath View Post
    Taking more advanced games than even the same system were showing at the time as some sort of proof that the Dreamcast could not, in the absolute sense, keep up is unbiased?
    Quote Originally Posted by sheath View Post
    I am trying to figure out if re-rasterizing polygons each pass is just as resource heavy as the raw polygon, or more specifically if it makes sense to count the same polygons again on a per pass basis when determining polygons per second performance. Between that and the wide variety of ways systems render polygons (Tile Based Renderers, Multi Pass to Single Pass Renderers, and whatever else is out there) the whole polygon spec is becoming even less meaningful to me than it was before.
    Quote Originally Posted by sheath View Post
    I am leaning toward counting them only as much as they weigh in resources, and even then only in the context of systems with better GPUs not needing more polys for effects in the first place.
    You just minimized and relativized the meaning of those sources; the sources that disproved a lot of what you had claimed, repeatedly, for years. The sources that you said to be searching for all the time.
    The sources you supposed to not exist...




    And, after that, you also found "new" ways to downplay the PS2's hardware:
    Quote Originally Posted by sheath View Post
    While large textures causing the GS to stall may not be black and white, though Sony's doc more than implies it is, they certainly are a reliable problem based on the evidence on the table.
    Quote Originally Posted by sheath View Post
    As for the Vector Units making up for any perceived deficiency in the CPUs, you seem to be ignoring the hardware processing capabilities of the GPUs for the other systems. I know you aren't, but since we are jumping to conclusions based on omissions there it is.
    Quote Originally Posted by sheath View Post
    For my part I have always shown the PS2 as a system that depended almost entirely on the CPU and its on die processing capabilities to push 3D, while the later 6th gen consoles relied more heavily on the GPU's processing capabilities. I seem to be getting lumped in with "typical" forum "Sega fans" though.
    Quote Originally Posted by sheath View Post
    The heavy requirement on the developer and the obvious difficulty in getting the VUs to function even at half efficiency is a technical disadvantage. Listing their theoretical peaks without that context is pure biased fabrication on the part of everyone who does it.
    And several of those assumptions have already been refuted as well...



    Your latest step was to meltdown, victimize yourself and turn against me, stu and whoever question all that shit that you've been saying and doing for a long time:
    Quote Originally Posted by sheath View Post
    If anything I have proven that no matter how many times I rephrase my statements you and everybody else in this place will continue push their own agenda and interpretation while reading my statements in the worst possible light. A microcosm of this is how everybody keeps assuming I am saying there is no benefit to bluray whatsoever. This assertion actually is only in your minds.
    Quote Originally Posted by sheath View Post
    These happened only in your mind. I got fierce when you more than implied, again, that I was attempting to dismiss facts or somehow discredit them. The constant defensiveness in regard to all things Sony makes any and all discussion of these facts impossible. It has even allowed the trolls to (accidentally) poke holes in your arguments for the PS2's superiority.
    When I ask a question, you accuse me of dismissing and obfuscating facts. To the former I say again and again nothing is dismissed and contextualizing facts is right and beneficial to the discussion. To the later I say the facts are already stupefying and confused in these discussions. You might as well be blaming me for why the wires under my desk are a rats nest right now when the last time I touched them they were in neat rows.

    Better yet, you keep evoking the same crap:
    Quote Originally Posted by sheath View Post
    On a pragmatic, and historical, level

    ...and blaming "people":
    Quote Originally Posted by sheath View Post
    All of that is coming around to say, the very common claim that the PS2 is "way more powerful" really does not show up in most games but people believe it because that is how marketing works. The same people also tend to claim the Dreamcast was already maxed out in 2000-01, and that its polygon performance was so low that it was dead as soon as the PS2 launched. If that were so, then most PS2 games would have been laughed off the shelf for their "poor" performance.
    Empty accusations towards the "idiot Sony fans" (a.k.a. "mainstreamers") will not make any of your previous absurd statements any less absurd.

    OTOH, your anti-Sony speech and your conspiracy theory around PR statements being responsible for the DC's and/or Sega's failure are getting more and more anecdotal every time you reiterate them.
    Be it for the emptiness of themselves or for the fact that you are the same guy who advocates the 32X to no end, an add-on which was advertised as being "FOURTY TIMES faster than 16-bit machines" while several of its 16-bit-looking games failed to run at the same frame rate of the supposed far inferior 16-bit machines (and let's forget that the Neo Geo was also a 16-bit machine).
    IF buying a platform which, overall, undelivered the PR specs is a sign of having suffered a PR and/or media brainwash and/or being an idiot fan; well, then I'm sad to say that you're not in a better position than the "idiot Sony fans"/"mainstreamers".


    Seriously, sheath, cut the crap. Once and for all.
    Quit the drama, quit the accusing and get down off your high horse.

  15. #330
    Outrunner
    Join Date
    Sep 2012
    Posts
    620
    Rep Power
    14

    Default

    Quote Originally Posted by Crazyace View Post
    I haven't looked at many emulators for DC - so I'm just taking the numbers from the NullDC video as is - Do you think that they are inaccurate?
    More accurate than random old newspaper quotes. Checking what the hardware is drawing, instead of what some payed off "developer" is writing about consoles he may not even worked on.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •