Quantcast

Page 25 of 104 FirstFirst ... 152122232425262728293575 ... LastLast
Results 361 to 375 of 1557

Thread: PS2 vs Dreamcast Graphics

  1. #361

  2. #362
    Hard Road! ESWAT Veteran Barone's Avatar
    Join Date
    Aug 2010
    Location
    Brazil
    Posts
    6,839
    Rep Power
    142

    Default

    @rusty
    Wow, great to have a developer from those days around here. Be welcome. Thanks a lot for your input.
    Please, let me ask you some questions and add some comments:

    Quote Originally Posted by rusty View Post
    Of course, what Alex and Rich don't tell everybody is that B1 and B3 didn't use RenderWare for rendering at all. It was *all* custom stuff. B2 did but it was a nightmare getting decent performance out of it. Oh the irony, given that Criterion Games was the show pony to help sell RenderWare. It's true that they had the same version of RenderWare as everybody else - it just wasn't used!
    I guess it has something to do with PR, ehehe.
    By the way, I still think that Burnout 2 looked amazing on the PS2 by the time of its release, especially compared to the other games available then. I love how detailed the city is. The 3D models of buildings and trees, to name a few, look far more detailed than on the best looking DC racers IMO.

    Do you have any idea about how many polygons per second or per frame the Burnout games, especially from the second one and on, were pushing on the PS2?


    Quote Originally Posted by rusty View Post
    On PS2 vs Dreamcast, there's no question what console wins on capability. The PS2. Sure the VRAM was tiny, and there was the AA bug in the GS meaning you were never going to be able to have 640x480 at 60fps.
    Just to clarify, you meant you had to go without Anti-Aliasing to be able to achieve 640x480 at 60fps, right?
    Some games, like Grand Prix Challenge from Melbourne House, claim to be using real-time supersampling instead. Was that the best way to go in your opinion/did you used some other workaround to the AA bug in order to avoid jaggies in your games?
    Have you tried/used any resolution mode higher than 640 x 480 in your PS2 games? What was the major limitation keeping you from using the PS2's higher res modes?


    Quote Originally Posted by rusty View Post
    But the machine was just such a beast under the hood. A Beast that took a lot of effort of get the simplest things up and running, but a still a beast all the same. The fill-rate was just so insane on the PS2, I remember one of the first technical notes from Sony was "don't bother checking of back-facing polygons. The GS is so fast, it takes less time for the GS to do it than the calculation does on VU1".
    Thanks for the info. It confirms what was said in that old MDK2 developer interview and several other comments regarding the fill-rate capacity of the PS2.


    Quote Originally Posted by rusty View Post
    It wasn't just theory. Every PS2 game I worked on just used VRAM as a place for the frame buffers and as a texture cache. You sorted your scene by material, and uploaded the textures once and once only for a series of draw calls and did this every frame.
    Thanks a lot for this. I was afraid that I could have made some stupid mistake while trying to comprehend how that stuff worked but feels like I wasn't so far from the reality.

    Please, do you remember the actual amount of main RAM that you used to "share" as texture RAM in your games?
    This could give me a better idea about how long the Dreamcast could have remained competitive "fixed" at its 8 MB of VRAM.


    Quote Originally Posted by rusty View Post
    When I worked on a UE based PS2 title, it only confirmed to me just how dumb PC coders were at the time - they guys who had done the port and stuck to the PC way of doing things, by using VRAM as a permanent storage for all the textures with some really horrendously complex memory allocator. The comments in the code were hilarious, along the lines of "OMG - how do other games do this? Undocumented haxxxx???". They just didn't get it, despite all of the technical whitepapers and docs that Sony had.
    "UE" means "Unreal Engine"???

    But yes, I wonder that the source code of most of the PC games at the time should be a mess. Several of those engines were developed in a "chaos" environment AFAIK, with no sense of Software Engineering and without giving much importance to a clear and well organized/structured code. Much less to actual optimization of the resources usage.
    The "infinite" resources of the PC platform used to "provide" a lot of bad programming and project development practices when compared to consoles AFAIK.

    Do you think that such a "mess" state of things might have prevented the DC from being able to receive more and better ports of PC games?
    Speed Busters, for an examplem, was awesome on the PC but it had some serious frame rate problems on the DC, despite most of the DC specifications and its hardware design suggesting that it would have no problem to run games like that.
    Do you remember about any major bottleneck in the DC hardware when you worked with it? Did you have a background in other platforms?
    What do you think about Sega using the Windows CE on the Dreamcast? Was it efficient when compared to the Sony solution for the PS2 in your opinion?


    Quote Originally Posted by rusty View Post
    MGS2 is one of the better examples of this scheme though. When the performance analyzer came out, you were able to run a retail game and profile EVERYTHING, including the commands sent to the VIF/GIF. The guys on MGS2 were so dammed clever. All of their characters were broken into micro-packets, that used little 32x32 texture squares to minimize the risk of the GS stalling because a texture upload took too long (texture transfers could happen in parallel while rendering was happening). My jaw just dropped when I saw this. Then I looked at how our cars were rendered, and keep in mind, Alex Fry is a really smart guy; and we were still uploading massive 256x256 uncompressed textures in one go.
    Pure gold, thanks.
    Any idea on how many polygons MGS2 was able to push (I wonder that the performance analyzer could measure that too, right?)?


    Quote Originally Posted by rusty View Post
    When I look at MGS2 today; I'm still amazed at how advanced it is for such an early PS2 title. I wish I still had access to the PS2PA kits, because I also suspect that game was using per pixel lighting, rather than just vertex lighting.
    Per pixel lighting in 2001? Yep, that would be really impressive.


    Quote Originally Posted by rusty View Post
    Mind you...I just wish Sega had been more careful with the design of the Dev Hardware. There's nothing like switching off the TV and Dev-Kit in the wrong order at 3am in the morning only to hear five grand of kit go "pop".
    Ouch!

    Thanks again for taking your time to partake this discussion.


    Quote Originally Posted by gamevet View Post
    I still think Metal Gear Solid 2 is an amazing looking game. I had bought the Xbox version of MGS2: Substance, only to find out it ran like crap on the console.
    Especially on the outdoors of the Tanker IIRC.
    Several PS2-designed games had more slowdown on the Xbox for what I've read. I guess that they relied on the parallelism of the PS2 hardware for some important tasks and then ended up being bottleneck'ed on the Xbox.

  3. #363
    Wildside Expert
    Join Date
    Jan 2014
    Posts
    146
    Rep Power
    9

    Default

    Quote Originally Posted by Barone View Post
    @rusty
    Wow, great to have a developer from those days around here. Be welcome. Thanks a lot for your input.
    Please, let me ask you some questions and add some comments:
    Oh hey, you're quite welcome. I'm glad that you find it interesting.

    **edit**
    But please take everything I write with a pinch of salt and as such, I'm very hazy on some exact numbers and stuff. But the general techniques I remember vividly. We never tried to hit a certain number of triangles per frame - our goal was to make the game look good. You'd be amazed how re-phrasing your goal makes you push harder and come up with more inventive solutions.

    Quote Originally Posted by Barone View Post
    By the way, I still think that Burnout 2 looked amazing on the PS2 by the time of its release, especially compared to the other games available then. I love how detailed the city is. The 3D models of buildings and trees, to name a few, look far more detailed than on the best looking DC racers IMO.

    Do you have any idea about how many polygons per second or per frame the Burnout games, especially from the second one and on, were pushing on the PS2?
    I have absolutely no idea. It was a lot though. The number of polys that you can push is a bit of a crap benchmark on PS2 though. And the reason for this, is that it wasn't designed to push polys in a 1:1 fashion. It could do it, but if you look at the hardware, especially how DMA works through the VIF and how VU1 and GS work in parallel you're first thought is; it's designed to make use of high order primitives (curved surfaces, procedural models, sub-division cages etc.). That is, it's designed to take a small data set and spit out lots of triangles by tessellating on the fly.

    So really...this sort of measurement has little meaning as the real constraint was pushing data through the VIF in a one to one fashion, and trying to double buffer that up in VU1 memory so that you could have the GS and VU1 operating in parallel. But everybody was so committed to the traditional 1:1 triangle rendering that things like Xbox supported, so they replicated that for every aspect of rendering on the PS2.


    Quote Originally Posted by Barone View Post
    Just to clarify, you meant you had to go without Anti-Aliasing to be able to achieve 640x480 at 60fps, right?
    Some games, like Grand Prix Challenge from Melbourne House, claim to be using real-time supersampling instead. Was that the best way to go in your opinion/did you used some other workaround to the AA bug in order to avoid jaggies in your games?
    Have you tried/used any resolution mode higher than 640 x 480 in your PS2 games? What was the major limitation keeping you from using the PS2's higher res modes?
    So I was less than clear on this, sorry. I suffered from fizzy drink induced insomnia this evening. Achieving 640x480 was not really possible in a game because it took up so much VRAM for the 2x back buffers and the front buffer. The chip AA just didn't work. At all. And the way you got around that was by rendering to double height back-buffer, and then drawing a full-screen quad (well..a series of 32x32 quads because it was faster that way) at half height to use the texture sampler reduce the jaggies. So yeah....the guys at Melbourne House (who were awesome BTW) are talking about that technique. It was actually an advisory from Sony right from day 1 "Hardware AA is broken. Don't use it. Do this instead."

    It was nowhere near as good as proper AA though, but good enough for most things. It also meant you could do some neat tricks. See the bloom/fake HDR in B2 and B3 as an example of that.

    The PS2 was never really meant to render at high res. It was supposed to render at a lower resolution than the DC, but use HW AA to smooth out the jaggies. Doh!


    Quote Originally Posted by Barone View Post
    Thanks for the info. It confirms what was said in that old MDK2 developer interview and several other comments regarding the fill-rate capacity of the PS2.
    You have no frickin' idea. It was just insane; ~40GB/s alone for the rendering side which is close to the XB360!! The odd thing that people don't realise is that the GS was meant to do programmable pixel ops. But it didn't work in the same way as traditional pixel shaders worked. You were supposed to things in multiple passes, hence the insane fill-rate. You could even do texture decompression using VQ compression as a series of passes on a texture. I ran out of space for my decrompresser so I just used the alpha channel in the front-buffer as a work buffer. It was so flexible that you could do things like that.

    Interesting note; modern render engines use multiple passes to do most things. Just like the PS2 hardware was designed to do.


    Thanks a lot for this. I was afraid that I could have made some stupid mistake while trying to comprehend how that stuff worked but feels like I wasn't so far from the reality.

    Please, do you remember the actual amount of main RAM that you used to "share" as texture RAM in your games?
    This could give me a better idea about how long the Dreamcast could have remained competitive "fixed" at its 8 MB of VRAM.
    All of it. What wasn't used for render buffers, or temporary render targets, was used for texture cache. So the amount tended to vary depending on your needs.


    Quote Originally Posted by Barone View Post
    "UE" means "Unreal Engine"???
    Yeah, sorry about that. Unreal Engine 2.5.

    Quote Originally Posted by Barone View Post
    Do you think that such a "mess" state of things might have prevented the DC from being able to receive more and better ports of PC games?
    Speed Busters, for an examplem, was awesome on the PC but it had some serious frame rate problems on the DC, despite most of the DC specifications and its hardware design suggesting that it would have no problem to run games like that.
    Do you remember about any major bottleneck in the DC hardware when you worked with it? Did you have a background in other platforms?
    What do you think about Sega using the Windows CE on the Dreamcast? Was it efficient when compared to the Sony solution for the PS2 in your opinion?
    I'm sort of running out of time before I have to get out of my PJ's and head to work. So I'll give you a condensed answer, and expand on anything specific that you want me to answer.

    Real problem for DC was low sales. Low sales = no fiscal reason for spending money on a port, let alone a good one.
    Major Bottleneck: it was sort of fixed in what you could do. It wasn't really all that flexible. (I liked it though) and as I said; this could also be a strength.
    Windows CE was horrible. Remember Tomb Raider on the DC? Yeah...the lead programmer on that who I worked beside, still bitches about how bad CE was, even today.


    Quote Originally Posted by Barone View Post
    Pure gold, thanks.
    Any idea on how many polygons MGS2 was able to push (I wonder that the performance analyzer could measure that too, right?)?
    Again...a bit of a meaningless number. Lots is the answer. The real teller on how well written that game was, is the number of animated characters it could have on screen at once. There is a lot of detail, so I'll just say "more than most PS2 games, even towards the end of the consoles life". Pushing stuff through the VIF and being clever with VU1 was the real limiting factor on PS2.



    Quote Originally Posted by Barone View Post
    Especially on the outdoors of the Tanker IIRC.
    Several PS2-designed games had more slowdown on the Xbox for what I've read. I guess that they relied on the parallelism of the PS2 hardware for some important tasks and then ended up being bottleneck'ed on the Xbox.
    Hit the nail on the head.

    I remember my first day as a Senior PS2 coder on the UE project (I was promoted to Lead a couple of months later) and I sat down with the team to discuss our strategy, and this artist guy starts of with "the PS2 is crap at doing outdoor scenes, and this game is mostly outdoors. We should just accept it's going to be bad". Whut???? I ran the level he had made for the PS2 through the PA, noted the results, talked with another artists. Got the other artist to reduce his 256x256 textures to be smaller (in fact, we merged several into one texture) and did things like NOT HAVING 6 DIFFERENT TEXTURES ON A BOX. Then we showed him his outdoor scene with no noticeable drop in quality, running at 60fps on UE 2.5.

    Can't do outdoor scenes very well, eh? Badly made outdoor scenes maybe. I had a chat with the art director and he was moved onto the PC.
    Last edited by rusty; 01-30-2014 at 01:43 AM.

  4. #364
    Mastering your Systems Hero of Algol TmEE's Avatar
    Join Date
    Oct 2007
    Location
    Estonia, Rapla City
    Age
    27
    Posts
    9,965
    Rep Power
    103

    Default

    Incredibly fascinating stuff !
    Death To MP3, :3
    Mida sa loed ? Nagunii aru ei saa "Gnirts test is a shit" New and growing website of total jawusumness !
    If any of my images in my posts no longer work you can find them in "FileDen Dump" on my site ^

  5. #365
    Raging in the Streets azonicrider's Avatar
    Join Date
    May 2013
    Location
    British Columbia
    Posts
    2,588
    Rep Power
    38

    Default

    Yes of course low sales were a problem for Dreamcast. I bet if it sold 8 million by Sept 2000, third-parties would be going ape shit over the console.
    Certified F-Zero GX fanboy

  6. #366
    Wildside Expert
    Join Date
    Jan 2014
    Posts
    146
    Rep Power
    9

    Default

    Quote Originally Posted by azonicrider View Post
    Yes of course low sales were a problem for Dreamcast. I bet if it sold 8 million by Sept 2000, third-parties would be going ape shit over the console.
    Yeah, it was a problem. Sega had gotten everything about the Dreamcast right. The cost, the spec, the cost of development, the absolutely insanely good sdk compared to that stuff that Sony offered. But the marketing sucked, and they didn't have the cash to ride out the storm after several poorly selling hardware choices (MegaCD, 32X + Saturn).

    There was also the crap about EA demanding exclusive rights to Sports games on the Dreamcast which didn't help. If people had seen the EA sports titles on DC before PS2 had been launched, I'm pretty sure it would have gotten a fair market share. DC vs PS2 is not a great discussion for the DC, but DC vs PS1? There's more than just a clear winner there.


    I have a story to tell about the shenanigans that went on. It was around this time that Microsoft were canvasing devs for Xbox. However, Dreamcast was in the way of them having a clear #2 spot to aim for. They were afraid that if Sega rode out the tough period, they would cannibalise sales of XBox. So they would offer to pay for the cost of developing a Dreamcast title if one was in development, which the developer would actually work on, but then would not release it. Talk about underhanded, eh?

  7. #367
    Raging in the Streets azonicrider's Avatar
    Join Date
    May 2013
    Location
    British Columbia
    Posts
    2,588
    Rep Power
    38

    Default

    The specs fine? you don't think the Dreamcast could've used more main RAM? it only has 6x more RAM than Saturn.

    I think the main RAM could've been bumped up to 24 MBs, its not like they were using super expensive RAM like XDR.
    Certified F-Zero GX fanboy

  8. #368
    Wildside Expert
    Join Date
    Jan 2014
    Posts
    146
    Rep Power
    9

    Default

    At the time? It seemed fine to me. Sega also provided CriWare as part of the SDK which at the time, had an amazingly good general purpose streaming component to stream in data from the GD-ROM.

    Sure more memory would have been nice, but it wasn't a massive issue. Besides...as good as the SH4 was at processing vertex data, there was a relatively low limit on how much you could process. Which also limited data size. Don't get me wrong, it wasn't ideal, but it wasn't a disaster either. Especially after having come from developing on the PS1 and N64.

    My first reaction wasn't "how am I going to fit my data in there" but rather "what the hell am I going to fill all that memory with". *shrug* These days, I won't get out of bed for anything less than 1GB of memory.

    The thing that really kills me about this discussion is that I wish my ex had followed through with a promise to me. She was working in IT at a previous employer, and they had a tonne of Version 5 Dreamcast Devkits, with SCSI cards and GR-ROM emulators. The company was going to scrap them, because Sega didn't want them back. So she offered to let me have a whole batch! It didn't happen though.

    I'd really love to write a streaming open-world game on the the DC.
    Last edited by rusty; 01-30-2014 at 04:53 AM.

  9. #369
    Raging in the Streets azonicrider's Avatar
    Join Date
    May 2013
    Location
    British Columbia
    Posts
    2,588
    Rep Power
    38

    Default

    So the Dreamcast has the disc-drive speed necessary for GTA games, its just the bandwidth isn't fast enough?

    The Dreamcast has a 12x disc drive, so yea that was pretty fast. Not sure if I spell it "disc" or "disk".
    Certified F-Zero GX fanboy

  10. #370
    Wildside Expert
    Join Date
    Jan 2014
    Posts
    146
    Rep Power
    9

    Default

    It really depends on what you're going to render. I think the geometry could have been on par, with better textures. After all, the PowerVR2 supported compressed textures. Uploading them to VRAM might have been an issue though. But then the majority would have been 64x64 tiled textures with a static area for characters. VQ compression at that size is results in a 3K texture.

    So maybe 128x128 would have been better as it resulted in a 6K texture since the codebook is always 2K. Hmm. Yeah, I reckon it could have worked with a bit of effort.

  11. #371
    Bite my shiny, metal ***! Hero of Algol retrospiel's Avatar
    Join Date
    Mar 2008
    Location
    Cologne, FRG
    Posts
    7,816
    Rep Power
    87

    Default

    Quote Originally Posted by rusty View Post
    Achieving 640x480 was not really possible in a game because it took up so much VRAM for the 2x back buffers and the front buffer.
    Really nice to see that confirmed. Welcome to Sega-16 btw.
    The Mega Drive was far inferior to the NES in terms of diffusion rate and sales in the Japanese market, though there were ardent Sega users. But in the US and Europe, we knew Sega could challenge Nintendo. We aimed at dominating those markets, hiring experienced staff for our overseas department in Japan, and revitalising Sega of America and the ailing Virgin group in Europe.

    Then we set about developing killer games.

    - Hayao Nakayama, Mega Drive Collected Works (p. 17)

  12. #372
    Wildside Expert
    Join Date
    Jan 2014
    Posts
    146
    Rep Power
    9

    Default

    Quote Originally Posted by Christuserloeser View Post
    Really nice to see that confirmed. Welcome to Sega-16 btw.
    Thanks

    By not really possible, I actually mean it could do it. But with a double buffered backbuffer required for parallel rendering and update, you'd only have 0.5MB of VRAM. Not enough to work with really.

    But if there's one area that the Dreamcast stood head and shoulders above the PS2, it was the resolution and the really nice RAMDAC it had. The colour from the PS2 and XBOX always looked pretty washed out to me. But the Dreamcast had a really vibrant colour space. No matter what TV I had my DC on, the colour clarity was just amazing.
    Last edited by rusty; 01-30-2014 at 08:29 AM.

  13. #373
    Road Rasher
    Join Date
    Oct 2012
    Posts
    409
    Rep Power
    35

    Default

    Quote Originally Posted by rusty View Post
    Yeah, it was a problem. Sega had gotten everything about the Dreamcast right. The cost, the spec, the cost of development, the absolutely insanely good sdk compared to that stuff that Sony offered. But the marketing sucked, and they didn't have the cash to ride out the storm after several poorly selling hardware choices (MegaCD, 32X + Saturn).

    There was also the crap about EA demanding exclusive rights to Sports games on the Dreamcast which didn't help. If people had seen the EA sports titles on DC before PS2 had been launched, I'm pretty sure it would have gotten a fair market share. DC vs PS2 is not a great discussion for the DC, but DC vs PS1? There's more than just a clear winner there.


    I have a story to tell about the shenanigans that went on. It was around this time that Microsoft were canvasing devs for Xbox. However, Dreamcast was in the way of them having a clear #2 spot to aim for. They were afraid that if Sega rode out the tough period, they would cannibalise sales of XBox. So they would offer to pay for the cost of developing a Dreamcast title if one was in development, which the developer would actually work on, but then would not release it. Talk about underhanded, eh?
    Hi rusty ,

    Welcome to the forum, its really great to have someone who developed games on PS2 and Dreamcast as part of this forum and taking part in this discussion.
    I am really interested in that last section of your post regarding Microsoft and their apparent role in the Dreamcast's demise. If I am understanding you correctly then Microsoft (despite being a partner of Sega) was actively attempting to sabotage Sega's chances of pulling the DC out of its tough period in order to further their own chances with the Xbox.

    I was wondering if you know of any specific examples of these unreleased games that were created by independent developers and financed by Microsoft and a rough idea of how many titles we are talking about. With all the talk of unreleased games for the Dreamcast coming to light and finally been shown to the Dreamcast community (eg Geist Force and the Dreamcast version of Toejam and Earl 3), I just wonder how many other games that are sitting on hard drives out there that Microsoft financed and sat on while the Dreamcast was suffocated.

  14. #374
    Nameless One
    Join Date
    Sep 2010
    Posts
    50
    Rep Power
    7

    Default

    Great to see the perspective of someone who worked on both machines. Confirms what I always thought that that the DC was a decent machine with a straight forward balanced design that was perfectly capable but that the PS2 was an absolute beast but difficult to harness properly. It was different architecture to what people were used to and required a different mindset, very idiosyncratic but supremely flexible. More capable but took a lot of effort to get great result.

    Also confirms my feeling that DC had a super strong and "clean" video signal. Always looked really bright and vibrant to me whatever video connection I used. Image always seemed to "pop" off the screen. The PS2 and XBox always seemed a bit more muted and "muddy" somehow (not in terms of art design as obviously Sega Blue Skies and all that but just in terms of sharpness, contrast and clarity. Always noticed it when comparing the Namco logo between systems even if using Bleem Dreamcast was definitely a system that benefitted from seeing it in person as it were. Soul Calibur via VGA still looks ace.

    Shame you didn't get those Dev Kits. Would have been great to have more people developing on DC

  15. #375
    Hard Road! ESWAT Veteran Barone's Avatar
    Join Date
    Aug 2010
    Location
    Brazil
    Posts
    6,839
    Rep Power
    142

    Default

    Quote Originally Posted by rusty View Post
    I have absolutely no idea. It was a lot though. The number of polys that you can push is a bit of a crap benchmark on PS2 though. And the reason for this, is that it wasn't designed to push polys in a 1:1 fashion. It could do it, but if you look at the hardware, especially how DMA works through the VIF and how VU1 and GS work in parallel you're first thought is; it's designed to make use of high order primitives (curved surfaces, procedural models, sub-division cages etc.). That is, it's designed to take a small data set and spit out lots of triangles by tessellating on the fly.
    The more I read about the PS2 hardware design the more I feel that it was intended to give as many alternatives as possible to the developer in order to offload the CPU from most of the rendering-related tasks and make the game run as fast as possible.
    I mean, it seems to have been designed to provide a plethora of possibilities for graphical effects without hurting the frame rate.



    Quote Originally Posted by rusty View Post
    So really...this sort of measurement has little meaning as the real constraint was pushing data through the VIF in a one to one fashion, and trying to double buffer that up in VU1 memory so that you could have the GS and VU1 operating in parallel. But everybody was so committed to the traditional 1:1 triangle rendering that things like Xbox supported, so they replicated that for every aspect of rendering on the PS2.
    Oh, sorry for that. But some people here usually go apeshit when say that the PS2 capabilities for polygon rendering were clearly ahead of the DC ones and visual examples seem to fail to convince them; so cold hard numbers are all that they usually like to "bite".

    But, really, cold comparisons of hardware specs usually leads to many misconceptions and assumptions about how well the hardware will perform. However, it's not an easy task to convince people to pay attention to how the hardware specs actually play in practical situations, how the design of the system can influence the overall performance much more than bigger or lower numbers...



    Quote Originally Posted by rusty View Post
    So I was less than clear on this, sorry. I suffered from fizzy drink induced insomnia this evening. Achieving 640x480 was not really possible in a game because it took up so much VRAM for the 2x back buffers and the front buffer. The chip AA just didn't work. At all. And the way you got around that was by rendering to double height back-buffer, and then drawing a full-screen quad (well..a series of 32x32 quads because it was faster that way) at half height to use the texture sampler reduce the jaggies. So yeah....the guys at Melbourne House (who were awesome BTW) are talking about that technique. It was actually an advisory from Sony right from day 1 "Hardware AA is broken. Don't use it. Do this instead."
    Very interesting info there. By "from day 1" you mean since the official release of the PS2 or even prior to that Sony already admitted to the 3rd parties that their AA system was broken?
    'Cause launch titles like Namco's Ridge Racer V ended up being released without any sort of workaround for the lack of AA as it seems... I mean, I always supposed that they waited until the last minute for a hardware revision that would fix the problem prior to the console's release and ended up being screwed without, having to smoke a bit of their reputation by releasing their major franchise with lots of jaggies.
    The first Virtua Fighter 4 port is also a serious offender in such aspect despite being released in 2002.


    Quote Originally Posted by rusty View Post
    It was nowhere near as good as proper AA though, but good enough for most things. It also meant you could do some neat tricks. See the bloom/fake HDR in B2 and B3 as an example of that.
    Interesting. Yeah, I love how you use fake High Dynamic Range in Burnout 2, it still looks good today IMO.
    Do you think that it would be possible/feasible to implement fake-HDR in Dreamcast racing games as well? 'Cause I don't remember playing any DC games with that effect.


    Quote Originally Posted by rusty View Post
    The PS2 was never really meant to render at high res. It was supposed to render at a lower resolution than the DC, but use HW AA to smooth out the jaggies. Doh!
    The PS2 would be always punching above its weight that way... The N64 did something like that previously, N64 fans to this day love to say how it didn't had the PS1 jaggies but it was still rendering most of the games at 320 x 240.


    Quote Originally Posted by rusty View Post
    You have no frickin' idea. It was just insane; ~40GB/s alone for the rendering side which is close to the XB360!! The odd thing that people don't realise is that the GS was meant to do programmable pixel ops. But it didn't work in the same way as traditional pixel shaders worked. You were supposed to things in multiple passes, hence the insane fill-rate. You could even do texture decompression using VQ compression as a series of passes on a texture. I ran out of space for my decrompresser so I just used the alpha channel in the front-buffer as a work buffer. It was so flexible that you could do things like that.

    Interesting note; modern render engines use multiple passes to do most things. Just like the PS2 hardware was designed to do.
    I remember how a lot of people were going crazy in gaming forums when it was announced that the GameCube would support one-pass multitexture. I guess that was another case where people should be trying to understand what that meant rather than going crazy reading the specs...
    The same for S3TC hardware support on the GC, 'cause, if I'm not misunderstanding what you said, you could also have fixed-rate data compression on the PS2 with similar results just by using VQ-based schemes. Of course, you would have to implement it though.

    Still about how you could exploit the fill-rate capacity of the PS2, I think Emboss Bump Mapping could also be done with no significant hit to the performance.
    Did you use it in the Burnout games on the PS2 or any other game that you developed on the console (there are dozens of threads in different gaming forums about the subject)? What about "DOT3 bump mapping"/normal mapping? EMBM?
    Any idea about what actually was the Ubisoft's Geotexture (they hyped about it back then)?

    For the comparative sake of this thread, could those techniques be effectively used on the Dreamcast, for an example, in the pavement of the track in a racing game without major impacts in the performance of the system (F-Zero GX on the GC does it in at least one of its tracks)? 'Cause we've seen very little use of bump mapping in Dreamcast games, stuff like a coin in Shenmue.
    AFAIK you could implement normal mapping in a scene with a single light source without much trouble on the DC but IDK how expensive it could be when used in situations like in F-Zero GX. I also think that the "single light source" limitation could be a deal breaker in many cases but, please, correct me if I'm wrong.

    Also, in a previous discussion in this thread, a forum member said that the Dreamcast didn't have hardware support for Bump Mapping and posted a picture from an old magazine (http://farm6.staticflickr.com/5471/1...bfc5912f_c.jpg) to prove that. Despite have been thrashed by other members, I think he (and Hideki Sato) was probably talking about no hardware support for Bump Mapping with per pixel lighting (which used to be considered the "real" one at the time IIRC) in which case he would be right. Any thoughts/comments?

    In terms of environment mapping I'd like to ask you if you're using real-time reflections in any of the PS2 Burnout games (like Melbourne House claimed to be using in Grand Prix Challenge) or it was more like a fake dynamic environment mapping done right (Burnout 3 seems to have been improved when compared to Burnout 2 in such aspect)? Back in the days, press published that Gran Turismo 3 was using real-time reflections but it seems like it is also just a well done fake for what I have carefully observed.
    And, again, even fake dynamic environment mapping was something quite rare on the Dreamcast, I just remember about Le Mans (also from Melbourne House) using it but you could easily notice that it was fake (not on par with GT3's, for an example). Would the DC's fill-rate be a problem for dynamic environment mapping or it was more about being CPU-heavy?
    On the PS2, would it be possible to implement dynamic environment mapping with real-time reflections using just/mostly the GS instead of being CPU-heavy?
    For comparison, the PC games of the time like F1 Racing Championship (by Ubisoft) would make your CPU craw if you had switched from "Static Environment Mapping" to "Dynamic Environment Mapping" in the options menu.

    Please, if you could provide a similar comparative brief analysis about the possibility of implementing other effects that you used in Burnout games, like Radial Blur (which I also don't remember have been used in any Dreamcast game), on the Dreamcast it would be really cool.



    Quote Originally Posted by rusty View Post
    All of it. What wasn't used for render buffers, or temporary render targets, was used for texture cache. So the amount tended to vary depending on your needs.
    Awesome to be able to know about that. Thanks.



    Quote Originally Posted by rusty View Post
    I'm sort of running out of time before I have to get out of my PJ's and head to work. So I'll give you a condensed answer, and expand on anything specific that you want me to answer.

    Real problem for DC was low sales. Low sales = no fiscal reason for spending money on a port, let alone a good one.
    Major Bottleneck: it was sort of fixed in what you could do. It wasn't really all that flexible. (I liked it though) and as I said; this could also be a strength.
    Windows CE was horrible. Remember Tomb Raider on the DC? Yeah...the lead programmer on that who I worked beside, still bitches about how bad CE was, even today.
    Maybe "memory leak" would ring a bell to your Tomb Raider friend?
    Besides that, I wonder how buggy the Windows CE Toolkit (especially its first version) was. For what I've read, the Direct3D version it used didn't support all of the DC's PowerVR capabilities and sometimes it would be impossible to use them even if you recurred to the Dreamcast-specific flags to expose the PowerVR features.
    MS also claimed that such Direct3D implementation had been optimized for the PowerVR but I wouldn't bet it was really near to squeeze everything that PowerVR could deliver in terms of performance.
    All MS promises became more and more grey to me when they announced that they would be jumping into the market by themselves. Would you trust that they wouldn't be delivering half-assed "optimizations" and buggy tools to their competitors at Sega?


    Quote Originally Posted by rusty View Post
    Again...a bit of a meaningless number. Lots is the answer. The real teller on how well written that game was, is the number of animated characters it could have on screen at once. There is a lot of detail, so I'll just say "more than most PS2 games, even towards the end of the consoles life".
    I keep repeating stuff like that around here but it doesn't seem to work in most of the discussions.


    Quote Originally Posted by rusty View Post
    Pushing stuff through the VIF and being clever with VU1 was the real limiting factor on PS2.
    Talking about "Pushing stuff through the VIF", when you mentioned that "and we were still uploading massive 256x256 uncompressed textures in one go" I didn't understand why you were doing that in the first place. Were you willing to compress that later on using some more traditional scheme with the IPU and so you were initially uploading 256x256 in order to achieve better compression ratio later or something like that? It just bugged me.


    Quote Originally Posted by rusty View Post
    I remember my first day as a Senior PS2 coder on the UE project (I was promoted to Lead a couple of months later) and I sat down with the team to discuss our strategy, and this artist guy starts of with "the PS2 is crap at doing outdoor scenes, and this game is mostly outdoors. We should just accept it's going to be bad". Whut???? I ran the level he had made for the PS2 through the PA, noted the results, talked with another artists. Got the other artist to reduce his 256x256 textures to be smaller (in fact, we merged several into one texture) and did things like NOT HAVING 6 DIFFERENT TEXTURES ON A BOX. Then we showed him his outdoor scene with no noticeable drop in quality, running at 60fps on UE 2.5.

    Can't do outdoor scenes very well, eh? Badly made outdoor scenes maybe. I had a chat with the art director and he was moved onto the PC.
    lol
    With artists designing the scenes like that you didn't need anything else to be in trouble.

    By the way, would you say that the first PA system is what really boosted the looking of the later PS1 games?
    Do you consider the PS2PA a major advantage in terms of pushing the hardware when compared to the DC-related tools?




    Quote Originally Posted by rusty View Post
    There was also the crap about EA demanding exclusive rights to Sports games on the Dreamcast which didn't help. If people had seen the EA sports titles on DC before PS2 had been launched, I'm pretty sure it would have gotten a fair market share. DC vs PS2 is not a great discussion for the DC, but DC vs PS1? There's more than just a clear winner there.
    Not having EA in the DC side of things was really bad IMO. They also had a good catalog of PC games that could have been ported to the Dreamcast to give a huge boost to its library right in the beginning. Heck, even the freakin' 3DO experienced something like that in a much worse context.
    I'd love to have played NFS Porsche on the Dreamcast instead of being stuck with the PS1 port which was more like a different game (despite using a very good engine IMO).


    Quote Originally Posted by rusty View Post
    I have a story to tell about the shenanigans that went on. It was around this time that Microsoft were canvasing devs for Xbox. However, Dreamcast was in the way of them having a clear #2 spot to aim for. They were afraid that if Sega rode out the tough period, they would cannibalise sales of XBox. So they would offer to pay for the cost of developing a Dreamcast title if one was in development, which the developer would actually work on, but then would not release it. Talk about underhanded, eh?
    Yuck, that sucks but thanks for sharing this with us.


    Quote Originally Posted by rusty View Post
    At the time? It seemed fine to me. Sega also provided CriWare as part of the SDK which at the time, had an amazingly good general purpose streaming component to stream in data from the GD-ROM.

    I'd really love to write a streaming open-world game on the the DC.
    Great to hear that.


    Quote Originally Posted by rusty View Post
    By not really possible, I actually mean it could do it. But with a double buffered backbuffer required for parallel rendering and update, you'd only have 0.5MB of VRAM. Not enough to work with really.
    Thanks for refining the answer.


    Quote Originally Posted by rusty View Post
    But if there's one area that the Dreamcast stood head and shoulders above the PS2, it was the resolution and the really nice RAMDAC it had. The colour from the PS2 and XBOX always looked pretty washed out to me. But the Dreamcast had a really vibrant colour space. No matter what TV I had my DC on, the colour clarity was just amazing.
    This (by Simon Fenney, one of the designers of Dreamcast's graphics chip) probably explains, at least partially, why the video output of the Dreamcast looked so vibrant:
    "Performance wise, there'd be very little difference. Obviously 32bit > 24 > 16bit when it comes to bandwidth use, but that's not going to be terribly significant for a 640x480x[50|60] Hz display.

    The main reason, I suspect, is that the developers want to reserve as much of the graphics memory for textures, and so using a 16-bit mode saves them a bit of space.

    Now because PowerVR's 16-bit format is much better in general that other systems' there's not really a significant loss in quality in going to 16 bit. (IIRC, there's a public demo of this on the PowerVR developer website)"


    I had a PC with a Voodoo3 at the time and I can confirm that its 16-bit format looked far inferior to the DC's PowerVR one. Colors looked washed out for the most part and games like NFS Porsche would have stuff like green-ish smoke effects on the Voodoo 3. This made me upgrade to a Diamond Viper V770 Ultra TNT2 and get drunk in its 32-bit mode.



    Quote Originally Posted by stu View Post
    I just wonder how many other games that are sitting on hard drives out there that Microsoft financed and sat on while the Dreamcast was suffocated.
    We're probably talking about something in the 50-150 (maybe a bit more, IDK) range of games, projecting numbers based on what this September 1999 article states:
    "Currently, more than 70 titles in development for the Dreamcast are making use of the Win CE development environment. If Microsoft's tools live up to expectations, we can hopefully expect great things from these titles."

    http://www.ign.com/articles/1999/09/...ced-for-the-dc



    As a bit off-topic question/story, since you worked at Criterion, did you know/worked with someone form the early days of the company?
    I used to love a game from the early days of the RenderWare (don't tell me it didn't use it, ehehe) called SpeedBoat Attack. That game received mixed reviews at the time (it was actually panned by several magazines) and it's forgotten nowadays but I played it to death. Have you ever played it?
    It had a cool announcer ("Impressive start!!!", "Faaaannnnntastic!") with a British accent and the game was very fun to play. I think that Hydro Thunder copied a lot of ideas from that game, like the overuse of secret shortcuts and improved several aspects of the gameplay. It's crazy how fast some games are forgotten and others praised as revolutionary when they actually just stole a lot of ideas from previous games.
    SpeedBoat Attack also supported the first Voodoo card and I was blown away when I saw the water "properly" rendered for the first time; people would laugh at those graphics now...

    My 2 cents.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •