December 18, 2002, 08:07
|
#31
|
Emperor
Local Time: 14:35
Local Date: November 1, 2010
Join Date: Aug 1999
Location: Aarhus, Denmark
Posts: 3,618
|
Now bear in mind that I know next to nothing about these things. But it struck me that in the days just after Civ3 was released, Soren Johnson - the AI programmer for that game - said that one of the ways he tested the AI was to let the game play an entire night with no human players (something which in that game can only be done in debug mode). Surely QSI have a similar method
Asmodean
__________________
Im not sure what Baruk Khazad is , but if they speak Judeo-Dwarvish, that would be "blessed are the dwarves" - lord of the mark
|
|
|
|
December 18, 2002, 11:18
|
#32
|
Emperor
Local Time: 14:35
Local Date: November 1, 2010
Join Date: Jan 2001
Location: Ashes
Posts: 3,065
|
Playing the AI against itself will have little to do with regression testing. Regression testing expects the results of the run to be always the same so that they can be compared (it's a bit more complicated, e.g. if you want to test performances when you have the right to be better...).
You want to track inconsistencies and crashed, it is not the same as launching the AI do stuff all alone. You want to track selections, actions from the player when you do regression testing.
Some regression tests could be automated by playing the game against itself, but it would be very limited, and you would have to develop some code to "read" the output (for instance if you make a scenario where you want to test a warrior vs. 3 tanks fight, you have to check the warrior is dead after the fight, which Soren Johnson or you will see by casting a glance at the screen, but the computer has no idea how to report that.
When you want to simulate thousands of different combinations (think each unit vs unit fight in Civ - now think of the variety of units in Moo, spice up with races special abilities and tactical comnbat): thousands of fight reports should be looked at to see if everything went on as expected.
If you want to test these, you'd better write some actual testing software. Believe me, it is much more complicated than launching the game (against itself or not). This could be partly automated, however.
(edited: added paragraphs)
Last edited by LDiCesare; December 18, 2002 at 11:27.
|
|
|
|
December 18, 2002, 21:23
|
#33
|
Warlord
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Dec 2002
Posts: 124
|
Quote:
|
Originally posted by Asmodean
Now bear in mind that I know next to nothing about these things. But it struck me that in the days just after Civ3 was released, Soren Johnson - the AI programmer for that game - said that one of the ways he tested the AI was to let the game play an entire night with no human players (something which in that game can only be done in debug mode). Surely QSI have a similar method
Asmodean
|
Sure, they probably do but that will only show the more obvious bugs.
Most of the bugs that are found through regression testing occur because the tests are designed to run through boundary cases of situations or are designed to run every single line of code possible.
The AI will do 90% of the things humans will do but its the 10% that they don't do that they need to check for. But they can't tell exactly what the AI has done or hasn't done so everything still needs to be tested.
|
|
|
|
December 19, 2002, 10:31
|
#34
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Apr 2002
Location: North Carolina
Posts: 190
|
Quote:
|
Originally posted by moomin
This seems pretty obvious, yes. Given that you are something of an industy insider, would you care to guess about what the reasons behind this seemingly self-destructive behaviour is? Is it because the customers accept it? Is it because it has become a norm noone seeking funding dare deviate from? It's hard to understand it from the outside.
|
The problem, IMHO, is that too few people understand that software really IS engineering, as much as building a bridge or fabricating a circuit board.
There are several reasons for this. One, software is intangible -- you run the binary, and things happen, but only someone who understands the source code can appreciate the complexity of what is making those things happen. It's much much easier to be impressed by a towering bridge or an intricate circuit board.
Second, a lot of people seem to think that, since seemingly anyone can do fairly basic stuff -- create an HTML page, or a simple VB GUI, that the rest of it can't be "all that hard". Of course, these same people don't apply the same logic to building a sand castle vs. building a REAL castle.
Third, since software CAN be modified and improved after the fact, there's a perception that getting it as right as possible the first time isn't AS important as it is for, say, building construction. Again, a flawed assumption -- good software is a product of up front analysis and design, like any engineering effort. People that just jump into writing the code are DOOMED to fail, and don't let them tell you otherwise.
Finally, the ultimate problem is that the people *making the decisions* are the ones who aren't waking up to these facts -- that software development IS an engineering process, and one that needs the same level of attention to detail, preparation, review, and so on.
When the right people understand these issues, and give them due attention, you get software done on time, on budget, feature complete, and stable. The very first project I worked on for a paying customer, less than 6 months out of college, *doubled* its performance requirement, on hardware that cost half as much as the originally proposed spec machine, and *does not crash* -- I mean it is almost literally impossible to crash the software (you have to have some other app sabotaging the OS, or have the OS or the hardware fail). That was a 6 month project, of which only 1.5 was spent writing the code -- almost a month was spent first nailing down the requirements, another month doing (and reviewing!) the design, then implementation, then testing, performance tuning, etc. Just writing code from day one, I'm sure the project would have taken twice as long, or more, because the decisions and changes that became necessary during requirements and design can be made MUCH faster if code doesn't have to be changed as a result.
__________________
Xentax@nc.rr.com
|
|
|
|
December 19, 2002, 20:19
|
#35
|
Settler
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Nov 2002
Posts: 16
|
"When the right people understand these issues, and give them due attention, you get software done on time, on budget, feature complete, and stable."
Xentax,
For one who is a BT for a game with so many missed release dates, that is one helluva statement. I applaud you for it.
|
|
|
|
December 19, 2002, 23:44
|
#36
|
Emperor
Local Time: 05:35
Local Date: November 1, 2010
Join Date: Dec 1969
Location: LF & SG(2)... still here in our hearts
Posts: 6,230
|
Yep. When I took structural engr programming we spent most of our time creating the design (not really a traditional flowchart, but something close). We only started coding when the design was 100% complete. Then each subroutine was created as a null-return stub, and the code was fleshed in and tested out incrementally.
Even with that precaution I had an error in one parameter (an array dimension—we were using good ol' FORTRAN) that took two or three weeks to track down. As soon as I discovered the errant digit the whole program worked flawlessly because the testing still worked right on the other subroutines.
Obviously, with a game in which functions or features or the entire objective may change with the stroke of a pen, the ability to complete the design before coding may be hampered. That can't be fun for a disciplined programmer…
__________________
(\__/) Save a bunny, eat more Smurf!
(='.'=) Sponsored by the National Smurfmeat Council
(")_(") Smurf, the original blue meat! © 1999, patent pending, ® and ™ (except that "Smurf" bit)
|
|
|
|
December 20, 2002, 07:06
|
#37
|
Prince
Local Time: 13:35
Local Date: November 1, 2010
Join Date: Dec 1969
Location: Belgium
Posts: 301
|
Hmmm... I think the complexity of this program is somewhere in the 100 thousands lines of source code. It is impossible to produce a complex desing document, which has a complete description of the whole program. Probably they have a kinda big picture desing doc and many smaller design and programmer docs for each aspect/part of the game.
__________________
Blade
|
|
|
|
December 20, 2002, 10:54
|
#38
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Apr 2002
Location: North Carolina
Posts: 190
|
It's not impossible to produce a low-level design document. It's time consuming, and depending on the size/stability of the dev team, it may be *overkill*, but it's not impossible.
At a minimum, module interfaces -- anywhere one major area of the code is talking to another major area -- need to be completely designed and reviewed. Anything one person is designing and another implementing should also see as complete a design as possible, because the designer's set of assumptions won't necessarily match up to the actual implementor's.
While a given dev may not need to get every last detail designed for the parts of the project that are "internal" to his area of responsibility, he still SHOULD at least cover all the bases. Ideally, he should still generate as complete a design as possible, because there IS always a chance he'll get hit by a bus halfway through.
__________________
Xentax@nc.rr.com
|
|
|
|
December 20, 2002, 13:06
|
#39
|
Chieftain
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Dec 2002
Posts: 71
|
Even more importantly than getting hit by a bus:
- Designing first (and documenting it!) means you have a chance to think about all the questions you need to answer, so that you are less likely to get halfway through implementation and suddenly realize you forgot something completely
- Having a review of written documentation will give other designers working on related components or components you need to interface with a chance to see what you're doing, make suggestions, and accomodate what they're doing to fit what you're doing (more than just at the interface level -- for instance, common look and feel, similar debugging tools, etc.)
|
|
|
|
December 22, 2002, 21:48
|
#40
|
Settler
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Dec 2002
Posts: 2
|
Heh... Moomin, you have provoked a lot of discussion...
Funnily enough you compared MOO3 with Neverwinter Nights... which had to be one of the most shoddily developed games in the history of the gaming indusry.
I am truly saddened that bioware managed to make such a mess of a game with so much promise. Honestly, their quality control process has fans living in fear of what the next patch will break, a sad thiing really.
-Perhaps they were rushed into releasing the game, rather than simply putting off the release date until the game was wel and truly finished.
Personally I have enough other interests and enough to do that I am not going to chew off my fingernails waiting for a game, no matter how eagerly I might await it.
....in the end I sit back and enjoy the excitement of the expectation.... ahhhh sweeet anticipation.
|
|
|
|
December 23, 2002, 01:45
|
#41
|
Warlord
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Nov 2002
Posts: 282
|
More often than not, a low-level design doc ends up looking eerily like code; my argument would be to write self-documenting code, and allow it to explain the low-level architecture.
Design first is often an excellent strategy, but it's not the only one that works. Ever read any XP/Agile programming books, Xentax? I would argue that many developers think in terms of coding as the way they explain design, and so long as you approach coding early on as rough drafts that will be modified or thrown away in the future, it can be just as stable, useful, and reliable as fully specced, fully designed engineering projects. Often moreso, because they are able to go through many iterations, and iterations often produce better code.
The most important thing for everyone out there to understand is that the specs are the key. Period. Bad specs == bad code. Incomplete specs == incomplete code. If one expects to write anything of quality when specifications aren't understood, they are wrong.
That's clearly one of the bigger flaws of MOO3, and we all know it. The specs called for IFP, which is a high-level way of controlling the entire game. Removing it changes a LOT of the way that the code was organized, prepared, and planned on. It effectively could touch everything. That's something of a biggie. Every time features get added or removed, again, incomplete specs. These are things that are almost impossible to avoid in the gaming environment without massive beta testing (like Blizzard does, for instance), and in general are hard to deal with when releasing a commercial retail level product. The less customers a product has, the easier it is to get specs right the first time. Retail has the most customers. QED.
|
|
|
|
December 23, 2002, 16:32
|
#42
|
King
Local Time: 13:35
Local Date: November 1, 2010
Join Date: Jul 2000
Location: Moo Like In Moomin
Posts: 1,579
|
Quote:
|
Originally posted by Xentax
The problem, IMHO, is that too few people understand that software really IS engineering, as much as building a bridge or fabricating a circuit board.
|
Hmm. But this is true for the entire software industry - people have much more respect for hardware engineering than for software. Nevertheless, it seems downright ludicrous practices with regard to time planning and project requirements management isn't so much an exeption as a rule in the PC gaming indutry, and given the countless failures of the model currently employed you'd think the backers would finally start doing their due diligence and demand better practices. I still don't understand why this isn't happening.
__________________
"The number of political murders was a little under one million (800,000 - 900,000)." - chegitz guevara on the history of the USSR.
"I think the real figures probably are about a million or less." - David Irving on the number of Holocaust victims.
|
|
|
|
December 23, 2002, 16:44
|
#43
|
King
Local Time: 13:35
Local Date: November 1, 2010
Join Date: Jul 2000
Location: Moo Like In Moomin
Posts: 1,579
|
Quote:
|
Originally posted by Satyre
Funnily enough you compared MOO3 with Neverwinter Nights... which had to be one of the most shoddily developed games in the history of the gaming indusry.
|
Well, I've played NWN every weekend online with my old collage pals since it was released, and while it took a patch or two until it networked reliably, I haven't had it crash on me yet. Not a single time. Neither has any of the people I play with. So I'm pretty impressed with the game, given how incredibly flexible and open-ended the editor is.
But that's neither here nor there. My point wasn't that NWN is much better quality than Moo3 will be (although I will be very, very pleased if it runs half as smooth and stable for me as NWN had been doing). The point is that it is an undertaking that very much more complicted than Moo3, because of the versatility of the toolset, which Moo3 won't even be close to match.
__________________
"The number of political murders was a little under one million (800,000 - 900,000)." - chegitz guevara on the history of the USSR.
"I think the real figures probably are about a million or less." - David Irving on the number of Holocaust victims.
|
|
|
|
December 24, 2002, 19:28
|
#44
|
Emperor
Local Time: 13:35
Local Date: November 1, 2010
Join Date: Mar 2000
Location: London, UK
Posts: 3,732
|
I'd second that opinion of NWN. It has been a very good piece of software imo. Some people had issues with the nature of the 1 player campaign that came with the box, but the whole package was excellent and stable as far as I am concerned.
__________________
To doubt everything or to believe everything are two equally convenient solutions; both dispense with the necessity of reflection. H.Poincare
|
|
|
|
January 2, 2003, 02:57
|
#45
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Nov 2001
Location: Ontario
Posts: 108
|
Quote:
|
Originally posted by moomin
Well, I've played NWN every weekend online with my old collage pals since it was released, and while it took a patch or two until it networked reliably, I haven't had it crash on me yet. Not a single time. Neither has any of the people I play with. So I'm pretty impressed with the game, given how incredibly flexible and open-ended the editor is.
But that's neither here nor there. My point wasn't that NWN is much better quality than Moo3 will be (although I will be very, very pleased if it runs half as smooth and stable for me as NWN had been doing). The point is that it is an undertaking that very much more complicted than Moo3, because of the versatility of the toolset, which Moo3 won't even be close to match.
|
As I recall NWN was in developement for 5 years. Enough said.
|
|
|
|
January 2, 2003, 10:37
|
#46
|
King
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Sep 1999
Posts: 1,657
|
Quote:
|
Originally posted by HugoHillbilly
As I recall NWN was in developement for 5 years. Enough said.
|
We may be able to say the same about Moo3 when and if it is released.
|
|
|
|
January 2, 2003, 12:34
|
#47
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Apr 2002
Location: North Carolina
Posts: 190
|
Quote:
|
Originally posted by kalbear
More often than not, a low-level design doc ends up looking eerily like code; my argument would be to write self-documenting code, and allow it to explain the low-level architecture.
Design first is often an excellent strategy, but it's not the only one that works. Ever read any XP/Agile programming books, Xentax? I would argue that many developers think in terms of coding as the way they explain design, and so long as you approach coding early on as rough drafts that will be modified or thrown away in the future, it can be just as stable, useful, and reliable as fully specced, fully designed engineering projects. Often moreso, because they are able to go through many iterations, and iterations often produce better code.
|
If I said, or implied, that a complete low level design should be done before implementation starts, I apologize as that was not my intent. And, as I said, a *complete* low-level design is often not necessary or even appropriate. Depending on the situation, it's quite possible to start coding on one area or module before completing the low-level design of another area. We usually try to get all interfaces (and, for C++, all headers) done first, then we're free to fill in the rest, even if "the rest" involves some additional low-level design before coding. By "interfaces", I mean any module-to-module communication, not just "interface" in the OO sense of the word.
However, a complete *high level* design, BEFORE implementation, is a MUST, at least IMHO.
I have done some XP, and I've read about XP/Agile development. Paired programming can be a WONDERFUL experience in terms of code quality, and it's a great way to "learn the ropes", pairing an experienced developer with a lesser-experienced one (I've been in both positions now, though I'm still more likely to be the lesser-experienced one in most situations).
But, as others have said, you HAVE to spend time up front in design, because any potential oversights or gotchas ("Sleeping Dragons" as I've heard them called) are MUCH cheaper to resolve before you start coding.
Quote:
|
The most important thing for everyone out there to understand is that the specs are the key. Period. Bad specs == bad code. Incomplete specs == incomplete code. If one expects to write anything of quality when specifications aren't understood, they are wrong.
That's clearly one of the bigger flaws of MOO3, and we all know it. The specs called for IFP, which is a high-level way of controlling the entire game. Removing it changes a LOT of the way that the code was organized, prepared, and planned on. It effectively could touch everything. That's something of a biggie. Every time features get added or removed, again, incomplete specs. These are things that are almost impossible to avoid in the gaming environment without massive beta testing (like Blizzard does, for instance), and in general are hard to deal with when releasing a commercial retail level product. The less customers a product has, the easier it is to get specs right the first time. Retail has the most customers. QED.
|
I don't know how much game concepts like IFP affected game design in terms of code. I'm sure removing IFP had a non-trivial impact on the code, but it's hard to estimate just how big a change that required. I mean, they could have left a lot of the mechanics alone and just kept a hidden, non-depleting pool of IFP. Obviously, certain techs and actions had to be removed since they were tied to IFPs directly, but techs at least are pretty much data rather than code, so removing them isn't too big a deal. They may leave behind game mechanics that are never used, though.
In more general terms, we at least *try* to keep specs up to date when situations change -- so that, at the end of the project, the design DOES match the code. Of course, we're usually short of the ideal, since revising design docs after or during the fact takes time (and we all know time = money).
All of this tends to be problems of scale. On a small project, it's both cheap and easy to wiggle as needed. On a larger project (especially if the team is large), you pretty much have to keep the specs current or you'll have inconsistencies in implementation, as some people will know about changes but others won't.
__________________
Xentax@nc.rr.com
|
|
|
|
January 2, 2003, 16:02
|
#48
|
Warlord
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Nov 2002
Posts: 282
|
Quote:
|
We usually try to get all interfaces (and, for C++, all headers) done first, then we're free to fill in the rest, even if "the rest" involves some additional low-level design before coding. By "interfaces", I mean any module-to-module communication, not just "interface" in the OO sense of the word.
|
Agreed, though often the interfaces dictate well enough the behavior of the modules and the interactions so that full-fledged documentation is not necessary.
A complete high-level design for a retail product is a must, also agreed. This is especially true for the UI framework; the underlying mechanics don't need as much of this, as it's basically one large spreadsheet with various things acting on it (some of which are players, some of which are AIs).
Quote:
|
But, as others have said, you HAVE to spend time up front in design, because any potential oversights or gotchas ("Sleeping Dragons" as I've heard them called) are MUCH cheaper to resolve before you start coding.
|
Depends on the problem at hand, the scope of the problem, and how expensive it is to refactor that code out of there. I brought up the example of IFPs as a tough one to remove, and the reason is that it touches everything. All the graphics need to be changed, all the ways you interact with those graphics need to be modified. It's a pretty large interface change on the macro and micro level, and those kinds of high-level design changes usually are expensive.
Except that, if you have everything else in place, it's not so bad. If you have all the unit tests, if you have the graphics frameworks and the storyboard, it's just a matter of making a lot of propagating changes all over the place. It's work you have to do again, but it's not hugely costly. That's at least one of the ideas behind XP - that future design changes won't be horrendously expensive in time and manpower.
I think that can be very true for a small group of people working on a retail product that doesn't involve fabricating hardware - much like, say, almost all games published. That isn't to say you should just start coding and be done with it; it's to say that design should be considered and reconsidered every time you touch the code (refactoring, another rule of XP), and you shouldn't be afraid to do so early.
Quote:
|
All of this tends to be problems of scale. On a small project, it's both cheap and easy to wiggle as needed. On a larger project (especially if the team is large), you pretty much have to keep the specs current or you'll have inconsistencies in implementation, as some people will know about changes but others won't.
|
True, though with things like pair programming and a large body of unit tests, the specs need not be updated nearly as much.
Games, I feel, really can benefit from the XP process - at least for devs. Having massive unit testing done early is a godsend, after all. Being able to up-front change design without serious expense, because you've already budgeted that in and plan on doing so multiple times, seems like the way to go. Having a working model as early as possible also seems like a good thing. Imagine how many problems in, say, Everquest, would have been solved this way?
|
|
|
|
January 2, 2003, 17:16
|
#49
|
Chieftain
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Jan 2002
Location: Chicago, Il.
Posts: 86
|
Moomin, I don't know of many applications...
...that have the user base, platform diversity, and reliability requirements of computer games.
I spent 8 years working for a securities exchange which had a parallel processing hot backup, and 16 redundant databases because each minute of downtime was pegged at 1 million dollars. Even so, that was a cakewalk compared to game testing. And, appropriately, we didn't have nearly the regression staff chantz described.
I also worked for a credit history firm, building an API between the applications and the database. We had the fortune and misfortune to be able to completely automate the regression test, so that the test took 5 hours. ( Of course, getting a trillion test cases down to 0 bugs took a few years. )
MOO III might not have a trillion test cases, but testing cannot be automated very much (because of the custom graphical interface).
Bottom line is that a bug free product is impossible, and most people are ignorant of that fact. Chantz's reply tried to explain why. It also tried to go further and say that the product [B]could[B] be released now, but was not, and that represented a commitment to quality. I would like to add: a commitment that is sorely lacking in today's computer game industry.
Your response to Chantz's post reminds me of the director who almost fired me on the spot when I told him that every release of software we ever loaded into his system had bugs.
Pop Quiz: What's the biggest problem with doing a thorough job of looking for bugs?
Answer: Finding them.
Paraphrased: Ignorance is bliss...
|
|
|
|
January 2, 2003, 17:44
|
#50
|
Chieftain
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Jan 2002
Location: Chicago, Il.
Posts: 86
|
Okay design freaks... :)
Paralysis by Analysis... I am a systems architect, so believe me when I say, I love designs, but you have got to start building to test it out. Too much time is wasted arguing whether something will work, too little spent on arguing about what is needed.
Building a program is not like building a bridge. All the bridges in the world have the same set of rules. Bridge stand, Bridge bear load, Bridge not fall down. Sir Isaac Newton is the primary subject-matter expert, now go do your max/min projects and be done with it. Every program has different requirements and different SMEs.
Even with bridge building, every design has build and test cycles incorporated on a scaled down models.
The gaming industry has an even tougher row to hoe. Even if the code satisfies the design flawlessly, it still has to be fun. Fun is not as easy to measure as, Bridge still standing or field not null. The only way to test "fun" is to build it first...
I see the problem as just the opposite. Too much of the game developement cycle is now dedicated to pre-build work, and too little dedicated to post-build adjustments.
PS I believe the reason for this is economics 101 stuff. Why pay for a QA staff when you can get a QA staff to pay you for the privilege? The only risk comes when you are too far off the mark and cannot rework your way into a quality product (HOMMIV). This approach only works for established franchises. New franchises tend to be more solid because the free QA staff does not yet exist...
|
|
|
|
January 3, 2003, 11:13
|
#51
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Apr 2002
Location: North Carolina
Posts: 190
|
Quote:
|
Originally posted by kalbear
Agreed, though often the interfaces dictate well enough the behavior of the modules and the interactions so that full-fledged documentation is not necessary.
A complete high-level design for a retail product is a must, also agreed. This is especially true for the UI framework; the underlying mechanics don't need as much of this, as it's basically one large spreadsheet with various things acting on it (some of which are players, some of which are AIs).
Depends on the problem at hand, the scope of the problem, and how expensive it is to refactor that code out of there. I brought up the example of IFPs as a tough one to remove, and the reason is that it touches everything. All the graphics need to be changed, all the ways you interact with those graphics need to be modified. It's a pretty large interface change on the macro and micro level, and those kinds of high-level design changes usually are expensive.
Except that, if you have everything else in place, it's not so bad. If you have all the unit tests, if you have the graphics frameworks and the storyboard, it's just a matter of making a lot of propagating changes all over the place. It's work you have to do again, but it's not hugely costly. That's at least one of the ideas behind XP - that future design changes won't be horrendously expensive in time and manpower.
I think that can be very true for a small group of people working on a retail product that doesn't involve fabricating hardware - much like, say, almost all games published. That isn't to say you should just start coding and be done with it; it's to say that design should be considered and reconsidered every time you touch the code (refactoring, another rule of XP), and you shouldn't be afraid to do so early.
True, though with things like pair programming and a large body of unit tests, the specs need not be updated nearly as much.
Games, I feel, really can benefit from the XP process - at least for devs. Having massive unit testing done early is a godsend, after all. Being able to up-front change design without serious expense, because you've already budgeted that in and plan on doing so multiple times, seems like the way to go. Having a working model as early as possible also seems like a good thing. Imagine how many problems in, say, Everquest, would have been solved this way?
|
It's that budgeting for changes, in terms of both cost and time, that is the great failing of so many software projects. It's so much more common to have less than 100% of the time you'd need to do it in UTOPIAN conditions, and we all know that it NEVER goes 100% according to plan.
Maybe that's one reason the project manager we had with a Marine background was SOOO capable (far and away the best manager I've ever worked under) -- he *expected* "the waste material to interface with the air oscillation unit" at least once per project, and planned appropriately.
__________________
Xentax@nc.rr.com
|
|
|
|
January 4, 2003, 12:24
|
#52
|
Emperor
Local Time: 14:35
Local Date: November 1, 2010
Join Date: Aug 1999
Location: Aarhus, Denmark
Posts: 3,618
|
Quote:
|
Originally posted by Xentax
he *expected* "the waste material to interface with the air oscillation unit" at least once per project
|
HeHe...Nice one, Xentax.
Asmodean
__________________
Im not sure what Baruk Khazad is , but if they speak Judeo-Dwarvish, that would be "blessed are the dwarves" - lord of the mark
|
|
|
|
January 6, 2003, 06:36
|
#53
|
Warlord
Local Time: 20:35
Local Date: November 1, 2010
Join Date: Oct 2002
Posts: 103
|
well dont you have to get console games perfect the first time because you cant patch a console game? so you kind of have a car that you cant fix, so you would spend more time getting the console game perfect or something close to it.
__________________
"Dont move or ill shoot you full of... little yellow bolts of light!" -John Crichton, astronaut and scientist
|
|
|
|
January 6, 2003, 09:18
|
#54
|
King
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Oct 2002
Location: Birmingham, AL
Posts: 1,595
|
Quote:
|
Originally posted by Gooberman32
well dont you have to get console games perfect the first time because you cant patch a console game? so you kind of have a car that you cant fix, so you would spend more time getting the console game perfect or something close to it.
|
Enjoy this while it lasts. I have an Xbox and it already has games that could use a patch. As online console gaming becomes more prevalent, it offers a convenient way to patch a broken game. Console players like Sony and Microsoft won't complain because it forces consumers to participate(buy) in their online experience. Look for more shoddily produced games to hit the market as more people get online; it's not going to be any different than the PC market
|
|
|
|
January 6, 2003, 10:14
|
#55
|
Warlord
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Apr 2002
Location: North Carolina
Posts: 190
|
*If* the console-online-experience catches on, anyway -- XBox seems to have a small but rabid following so far, and the requirement that the X-Box Live games all support voice and game-matching service, etc. is an interesting approach.
Considering that by the time you get the console, the modem or broadband adapter, and a keyboard, and whatever else, it really does become little different than a PC (except that the target platform is much more uniform).
I dunno, I suppose it'll happen, if good enough and unique enough games are console only (I'm still royally pissed that Halo hasn't come out on the PC yet!).
But, AFAIK, only the X-Box has the internal hard disk that a patch really needs to be viable on the console -- I can't see the PS2 and Gamecube trying to use a memory card to store a patch...
So, Gooberman, yes you DO need to get much closer to "perfection" for console games, since patches aren't a possibility yet, and probably won't be for some time. However, since you have a consistent target platform and a purpose-built bare-bones OS (rather than a multi-purpose does-all one like Windows, MacOS, etc.), it tends to be a somewhat simpler problem to solve, though it will always require very vigorous testing.
And you usually *do* find a few glitches in any console game, but they're rarely if ever major ones. For example, I noticed that the "You are here" marker in GTA:Vice City is slightly offset from your actual position, at least sometimes. Annoying, to be sure, but also not surprisingly something that escaped the testers' attention.
__________________
Xentax@nc.rr.com
|
|
|
|
January 6, 2003, 11:04
|
#56
|
King
Local Time: 12:35
Local Date: November 1, 2010
Join Date: Oct 2002
Location: Birmingham, AL
Posts: 1,595
|
The PS2 has an attachable hard drive that's required for some games, FFXI being the most notable. I believe the EverQuest port requires the hard drive as well.
|
|
|
|
January 16, 2003, 17:03
|
#57
|
Emperor
Local Time: 14:35
Local Date: November 1, 2010
Join Date: Aug 1999
Location: Aarhus, Denmark
Posts: 3,618
|
Hmmm...maybe it's time to "unsticky" this thread. No replies since january 6th.
Asmodean
__________________
Im not sure what Baruk Khazad is , but if they speak Judeo-Dwarvish, that would be "blessed are the dwarves" - lord of the mark
|
|
|
|
January 17, 2003, 01:17
|
#58
|
King
Local Time: 07:35
Local Date: November 1, 2010
Join Date: Dec 1969
Location: Seattle
Posts: 1,038
|
Six stickies is a bit much.
|
|
|
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is On
|
|
|
All times are GMT -4. The time now is 08:35.
|
|