Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Quake First Person Shooters (Games)

VoodooExtreme Interview With John Carmack 70

We've had quite a number of submissions concerning the VodooExtreme interview with John Carmack. Day One is on the site as well Day Two. Day Three goes up (Surprise!) tomorrow - so check back there tomorrow, 'cuz I'm not posting it again. *grin*
This discussion has been archived. No new comments can be posted.

VoodooExtreme Interview With John Carmack

Comments Filter:
  • Well, other than the fact it requires an external power source, are there any stats on the voodoo6?
    Because, from all teh comparisons I've seen of the new voodoo cards vs. teh nvidia and riva's, voodoo is loosing the FPS war. badly.
  • by Trevor Goodchild ( 187368 ) on Wednesday September 20, 2000 @12:33PM (#765497)
    Is today a slow news day, or something?

    Well, today we learned:

    Nobody understands the need for a DVD full of Linux apps

    There's legal liability in having a web site

    Voodoo cards sure are zippy

    "Hackers" is an old book, and we don't care anymore

    CueCat is as doomed as everyone thought it was

    Stuff's still happening in the Mozilla world

    The FBI is evil

    Robots are cool

    OS-X has a BSD kernel, which should be nifty

    Geeks live in houses that any right-thinking person would avoid

    AMD proves that Moore's law ain't croaked yet

  • nope. that's not right. it's actually almost 180 degrees off from what the story is about.

    Hemos posted that Voodoo Extreme has an interview with John Carmack on their site.
    the interview is broken into 3 parts.
    The 3 parts are called Day 1, Day 2, and Day 3.
    Day 1 and Day 2 are already available on their site. Day 3 will be available tomorrow.

    Hemos specifically says he's not posting anything about it tomorrow.


    Darth -- Nil Mortifi, Sine Lucre

  • Oh, I read "there" as "here"...

    /me slaps himself with a wet trout.

    ---

  • As much as I have liked 3DFX products in the past, that they have made a few (fatal?) errors on this product. One, reading the review, you get the defenite impression that this product was cobbled together quickly with many shortcuts that affect the overall quality, and then rushed to market in order to keep afloat against NVIDIA. The product almost has an air of desperation about it. While they do have a few good ideas, like being able to daisy-chain the processors in future products, I would bet heat and power considerations will limit this in the future. Overall, it is almost as if 3DFX is trying to cheat thier customers with a holding action while they, hopefully, deliver a once more superior product in the future.

    And this isn't meant as flamebait, but I am sure they will ensue...
  • And I suppose you got Brad's permission to link directly to his essay hum? ;p

  • by Junks Jerzey ( 54586 ) on Wednesday September 20, 2000 @12:40PM (#765502)
    John Carmack is brilliant. That really doesn't need to be said. He's a top notch programmer, and a 3D graphics expert. He exerts great influence over PC video hardware. He brought 3D graphics research from decades earlier to the PC. He came up with some cool ways of getting high-end looking graphics on fairly low-end PCs.

    Obviously, though, this is all very technology oriented. There's more to games than that. It gets tiring to read interviews in which he is called the Top Dog of computer games, and all the questions are about 3D APIs and which video card is best and what console has faster hardware. In short, he's The King of 3D Tech on the PC, but this is being equated with the driving pulse of computer games. In a way it's sort of depressing that PC gaming has been reduced to video cards and benchmarks. This isn't Mr. Carmack's fault, of course, but it all feels very materialistic and empty.
  • Er...VooDoo Extreme is a review site. Has nothing to do with The VooDoo vid cards.
  • The technology exists to stop people from linking directly. It's just that most sysadmins are too lazy to do it. If you don't want people to directly link directly to your webpages then don't put them up. Or use a html session to stop people who are !directly! linking in to stop them and their 'vile' ways.
  • Look, the 'net was designed from the ground up to serve as a knowledge repository, a method for researchers to share information between each other. Linking goes right to the core of this - when you publish something on the 'net without http authentication or logging in, you are implicitly saying that it is OK to link your content.

    It is ideas like the one presented here that threaten the nature of the 'net. Some people would like to modify the HTML standard so that each link as a Copyright field in it so you can't go there unless your browser is authorized to. Or how about compiling HTML so other people can't view your code? Let's make the net read-only and royalty based, and lock out the information have-nots and poor and create a digital divide, setting up different classes of citizens!

    It quickly snowballs, you see, and not for the better. Slashdot needs no such permission, if they want to make sure people don't like, do like the NY Times and register before you can view. It's called fair use - slashdot doesn't profit directly from linking, or not linking, to that article. And btw - I thought the goal of a magazine was to GET READ... putting up barriers to linking only encourages people NOT TO READ your 'zine. Or make dynamic links that break every 15 minutes or so. Go ahead, destroy your website and make people want to go elsewhere if that's your game plan, but keep your laws off my content.

    --

  • Does anybody who's read the article have any idea what the vague references to Doom 3's tech are supposed to mean ? Between the lines it almost sounds like he's going the Unreal way : no more precalculated PVSes, but editor-placed portals, for more dynamic scenery and lower vis times. Which would be pretty strange, since the PVS constant-time culling was arguably one of the breakthroughs enabling Quake's framerate. Is he saying Sweeney was right after all ? It's obvious that the Q3A engine is too static for a single-player focused Doom3, still, Valve did some nice hack-n-patch with the equally static Q1 code... Curious.
  • Umm.. that case already went to court. Nobody ownes links. And since its not DecSS or kiddie porn, it is their right. But, morally and out of consideration.. I don't know. Depends on how the site is set up. But, either way, voodooextreme is slammed nonresponsive. I can't get it no matter if I hit the link or the main cache. So, moot point in the instance.
  • Ummmmmm, you deep linked to his essay. You are denying him the right to spam (good or bad) us with things we don't want to see. Just trying to make a simple point.

    I really don't care if I come off like a flamer, but what planet are you from? If you haven't noticed, most articles posted on Slashdot have both the root web address for the site and the address for the article. Quite frankly, I don't want to always surf through crap to find what I'm looking for. Most don't. Besides, sites that are updating news and content often change their entry pages. So, this Carmack interview may be hard to locate in three days.

    Also, many sites on the web over do the banners and revunue stream thing anyway. It's like, click next page 10 times for something that could fit on one page. I click the printer friendly formats whenever possible just to skip the crap. Banners and gifs and crap on the top, bottom, left and right. You readers know what I'm talking about.

    This is lame anyway. Permission? They posted it. It's the web. We don't stand for that crap. Some stupid judge in New York may call linking wrong, but we don't. By the way, the search engines are stealing their content too if you apply the same logic.

    You should work for the RIAA or the MPAA.
  • I agree.

    Not to sound like some sort of bandwagon jumper-oner.

    But without people like John C. and the Unreal Team there would be no new graphics technology. We would be forced to find our graphics enjoyment in the realm of console gaming. Which is good, but it's nice to be able to get a really awsome card during that 1-2 period when gaming consoles don't improve.

    Hardware/Software are a funny thing because hardware drives the software to new heights and without the software, nobody would buy the hardware.

  • by startled ( 144833 ) on Wednesday September 20, 2000 @12:49PM (#765510)
    LOL. I don't think the moderator realized this was satire.

    To the uninitiated: VoodooExtreme is a news collection site. While it produces some of its own content, much of it is just links all over the web. You know, like slashdot. They excuse it by giving credit to all the contributors.

    It's a thin line, though, between performing a service by seeking out cool stories, and just ripping off other sites' news. This point was best made by OldManMurray, who just linked to all of VoodooExtreme. It used to be at this link to "marvin sedate" [oldmanmurray.com], but that gave me some odd redirect to here [oldmanmurray.com], so they probably stopped doing it back in February, which is what the latter links to. Still funny.
  • How the hell is this stealing revenue from the site? First off I haven't been to the site... but I am willing to bet there is an AD on every page.

    If someone from slashdot goes to this interview it means they weren't reading it from VoodooExtreme in the first place. This means if Slashdot hadn't linked to it they prolly never would have read the story... now that it has been linked from TrollDot lots of new people are going to read the interview. Slashdot has in fact increased the number of AD impressions and revenue for the site overall.

    You obviously have never had to deal with issues like this before... people linking into your site is great. It brings you a new untapped market that may potentially come back for more.
    ---
    Solaris/FreeBSD/Openstep/NeXTSTEP/Linux/ultrix/OSF /...
  • I know this thread is off topic, but hopefully i wont be moderated into oblivion for responding to it...

    Here's an article all about deep linking and it's legal ramifications.
    GigaLaw Deep Linking Article [gigalaw.com]
    Also, as far as copyright violation goes, at least one U.S. District Judge disagrees with Brad Templeton as he ruled that deep linking did not constitute copyright violation. (it's on page 4, Ticketmaster vs. Tickets.com)
    (well, to be honest he said it's not copyright violation as long as it's clear that you are being transported to another site....so no opening them into frames).

    realistically, i think Voodoo Extreme is making a lot more money off of everyone at slashdot jumping directly to the article and being served the ad banner that's on the article pages. (i'm assuming there's one there since they are slashdotted and i cant check...but it's a pretty safe assumption considering i've never seen a news site that only served ads on their main page)


    Darth -- Nil Mortifi, Sine Lucre

  • I think you're thinking of Romero
  • Huh ? From the fact that all interviews with The Carmack are about 3D APIs and videocards, you conclude that gaming has been reduced to videocards and benchmarks ?

    Ofcourse, I agree that most games put way too much emphasis on the flash and bang, but otoh don't underestimate the importance of a good engine, either. You can hate the industry's focus on fillrates, but as long as those fillrates aren't high enough, you can simply forget about Doom-like rooms with literally 70-or-so imps in it -- unless you go back to 2D sprites. Yeah, that's an option, but no publisher'll buy it. So, for every 10 games that come out that are all about the tech & gfx but offer no gameplay, there is one game out there that has a designer who really wants the tech for his gameplay ideas. That's probably a fair ratio. You can argue in what category id's products fall -- but at the very least it's clear into which some of their licensees go (HalfLife anyone ?). So give 'm a break ;)

  • well, I am going to bite
    carmack never said he'd make you his bitch, that would be romero talking about daikatana.


    $var = &ltSTDIN>
    $var =~ s/\\$//;
  • The only thing for which Slashdot should be held resposible is that they get to decide who's server gets /.'d next. (Example: I am having issues loading voodooextreme right now)

    But I find it rather interesting that you consider webing (ie, linking to other sites) something immoral to do on the web. Last time I checked, thats why we have this thing called the world wide web, as opposed to the world wide set of non-interlinked sites. Linking is the reason we have a web.

    More over, I find it bothersome that you consider it a bad thing that /. sends customers to other people's pages. Sure, it would make sense if they were sending us to a frame that contains only the story, but they aren't; the link sends us to a voodooextreme page. A full page. If voodooextreme only bothers to put banners on their front page, it's their fault. More likely they have banners on every page. Also, just the fact that we are now on voodooextreme's page, it would seem to me that a good number of people would actually look around other, non-related pages on voodooextreme just cause we are in the neighborhood. I know I do that all the time. Therefore, /. is supplying voodooextreme with customers, yet again.

    In all, I find this stance rediculous, if only for the fact that if /. never linked to a page, we wouldn't be part of the "web." And if they only linked to the front page, I sure as hell know I would never follow a link from /. again.(Though I am sure a lot of sys admins would be damned happy about not having to face being /.'d.)

    Just a little logic in the hands of an illogical computer user.

  • This article has nothing to do with 3dfx. What's wrong with you?
  • they enable readers to skip past many of the ad banners and cripple the targetted site's
    revenue stream -- in effect, they're stealing VoodooExtreme's interview.


    Wrongo.
    You are so incredibly negative. Do you ever have any_damn_thing positive to say?

    Did you even go to the page and read the interview?

    Gaming sites have ads on every page, including interviews.

    I would say that slashdot actually generated some revenue for voodooextreme by sending people to that interview.

  • Several readers have already replied to this post and made good points, but to summarize:

    1. Technology to discourage deep linking exists.
    2. Once published online, there's NOTHING anyone can do to prevent the content from propegating, either as a link or a copy. At least slashdot links to the site rather than doing a lynx -dump and posting that on their own site.
    3. I wouldn't want it any other way. I have no interest in reading VoodooExtreme regularly to find the articles that interest me, nor would I be impressed if SlashDot posted a "there's a John Carmack interview somewhere on VoodooExtreme...go find it if you can!"
    4. The issue of whether deep linking or mirroring is "wrong" is not clear-cut as you suggest with your "do the right thing" comment.
    5. If VoodooExtreme asked Slashdot to remove the link, I bet Slashdot would be happy to oblige. I don't think VE cares that much. It's still publicity.
    6. If I say "I saw an article in the local paper on page B6 about increasing medical costs..." I am not cheating the newspaper out of all the "page views" they would have gotten if I had left out the page number.
  • by John Carmack ( 101025 ) on Wednesday September 20, 2000 @01:11PM (#765520)
    PVS was The Right Thing when level geometry counts were much lower. With tens of thousands of polygons in a scene, creating a cluster tree directly from that becomes completely unrealistic from a space and time perspective.

    The options are to either do PVS with a simplified version of the world, or ignore the geometry and just work with portal topology.

    Unreal used a scan-line visibility algorithm, which crippled it's ability to have high poly counts or high framerates with hardware accelerators.

    Tim Sweeny knows full well that the architecture is very wrong for modern systems, but many of the original decisions were based on earlier software technologies. Unreal was supposed to be a "killer app" for the Pentium-200 MMX processor.

    I have a lot of respect for Tim and Unreal, but the visibility algorithm in Unreal turned out to be a bad call. He is changing it for future work.

    John Carmack
  • goto http://johncglass.com/leethaxors.htm [johncglass.com] and you will learn about being l33t!
  • Well blow me down!

    Vooodooextreme is set as my start page....and I always wondered if that TI-85 they run the site off of would get Slashdotted......looks like they are!

    Good thing I already read the article .


    -Julius X
  • This is the second part of the article, posted today.

    --

    Voodoo Extremist Chris Rhinehart; Human Head Studios -- From what I've read, Doom3 is intended to have a strong single-player experience. What do you anticipate to be the biggest design hurdles to overcome while creating Doom3, as opposed to designing a title intended primarly for multiplayer?

    John Carmack -- We sort of went into Q3 thinking that the multi-player only focus was going to make the game design easier. It turned out that the lack of any good unifying concept left the level designers and artists without a good focal point, and there was more meandering around that we cared for. The hardest thing is deciding what to focus on, because DOOM meant different things to different people. We have decided to make the single player game story experience the primary focus, but many people would argue that DOOM was more about the multi-player.

    Voodoo Extreme -- When do you think computers will become fast enough so that developers can dump BSP based VSD algorithms for more flexible ones?

    John Carmack -- I think this has been mis-characterized for a long time - None of the Quake games have had what I would call a "BSP based VSD algorithm". The visibility associated with quake is a cluster to cluster potentially visible set (PVS) algorithm, masked by an area connectivity graph (in Q2 and Q3), followed by hierarchical frustum culling (which does use the BSP). The software renderers then performed an edge based scan-line rasterization algorithm, which resulted in zero-overdraw for the world.

    Early in Q1's development, I pursued "beam trees", which were truly a BSP based visibility algorithm that did exact visibility by tracking unfilled screen geometry going front to back, but the log2 complexity scaling factor lost out to the constant complexity factor from the PVS.

    That highlights an important point that some graphics programmers don't appreciate properly - it is the performance of the entire system that matters, not a single metric. It is very easy to go significantly slower while drawing less primitives or with less overdraw, because you spent more time deciding which ones to not draw than it would have taken to draw them in a more optimized manner. This applies heavily to visibility culling and level of detail work, and is much more significant now with geometry processors and static meshes.

    The PVS system had two significant benefits: constant time lookup, and complete automation (no designer input required).

    Through Q2 and Q3, the "complete automation" advantage started to deteriorate, as designers were coerced into marking more and more things as detail brushes to speed up the processing, placing hint brushes to control the cluster sizes, or manually placing area-portals.

    The principle drawbacks of the PVS are the large pre-processing time, the large storage space cost, and the static nature of the data.

    The size and space drawbacks were helped with detail-brushes, which basically made a more complex map seem less complex to the visibility process, but they required the level designers to pro-actively take action. It has been interesting to watch the designers' standard practices. Almost nobody just picks a policy like "all small trim will be detail brushes". Instead, they tend to completely ignore detail brushes until the map processing time reaches their personal pain threshold. Here at Id, we usually didn't let maps take more than a half-hour to process (on our huge 16 CPU server.), but I heard tales from other companies and the community of maps that were allowed to take overnight or all weekend to vis. That is a mistake, but the optimize-for-vis-time guidelines are not widely understood.

    The static nature of a pre-computed PVS showed up most glaringly when you had your face in front of a closed door, but the game was running slow because it was drawing everything behind the door, then drawing the door on top of it. I introduced areaportals in Q2 to allow designers to explicitly allow large sections of the vis to be pruned off when an entity is in a certain state. This is much more efficient than a more generalized scheme that actually looked at geometric information.

    In the Q1 timeframe, I think the PVS was a huge win, but the advantage deteriorated somewhat as the nature of the rendering datasets changed.

    In any case, the gross culling in the new engine is completely different from previous engines. It does require the designers to manually placed portal brushes with some degree of intelligence, so it isn't completely automated, but I expect that for commercial grade levels, there will be less portal brushes than there currently are hint brushes. It doesn't have any significant pre-processing time, and it is an exact point-to-area, instead of cluster-to-cluster. There will probably also be an entity-state based pruning facility like areaportals, but I haven't coded it yet.

    Voodoo Extreme -- The shader rendering pipeline [in DOOM 3] - completely re-written from Quake III? How are you going to handle the radically different abilities of todays cards to produce a similar visual effect on each? For example I'm thinking of the presence or non presence of register combiners, and the different implemntations of these extensions.

    John Carmack -- The renderer is completely new, and very different in structure from previous engines. Interestingly, the interface remained fairly close for a long time, such that I was able to develop most of the DOOM renderer using the rest of Q3 almost unmodified. It finally did diverge, but still not too radically.

    The theoretically ideal feature set for a 3D accelerator would be:

    Many texture units to allow all the lighting calculations to be done in a single pass. I can use at least eight, and possibly more if the reflection vector math needs to burn texture units for its calculations. Even with the exact same memory subsystem, this would more than double the rendering speed over a current dual texture chip.

    Flexible dependent texture reads to allow specular power function lookups and non-triangulation dependent specular interpolation. No shipping card has this yet. I was initially very excited about the possibility that the ATI Radeon would be able to do some of this, but it turns out to not quite be flexible enough. I do fault Microsoft for adopting "bumped environment mapping" as a specialized, degenerate case of dependent texture reads.

    Dot3 texture blending. This is critical for bump mapping. Embossing and bump env mapping don't cut it at all. GeForce and Radeon have this now, and everyone will follow.

    Flexible geometry acceleration. I can't use current geometry accelerators to calculate bumped specular, so the CPU must still touch a lot of data when that feature is enabled. Upcoming geometry processors will be powerful enough to do it all by themselves. I could also use multiple texture units to get the same effect in some cases, if the combiners are flexible enough.

    Destination alpha and stencil buffer support are needed for the basic functioning of the renderer. Every modern card has this, but no game has required it yet.

    The ideal card for DOOM hasn't shipped yet, but there are a couple good candidates just over the horizon. The existing cards stack up like this:

    Nvidia GeForce[2]: We are using these as our primary development platform. I play some tricks with the register combiners to get a bit better quality than would be possible with a generic dual texture accelerator.

    ATI Radeon: All features work properly, but I needed to disable some things in the driver. I will be working with ATI to make sure everything works as well as possible. The third texture unit will allow the general lighting path to operate a bit more efficiently than on a GeForce. Lacking the extra math of the register combiners, the specular highlights don't look as good as on a GeForce.

    3DFX Voodoo4/5, S3 Savage4/2000, Matrox G400/450, ATI Rage128, Nvidia TNT[2]: Much of the visual lushness will be missing due to the lack of bump mapping, but the game won't have any gaping holes. Most of these except the V5 probably won't have enough fill-rate to be very enjoyable.

    3DFX Voodoo3, S3 Savage3D/MX, Matrox G200, etc: Without a stencil buffer, much of the core capabilities of the renderer are just lost. The game will probably run, but it won't be anything like we intend it to be viewed. Almost certainly not enough fill rate.

    Voodoo Extreme -- The game side is C++, why not the rest of the code?

    John Carmack -- It's still a possibility, but I am fairly happy with how the internals of the renderer are represented in straight C code.

    Voodoo Extremist Gabe Newell; Valve Software -- John has consistently made very clear decisions about the scope of projects id has undertaken, which I would say is one of the main reasons id has been such a consistent producer over an extended period of time. Not having spoken with John about it directly, I think I understand his rational for focusing id on the Doom project. For the benefit of other developers, are there a couple of heuristics John uses to decide what does and doesn't make sense to undertake on a given project?

    John Carmack -- The basic decision making process is the same for almost any choices: assess your capabilities, value goals objectively, cost estimate as well as you can, look for synergies to exploit and parasitic losses to avoid. Maximize the resulting values for an amount of effort you are willing to expend.

    Computer games do have some notable aspects of their own, though. Riding the wave of Moore's Law causes timeliness to take on a couple new facets. Every once in a while, new things become possible or pragmatic for the first time, and you have an opportunity to do something that hasn't been seen before, which may be more important than lots of other factors combined.

    It also cuts the other way, where something that would have been a great return on the work involved becomes useless or even a liability when you miss your time window. Several software rendering engines fell into that category.



    -Julius X
  • Comment removed based on user account deletion
  • by Frac ( 27516 )
    VoodooExtreme gets Slashdotted to the Extreme. Bye bye server.
  • Though it may be worthy of trolling or flamebait.

    GPL is meant to allow people the freedom of using code. It means anyone anywhere anytime can use and hack and play with the code to suit their needs, to scratch their itches.

    It just so happens that GPL would not help id. It would not even help the community, I think, because anyone who can casually jump in and 'browse' and edit and play with the code, would probably be able to write this kind of stuff from scratch in the first place. It's highly specialized, highly tuned, highly precise code to do things tight and fast.

    I guess it would benefit non-owners/writers of code if we could look through it and learn from it, but that is almost exactly why he GPLs his old out of date code. It's more useful as training material than it is for release/sales.

    If someone wants a GPL high performance 3d engine, there is Crystal Space. Otherwise, ID owns Quake3, and can decide *when* and *if* to GPL it. No one else.

    The nick is a joke! Really!
  • Whoa dude, wrong article.
  • Without the 'technology as game phase' we wouldn't have the Half Lifes and Hexens, would we?

    ID does what it does well, and everyone else mooches off them. If no one is doing what you need, in terms of games, then go do the noble, honorable, open source thing, and go scratch your itch. Buy a copy of Quake[123] and go brew your own game. Code your own logic, make your own models, map your own levels, monsters, weapons, stories, etc. And then release it/sell it/distribute it. Because others are just as weary as you are, and will give you much praise/wealth/accolades for your contribution to their life.

    The nick is a joke! Really!
  • In short, he's The King of 3D Tech on the PC, but this is being equated with the driving pulse of computer games.

    Well, have you ever heard of a little game called DooM? Or how about Quake? Or, if you want to get old school, how about Wolfenstein 3d? Wolfenstein was simply the first person shooter EVER. That's Ever with a big E. DooM is the most successful computer game on the planet. Hands down. Quake started the online multiplayer movement. There would be no EverQuest or Diablo II massive-multiplayer abilities if Quake hadn't broken the old creed that multiplayer wasn't viable. QuakeWorld quickly and simply made everyone turn their heads and wake up to the fact that millions of people would like to play other people far distances from them. In real time. With hardly any lag that NetQuake suffered. QuakeWorld still has thousands of people playing it today. DooM was the first to let you build your own levels (this is debatable, as you could build them in Wolf3d, and perhaps Rise of the Triad, but I'm not sure about the latter). Quake introduced QuakeC, allowing people to make their own games that had NOTHING to do with the original game. Ever heard of Quake Rally the racing game? or how about Air Quake where you got to fly around in airplanes and tanks and such?

    If John Carmack isn't the founder of modern gaming, I don't know who is. The reason he's speaking only of 3d graphics, algorithms and such is because id is keeping its mouth shut on Doom 3 or Doom2k or whatever they're calling it. What else is there to talk about when you can't talk of the game? The technology that makes them work.

  • I'm looking for someone to create the ultimate co-op game.

    I don't think I've found a good one yet; Diablo and other such games have some of the flavor, but don't quite have the teamwork or shared experience that Doom [12] did.

    Multiplayer deathmatch games are too haphazard and uncoordinated, whereas tight storydriven games are too delicate. Is there a way to create a *shared* experience game? I guess a war sim, with players a soldiers in a platoon, might do it.

    I think it just requires the proper matching of story, plot, and gameplay to do it, and that it isn't impossible, even if it hasn't quite been done yet either. Think of movies that have group dynamics, and try to capture some of that in a game. I guess I'm rambling now.

    Still, I'm looking forward to Doom3, if only to see what ID puts out this time...

    The nick is a joke! Really!
  • no you were fooled. that was an impostor... look at the 'r': c a r r n a c k
  • geez, of all people, I didn't expect to see you fall for this.

  • I think you fundamentally fail to understand the nature of the Web as a medium. One of the things that differentiates the web from, say gopher or a magazine, is that every page can be directly linked to, and can have links. This creates an organizational structure in which the basic unit is a page, not a site (or book, or magazine). I think this is a good thing. Whether it is good or not, it is indisputably true.

    This allows for much more usefull and dynamic configurations than a less fully interconnected graph would have. It also means that every web page must stand (or fail) on its own. On a fundamental level every file, be it text, image, or other, is offered independently by the web server.

    Any file with a unique permanent URL can be linked to, or even included in (via frames or tags) any HTML page, or fetched directly by URL. That is the basic nature of the web. Creating a business model which ignores the way the world is is unwise. Trying to change reality to support that business model is arrogant and stupid.
  • Here's the email I just sent the author of that page: Hi Brad,

    I'm writing to offer another perspective on the deep linking issue, in response to your page at

    http://www.templetons.com/brad/linkright.html

    " The trick is that there may be restrictions on how the pages are fetched that the owner wishes to set. In particular, many web pages are composite documents, consisting of several items, such as graphics and text, and are not meant to be viewed in their individual parts. That they can be viewed as independent parts is an artifact of the HTML language, not the intention of the copyright holders."

    I'd agrue that this isn't like specifying that book cannot be photocopied, but that it is more like specifying that the pages of the book must be read in a particular order, trying to claim that reading the last page of a mystery novel first is a violation of copyright law. Such an arguement wouldn't get very far.

    "So does the owner have the right to say you can only fetch pages from a server according to rules they might set? Quite possibly. Remember that while technologically it is difficult (but not quite impossible) to stop people from being able to fetch a component graphic from a web page or a sub-page buried under advertising supported menus, the whole purpose of copyright law is to provide legal protection for documents when technological protection is hard. You don't need legal copyright protection if technological protection is easy, after all. "

    It seems to me that it would be technologically easy to require that the referrer tag have some particular value in order to retrieve a particular file.

    A server could be set up to compare the value of the referrer to the expected value. If they matched, the file would be sent. If they failed to match, another page would be sent instead, making it clear that the copyright holder wishes that the request file be viewed as part of whole, and providing the URL for the page the copyright holder would like to have people start from.

    While it would be possible for someone to bypass such a mechanism, it would be _much_ more clear that they knew about the copyright holder's wishes, and that they were disregarding them.

    The way HTTP requests are structured, and the way web servers are typically configured, it seems to be that the intent is that all files which are accessable are available to be viewed in any order or no order at all.

    While I agree that some copyright holders may want something else, there should be some burden placed on them to make their wishes known. This could be done clearly, easily and concisely by checking the referrer and providing a page with a URL link when the referrer doesn't have the expected value.

    For a copyright holder to complain about deep linking when they haven't taken such a step is a lot like someone dumping copies of their book out of an airplane and complaining when people fail to pay for the copies. Both people are choosing a distribution medium which doesn't provide for their desires, then blaiming the medium for their poor choice.

    The fact is that the web is well designed for sharing information without regard to order. If that isn't what someone wants to do, they should avoid using the web to distribute their information, or they should make the effort to change the details of distribution to more closely suit their desires.

  • Something that seems a bit disturbing to me about Brad Templeton's essay is the notion that putting a frame around someone else's web page constitutes a derivative work. Suppose that we consider Netscape Navagator (or any application) to be a "frame". Does this mean that if Microsoft were to put a legal restriction on their web site that it may only be accessed by IE, that they could take a $1000 (???) bite out of Netscape every time someone hit their site using Netscape?

    Since when does copyright law have any notion of restricting _HOW_ someone uses a copyrighted work, other than using it at all, copying it, and redistributing it? Would it be legally enforceable to disallow people from flipping through a magazine backwards?

    Do web site owners think that I have nothing better to do than sift through their horribly organized web sites to find the thing that someone else mentioned but were disallowed to link to? What a huge leap backwards! The other day there was mention of a story in the Washington Post but only a link to the front page of the Post. I got there and I couldn't frikkin find anything.

    There are already enough technological problems with iteroperability, such as [all Microsoft products]. The last thing the web needs is legally enforced non-interoperability.
  • "3DFX Voodoo4/5, S3 Savage4/2000, Matrox G400/450, ATI Rage128, Nvidia TNT[2]: Much of the visual lushness will be missing due to the lack of bump mapping, but the game won't have any gaping holes. Most of these except the V5 probably won't have enough fill-rate to be very enjoyable.:

    If DOOM3 is as critical to accelerator success as Q3 and Q2 were, this could very well spell the end of 3dfx as we know it. As John said, the entire Voodoo series lacks dot product bump mapping, and in DOOM3, this will be used extensively. What does this mean? It means the difference between the textures in Quake3 at the highest texture level, and the wall textures in Half-Life: the Voodoo5, despite FSAA, will only be able to provide horrid texturing in DOOM3. Without texture clarity, the FSAA is worthless; the output will probably have the tris density of Quake3, but the texturing of GLQuake on the 3dfx 8-bit texture extensions (can you say, geezerware?). Unless the Rampage becomes real within a year or two, 3dfx could be in a very bad position.

  • As the webmaster of Doomworld.com this stuff is supposed to interest me, but hearing all about the new id engine really doesn't mean much in my book. All of the id engines have kicked ass and I have complete faith that this one will as well. The game, however, is what I want to know about. Unfortunately they have made it clear that we're not going to be hearing much about the gameplay previous to release... argh!
  • Yes. Just go to the /idstuff/quake3/source directory in /idgames3 of ftp.cdrom.com. There's the Quake3 source code, staring at you. Sure, it's an .exe, just use wine, you whiner.
  • Look at Q3DM12. What a big level. It turns out, there's an areaportal inside each door closest to the central courtyard. This is very good, as it prevents the rendering of EVERYTHING behind the door when it's closed. I turned on the FPS counter and went behind the door on the rocket platform. Here's the results: door closed: 90 fps; doop open: 45 fps.
    That's why there are portals.
  • > If DOOM3 is as critical to accelerator success as Q3 and Q2 were, this could very well spell the end of 3dfx as we know it.

    Doom3/Doom2K isn't going to be out for another YEAR. That leaves plenty of time for the Voodoo 6 (or 7 or whatever they call it ;-) to ship.
  • >entire Voodoo series lacks dot product bump
    >mapping, and in DOOM3, this will be used

    Thanks; I'll have nightmares about matrix manipulation all night now. Thanks a lot. :^)
  • You also forgot about the Commander Keen games John made. Those were the first games that used timers to control the speed at which the games run. Prior to that, if you were to run, say Joust on a 1g PC, it would be *completely* unplayable. But fire up Commander Keen, and it runs just as it should -- a major improvement.

    Also the reason everyone is talking about 3D API's, and FPS and fill-rate is because, like it or not, it's the future of gaming. Even Diablo-II, probably one of the last top-tier sprite games we'll ever see, uses Direct3D to accelerate and add more realism.

    It's getting to the point that a graphics card is almost as important to your over-all game playability as your CPU. Have a 1G PC and a TNT1? Don't even try playing most new games.

    John Carmack talks about this because people listen. He and ID software make *the best* 3d game engines. Whether or not you like the content of the game, the engine you cannot complain about. Not to mention John sits on the advisory board to nVidia and 3DFX. If those companies are listening to him, why aren't you? I don't think John ever claimed to talk about the plot line, or other creative items of the games he was working on. He talks about what he does for ID software, the engines.

  • this could very well spell the end of 3dfx as we know it.

    Don't worry - the Voodoo666 1.1G (short for 1100000000) will sport 2000 VSA-100 chips in parallel, and contain 32 gigabytes of RAM on the card. It will require a server rack to house the videocard, a basketball court to house the power supply, and liquid nitrogen cooling it at 62 gallons per minute.

    Unfortunately, the architecture used is not unlike the V5 and V6's now, which means the 64 gigabytes of RAM are shared between all 2000 processors, and that equates to 16 megabytes per chip. Sorry, no big textures for you!

  • dohma - I meant 32gigabytes of RAM in the second paragraph.
  • It actually would've been funny if you talked about the right person. Do not pass go! Do not collect two hundred dollars!
  • Vooodooextreme is set as my start page....

    You must be a pretty big loser.

    And that fact that you cut and pasted the entire day two interview in your post makes you a karma whore too.

  • by Junks Jerzey ( 54586 ) on Wednesday September 20, 2000 @08:39PM (#765547)
    If John Carmack isn't the founder of modern gaming, I don't know who is.

    Let's think about this. There were 3D games in the late 1970s and all through the 1980s. Remember Atari's Hard Drivin' coin-op from 1989? Fully 3D polygonal graphics, including realistic physics (and physics didn't start becoming a buzzword until 1997 or so). I could name dozens more 3D games released before 1992. The Wolfenstein 3D graphics technique of ray casting was used in a couple of games from 1983 (Way Out and Capture the Flag). There were multiplayer networked games before Doom and Quake, too. You need to learn your gaming history!

    Wolfenstein 3D and Doom jumpstarted 3D gaming on the PC. There were 3D PC games before that, but John Carmack did a bang-up job of bringing us all up to date. Doom and Quake (which, remember, had John Romero as co-designer), are linked to the rise of 3D graphics on the PC, and the rise of the First Person Shooter Genre.

    The mistake you are making is saying that gaming can be equated to these items. No, it cannot be. Consider Civilization, The Sims, X-Com, all the Ultima games (including Ultima Underworld), everything Sid Meier has done, everything Shigeru Miyamoto has done, the Freespace games, the Final Fantasy series, The Need for Speed games (which started before Quake, BTW), and so on and so on.

    A classic fanboy mistake is thinking that not only are the Quake and Unreal engines the epitome of 3D technology, but that the development of these engines are the foundation for gaming. Neither of these is true. It's just that all the other 3D game developers out there aren't poster children for PC graphics card manufacturers.

    Am I insulting John Carmack? Certainly not. I'm insulting the fanboys who insist that Carmack and Sweeney are the Sole Carriers of The Gaming Torch, well, that's just misguided.
  • 2 people modded it as "troll". Whatever. I appluad the one intelligent individual who modded it "underrated".
    ---
  • This guy has basically summed up the Napster issue quite well. "Stealing Interview by linking" == "Stealing music by downloading MP3s"
  • John Carmack is brilliant...

    I just wonder whether his next step will be to pioneer space travel now that he has read that rocket science book.
  • Did anyone else notice Gabe Newells question regarding the targeting of the technology at broadband capable systems. Maybe TF2 is already finished and they are just waiting for everyone to get Cable/xDSL so we can play it :)

    Maybe whilst they are waiting they could port it to linux ?

    As a programmer who has hacked about with OpenGL and as a gamer I would like to pledge my undying allegiance to John Carmack as IMNSHO he is a genius.

    Nearly as clever as these guys http://webpages.mr.net/bobz/ttyquake/ [mr.net]
  • If John Carmack isn't the founder of modern gaming, I don't know who is.

    Well lets see. How about Nolan Bushnell? Ken Kutaragi? Shigeru Miyamoto? Get some perspective, iD has invented a single genre of game. A very cool genre to be sure, but not the only one and not the most popular one either.

  • Larger,but with less polygons being drawn.This is still true of Unreal Tournament.Q3A can have far more polygons visible than UT.This is why UT levels tend to look blockier than Q3A levels.Both are great games and both reflect critical design paths chosen for the programming. Now,if we could just get Sweeney to add multi-threading to his next engine:)(I'm assuming Carmack will keep his for DOOM3)
  • i played quake for a while now, and still cant understand why rockets all fly at the same speed. if you running at 20mph and shoot a rocket, the speed of the rocked should be 20mph + (speed of rocket)... am i wrong? if you step on one of those accelerator jump pad things in quake 3 flying in the air and shoot a rocket, the speed of the rocket you just shot should be 300mph + (speed of rocket). but if you played quake before, you should know that you arrive at the destionation before the rocket does. maybe someone should tell John that he isnt following the Newton's Law of Motion.
  • The third (and last) part of the interview is present on the site now, go here [voodooextreme.com].
  • Multiplayer deathmatch games are too haphazard and uncoordinated, whereas tight storydriven games are too delicate.

    I think that the best co-op I've played is System Shock 2. A lot of the problem with co-op comes from difficulty balancing - ultimately, a party of four in a co-op game can take four times the damage, carry four times as much stuff, etc...

    If you could complete Doom on Ultra-Violent, then co-op on Nightmare. Given that people re-spawn when they die, it's impossible to lose... then again, in System Shock 2, you re-spawn in single player as well.

    Ultimately, a game designed for co-op would need to have a dynamic difficulty, which made the game harder depending on the number of players. Facilities to split the party would be good, as these would heighten tension. When playing System Shock 2, my friends and I try to heighten the excitement by going as quickly as possible, often splitting into two or more groups to accomplish targets simultaneously. We're not forced to, but it makes the game so much more fun...

    Alex

  • I fall for anything these days...

    --

  • Quake Army, one day, it will be here.
  • To me, PVS in Quake meant lean fast speed in interior spaces and "sort-of courtyard" spaces. Ideal for deathmatch or mowing down monsters. Until someone intergrates a true procedural world with detailed high poly landmarks that are custom artists/designer created FPS, will be forever "doomed" (no pun intended) to being creepy dungeons and weird-o quasi science fiction locations such as the ones I created myself in Unreal and Unreal Tournament. Designers of sp RPG titles will have to make "mission based" games that use "creepy dungeons and weird-o quasi science fiction locations" as stages for game play since they quite simply cannot create REAL places of any immense size or configuration. I think the arrival of MMP projects being incredible entertainment that makes money will push for Engine Renderer creationsist such as yourself and Tim to hopefully focus more so on creating engine technology that can do truly amazing worlds that intrigue and inspire players rather than better faster "tournament arenas" But this is driven by the design and the idea of "what is fun" for people such as yourself. If you yourself played MMP titles and saw how utterly crappy their engines were and vowed to change the way "3d gaming draws worlds" I would probably be overwhelmed with joy that finally someone truly capable saw the need. Or if Tim were to take his latest tech movie and showed us a city filled with hundreds of players coming and going in a persistent world of grand detail rather than "dude carrying flag being chased by dude" I would also be overwhelmed with joy. For me as a designer, I want to create worlds with meaning and longevity. But to date no one has risen to this challenge to provide persistent online entertainment technology the nitrous oxide it needs. I personally hope that reputable FPS engine writers will eventually expand their technologies to achieve more fantastic ways to render real worlds and less time adding polys to deathmatch games. of course this may all be part of the "master plan" but I'm crossing my fingers that it is. Myscha the sled dog T. Elliot Cannon Co-Lead CYAN myscha@cyan.com
  • John, if you had a ' in your code where it didn't belong...

    Christ, I hate to be so petty, but after seeing your tirade about Mr. Carmack's incorrect apostrophe I must point out your grammatical mistake. Your sentence should read "if you had AN ' in your code." The ' would be pronounced "apostrophe" if you were reading it out loud, and therefore should be preceded by an "an" not an "a."

    Ok I'm done. Now who's going to nitpick *MY* grammar?

    • _____

    • ToiletDuk (58% Slashdot Pure)
  • why score 2? This sould be 5
  • I prefer "tick" (and the respective back-tick) to "single quote". Fewer syllables. Also you can't differentiate ' from ` if you say single quote, because they are both single quotes. And back- single-quote sounds bad, but back-tick doesn't.
  • Beautiful, my man, beautiful.

    No one wants to steal anything from a news site; when we *really* want to steal content, we have Napster and Gnutella on our side.
  • John, Are you saying that Unreal required too much hardware at the time it was released or that it was the wrong thing even for the leading edge hardware at the time? Because while I agree that Quake 2 ran nicely on a Pentium 200 without MMX and Unreal didn't, the fact is that Unreal blew me away with it's graphics on a PII-400 with a Voodoo2, which was relatively leading edge at the time Unreal was released. But honestly I have to say that the large outdoor areas that Unreal offered seemed to me to justify the hardware, and a year later I had to go back and play the game again because it was so nice to run through the huge outdoor levels like Vortex Rikers and Sunspire. I enjoyed Quake 2 for it's multiplayer gaming goodness on mid-range machines, but its graphics just didn't drop my jaw the way Unreal's did. Also, it seemed that Quake 2 wasn't as scaleable for large outdoor levels the way Unreal was. I could make huge homemade levels without running vis and still have decent framerates - something I wasn't able to achieve with the Quake 2 engine, even after hours of vis'ing. So to me to say that what Tim did was wrong may be true if you're taking the view that a game should run well on mid-range machines at the time of its release, but isn't true if you're targeting a game for leading edge hardware and want to push the envelope more.
  • I can't stand voodoo - I can't stand games - I can't stand magicians who mess up the web with games - I can't stand people abusing users compulsiveness - I can't stand smart people who use their intelligence for lurking other people in degrading experiences - I can't stand the game industry - I can't stand people who think being intelligent is good enough - why don't you play your stupid games somewhere else - I know what the web is all about and you can't have it (said someone whose name I have forgotten).

    Noone brave enough to question extreme voodoo-ism?
    What a bunch of loosers.

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...