VoodooExtreme Interview With John Carmack 70
We've had quite a number of submissions concerning the VodooExtreme interview with John Carmack. Day One is on the site as well Day Two. Day Three goes up (Surprise!) tomorrow - so check back there tomorrow, 'cuz I'm not posting it again. *grin*
voodoo6 question (Score:1)
Because, from all teh comparisons I've seen of the new voodoo cards vs. teh nvidia and riva's, voodoo is loosing the FPS war. badly.
Re:WTF is this? (Score:5)
Well, today we learned:
Nobody understands the need for a DVD full of Linux apps
There's legal liability in having a web site
Voodoo cards sure are zippy
"Hackers" is an old book, and we don't care anymore
CueCat is as doomed as everyone thought it was
Stuff's still happening in the Mozilla world
The FBI is evil
Robots are cool
OS-X has a BSD kernel, which should be nifty
Geeks live in houses that any right-thinking person would avoid
AMD proves that Moore's law ain't croaked yet
Re:WTF is this? (Score:2)
Hemos posted that Voodoo Extreme has an interview with John Carmack on their site.
the interview is broken into 3 parts.
The 3 parts are called Day 1, Day 2, and Day 3.
Day 1 and Day 2 are already available on their site. Day 3 will be available tomorrow.
Hemos specifically says he's not posting anything about it tomorrow.
Darth -- Nil Mortifi, Sine Lucre
Re:WTF is this? (Score:1)
Oh, I read "there" as "here"...
---
It seems.... (Score:1)
And this isn't meant as flamebait, but I am sure they will ensue...
Re:Why does Slashdot keep stealing content? (Score:1)
Starting to tire of technology passed of as games (Score:5)
Obviously, though, this is all very technology oriented. There's more to games than that. It gets tiring to read interviews in which he is called the Top Dog of computer games, and all the questions are about 3D APIs and which video card is best and what console has faster hardware. In short, he's The King of 3D Tech on the PC, but this is being equated with the driving pulse of computer games. In a way it's sort of depressing that PC gaming has been reduced to video cards and benchmarks. This isn't Mr. Carmack's fault, of course, but it all feels very materialistic and empty.
Re:voodoo6 question (Score:1)
Re:Why does Slashdot keep stealing content? (Score:1)
NEWSFLASH: NET DESIGNED FOR LINKING (Score:2)
It is ideas like the one presented here that threaten the nature of the 'net. Some people would like to modify the HTML standard so that each link as a Copyright field in it so you can't go there unless your browser is authorized to. Or how about compiling HTML so other people can't view your code? Let's make the net read-only and royalty based, and lock out the information have-nots and poor and create a digital divide, setting up different classes of citizens!
It quickly snowballs, you see, and not for the better. Slashdot needs no such permission, if they want to make sure people don't like, do like the NY Times and register before you can view. It's called fair use - slashdot doesn't profit directly from linking, or not linking, to that article. And btw - I thought the goal of a magazine was to GET READ... putting up barriers to linking only encourages people NOT TO READ your 'zine. Or make dynamic links that break every 15 minutes or so. Go ahead, destroy your website and make people want to go elsewhere if that's your game plan, but keep your laws off my content.
--
Portals ? (Score:2)
Re:Slashdot stealing content? (Score:1)
Re:Why does Slashdot keep stealing content? (Score:2)
I really don't care if I come off like a flamer, but what planet are you from? If you haven't noticed, most articles posted on Slashdot have both the root web address for the site and the address for the article. Quite frankly, I don't want to always surf through crap to find what I'm looking for. Most don't. Besides, sites that are updating news and content often change their entry pages. So, this Carmack interview may be hard to locate in three days.
Also, many sites on the web over do the banners and revunue stream thing anyway. It's like, click next page 10 times for something that could fit on one page. I click the printer friendly formats whenever possible just to skip the crap. Banners and gifs and crap on the top, bottom, left and right. You readers know what I'm talking about.
This is lame anyway. Permission? They posted it. It's the web. We don't stand for that crap. Some stupid judge in New York may call linking wrong, but we don't. By the way, the search engines are stealing their content too if you apply the same logic.
You should work for the RIAA or the MPAA.
Re:Starting to tire of technology passed of as gam (Score:1)
Not to sound like some sort of bandwagon jumper-oner.
But without people like John C. and the Unreal Team there would be no new graphics technology. We would be forced to find our graphics enjoyment in the realm of console gaming. Which is good, but it's nice to be able to get a really awsome card during that 1-2 period when gaming consoles don't improve.
Hardware/Software are a funny thing because hardware drives the software to new heights and without the software, nobody would buy the hardware.
Re:Why does Slashdot keep stealing content? (Score:3)
To the uninitiated: VoodooExtreme is a news collection site. While it produces some of its own content, much of it is just links all over the web. You know, like slashdot. They excuse it by giving credit to all the contributors.
It's a thin line, though, between performing a service by seeking out cool stories, and just ripping off other sites' news. This point was best made by OldManMurray, who just linked to all of VoodooExtreme. It used to be at this link to "marvin sedate" [oldmanmurray.com], but that gave me some odd redirect to here [oldmanmurray.com], so they probably stopped doing it back in February, which is what the latter links to. Still funny.
Stealing revenue? I think not. (Score:1)
If someone from slashdot goes to this interview it means they weren't reading it from VoodooExtreme in the first place. This means if Slashdot hadn't linked to it they prolly never would have read the story... now that it has been linked from TrollDot lots of new people are going to read the interview. Slashdot has in fact increased the number of AD impressions and revenue for the site overall.
You obviously have never had to deal with issues like this before... people linking into your site is great. It brings you a new untapped market that may potentially come back for more.
---
Solaris/FreeBSD/Openstep/NeXTSTEP/Linux/ultrix/OS
Re:Why does Slashdot keep stealing content? (Score:2)
Here's an article all about deep linking and it's legal ramifications.
GigaLaw Deep Linking Article [gigalaw.com]
Also, as far as copyright violation goes, at least one U.S. District Judge disagrees with Brad Templeton as he ruled that deep linking did not constitute copyright violation. (it's on page 4, Ticketmaster vs. Tickets.com)
(well, to be honest he said it's not copyright violation as long as it's clear that you are being transported to another site....so no opening them into frames).
realistically, i think Voodoo Extreme is making a lot more money off of everyone at slashdot jumping directly to the article and being served the ad banner that's on the article pages. (i'm assuming there's one there since they are slashdotted and i cant check...but it's a pretty safe assumption considering i've never seen a news site that only served ads on their main page)
Darth -- Nil Mortifi, Sine Lucre
Re:Seeing as how... (Score:1)
Re:Starting to tire of technology passed of as gam (Score:2)
Ofcourse, I agree that most games put way too much emphasis on the flash and bang, but otoh don't underestimate the importance of a good engine, either. You can hate the industry's focus on fillrates, but as long as those fillrates aren't high enough, you can simply forget about Doom-like rooms with literally 70-or-so imps in it -- unless you go back to 2D sprites. Yeah, that's an option, but no publisher'll buy it. So, for every 10 games that come out that are all about the tech & gfx but offer no gameplay, there is one game out there that has a designer who really wants the tech for his gameplay ideas. That's probably a fair ratio. You can argue in what category id's products fall -- but at the very least it's clear into which some of their licensees go (HalfLife anyone ?). So give 'm a break ;)
Re:Seeing as how... (Score:1)
well, I am going to bite
carmack never said he'd make you his bitch, that would be romero talking about daikatana.
$var = <STDIN>
$var =~ s/\\$//;
Re:Why does Slashdot keep stealing content? (Score:1)
But I find it rather interesting that you consider webing (ie, linking to other sites) something immoral to do on the web. Last time I checked, thats why we have this thing called the world wide web, as opposed to the world wide set of non-interlinked sites. Linking is the reason we have a web.
More over, I find it bothersome that you consider it a bad thing that /. sends customers to other people's pages. Sure, it would make sense if they were sending us to a frame that contains only the story, but they aren't; the link sends us to a voodooextreme page. A full page. If voodooextreme only bothers to put banners on their front page, it's their fault. More likely they have banners on every page. Also, just the fact that we are now on voodooextreme's page, it would seem to me that a good number of people would actually look around other, non-related pages on voodooextreme just cause we are in the neighborhood. I know I do that all the time. Therefore, /. is supplying voodooextreme with customers, yet again.
In all, I find this stance rediculous, if only for the fact that if /. never linked to a page, we wouldn't be part of the "web." And if they only linked to the front page, I sure as hell know I would never follow a link from /. again.(Though I am sure a lot of sys admins would be damned happy about not having to face being /.'d.)
Just a little logic in the hands of an illogical computer user.
3dfx? (Score:1)
Re:Why does Slashdot keep stealing content? (Score:1)
revenue stream -- in effect, they're stealing VoodooExtreme's interview.
Wrongo.
You are so incredibly negative. Do you ever have any_damn_thing positive to say?
Did you even go to the page and read the interview?
Gaming sites have ads on every page, including interviews.
I would say that slashdot actually generated some revenue for voodooextreme by sending people to that interview.
Whatever (Score:1)
1. Technology to discourage deep linking exists.
2. Once published online, there's NOTHING anyone can do to prevent the content from propegating, either as a link or a copy. At least slashdot links to the site rather than doing a lynx -dump and posting that on their own site.
3. I wouldn't want it any other way. I have no interest in reading VoodooExtreme regularly to find the articles that interest me, nor would I be impressed if SlashDot posted a "there's a John Carmack interview somewhere on VoodooExtreme...go find it if you can!"
4. The issue of whether deep linking or mirroring is "wrong" is not clear-cut as you suggest with your "do the right thing" comment.
5. If VoodooExtreme asked Slashdot to remove the link, I bet Slashdot would be happy to oblige. I don't think VE cares that much. It's still publicity.
6. If I say "I saw an article in the local paper on page B6 about increasing medical costs..." I am not cheating the newspaper out of all the "page views" they would have gotten if I had left out the page number.
Re:Portals ? (Score:4)
The options are to either do PVS with a simplified version of the world, or ignore the geometry and just work with portal topology.
Unreal used a scan-line visibility algorithm, which crippled it's ability to have high poly counts or high framerates with hardware accelerators.
Tim Sweeny knows full well that the architecture is very wrong for modern systems, but many of the original decisions were based on earlier software technologies. Unreal was supposed to be a "killer app" for the Pentium-200 MMX processor.
I have a lot of respect for Tim and Unreal, but the visibility algorithm in Unreal turned out to be a bad call. He is changing it for future work.
John Carmack
Re:l33t-gamer (Score:1)
Slashdotted!! (Score:1)
Vooodooextreme is set as my start page....and I always wondered if that TI-85 they run the site off of would get Slashdotted......looks like they are!
Good thing I already read the article .
-Julius X
Since it is /.'d---This is the second part. (Score:2)
--
Voodoo Extremist Chris Rhinehart; Human Head Studios -- From what I've read, Doom3 is intended to have a strong single-player experience. What do you anticipate to be the biggest design hurdles to overcome while creating Doom3, as opposed to designing a title intended primarly for multiplayer?
John Carmack -- We sort of went into Q3 thinking that the multi-player only focus was going to make the game design easier. It turned out that the lack of any good unifying concept left the level designers and artists without a good focal point, and there was more meandering around that we cared for. The hardest thing is deciding what to focus on, because DOOM meant different things to different people. We have decided to make the single player game story experience the primary focus, but many people would argue that DOOM was more about the multi-player.
Voodoo Extreme -- When do you think computers will become fast enough so that developers can dump BSP based VSD algorithms for more flexible ones?
John Carmack -- I think this has been mis-characterized for a long time - None of the Quake games have had what I would call a "BSP based VSD algorithm". The visibility associated with quake is a cluster to cluster potentially visible set (PVS) algorithm, masked by an area connectivity graph (in Q2 and Q3), followed by hierarchical frustum culling (which does use the BSP). The software renderers then performed an edge based scan-line rasterization algorithm, which resulted in zero-overdraw for the world.
Early in Q1's development, I pursued "beam trees", which were truly a BSP based visibility algorithm that did exact visibility by tracking unfilled screen geometry going front to back, but the log2 complexity scaling factor lost out to the constant complexity factor from the PVS.
That highlights an important point that some graphics programmers don't appreciate properly - it is the performance of the entire system that matters, not a single metric. It is very easy to go significantly slower while drawing less primitives or with less overdraw, because you spent more time deciding which ones to not draw than it would have taken to draw them in a more optimized manner. This applies heavily to visibility culling and level of detail work, and is much more significant now with geometry processors and static meshes.
The PVS system had two significant benefits: constant time lookup, and complete automation (no designer input required).
Through Q2 and Q3, the "complete automation" advantage started to deteriorate, as designers were coerced into marking more and more things as detail brushes to speed up the processing, placing hint brushes to control the cluster sizes, or manually placing area-portals.
The principle drawbacks of the PVS are the large pre-processing time, the large storage space cost, and the static nature of the data.
The size and space drawbacks were helped with detail-brushes, which basically made a more complex map seem less complex to the visibility process, but they required the level designers to pro-actively take action. It has been interesting to watch the designers' standard practices. Almost nobody just picks a policy like "all small trim will be detail brushes". Instead, they tend to completely ignore detail brushes until the map processing time reaches their personal pain threshold. Here at Id, we usually didn't let maps take more than a half-hour to process (on our huge 16 CPU server.), but I heard tales from other companies and the community of maps that were allowed to take overnight or all weekend to vis. That is a mistake, but the optimize-for-vis-time guidelines are not widely understood.
The static nature of a pre-computed PVS showed up most glaringly when you had your face in front of a closed door, but the game was running slow because it was drawing everything behind the door, then drawing the door on top of it. I introduced areaportals in Q2 to allow designers to explicitly allow large sections of the vis to be pruned off when an entity is in a certain state. This is much more efficient than a more generalized scheme that actually looked at geometric information.
In the Q1 timeframe, I think the PVS was a huge win, but the advantage deteriorated somewhat as the nature of the rendering datasets changed.
In any case, the gross culling in the new engine is completely different from previous engines. It does require the designers to manually placed portal brushes with some degree of intelligence, so it isn't completely automated, but I expect that for commercial grade levels, there will be less portal brushes than there currently are hint brushes. It doesn't have any significant pre-processing time, and it is an exact point-to-area, instead of cluster-to-cluster. There will probably also be an entity-state based pruning facility like areaportals, but I haven't coded it yet.
Voodoo Extreme -- The shader rendering pipeline [in DOOM 3] - completely re-written from Quake III? How are you going to handle the radically different abilities of todays cards to produce a similar visual effect on each? For example I'm thinking of the presence or non presence of register combiners, and the different implemntations of these extensions.
John Carmack -- The renderer is completely new, and very different in structure from previous engines. Interestingly, the interface remained fairly close for a long time, such that I was able to develop most of the DOOM renderer using the rest of Q3 almost unmodified. It finally did diverge, but still not too radically.
The theoretically ideal feature set for a 3D accelerator would be:
Many texture units to allow all the lighting calculations to be done in a single pass. I can use at least eight, and possibly more if the reflection vector math needs to burn texture units for its calculations. Even with the exact same memory subsystem, this would more than double the rendering speed over a current dual texture chip.
Flexible dependent texture reads to allow specular power function lookups and non-triangulation dependent specular interpolation. No shipping card has this yet. I was initially very excited about the possibility that the ATI Radeon would be able to do some of this, but it turns out to not quite be flexible enough. I do fault Microsoft for adopting "bumped environment mapping" as a specialized, degenerate case of dependent texture reads.
Dot3 texture blending. This is critical for bump mapping. Embossing and bump env mapping don't cut it at all. GeForce and Radeon have this now, and everyone will follow.
Flexible geometry acceleration. I can't use current geometry accelerators to calculate bumped specular, so the CPU must still touch a lot of data when that feature is enabled. Upcoming geometry processors will be powerful enough to do it all by themselves. I could also use multiple texture units to get the same effect in some cases, if the combiners are flexible enough.
Destination alpha and stencil buffer support are needed for the basic functioning of the renderer. Every modern card has this, but no game has required it yet.
The ideal card for DOOM hasn't shipped yet, but there are a couple good candidates just over the horizon. The existing cards stack up like this:
Nvidia GeForce[2]: We are using these as our primary development platform. I play some tricks with the register combiners to get a bit better quality than would be possible with a generic dual texture accelerator.
ATI Radeon: All features work properly, but I needed to disable some things in the driver. I will be working with ATI to make sure everything works as well as possible. The third texture unit will allow the general lighting path to operate a bit more efficiently than on a GeForce. Lacking the extra math of the register combiners, the specular highlights don't look as good as on a GeForce.
3DFX Voodoo4/5, S3 Savage4/2000, Matrox G400/450, ATI Rage128, Nvidia TNT[2]: Much of the visual lushness will be missing due to the lack of bump mapping, but the game won't have any gaping holes. Most of these except the V5 probably won't have enough fill-rate to be very enjoyable.
3DFX Voodoo3, S3 Savage3D/MX, Matrox G200, etc: Without a stencil buffer, much of the core capabilities of the renderer are just lost. The game will probably run, but it won't be anything like we intend it to be viewed. Almost certainly not enough fill rate.
Voodoo Extreme -- The game side is C++, why not the rest of the code?
John Carmack -- It's still a possibility, but I am fairly happy with how the internals of the renderer are represented in straight C code.
Voodoo Extremist Gabe Newell; Valve Software -- John has consistently made very clear decisions about the scope of projects id has undertaken, which I would say is one of the main reasons id has been such a consistent producer over an extended period of time. Not having spoken with John about it directly, I think I understand his rational for focusing id on the Doom project. For the benefit of other developers, are there a couple of heuristics John uses to decide what does and doesn't make sense to undertake on a given project?
John Carmack -- The basic decision making process is the same for almost any choices: assess your capabilities, value goals objectively, cost estimate as well as you can, look for synergies to exploit and parasitic losses to avoid. Maximize the resulting values for an amount of effort you are willing to expend.
Computer games do have some notable aspects of their own, though. Riding the wave of Moore's Law causes timeliness to take on a couple new facets. Every once in a while, new things become possible or pragmatic for the first time, and you have an opportunity to do something that hasn't been seen before, which may be more important than lots of other factors combined.
It also cuts the other way, where something that would have been a great return on the work involved becomes useless or even a liability when you miss your time window. Several software rendering engines fell into that category.
-Julius X
Re: (Score:2)
Oh no (Score:1)
This is *not* interesting (Score:3)
GPL is meant to allow people the freedom of using code. It means anyone anywhere anytime can use and hack and play with the code to suit their needs, to scratch their itches.
It just so happens that GPL would not help id. It would not even help the community, I think, because anyone who can casually jump in and 'browse' and edit and play with the code, would probably be able to write this kind of stuff from scratch in the first place. It's highly specialized, highly tuned, highly precise code to do things tight and fast.
I guess it would benefit non-owners/writers of code if we could look through it and learn from it, but that is almost exactly why he GPLs his old out of date code. It's more useful as training material than it is for release/sales.
If someone wants a GPL high performance 3d engine, there is Crystal Space. Otherwise, ID owns Quake3, and can decide *when* and *if* to GPL it. No one else.
The nick is a joke! Really!
Re:It seems.... (Score:2)
Can't exactly forgo it, though! (Score:3)
ID does what it does well, and everyone else mooches off them. If no one is doing what you need, in terms of games, then go do the noble, honorable, open source thing, and go scratch your itch. Buy a copy of Quake[123] and go brew your own game. Code your own logic, make your own models, map your own levels, monsters, weapons, stories, etc. And then release it/sell it/distribute it. Because others are just as weary as you are, and will give you much praise/wealth/accolades for your contribution to their life.
The nick is a joke! Really!
Misconceptions (Score:2)
Well, have you ever heard of a little game called DooM? Or how about Quake? Or, if you want to get old school, how about Wolfenstein 3d? Wolfenstein was simply the first person shooter EVER. That's Ever with a big E. DooM is the most successful computer game on the planet. Hands down. Quake started the online multiplayer movement. There would be no EverQuest or Diablo II massive-multiplayer abilities if Quake hadn't broken the old creed that multiplayer wasn't viable. QuakeWorld quickly and simply made everyone turn their heads and wake up to the fact that millions of people would like to play other people far distances from them. In real time. With hardly any lag that NetQuake suffered. QuakeWorld still has thousands of people playing it today. DooM was the first to let you build your own levels (this is debatable, as you could build them in Wolf3d, and perhaps Rise of the Triad, but I'm not sure about the latter). Quake introduced QuakeC, allowing people to make their own games that had NOTHING to do with the original game. Ever heard of Quake Rally the racing game? or how about Air Quake where you got to fly around in airplanes and tanks and such?
If John Carmack isn't the founder of modern gaming, I don't know who is. The reason he's speaking only of 3d graphics, algorithms and such is because id is keeping its mouth shut on Doom 3 or Doom2k or whatever they're calling it. What else is there to talk about when you can't talk of the game? The technology that makes them work.
Woohoo, it's JC(at least, it's one of them ^^) (Score:2)
I don't think I've found a good one yet; Diablo and other such games have some of the flavor, but don't quite have the teamwork or shared experience that Doom [12] did.
Multiplayer deathmatch games are too haphazard and uncoordinated, whereas tight storydriven games are too delicate. Is there a way to create a *shared* experience game? I guess a war sim, with players a soldiers in a platoon, might do it.
I think it just requires the proper matching of story, plot, and gameplay to do it, and that it isn't impossible, even if it hasn't quite been done yet either. Think of movies that have group dynamics, and try to capture some of that in a game. I guess I'm rambling now.
Still, I'm looking forward to Doom3, if only to see what ID puts out this time...
The nick is a joke! Really!
Re:MODERATORS MOD THIS UP!!!! (Score:1)
NEWSFLASH: YHBT YHL HAND (Score:2)
Basic structure of the web (Score:1)
This allows for much more usefull and dynamic configurations than a less fully interconnected graph would have. It also means that every web page must stand (or fail) on its own. On a fundamental level every file, be it text, image, or other, is offered independently by the web server.
Any file with a unique permanent URL can be linked to, or even included in (via frames or tags) any HTML page, or fetched directly by URL. That is the basic nature of the web. Creating a business model which ignores the way the world is is unwise. Trying to change reality to support that business model is arrogant and stupid.
Re:Why does Slashdot keep stealing content? (Score:2)
I'm writing to offer another perspective on the deep linking issue, in response to your page at
http://www.templetons.com/brad/linkright.html
" The trick is that there may be restrictions on how the pages are fetched that the owner wishes to set. In particular, many web pages are composite documents, consisting of several items, such as graphics and text, and are not meant to be viewed in their individual parts. That they can be viewed as independent parts is an artifact of the HTML language, not the intention of the copyright holders."
I'd agrue that this isn't like specifying that book cannot be photocopied, but that it is more like specifying that the pages of the book must be read in a particular order, trying to claim that reading the last page of a mystery novel first is a violation of copyright law. Such an arguement wouldn't get very far.
"So does the owner have the right to say you can only fetch pages from a server according to rules they might set? Quite possibly. Remember that while technologically it is difficult (but not quite impossible) to stop people from being able to fetch a component graphic from a web page or a sub-page buried under advertising supported menus, the whole purpose of copyright law is to provide legal protection for documents when technological protection is hard. You don't need legal copyright protection if technological protection is easy, after all. "
It seems to me that it would be technologically easy to require that the referrer tag have some particular value in order to retrieve a particular file.
A server could be set up to compare the value of the referrer to the expected value. If they matched, the file would be sent. If they failed to match, another page would be sent instead, making it clear that the copyright holder wishes that the request file be viewed as part of whole, and providing the URL for the page the copyright holder would like to have people start from.
While it would be possible for someone to bypass such a mechanism, it would be _much_ more clear that they knew about the copyright holder's wishes, and that they were disregarding them.
The way HTTP requests are structured, and the way web servers are typically configured, it seems to be that the intent is that all files which are accessable are available to be viewed in any order or no order at all.
While I agree that some copyright holders may want something else, there should be some burden placed on them to make their wishes known. This could be done clearly, easily and concisely by checking the referrer and providing a page with a URL link when the referrer doesn't have the expected value.
For a copyright holder to complain about deep linking when they haven't taken such a step is a lot like someone dumping copies of their book out of an airplane and complaining when people fail to pay for the copies. Both people are choosing a distribution medium which doesn't provide for their desires, then blaiming the medium for their poor choice.
The fact is that the web is well designed for sharing information without regard to order. If that isn't what someone wants to do, they should avoid using the web to distribute their information, or they should make the effort to change the details of distribution to more closely suit their desires.
Re:Why does Slashdot keep stealing content? (Score:1)
Since when does copyright law have any notion of restricting _HOW_ someone uses a copyrighted work, other than using it at all, copying it, and redistributing it? Would it be legally enforceable to disallow people from flipping through a magazine backwards?
Do web site owners think that I have nothing better to do than sift through their horribly organized web sites to find the thing that someone else mentioned but were disallowed to link to? What a huge leap backwards! The other day there was mention of a story in the Washington Post but only a link to the front page of the Post. I got there and I couldn't frikkin find anything.
There are already enough technological problems with iteroperability, such as [all Microsoft products]. The last thing the web needs is legally enforced non-interoperability.
DOOM3 might be the nail in the 3Dfx coffin (Score:2)
If DOOM3 is as critical to accelerator success as Q3 and Q2 were, this could very well spell the end of 3dfx as we know it. As John said, the entire Voodoo series lacks dot product bump mapping, and in DOOM3, this will be used extensively. What does this mean? It means the difference between the textures in Quake3 at the highest texture level, and the wall textures in Half-Life: the Voodoo5, despite FSAA, will only be able to provide horrid texturing in DOOM3. Without texture clarity, the FSAA is worthless; the output will probably have the tris density of Quake3, but the texturing of GLQuake on the 3dfx 8-bit texture extensions (can you say, geezerware?). Unless the Rampage becomes real within a year or two, 3dfx could be in a very bad position.
Blah (Score:2)
Re:John Carmack already released it. (Score:2)
Re:Portals are good! (Score:2)
Re:DOOM3 might be the nail in the 3Dfx coffin (Score:1)
Doom3/Doom2K isn't going to be out for another YEAR. That leaves plenty of time for the Voodoo 6 (or 7 or whatever they call it
Re:DOOM3 might be the nail in the 3Dfx coffin (Score:1)
>mapping, and in DOOM3, this will be used
Thanks; I'll have nightmares about matrix manipulation all night now. Thanks a lot.
Re:Misconceptions (Score:2)
Also the reason everyone is talking about 3D API's, and FPS and fill-rate is because, like it or not, it's the future of gaming. Even Diablo-II, probably one of the last top-tier sprite games we'll ever see, uses Direct3D to accelerate and add more realism.
It's getting to the point that a graphics card is almost as important to your over-all game playability as your CPU. Have a 1G PC and a TNT1? Don't even try playing most new games.
John Carmack talks about this because people listen. He and ID software make *the best* 3d game engines. Whether or not you like the content of the game, the engine you cannot complain about. Not to mention John sits on the advisory board to nVidia and 3DFX. If those companies are listening to him, why aren't you? I don't think John ever claimed to talk about the plot line, or other creative items of the games he was working on. He talks about what he does for ID software, the engines.
Re:DOOM3 might be the nail in the 3Dfx coffin (Score:2)
Don't worry - the Voodoo666 1.1G (short for 1100000000) will sport 2000 VSA-100 chips in parallel, and contain 32 gigabytes of RAM on the card. It will require a server rack to house the videocard, a basketball court to house the power supply, and liquid nitrogen cooling it at 62 gallons per minute.
Unfortunately, the architecture used is not unlike the V5 and V6's now, which means the 64 gigabytes of RAM are shared between all 2000 processors, and that equates to 16 megabytes per chip. Sorry, no big textures for you!
Re:DOOM3 might be the nail in the 3Dfx coffin (Score:2)
Re:Seeing as how... (Score:2)
Re:Slashdotted!! (Score:2)
You must be a pretty big loser.
And that fact that you cut and pasted the entire day two interview in your post makes you a karma whore too.
Re:Misconceptions (Score:3)
Let's think about this. There were 3D games in the late 1970s and all through the 1980s. Remember Atari's Hard Drivin' coin-op from 1989? Fully 3D polygonal graphics, including realistic physics (and physics didn't start becoming a buzzword until 1997 or so). I could name dozens more 3D games released before 1992. The Wolfenstein 3D graphics technique of ray casting was used in a couple of games from 1983 (Way Out and Capture the Flag). There were multiplayer networked games before Doom and Quake, too. You need to learn your gaming history!
Wolfenstein 3D and Doom jumpstarted 3D gaming on the PC. There were 3D PC games before that, but John Carmack did a bang-up job of bringing us all up to date. Doom and Quake (which, remember, had John Romero as co-designer), are linked to the rise of 3D graphics on the PC, and the rise of the First Person Shooter Genre.
The mistake you are making is saying that gaming can be equated to these items. No, it cannot be. Consider Civilization, The Sims, X-Com, all the Ultima games (including Ultima Underworld), everything Sid Meier has done, everything Shigeru Miyamoto has done, the Freespace games, the Final Fantasy series, The Need for Speed games (which started before Quake, BTW), and so on and so on.
A classic fanboy mistake is thinking that not only are the Quake and Unreal engines the epitome of 3D technology, but that the development of these engines are the foundation for gaming. Neither of these is true. It's just that all the other 3D game developers out there aren't poster children for PC graphics card manufacturers.
Am I insulting John Carmack? Certainly not. I'm insulting the fanboys who insist that Carmack and Sweeney are the Sole Carriers of The Gaming Torch, well, that's just misguided.
Re:Crackhead moderators. (Score:2)
---
Umatched Satire. (Napster). (Score:1)
Re:Starting to tire of technology passed of as gam (Score:1)
John Carmack is brilliant...
I just wonder whether his next step will be to pioneer space travel now that he has read that rocket science book.Gabe Newell using new channels for info. (Score:1)
Did anyone else notice Gabe Newells question regarding the targeting of the technology at broadband capable systems. Maybe TF2 is already finished and they are just waiting for everyone to get Cable/xDSL so we can play it :)
Maybe whilst they are waiting they could port it to linux ?
As a programmer who has hacked about with OpenGL and as a gamer I would like to pledge my undying allegiance to John Carmack as IMNSHO he is a genius.
Nearly as clever as these guys http://webpages.mr.net/bobz/ttyquake/ [mr.net]Re:Misconceptions (Score:1)
Well lets see. How about Nolan Bushnell? Ken Kutaragi? Shigeru Miyamoto? Get some perspective, iD has invented a single genre of game. A very cool genre to be sure, but not the only one and not the most popular one either.
Re:Portals ? (Score:1)
damn rockets in quake (Score:1)
The third part is present now! (Score:1)
Re:Woohoo, it's JC(at least, it's one of them ^^) (Score:1)
I think that the best co-op I've played is System Shock 2. A lot of the problem with co-op comes from difficulty balancing - ultimately, a party of four in a co-op game can take four times the damage, carry four times as much stuff, etc...
If you could complete Doom on Ultra-Violent, then co-op on Nightmare. Given that people re-spawn when they die, it's impossible to lose... then again, in System Shock 2, you re-spawn in single player as well.
Ultimately, a game designed for co-op would need to have a dynamic difficulty, which made the game harder depending on the number of players. Facilities to split the party would be good, as these would heighten tension. When playing System Shock 2, my friends and I try to heighten the excitement by going as quickly as possible, often splitting into two or more groups to accomplish targets simultaneously. We're not forced to, but it makes the game so much more fun...
Alex
Re:NEWSFLASH: YHBT YHL HAND (Score:1)
--
Re:Woohoo, it's JC(at least, it's one of them ^^) (Score:1)
Re:Portals ? (Score:1)
Re:Portals ? (Score:1)
Christ, I hate to be so petty, but after seeing your tirade about Mr. Carmack's incorrect apostrophe I must point out your grammatical mistake. Your sentence should read "if you had AN ' in your code." The ' would be pronounced "apostrophe" if you were reading it out loud, and therefore should be preceded by an "an" not an "a."
Ok I'm done. Now who's going to nitpick *MY* grammar?
ToiletDuk (58% Slashdot Pure)
Re:Misconceptions (Score:1)
Re:Portals ? (Score:1)
Re:Why does Slashdot keep stealing content? (Score:1)
No one wants to steal anything from a news site; when we *really* want to steal content, we have Napster and Gnutella on our side.
Re:Portals ? (Score:1)
Voodoo magicians - I don't like you ! (Score:1)
Noone brave enough to question extreme voodoo-ism?
What a bunch of loosers.