Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Games Entertainment

No More Unreal Ports For Linux? 188

Ant was among the first to write with a link to this article on Blue's News claiming that Epic's new game engine is Direct 3D only, based on statements made at an E3 demo yesterday. Check that link to read the full article, but consider this excerpt: "A major side effect of this is that any future ports of Unreal-engine titles that use the new technology will need to have a completely rewritten rendering system, making Mac and Linux ports significantly more difficult." DoenerMord also wrote, saying "This kind of puts some perspective on recent comments from Tim Sweeney (the man behind the Unreal engine) on Microsoft's breakup ..."
This discussion has been archived. No new comments can be posted.

No More Unreal Ports For Linux?

Comments Filter:
  • by Anonymous Coward
    Quoting from http://lists.openprojects.net/pipermail/glx-dev/19 99-December/002511.html:

    I asked a microsoft rep about exactly that one time. He said they would
    "protect" the API. Big difference from SGI's tacit aproval of mesa.
  • Sadly, what many people are ignoring is that not only the Linux ports are losing out, but the PC as a platform, too. To get the last 4 games that I bought run on my Win system, I had lots of trouble, lots of tweaking to do. For me that's fine, but for the average John Smith user it's a nightmare. I went to E3 last week and at the PS2 booth I overheard Jay Wilbur behind me (formerly id, but now I guess working for Epic), saying that though they sold UnrealTournament poorly on Win, they hope that on PS2 it will receive the attention it has lacked so far. I think this may be the last year of the PC as a gaming platform, as - let's face it - the second generation of console platforms are just very attractive for developers - potentially much higher sales and much better issues regarding technical support. QA is getting to be a major trouble these days. As much as many /.ers may believe that Linux ports are the key to success, I think that the platform losing out is the PC in general. If there are PC ports in the future it will be ports from existing 2nd gen consoles and if these ports are from XBox, it is clear that this may be problematic for Linux ports.
  • The spec is already open, otherwise people wouldn't be able to program games for it. The major problem is, that if you want to use direct3d, you'd need to implement a lot of windows core functionality in order to get it to work. It would probably end up being a lot more work then porting OpenGL, actualy. You'd need direct Draw, for sure. a lot of GDI code, and a lot of COM stuff, as well.
  • Sure different implementations in D3D have different capabilities. In D3D, however, they are all emulated to smooth things over. Take compled vertecies for example. In OpenGL, if the ICD doesn't support that extensions, then you're ass out. In D3D, it is emulated where not avaible. The thing is that the new vertex and pixel stuff in DX8 will be emulated where not available. Write to the API and you can be sure that high quality implementations like nvidia's and matrox's and ATI's will accelerate the shading correctly. If the feature is not available, then it will be correctly emulated by D3D. About you second point, I think there is a general concensus that extensions suck. Take those nVidia extensions. Sure every feature will be there, but you'll have to rewrite your code to support the Radeon, then you'll have to do it to support Matrox's new card. A game developer is not going to wait a year for the ARB to approve an extensions when a major new graphics arch comes out every 6 months. Also, about MS shafting vendors, you've got it wrong. The game developers and the hardware companies come up with a set of featueres they want. Then the game developers ask for a set of features they'd like to see in hardware. MS puts those feautures in D3D, emulating them if they are not available in hardware. They bait the hook if you will. In the next revision of graphics cards, the hardware vendors take the bait and add that feature to the graphics hardware. This transition is seamless. When the new cards come out, the feature goes from being emulated to accelerated. No shafting is involved. In fact, MS hates nvidia (too much support for OpenGL), but I don't see any features in D3D shafting nVidia. In fact, quite the opposite, stuff like cubic environment mapping shows of a lot of features in nVidia cards. As for your third comment, MS stopped copying OpenGL features long ago. Why? They ran out of features by versions 6. Most of the cool, nifty stuff in 7.0 is new, and 7.0 is quite stable and fast. I doubt 8.0 will be any different.
  • Eh, that's true, but..

    The GNU port of Allegro is in a rather bad state right now. I downloaded it to play a nifty game called "Liquid War", and straight off discovered that I had to install the binary version because it would not compile from source. (it looked like a syntax error, as I recall, but I wasn't in a mood to hunt the bug down) Once installed, it worked -- except for several very annoying efficiency problems: among other things:

    -> it appears to not block at all, even when the program is doing nothing but waiting for input with an unchanging display. I suspect it just spins to handle timing instead of sleeping (I haven't tried to verify this)
    -> Drawing seems to be insanely inefficient. I don't know whether it's using shared memory (I suspect it isn't), but I've seen games of much less visual complexity work with much less CPU usage. Some button-pushing popped up what looked like a meter of time used, showing that almost all the time was going into drawing routines rather than game logic.
    -> For some reason, Allegro insists on using its own special mouse cursor instead of X's.

    The upshot of this is that displaying a static title screen takes 100% of my CPU time. Not so good.

    Of course, this is really still a beta version of the port and I'm sure it'll get better..unless these problems (eg, assumptions of 100% CPU access) are present in the API as well. I haven't looked to see, perhaps you can tell me.

    Daniel
  • No one I have ever known has done that. And they get pretty good performance to, at 100mhz :)
  • Actually, if these were standardized on hardware, then the API would never evolve. No cubic environment mapping for you!
  • Err, isn't that the standard that Microsoft is using to force people to upgrade from to Win2k with their refusal to release DX6 for NT4?
  • The problem is, no one follows the "standards" the same way. Take HTML for example ...

    Bad example. HTML is not supposed to look the same everywhere. When are people going to figure that out? HTML is supposed to be a presentation-independent, logical markup language. You markup what you are trying to say; the user agent (browser) determines how to say it.

    Netscape and IE. Wait, hold on, those two are guilty of extending the markup language on their own...

    I'm not saying NSCP and MSFT aren't both guilty of the crime of discarding standards. Indeed, I've condemned Netscape for their behavior in the past. Mozilla, at least, is based on standards, although I think that was because NSCP saw that was their only hope.

    I personally just want to use whatever works well, and I want it to be the same.

    So, we shouldn't have a choice? A single, select few companies should control what is fast becoming the most important communications medium in human history? We should be at the mercy of MSFT, AOL, and the like?

    And you call yourself a libertarian? Libertry is the root of that word, and it seems you're not all that familiar with the concept.

    ... those OEM vendors signed the contracts, no one forced them to do so ...

    Um, MSFT very much forced them to do so. It was either that, or lose their right to to distribute Microsoft's OS, which was pretty much the only choice (see monopoly). The only choice the companies had was to sign the contracts or go out of business.

    If that's your definition of "liberty", well, I'm glad I don't agree.

    There is a great advantage to having a somewhat standardized Operating System.

    There are advantages to having a somewhat standardized health care system, too. And to the regulation of firearm ownership and operation. Yet, if you are a libertarian, I'm sure you are strongly against such things. Why are OSes different?

    One of my greatest frustrations with Linux is the amount of variance in the systems and the versions of the libraries they include.

    Funny, that's my greatest frustration with MS-Windows. On Linux, all I have to do is grab a copy of the library I need, install it, and I'm all set. The better package-management tools even take care of this for you, automatically and transparently.

    On Windows, that is often illegal, and when it isn't, you still cannot have more then one version of a library installed. DLL Hell. No, thank you!
  • > OpenGL seems to me to be an exceedingly well-thought-out, clean, and consistent API.
    It is. Check out the blue book for more goodies on the orthogonality of OpenGL.

    > What is wrong with GL?
    Not a hell of a lot.

    - The only real drawback of GL is that software rendering speeds, are, shall we saying "quite lacking". It was meant for 3d hardware.

    - OpenGL drivers usually aren't as optimized as DirectX due to limited time and money.

    - Some people will complain that the OpenGL doesn't support every new feature in a consistent way, but the OpenGL extension is a much better way to do things. ANY vendor can add their own extensions. After implementations settle down, it may become part of the standard. Look at how long it took M$ to get rid of the Execute Buffer garbage in Direct3D 2/3.

    --
    "If I protest an illegal tax, does that make me an illegal tax protester ? ;-)
  • > I have never understood why no one has developed a game OS.

    It's called Winblows 9X. The darn thing isn't stable enough for anything else ... ;-)
  • the celeron-II is a decent value for BX-motherboard owners who will overclock to 100-112Mhz. the article you mentioned even admits it didn't overclock the celeron-II properly, although it's written very slyly, as if it purposely was meant to slander the new celeron-II. bad choice of reading, if you ask me. try www.hardocp.com or www.overclockers.com, etc. sheesh
  • The difference is that IE is "tied" into the operating system. You're not _required_ to have telnet, or Lynx, or an equivalent of Lynx.
  • by Anonymous Coward on Sunday May 14, 2000 @07:09AM (#1073667)

    It's sad, but true. OpenGL needs a lot of
    extensions now, and these extensions must be
    made mandatory for the next generation of
    games.

    Cards are now hitting the market with environmental bump mapping, cube mapping, texture coordination perturbation, etc, and unfortunately, the only way these are currently supported is through mutually incompatible OpenGL vendor extensions.

    In typical MS fashion, DX1,2,3..5 sucked. By DX6-7 were good, and now DX8 overtakes OpenGL.

    You can't blame game developers from moving away from OGL if they can't write to a consistent standard API that utilizes all of the hardware acceleration features available on a card.

    This christmas we will see the ATI Radeon, 3dfx Rampage, and NV20. All of these chipsets are likely to support the DX8 shading language. OpenGL won't except through proprietary extensions like NVidia's register combiners extension. That means developing a game that uses advanced photorealistic lighting on OGL will be hard.

    Herein lies the difference between open-source and commercial development. MS actually runs conventions, gathers developers and manufacturers together in the same room, has people present papers, and say: What do you want in the API? The game developers and hardware manufacturers interact, and MS adds the most request features to the API.

    As far as I know, there is no open-source equivalent to MS Meltdown, E3, WinHEC, etc.

    What Linux needs is a hardware/kernel convention so that there is some actually two-way communication between developers and manufacturers instead of just: "Gimme your goddamn specs so I can write a driver."

  • The only reason that UT windows made the sales is because there was no Linux boxed version available. Yeah, I had to buy the windoze version and d/l the binaries to play under Linux. And you know what, my framerates blew away everyone else running it under windoze. I guess they are ditching Glide for the very same reason nVidia is ditching Linux: money. Seems they enjoy sucking the micro$oft moneycock just like nVidia. Thanks a lot, Epic. Enjoy your time while you can.
  • id wrote Quake 3 entirely in OpenGL and entirely by themselves. Loki merely packaged it for them.
  • by dek2 ( 162721 ) on Sunday May 14, 2000 @07:12AM (#1073670) Homepage
    You can't "just port Direct3D to linux", because D3D is carefully licensed and protected by Microsoft. The reason they switched to D3D-only is because OpenGL doesn't have the features Sweeney wanted, and he told Microsoft to add such features to D3D. OpenGL's ARB just can't implement new features and versions of OpenGL as quickly as Microsoft can add new features to D3D, due to organization structure and manpower.
  • I guess I can add Epic to my list of companies to no longer buy games from. Right up there with the fine folks at Blizzard who refuse to allow Linux ports of their games. Tim Sweeny just doesn't make any sense to me. Unreal was a decent game ruined by horrible netcode, and a tereribly buggy/unoptimized engine. The UT engine is just as much a bloated turd as the Unreal engine is, rarely topping 40 FPS on even the most powerful systems out there. Perhaps if Tim stepped back and tought about how it might have been easier to work with an OS that isn't so damned buggy and hard to code for, his own games might not have required the numerous patches that he had to write to fix all the horrible bugs.
  • I bought Unreal Tournament only because there was a linux port of it.

    I will certainly not buy Unreal 2 unless there is a Linux port of it. And a server only port is not enough for me.
  • I think this is a good idea. It wouldn't hurt to have DirectX _and_ OpenGL available on free Unix platforms.

    However, the "flamebait" moderation on your post show how immature and ignorant many Linux users are, please don't let such dogmatic opinions stop you from implementing FreeDirectX.

    -T

  • by atw ( 9209 ) on Sunday May 14, 2000 @07:19AM (#1073674) Homepage
    You are absolutely right in my opinion asserting that Microsoft paid them money (and they have all right to do so, nothing bad here). However, since EPIC always targeted DOS/Windows as their primary gaming platform, it would never have become a problem to run it on Xbox. The actual answer as I think is that if Microsoft were to compete with Sony and SEGA in the console arena (and that's what they presumably trying to do with the XBox), then Microsoft would have to have some EXCLUSIVE games to lure players to buy Xboxes in the first place. Usage of DirectX ensures that whoever crazy enough to port this game will have hell of a time.

    What's really interesting is that EPIC seems to be desperate enough to bank on their relationship with Microsoft, and they do all that in a very foggy situation when Microsoft can be split up and God know what else can happen to them. EPIC is really taking a major bet here, and this probably is because they are not really doing too well in financial terms, hence comes cash infusion from Microsoft. Games development is an expensive business you know... and hey, EPIC guys deserve to drive Ferraries, too, right?

  • Let me add to what you posted. Tim essentially said that networking was in chaos until MS formally incorporated a solution that was the cross-platform industry standard. Well, duh Tim! I wonder why that was a good solution? Now suppose if someone were to use a nice cross-platform de facto 3D graphical standard like, OpenGL?

    Or what about a cross-platform security feature like Kerberos? No wait, I got to think more about this one.:-)

  • It's not that they can't do the "components" but they can't be an integral part of the OS. They cant standardized there OS around a specific component as most end users (ie, the ones bitching about windows) think that it is monopolistic to do so.

    The problem here is again, a different stack here, a different browser here, a different protocol there, a different compression scheme here, a different xml engine there, a different ipsec here, and you get the point.

    I could see microsoft being forced to allow its applications to run/be developed for OTHER platforms, but making win98/win2000 and future versions be broke up to replaceable components is DUMB. And that will cause problems.

    P.S. You can diss glide, but it was 3dfx voodoo and glide that started the whole PC gaming era as we now it. Without them, we would all still have VLB 1 meg Cirrus Logics :)

  • I am slashdot, that is, i am one of the many people that read and believe in slashdot and its ideas about open source. I feel that if microsoft would release the source code for its products they COULD be improved drasticly. But about this whole DirectX thing

    DirectX is a standard that is not perfect, but does its job quite well. only problem is that it is ONLY windows. If the DOJ breaks up microsoft then EPIC may not be able to pull back together enough to improve their games using OpenGL or something. They are jumping into a TRUST relationship with microsoft saying "you can do it, and we will make our shate for your shate only"

    my solution,
    the DOJ make microsoft release source code for the technologies that give it an advantage. DirectX for one. Microsoft gives the Linux comunity the source for DirectX, next day there is a custome installer for GNOME or KDE, BeOS can run every game windows can with perfect memory protection, and Macs can play more than solitair on their proprietary OS. Next step
    DOJ looks at Intel and says Monopoly, you must turn over all of your proprietary technology and Open-source the plans, suddenly the AMD thunderbird is socket370 on a Alpha bus at 200MhzFSB and you can still clock it down to 66MhzFSB to run your old celeron366.

    Overclockers are happy because they have BUS options up to 200-or 400mhz later on- and still use the chip they have today, and then they can upgrade down the road. They know that their socket370 chip will work in any new motherboard. Servers are running the same socket370-or socket A if your diehard AMD- so you can strap in your new AMD Athlon Xeon-which just happens to support first generation SIMD and advanced 3Dnow- and you can swap it out for the new 1.5Ghz Intel whillimet Xeon-which has a better SIMD but its 3Dnow aint quite as good.

    Standards are good, and there is something to be said about open source, final thoughts:
    open source every standard, not to be modified necesarily, but to MAKE it standard. Game developers would only have to make one rendering engine, it would run on Mac, linux, BeOS, BSD, windows, java, the new amiga, everything. they would just need some poor, boored hacker to port the executable over.
  • Do you have any idea what you're saying?
  • Join the OpenUT list at openut.sourceforge.net. I will guarentee it is an installation issue. 99.9% of the maps out there work just fine. In fact, I haven't found one that doesn't, but I know of a few circumstances that would prevent a map from running. jf
  • Like it or not, the X-Box will still be the fastest performing console available when it is released in Q4 2001.

    Given how complex the PS2 graphics engine is, and how current games have only just scratched the surface of what it can do, I'm not convinced that the X-Box will even be faster than PS2, let alone PS3 which should be out by then. Never mind the graphics engine, the X-box would appear also to have a hard time competing on memory bandwidth vs PS2 with RDRAM. I'm no fan of RDRAM in general, but in situations where bandwidth is more important than latency - which includes graphics - it certainly beats SDRAM.

    Anyway, my point wasn't how X-Box compares to other consoles, but how it compares to PCs and users general speed expectations. A 600MHz processor is hardly top-of-the-line today, let alone by Q4 2001. AMD's SLOWEST processor is about to be the Duron 600MHz, which has a full speed bus and will beat the PIII based Celeron II. Processor speeds from both AMD and Intel are expected at 1.5GHz in Q1 2001, and 2GHz by Q4.

    VooDoo 5 6000 may be $600 now, but by Q4 2001 it'll doubtless be MUCH cheaper as well as replaced and outclassed by VooDoo 6 or whatever. nVideo will also obviously keep bringing out higher performance cards, while the X-Box's design is frozen. Given the relative size of the games console and PC markets, game developers will continue to target PC platforms for games (maybe X-Box/PC can be targeted together), and user's expecations wil be raised by PC performance levels.

    If you've ever looked at the PS2 graphic engine architecture (Ars Technica have a good in depth review), you'll realize how complex it is, and what type of performance it can offer to games that truly take advantage. It's very hard to compare it to PC graphics cards. It's significant though that a number of arcade games manufacturers are switching to Sony's engine...
  • by Anonymous Coward
    so it'll be an easy port to the Xbox once it's released. I'm sure Microsoft threw them several million dollars.
  • is not JUST a graphics API like OpenGL is. DirectX is a whole package of media libraries. Funny thing about this is I sent in an Ask Slashdot a couple months ago asking if people were using OpenGL more or D3D. Anyways, DX is very popular (in my estimation) because it is a unified set of media libraries that are all written together and in the same way. A bonus is that video and sound cards have hardware DX support. This is a boon to any game developer because it means the CPU can pump out a few extra transforms or some smarter characters and so forth. What I have yet to hear about is a completely open and cross platform media API. There probably are a few floating about, they need some attention. I really like being able to play the same game on multiple platforms, I really wish more games could be developed on multiple platforms. I would have thought a game like Unreal would have primarily cross platform development in mind, you're trying to get as many people fragging as possible. Oh well, I don't play Unreal anyways and HALO will be on Mac and PC so I'm happy.
  • Oh well, I suppose they do.
    Of course, this is going to ruin the lives of people like us now. I think that we might possibly have to find some replacement, but what I wonder...
  • by pb ( 1020 )
    MEEPT is back! All Hail MEEPT!

    Please moderate the parent up. (only moderate my plea down after moderating the parent up, or ignoring my post and moderating nothing)

    Why?

    Because now that slashdot is censoring trolls and other fun-loving individuals, (try posting something with "First Post" in the subject line, too many capital letters, too much punctuation, etc., etc. and see what snide replies you get) a good way to combat this would be "intelligent trolling".

    ...and if you don't remember him, MEEPT is the king of intelligent trolling.

    So please, let him post with a default score of 0 or 1 just so the funny ones have a chance of being moderated up more, and I can see his posts more often! (and look through his history, moderate the good posts up to +5!)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • by Millennium ( 2451 ) on Sunday May 14, 2000 @07:32AM (#1073692)
    Epic has already made a ton of money off of all three operating systems. And yet they would do something which would hinder that strategy? Particularly with an inferior 3D API?

    I'm not worried about the Mac port; Westlake (who did the Unreal and UT ports to MacOS) has a lot of experience converting DirectX to interoperable API's (heck, most of The Sims is Windows-specific but they've already got it playable, albeit not yet at Alpha). But this could spell big trouble for the Linux port.

    It's yet another of Microsoft's Broken Promises (tm). The Mac compatibility layer for DirectX was due years ago, according to Microsoft itself. And with a public API, a Linux layer would have been far easier to implement.

    Now, let's look for a moment at layers:

    DirectX
    Pro - Easily controlled if you're in bed with MS, as Epic admits to being. Relatively easy to program.
    Con - Windows-specific.

    QuickDraw3D
    Pro - Very easy to program. Available on many platforms via the Quesa [quesa.org] project (already nearly complete). (and available on at MacOS and Windows in its original, closed implementation) Open-Source, again via Quesa. Implements a standard file format (3DMF, the basis for VRML97).
    Con - No longer being actively developed by Apple.

    OpenGL
    Pro - The most powerful API out there. Already has the non-game marketshare by an overwhelming margin. Runs almost anywhere (hell, it even runs on PalmPilots!) Has Open-Source "alternatives."
    Con - Few primitives, making it harder to program. Eats resources like you wouldn't believe; requires hardware acceleration for decent results. No true Open-Source "implementations," though that's more a technicality than anything else.

    So which is the best? Each has its pros and cons. I'm more partial to the underappreciated QD3D (which, incidentally, Quesa implements on top of OpenGL). I will say that DirectX's platform specificity is a big problem unless you only plan to port the thing to the X-Box. But again, if you're in bed with MS, you can change the API practically at will. OpenGL really doesn't have that big a share of the gaming market (though overall it's overwhelmingly the most popular), but it's more powerful than any of the others. QD3D has ease of programming, but it never really caught on for some reason.
  • The dreamcast support directx as well. So it is far from fooslish to stick with DirectX when the licensing is virtually free (you get it with the compiler or when you license WinCE to develop the game).

    So from one source base, and small code changes here and there you got Win95, Win98, Me, Win2000, Windows CE (Personal/Palm PC's), DreamCast and the soon to be released X-Box.

    Windows still runs 99.9% of the consumer/business PC's, Dremcast has 4 million+ units shipped, and the X-box can claim more.

    How is having a market of 20+ million customers a bad decision versus porting to linux and having to worry about licensing more so then quality?

  • by Frac ( 27516 )
    I'm not surprised if Epic arrived at this decision as a result of lackluster enthusiasm from their Linux port of Unreal Tournament.

    Their conclusion is probably because there's a huge overlap between Windows and Linux users (those who dual-boot to run Windows games), and it certainly doesn't make sense for them financially to support both platforms, since those dual-booting Linux users will buy their games anyways.

    Besides, supporting DirectX only will get on Microsoft's good side, which leads to preferable treatment when the X-Box comes out. Remember, Microsoft will be the central authority for the X-box, as they have the right to issue an license for a game or not (unlike Windows). How does waiting two weeks for a X-box license sound, instead of waiting four months like the OpenGL-supporting id Software?

    What this means is that you, as a gamer who wants to see games ported to Linux, should boycott game companies like Epic, who are departing platforms like the Mac and Linux for the greener pasture that is the X-Box. E-mail the developers at Epic that you refuse to dual-boot to Windows just to run their games, and that you will only choose games from developers that value the "right thing" over greed.

    Go get your free Palm V (25 referrals needed only!)

  • No shit, as if I was'nt aware of the meaning of the acronym ... however, AFAIK, it actually *emulates* Direct3D by translating its API to OpenGL .. so there you go ...
  • It is obvious that Tim doesn't properly seperate between specifying
    APIs and ABIs on one hand and a specific implementation on the other.
    The TCP/IP stack example shows this: To end the DOS/early-Windows
    chaos, people should have agreed (or forced to) APIs and ABIs, it
    wouldn't be required to use one single implementation.

    However, in several ways D3D as an API does what Game programmers want
    where OpenGL does not. For example, it has better control over
    texture loading, which is important for games that try to be more
    beautiful and/or more outdoor - oriented than Quake. The step from
    Glide to OpenGL/D3D initially widened the speed gap between Quake and
    games like Unreal, Starsiege Tribes etc, mostly because texture
    loading was suboptimal and couldn't be controlled/tuned as in Glide.

    Microsoft quickly extends D3D in ways the game programmers want, while
    the OpenGL consortium moves much slower and while moving it has to
    weigth the concerns of "serious" (non-game) 3D applications as well.

    There are some more far-sighted programmers like Carmack and Chris
    Hecker who think (my interpretation) that using a single-vendor API
    will cause more trouble in the long term than is worth the quick
    improvements now, but few people enjoy the luxury of allowed to be
    stubborn.

    Sweeney has to position his product relative to Quake, he needs to do
    some things better than Quake and (engine-wise) he lost ground while
    moving from Unreal vs. Q2 to UT vs. Q3A. OpenGL needs a lot of
    "right" tuning by the vendor and Quake is often used as the benchmark.
    It is obvious that Sweeney *needs* the better control features of D3D
    to get a product that is different from the new id engine while still
    performing well.

    I'd like to make clear that I'm absolutely on the OpenGL side here,
    I'm just trying to point out that Sweeney isn't necessarily evil
    (although his post about the MS splitup is rather stupid IMHO and I
    can't agree with most of his programming language observations
    either).

    Martin
  • But the problem is, if Janet Reno and Co. get their way, programmers won't have the power to make that decision anymore -- the government will.

    Where on this Earth did you get that from? Have you been watching too many "X-Files" shows?

    The government is not proposing, nor are they allowed to, dictate what programmers have the power to include in anything. If you honestly think that, I think you should log off, shutdown your workstation, and step outside. Notice the lack of flying saucers and KGB patrols. Once reality has returned, please re-join the discussion.

    What the DoJ is proposing is that a single company -- one Microsoft Corporation, in Redmond, Washington -- be broken up into two or more seperate companies, as punishment for illegally using their monopoly power to stifle competition, and to help prevent it from happening again.

    No more, no less. It will not affect what other companies can do. It will not affect what other programmers can do. It won't even affect what the new "Baby Bills" can do. It will simply split the companies and prevent then from working together.
  • AFAIK, there is no intention to pull OpenUT or any of the source code. In fact Brendon posted to the OpenUT devel list that he was going to release the new UT 413 headers directly after E3 so we can start developing again.

    Also note that Epic has done fairly little development for the Linux port, but rather released enough of the source code to allow the OpenUT group to get the engine running. Since the source release, the code has been ported to SDL for window handling, I/O, and OpenGL, and there is currently heavy development on the sound code, with, with the exception of 3d audio, is already on a par with the windows version.

    But there are three things that that gives me much hope for the future with Unreal and Linux:

    1) Daniel Volgel, one of the main programmers on OpenUT (probably would be considered the 'lead' programmer) started working for Loki about a week ago. He has a pretty good working relationship with Brandon Reinhart and I would imagine that there would be a dialog between Loki and Epic.

    2) AFAIK, the playstation uses OpenGL as it's primary graphics API, and Linux as it's development platform. Heck, I KNOW that Brandon has been doing the UT port on a Linux box. How can you ONLY support D3D and still port to the playstation in a reasonable fashion.

    3) It's a long way from here to a new Unreal Engine. Lots of times, and there is lots of time to change their mind.

    Anyway, that's my 2 cents. I've enjoyed hacking on the OpenUT code, and will continue to. And EPIC has done the Linux community a great service in releasing parts of the UT engine source, which can be a great learning experience on how a game engine is written.

    OpenUT is very playable on many systems, and this will only improve as the state of accelerated OpenGL (via DRI or GLX) on Linux improves.

    jf
  • Hell, all the linux zealots are a bunch of hypocrites anyway - all y'all have Win98 on your machines for games. So basically, your ethics can be bought for a video game. Ha.

    I don't have Windows anywhere -- five desktop boxes at home, desktop at work and a laptop, all run Linux and FreeBSD (one of desktops is a powermac 8100/80 -- it also runs MacOS as a bootloader for mkLinux). And, yes, I have bought UT, and I am absolutely sure, some Epic bean-counter counted it as a sold Windows version, even though I run it on my Linux box.

  • You should wait patiently. It took 5 minutes to render, but it rendered finally (I tried it on NT, sorry if it doesn't apply to linux version)

    Szo
  • emulator is, namely, that which emulates (tries to act like, but is not) an origial product.

    So the second system to implement an API is the emulator? All X11 servers but the first implementation (incl. XF86) are emulators, right? And all Java VMs other than the first are emulators, right? And the GNU system [gnu.org] is a UNIX® system [unix-systems.org] emulator, right? Microsoft IIS is an NCSA-httpd (now Apache [apache.org] Server) emulator?

  • Just a few months ago, Tim Sweeney got into a big .plan/news page battle with Jason Hall from Monolith about Unreal's place in the Engine licensing space. In those discussions, Tim kept pointing out that Epic licensees got a source tree that worked for PC, Linux, Mac and PS 2. This was supposed to be a great advantage.

    So it surprises me that there is all that talk about dropping other platform support, especially after just committing to PS 2, as i doubt that it's going to have a port of Direct X.

    Both Tim and Brandon's comments taken together look a little strange and may support rumors of dropping cross-platform support. But it's far from a clear cut statement that they will. BluesNews only reported that Brandon claimed it would be more difficult to port to Mac and Linux, not that it wouldn't be done. After all porting to PS 2 won't be any easier and it's definitely going forward.

    Let's hope those platforms don't become victims of insignificant revenues, just after the Unreal engine is gaining respect in a market where only John Carmack stood out as a programmer more interested in better technology than revenues.

  • by Anonymous Coward
    Not long ago, Sweeney had a post on the unreal tech page where he ranted and ranted about how great D3D and Microsoft tools are.

    Another time, he had some none-too-favourable comments about OpenGL, claiming that Carmack was the only one who could 'afford' to take an ideological stand.

    D3D is an ugly, non-standard uni-platform hairball. In every instance, D3D games look worse than OpenGL or even Glide games. And god help you if you're a programmer...just look at the case of Ultima IX. Origin originally wrote the engine for Glide and very quickly found out that D3D just could not DO what Glide did (at least, not without a LOT of pain). In the end the D3D performance was very much below Glide's performance.

    So, if Sweeney wants to support D3D only, he's really going against the tide. The era of Windows '95/98 'only' is OVER. Ignore the seachange at your own risk.
  • One of the most common reasons to dual boot Linux is to run Windows to play games.

    True. However, he's referring to hardcore gamers, who are more likely to be using dedicated gaming consoles (i.e. Sony Playstation, Nintendo, etc.) than either Windows or Linux.

    There isn't a single game that runs better anywhere except on Windows. Get real! Windows runs games better. Period.

    This is a load of absolute crap. Games written for Windows run better under Windows. Big surprise. Games written for Linux run better under Linux. The same is true of most games written for any particular platform: they will generally run best on their target platform.

    As for what software and hardware combination runs games the best, neither Linux, nor Windows, nor Macintosh, nor any other generalized platform does it as well or as cheaply as dedicated gaming consoles. Not even your beloved Windows.
  • The various X11 implementations are NOT emulators, since their purpose, their only purpose it to implement the X11 protocol itself. Berlin, AFAIK aims at EMULATING X11: because its main purpose is not to implement X11, it will implement X11 on top of another API. That's emulation. WINE, however, allows to run Windows programs on Linux. Linux is not meant to be a Windows implementation. See the difference? WINE is Not A [CPU/Machine Code/Virtual Machine] Emulator. It runs things natively, yes. That
  • these problems (eg, assumptions of 100% CPU access) are present in the API as well

    Sorry, but you're right. Allegro was originally written for DOS. As such, it didn't get sleeping routines until 3.9.32, it displays the mouse cursor as a sprite in the game window, and it draws all graphics by blitting from an internal framebuffer to the X11 screen. But it also runs on framebuffer-console (fbcon), and many of the problems you experienced are gone. (If you care about cpu usage, why are you running an Allegro game on the server anyway?)

    But it's still beta. Try Allegro 3.9.32 (from the Work in Progress page); it's quite stable now.

  • In other words, Mesa is to OpenGL as GNU is to UNIX.
  • Actually, if these were standardized on hardware, then the API would never evolve. No cubic environment mapping for you!

    This is why decent standards include standardized way to extend them.

  • Now, don't get silly. The basis of the government's case (at least part of it) is that, in their opinion, a web browser cannot be part of an OS. Microsoft disagrees.

    There is (or at least was) a sepatate market for web browsers, different from market for operating systems -- Netscape, Opera and others competed in it, and none of them at any degree suffered from the lack of "integration" that Microsoft "innovated" later. OTOH, at no point in Unix history there was a market for basic-functionality text editor or telnet client for Unix, and even though that, say, Unix tar competed with GNU tar, none of two versions actually managed to hurt another one.

  • by DragonHawk ( 21256 ) on Sunday May 14, 2000 @07:53AM (#1073726) Homepage Journal
    The problem here is again, a different stack tere, a different browser here, a different protocol there, a different compression scheme here, a different xml engine there, a different ipsec here, and you get the point.

    And this is why standards are what should be followed. Not Microsoft's "Here is our proprietary solution that locks you into our platform" APIs, but standards. Like POSIX, like HTML, like CSS, like XML. If you followed the standard you will be okay.

    It is when vendors like Microsoft decide to "enhance" the standard with extensions designed to lock out competition that things break down. Microsoft is an illegal monopoly, and something needs to be done to stop their abuse of that power, or the future of the information age will be owned by a single company.

    Microsoft is an illegal monopoly, and something needs to be done to stop their abuse of that power, or the future of the information age will be owned by a single company.

    I don't think breaking up MSFT is the best solution possible, but it may be the best solution possible under US law. I would rather be inconvenienced then enslaved.

    What would my ideal solution be?

    First, behavioral restrictions: Force Microsoft to publish complete specifications for all of their interfaces and file formats, and allow unlimited third-part re-implementation of them. If they cannot provide a spec, they cannot sell the product.

    This would keep Microsoft from using their products to lock people into their products. People would be free to choose Windows, Exchange, IIS, IE, or what-have-you if that was the best product available. But if MS tries anything funny, people would be free to switch vendors.

    Prohibit Microsoft from entering into exclusive licensing agreements with PC OEMs. This is something that is technically already done, but MS has found loop holes. Close them.

    Next, some punitive and corrective measures. MSFT has broken the law and deserves to be punished; they also have locked out their competition in many markets and can continue that lock-out legally. So:

    Prohibit bundling any other MS product with Windows for the next five years. No Windows Media Player. No MS-Works or MS-Office pre-loads. Force a choice on people, as MS has forced the lack of a choice on them for so many years.

    Force MS to include freely available third-party browsers in the Windows OS package along with IE, and give users the choice of which they want to use. They've used their OS monopoly to build a browser monopoly; correct that.

    But again, I don't know if any of this is possible or feasible, and I would rather be inconvenienced then enslaved.

  • by The Man ( 684 ) on Sunday May 14, 2000 @07:56AM (#1073727) Homepage
    Pretty much everything he says is correct. And it's a good thing. Microsoft's "standards" may be helpful for the peecee gamerz, but they're harmful to virtually everyone else who's ever touched a computer. We're all better off if Microsoft goes away completely; whatever "remedies" come out of this court case will probably be insignificant in the long run. Already the market is solving the problems itself with the rise of Linux, BSD, and the strengthening and in some cases re-invention of the traditional high-end vendors, then the subsequent Windows 2000 response from Microsoft. Sure, it still sucks, but if you ask me it's obvious that they recognized some of their shortcomings and have at least attempted to improve.

    The problem for gamez writers is real. The fundamental problem with peecee gaming is that gamez, to get the performance gamerz want today, need to get very close to the hardware. Unfortunately, this is exactly the type of thing that good operating systems both prevent and facilitate: arbitrary random physical access to hardware devices is out of the question, but there exists one prescribed method of gaining the access you need, which offers protection from at least most mistakes. DirectAnything is hamstrung by two problems: 1) It fails to provide any real protection, and has been designed and implemented in the same slipshod way as the rest of windows, 2) It requires hardware support which is anything but universal.

    So it's understandable that gamez manufacturers like targetting the consoles - it's a single known environment (as is pointed out in the article) which is generally rock solid internally and offers very direct hardware access, making high performance easy to get. Naturally, the lack of protection is problematic, but people have been writing hardware gamez since Pong - it's a science by now.

    I would argue that, once completely implemented, the new DRI/DRM XFree implementation will be infinitely preferable to Microsoft targets for gamez developers. Why? Performance (with the right hardware), universal availability, and uniformity. In other words, you won't _need_ a FuxorGraphics model 455CA specifically; anything will work, though naturally if you want good performance you'll want something that can do hardware GL. GL is not itself without problems, but at least it's widely available. And with a fast enough CPU, software rendering is not out of the question (in fact, some SGI systems were set up specifically to do most of the rendering on the CPU. This is not altogether a bad thing; the 3D performance on those systems does not suck, to put it mildly).

    Whether this ever happens, and whether it's ever adopted, remain to be seen. But even if all of his worst fears are realized, and DRI never gets completed in any sane fashion, it's not the end of the world. The worst doomsday scenario he comes up with is the death of the peecee. Would it be so bad to kiss the relic of the 70's, never designed but patched together by marketing types and marginal hackers from day one, unstable, unreliable, and nonuniform, a final goodbye? I'd like nothing better than the death of the peecee. Good riddance. Tragically, though, this fear is massively overblown: the peecee's existence relies on buyer ignorance and gullibility, of which there seem to be no shortage.

  • you got Win95, Win98, Me
    That's great. You just listed 1 operating system with 3 different names.
    , Win2000
    Ah yes, Windows 2000. It's got an enormous market share. It's so different that Windows 95 that you would absolutely nood DirectX to be portable. Can't use OpenGL or your own engine.
    , Windows CE (Personal/Palm PC's),
    What? Do you honestly expect someone to run Unreal on a "Pocket PC"? Are you crazy?
    DreamCast
    Nobody uses DirectX on the Dreamcast. The performance is just terrible. And from your very own comment, it appears that it also costs more.
    and the soon to be released X-Box.
    The X-Box is a PC. Just because Microsoft is trying to convince you otherwise means nothing. If you write it for the PC, it'll run on the X-Box, with or without DirectX.

    So, it seems while you tried to list 7 platforms, you really listed 3 that are so similar that you don't need DirectX to be portable.

  • Wine Is Not an Emulator. It replaces windows libraries, it doesn't emulate.

    Oh, please.

    emulate: 3. Computer science: To imitate the function of (another system), as by modifications to hardware or software that allow the imitating system to accept the same data, execute the same programs, and achieve the same results as the imitated system.

    Wine emulates the Windows binary interface. Funny names aside, Wine very much is an emulator.

  • by DoenerMord ( 21821 ) on Sunday May 14, 2000 @08:22AM (#1073733)
    A few posts to comp.sys.mac.games.action from the premier porting house for Mac games, Westlake Interactive, clears up the situation a bit. Here's a post from president Mark Adams:

    Of course porting "difficult" code is what we get paid to do. And we've dealt with many more Direct3D only games recently (TR:TLR, The Sims, etc.), so we're pretty confidant we're keeping our engine re-writing skills up.

    Kenn Cobb (also from Westlake) had this to say:

    Not a big problem. We port D3D titles all the time now (TR4, The Sims, currently). It really depends how they use D3D. There are various approaches to take with D3D, and some are easier to port than others.

    3dfx themselves are phasing out Glide, so I can understand Epic dropping it. I'm disappointed about OpenGL, but from a PC-only viewpoint I can see the argument. Most OpenGL drivers on the PC are designed primarily for Quake. Even though fuller implementations have been coming out, they are generally not as mature (ie, optimized, bug-free) as the D3D drivers. There are far more D3D games than OpenGL on the PC, so this only makes sense. Still, it would be nice if there were two major fronts pushing OpenGL (Epic and id) instead of just one.

    So, while it may not be a great situation, atleast Mac users won't be shut out entirely. Westlake knows what's up...

    -doenermord

  • saw this on Blue's News [bluesnews.com]:
    "one of the reasons for this is the fact that they have direct input to Microsoft as to the development of the API"
  • I could see microsoft being forced to allow its applications to run/be developed for OTHER platforms, but making win98/win2000 and future versions be broke up to replaceable components is DUMB. And that will cause problems.

    Problems. Even with MS's driver certification program there are still problems with video drivers (a component) and sound drivers (another component). What if MS bought 3dfx, nVidia, and Creative? Don't you think there would be fewer problems? I think there would be fewer. I think the quality of first release drivers would go up substantially. Do you think that scenerio would be a good thing?

    What if Microsoft bought every company or division of a company that made anything at all for the PC market? That would bring them to a console-like level of control (or better) over the PC market. That level of control is proven to generate very high quality, high reliability stuff. Do you think this scenerio would be a good thing?

    What if we turned the knob the other way? Dozens of Baby Bill Co.'s each write their own tiny fraction. Baby Bill Co. #34 writes the code for the IDirect3DLight interface, Baby Bill Co. #35 writes the code for IDirect3DExecuteBuffer, Baby Bill Co. #36 writes..., etc. Would that be a good thing?

    My opinion is that the answers to the STRONG questions are no, no, and no.

    Microsoft has demonstrated (Java) and continues to demonstrate (Kerberos) that the price for allowing them bring standards and interoperability to the PC market is strict Microsoft control over implementations of said standards. One of the primary sources of Microsoft's desire to innovate is the desire to "de-commoditize" standards (Halloween docs). So, I think it is fair to say that the more control MS gains, the less they will motivated to innovate. And so I think it is appropriate for the knob to be turned down a notch or two on Microsoft.

    Dealing with variations in standard implementations and getting people to play nice together, insofar as standard implementations go, IS a problem, but Microsoft domination and lock-down of those standards is not the answer.

  • Microsoft has been silent on the Wine issue, probably because it cannot run native-built Windows applications acceptably without a lot of tweaking. As soon as Microsoft Office on the Linux desktop via Wine becomes feasible, watch Microsoft bust out the attack lawyers and execute its Super Litigation Combo Death Move.

    They have been silently aggressing against things like Wine, too, by breaking API compatibility and licensing software for Windows only in the EULA (except it's phrased as "only the operating system for which it was written" or somesuch).
  • I must say I find it difficult do disagree with him on this. In the early days of D3D there was heavy contention between OpenGL and D3D, I myself was very vocal on the side of OpenGL. OpenGL has barely moved forward in the intervening years whereas D3D has developed at an alarming rate. OpenGL support for the full range of features available on different graphics cards is limited at best whereas D3D has done a pretty good job of keeping up (although there are some nasty sides to the yearly release of the API).

    Now I know I'm going to get flamed for saying anything pro a Microsoft api, and the first comment is going to be 'But Id software don't have a problem'. The largest benefit with openGL is that the specification defines exactly how every thing gets done, if the hardware can't do something it gets done in software, this unfortunately is also it's down falling. ID software don't have a problem, whenever they are about to release a big-name title all the card vendors run around making sure there drivers are optimally configured for the feature set ID is using. If you're a smaller developer, things work differently, the vendors are not as concerned with you and basically you have to limit yourself to using the features that the big-boys have already used. With D3D the situation is different, you are no longer shielded from the differences between cards but this can be used to your advantage, using once set of optimal features to achieve an affect on one card and another set on another (or a simpler effect).

    The ironic situation here is that (Microsoft proprietary) D3D API is providing the smaller developer with a far more free development platform than the open OpenGL.
  • Oh, please.

    First of all, don't jump to the conclusion that games written for Linux are necessarily going to be better performers, either where netplay or graphics are concerned. Most graphics cards these days have better OOTB support for Windows, hands down. All things being equal, that translates to better performing games. And where netplay is concerned, that varies highly from game to game. Compare Unreal's unpatched netplay to the latest patchlevel, or to UT's. Compare either to Q3. On the same hardware, with the same OS.

    Second, don't be ridiculous about "refusing to allow Linux ports." Companies like Loki don't politely knock on the doors of game companies and go, "Hey there, mind if we port your game for you? We'll only be a minute." Loki gets paid to do it. That's fine - they provide a service to their customers and they deserve the right to charge for it. But the folks at Blizzard aren't being evil because they don't feel like paying for a Linux port of the game.

    If Loki can show that they can sell enough copies of a game for Linux, Blizzard will likely change their minds. Until then, don't have a hissyfit at Blizzard for looking out for their bottom line. This may come as a shock to you, but Blizzard, Loki, and the publishers of every game you're likley to find at the local Best Buy are, in fact, in it for the money.
  • You totally missed my point. I don't care about the ethics of being open. SCREW THAT. People *CHOSE* Microsoft. Microsoft has NO responsibility to conform to ANYONE else but there OWN SELF. It is and always HAS been up to the consumer to choose what THEY WANT. Demand for something else has driven mac's, linux and other OS's out there, but the Demand for this "Opennes" that you preach about is BS. Its a CHOICE poeople. You DON'T like it. DON'T CHOOSE IT. DON'T preach to me about what is wrong or right.

    Have you EVER tried mixing thise "clone" pc's you talk about? Have you ever seen the inside of a COMPAQ? They're not a clone. You can't replace the motherboard on 99% of them because they're proprietary. DO YOUR HOMEWORK before you preach to me about CLONES and this and that.

    a COMPUTER is not defined as a PC CLONE, where in the world is that misnomer coming from?

    I don't think SUN calls there workstations PC's and IBM had nothing to do with there creation, the market demanded a workstation and sun provided. Sure the 'clone' market exploded, but it exploded around a common software packag called DOS and then Windows and Then whatever it is today. ITs because people chose DOS and chose windows and chose whatever they want to run.

    Please, get your head out of your ass and look around. Standards are just one group or one persons concept of anything. Microsoft has Microsoft Standards and maybe YOU don't agree with it, but don't give me the shpeel on how having 2000 variants can be anything good for the industry.

    Should we sue telephone companies because we can't run our own protocols or signals on the network? Should we sue cable companies because they choose what we get to watch? Should we sue cars because they put the stereo to high or two low? The world would be boring if evertthing was standardized.

    Innovation is NOT conformity people

  • IF you claim that opengl is better, then please give some examples. All I know is that under Unreal tournment, directx is faster on my geforce 256.

    Out of that whole message, this was the only thing I could even comprehend, so I'll address it. OpenGL is better for people who want a portable, robust API, that doesn't change every six months. It's not the only API out there, and it may not be the best for everyone.

    And I repeat, there is nothing about an API that would make one faster or slower than another. An API is just a specification.

    If a game is faster in D3D than in OpenGL on the same hardware, then this is almost always due to the fact that the game did not make better use of the OpenGL API, and has nothing to do with the illusion that one API is faster than another.
  • by yerricde ( 125198 ) on Sunday May 14, 2000 @08:27AM (#1073755) Homepage Journal

    The definition of "emulator" has a slippery slope: NES virt [zophar.net]ual m [zophar.net]achin [zophar.net]es emulate the NES binary interface. Java virtual machines emulate the Java binary interface. Linux emulates the UNIX® source interface (most of POSIX® and much of the Single UNIX Spec). XFree86 [xfree86.org] "emulates" the X11 source interface. GTK+ [gtk.org] emulates the GTK+ source interface. So you're saying an emulator is any program that exposes APIs, that all libraries are emulators?

  • There's a portable wrapper around much of DirectX. It's called Allegro [demon.co.uk]. It supports nearly everything popular that runs on x86 (DOS, Windows, GNU/Linux, BeOS) and other hardware (the X11 version supports linux-ppc, Slolaris, HockeyPUX, AIX and Pains, etc.) but does not support Mac OS because none of the developers has a Mac to code on. And there are lots [allegro.cc] of games written for Allegro, including all of mine [rose-hulman.edu].

  • It's called Windows CE, and it powers some Sega Dreamcast games (mostly PC ports) and will power most Xbox games.
  • OpenGL ... No true Open-Source "implementations,"

    Then what's Mesa [mesa3d.org]? It implements the OpenGL API on several platforms.

  • by empath ( 44572 ) on Sunday May 14, 2000 @06:48AM (#1073769) Homepage
    For those of us who don't follow this sort of thing, wine already does a fair chunk of DirectX, including Direct3d. If you want to play Windows games, go help wine.
  • by Troed ( 102527 ) on Sunday May 14, 2000 @06:48AM (#1073770) Homepage Journal
    ... reading Tim's comment, I saw this:

    Microsoft fixed that with Windows 95's built-in TCP/IP.
    The government's proposal would basically shackle Microsoft and prevent them from adding new features of that sort to the OS

    (on that networking was bad under DOS but works fine on Win95+)

    Breaking up Microsoft into one OS vendor and one Applications company won't prevent Microsoft from adding stuff like TCP/IP or DirectX to it's OS! They're OS components, I don't think anyone would claim they're applications in the normal thoughts of the user (or the DOJ, surely).

    Linux and Mac will have to push OpenGL over DirectX - I'd say that's quite reasonable. At least it's not as bad as Glide ...

  • Although this is an obvious troll, I'd still like to address these points, because a lot of people posting here seem to be clueless on some of these issues:

    We are not talking about D3D 3.0 anymore. 7.0 is here, and its not your mother's pansy-ass D3D anymore. It's as fast as OpenGL, and blows it away in features and power.

    This sounds more like market-speak than any kind of statement based in fact. Can you provide some examples of how an API can be fast and/or powerful?

    OpenGL used to be safe in the knowledge, that as much support as D3D could get, it was still king in terms of features. Not anymore. These days there are a host of features that the D3D programmer has direct access to, but is not directly accessible from the OpenGL API.

    Tell you what... For every "feature" you can find that cannot be implemented as an OpenGL extension, I'll list one that isn't in D3D... Go on, I'm waiting...

    OpenGL is does not have the speed advantage anymore.

    Well there goes all your credibility. Please explain again, how an API can be faster or slower than another one. An API is just an interface. Some vendors' implementations may differ in speed, but there is nothing about the API that specifies a speed difference.

    The OpenGL API is not powerful enough. It does not give you enough low-level access.

    You're programming down to the vertex on both API's. How much lower do you want to go, and still be able to run on all hardware?

    OpenGL is not extenible. Sure there are OpenGL extensions, but they are silly compared to the ones in D3D.

    They're silly... Another statement based in fact I see...

    Yea thats right everyone.. Just a troll.. move along.. nothing to see here..

  • Well, I'm running it on the X server because:

    -> fbcons doesn't work with my video card, which predates VESA2.0;
    -> besides, fbcons gives me a headache on VESA2.0 cards (60Hz refresh rate..)
    -> I happen to be running stuff inside X a majority of the time and I'm too lazy to switch VTs
    -> svgalib (if Allegro supports it) needs root permissions, something I'm not going to give to a binary download of a beta library, and tends to lock the console and do other weird things if the program using it crashes

    I care about CPU usage because I sometimes want to run other things in the background, and not only are they impacted, but Allegro-based games (--in the version I tried--) don't deal at all well with anything else getting CPU time; they get horribly jerky whenever, eg, fetchmail wakes up and delivers the mail that's arrived in the last 10 minutes to me.

    and finally: Nethack doesn't have this problem ;-)

    Daniel
  • It's clear both API's will be around for a while, so why not learn to live with them both?

    Only the most hard-core pundits are going to make this out to be some kind of war between API's. They both have their advantages and disadvantages--and as programmers, we can choose which one suits our needs best.

    Those who want a portable and robust API will choose OpenGL, and those who want to be able to run with the latest hardware and take advantage of Windows-only features, can choose D3D.

    They both have their drawbacks too. It's unfortunate that OpenGL's review process takes so long to specify new features, just as it is unfortunate that D3D changes with each version at MS's whim. As a programmer you just have to decide which drawbacks you can live with.
  • So you want to establish a consortium of people who believe in something. This requires time, money and lawyers. Just because YOUR idea of a standard means a consortium, that doesn't mean it is MY idea nor ANYONE elses idea of a standard. Windows set the standard for today. Why else would gnome/kde be in such a hurry to copy it? Sure they use "open source" and "open standards" but Windows was built around Win32's, OLE and all the Windows Standards.

    There is nothing wrong, nor abusive about standards. What microsoft is in trouble for is not having a closed api or a "politicized" standard, but for price gouging, license gouging, vendor contracts and bad business practices.

    Java isn't a standard, everyone has there own concept of it, why don't we sue SUN for not having one? Sun had proprietary knowledge and developers that build it and can customize it for THERE os and THERE HARDWARE. Doesn't that sound kind of strange? Sure you get the source code if you want, but that doesn't stop sun from doing things behind closed doors.

    The IBM PC isn't a standard either. Look at the major labels. Dell is the only one i know of that for business desktops uses replaceable components. HP's have crappy motherboards and proprietary miniature cases, Compaq's have disgusting bios's and really proprietary cheap ass hardware solutions. Just because it runs windows doesn't mean it followed some "pc standard" as i can emulate this and that and there is know "pc" standards body. Sure you have PCI folks, Vesa Groups and all sorts of other standards, but the Open source community cant afford the costs of Posix licensing/standardizing, ISO-9000 Certifications, Public meetings/relations/developer consortiums.

    So tell me again, why people shouldn't use Directx 8? Is it because OpenGL is standardized by a company that is loosing money faster then a drunk tycoon at a slot machine? i mean what in the hell is going through you peoples brains? If you don't like a product, DON'T USE IT. You don't hear windows people bashing Linux/Unix users for there "standards" and you don't hear then crying wolf against sun for having an edge just the very same microsoft does. (YOU CAN'T TELL ME THAT SUN DOESN'T PRICE GOUGE ON ITS HARDWARE/SOFTWARE/SUPPORT SOLUTIONS.) You can't tell me that a computer Sparc 5 should retail for 2900.00 while a pc with similar specs and tons more software and features runs for a 3'rd that price.

    Sure microsoft has its own beliefs in xml/dhtml/html and whatever else, but who is to judge if that is right or wrong. Isn't that what your beliefs on standards are? follow the basics and add your own features to expand upon them. As you state that breaks the reasons for a standard. if the standard is to be followed, then ONE CHANGE by ONE VENDOR screws up the whole standard should that ONE VENDOR be the sole leader of that market. JUST like SGI & OpenGL, sure they're in with NVIDIA, but i guess that is alright since NVIDIA is a linux friendly company. Tell me howcome if NVIDIA follows a standard i can't replace it with another graphics card that follows the exact same "standard" you (*&#( think exists and have it work in the exact same fashion to the spec???

    Get my point? Or do people just not think for themselves anymore? Got your head out of your ass yet? or did you think i said your head was up your ass because you run linux or did you think for a second and accept someones ideas or are you standardized on the notion that 'standards' are best and you can cry wolf when someone you don't like doesn't adopt it.

  • I don't understand Tim's point on the forum. Is he really that stupid? I don't know much about him, but it would appears as though he has no clue as to what "operating system" means. He talks about how breaking up MS will send us back to the days of DOS when TCP/IP Stacks conflicted and Sound Cards all had to be coded seperately. The problem with DOS and TCP/IP and Sound Cards was that it was a new-ish technology that the OS hadn't been updated for. When the new OS (Windows) came out the problem was fixed. The definition of OS is a piece of software that provides a "glue" between applications and hardware. That includes the hardware's protocols such as TCP/IP. Any new technology such as TCP/IP and Sound Cards WOULD be coverd by the MS Operating System Company. No progress would be halted and no incompatabilities would arise because handling that stuff is what the OS is supposed to do. What the OS is NOT suppose to do is provide Web Browsers and Office suites which is what Windows does now.
    I'd really like to see what Tim's opinion would be if MS decided to release a 3d game engine with Windows as one of the people in the forum said. Let's see if he is as sympathetic to MS's cause when it is his company on the line.
    Oh, and if DirectX is such a superior gaming platform then what about Unreal Tourney for Linux and Mac? Maybe I'm just a conspiracy theorist, but it seems as though maybe Bill has thrown a little money Tim's way to cause a complete 180 in the company's direction, no?

  • So the second system to implement an API is the emulator?

    Are you being obtuse on purpose, or were you born that way? Read what I flippin' wrote.

    All X11 servers but the first implementation (incl. XF86) are emulators, right?

    X11 servers are generally implementing the X11 protocol standard, not trying to act like the original X server implementation. In fact, many of them -- including XFree -- use the same code.

    And all Java VMs other than the first are emulators, right?

    I suppose that depends on Java's status this week -- is it a standard, or a product owned by Sun?

    And the GNU system is a UNIX® system emulator, right?

    That was how it started, after all. RMS wanted a Unix system without the Unix license problems.

    Microsoft IIS is an NCSA-httpd (now Apache Server) emulator?

    Well, I can't speak for Microsoft. Are they trying to be NCSA HTTPD all over again, or are they trying to implement the HTTP standard? (Or, more likely in MSFT's case, embrace and extend the standard?)

    If a project is trying to implement a standard, then I wouldn't call that an emulation. They are starting from scratch and creating something that meets an abstract specification.

    But, if a project sets out with the goal of implementing something which acts just like another product, then they are building an emulator, by definition. Emuate means "act like".

    Is that clear enough for you, or do I need diagrams?
  • Of course MS is going to give game developers more input into their API. With the release of the xbox MS is going to court as many game developers as possible. But just wait a year or so down the road when these developers have abandoned all other platforms and are dependant on MS they will stick them with outrageous licensing fees or make them somehow pay through the nose to use d3d just like they've done so many times before.
  • "I'm not surprised if Epic arrived at this decision as a result of lackluster enthusiasm from their Linux port of Unreal Tournament."

    Yes, a lack of enthusiasm on THEIR part. I run a rather large LAN party and I try very hard to run Linux for my game servers. But when the guy doing the Linux ports doesn't give a fucking shit about the port, what do you do? Epic has EARNED their Linux enthusiasm, not had it dumped on them.

    Bad Mojo [rps.net]
  • Breaking up Microsoft into one OS vendor and one Applications company won't prevent Microsoft from adding stuff like TCP/IP or DirectX to it's OS!
    Okay, what about a text editor? How about telnet? Lynx (or an equivalent)? What about a real browser, like IE?

    Slashdot readers probably have some reasoned opinions on which of the above can be considered "OS features", and which can't. But the problem is, if Janet Reno and Co. get their way, programmers won't have the power to make that decision anymore -- the government will. And once you give the government a sweeping power like that, they'll never give it up, not even when Microsoft and Windows are long gone. That's a fundamental law of nature.
  • by Allen Akin ( 31718 ) on Sunday May 14, 2000 @09:08AM (#1073797)

    You have valid concerns, and I don't want make light of all of them. But I did want to inject a dose of realism about a few things:

    • D3D is far from consistent. Different implementations have different capabilities, and that definitely includes the new vertex and pixel shader stuff in DX8. Developers will have the same trouble getting those new features to work in D3D that they'll have dealing with OpenGL extensions, if not more.
    • There are a lot of people working to keep OpenGL competitive. For example, a VP at NVIDIA recently told me "Every hardware feature will be exposed in OpenGL extensions." The first step is always to expose new functionality in extensions; it gets incorporated in the core standard only after it's shown to be useful and generally practical to implement. (This is unlike D3D, where Microsoft decides which vendors to favor and shafts others by adding D3D features that they won't necessarily be able to support.)
    • Always view new stuff in D3D with a healthy dose of skepticism. Microsoft has had to reverse its design decisions in D3D many times in the past, and now that they can no longer copy features wholesale from OpenGL, they're even less likely to get things right than before. There are highly knowledgable people in the games industry who have major doubts about the new features in DX8.

    Bottom line, I think you're correct that more needs to be done to support graphics and games in the open-source world. But I also believe that the game is far from over.

  • I experience serious video problems with WINE

    After three or four bottles things are really out of focus

  • The problem is, no one follows the "standards" the same way. Take HTML for example, not every page is compatible between Netscape and IE. Wait, hold on, those two are guilty of extending the markup language on their own, but what about the other browsers? Opera, Mozilla, iCab, Lynx, etc. They each have their own glitches and they each will render pages differently. Now, throw in the differences between operating systems on top of that, and you know why it's so hard to make web pages that work well and look good across every platform and browser.

    I personally just want to use whatever works well, and I want it to be the same. I support PC's for a living, and it's bad enough have millions of combinations of hardware and drivers to contend with, I'd at least like to keep the OS possibilities in the single digits.

    That said, I don't support everything Microsoft has done. Their tactics with the OEM vendors are despicable, but at the same time, the libertarian inside of me points out that those OEM vendors signed the contracts, no one forced them to do so, thus my sorrow for them is greatly mitigated.

    There is a great advantage to having a somewhat standardized Operating System. One of my greatest frustrations with Linux is the amount of variance in the systems and the versions of the libraries they include. That doesn't mean I don't like Linux, it just means I'm not going to use it as my primary desktop OS anytime soon, the BeOS has a much better chance with me.
    ---
  • by Admiral Burrito ( 11807 ) on Sunday May 14, 2000 @10:20AM (#1073802)

    The problem here is again, a different stack here, a different browser here, a different protocol there, a different compression scheme here, a different xml engine there, a different ipsec here, and you get the point.

    [SARCASM]
    I'd like to talk to you about an upstart computer company called "Compaq". It seems they're about to release a so-called "IBM-compatible" PC. This is the worst thing that can happen to the computer industry! Everyone should buy IBM. Not just because nobody ever got fired for buying IBM, but because IBM is standard. Just look at what Compaq and companies like them are going to do to the computer industry. Soon everyone will have different computers with different hard drives and different video cards and different CPUs and different amounts of RAM. Imagine the chaos that will ensue! There is no way that all of these different parts can possibly work together in a "100% IBM compatible" way. I predict that 10-15 years from now computers will be twice the size of a house and so expensive that only the five richest kings of England will be able to afford them. Put a stop to this! Buy IBM!
    [/SARCASM]

    But seriously. I believe it was Bill Joy who said, "The function of an operating system is fixed." That function is to control access to the computers' resources (CPU, RAM, files, network sockets, etc). Applications ask the operating system to provide access to resources through a standard API.

    What we need are operating systems (multiple, not one) providing APIs that are standard and available to all manufacturers the way hardware standards like PCI are. Then we can start to see the same competition in operating systems that we see in PC hardware.

    On the internet we have dozens of different TCP/IP stacks, web browsers, protocols, compression schemes, XML engines, IPSEC implementations, etc. As long as they are written to open standards more is better. It means you have a choice in what software you use as opposed to being locked into a single vendor such as Microsoft. Vendor lock-in is what happens when you have a single dominating implementation of a "standard". When such a "standard" is violated there are few people to complain about interoperability problems and no standards-compliant vendor to turn to. When there are multiple vendors with nobody dominating then people will simply stop buying from a non-compliant vendor (why buy a broken software when you can get software that works?), thus making non-compliance unprofitable.

    Competition is a Good Thing(tm).

  • D3D's not very good on WINE yet (it's only around DX3 level), but OpenGL support is in current CVS now (look for it in the next release). If you have a XFree86 4.0 supported OpenGL card it plays Half-Life in OpenGL mode (quite fast on my TNT2 even). Gaming on Linux is gonna get even more interesting as we go I think :)
  • Mesa cannot legally call itself an implementation of OpenGL, because it has not taken SGI's hideously expensive testing process. Pretty much everyone agrees that it's "close enough" (even SGI) so this is more a semantic point than anything else. But stupid as it is, that's enough to scare away some businesses.
  • They still seem like a Linux friendly company ( Epic, NOT GT Games). Hopefully Loki could work with Epic to port games like they already did with the D3D only Heavy Gear II (excellent port to opengl) and like they did with Quake3 for id. I have purchased two copies of UT and like it very much. If Epic is listening, then please don't cut off the Linux community. We do play and pay for your games.
  • Although this is an obvious troll, I'd still like to address these points, because a lot of people posting here seem to be clueless on some of these issues:
    >>
    Nope, not trolling.

    This sounds more like market-speak than any kind of statement based in fact. Can you provide some examples of how an API can be fast and/or powerful?
    >> I was just trying to be poetic. Geez, nerds. APIs can be faster or slower than others based on how effectivly it allows access to hardware and how effectivly it manages resources. Take early versions of Direct3D. They were the reason that D3D got a bad name. An engine implemented in D3D and equally optimized as one implemented in OpenGL would lose very significantly in the D3D case. Back around D3D 3 or 5, a glide or OpenGL game would run upwards of 40% faster than the same engine using a D3D renderer. In current implementations, however, the pipeline has been streamlined, the whole thing has been nearly rewritten, and provides much better access to hardware.

    Tell you what... For every "feature" you can find that cannot be implemented as an OpenGL extension, I'll list one that isn't in D3D... Go on, I'm waiting...
    >>>>>>>>>>
    Did you not read my entire post before answering? I say that extensions suck. They are not evenly implemented and the developer has to be constantly aware of which ones he is using. I can easily (relativly) write an engine that uses every features in D3D. Now no matter what graphics card I'm using, my app will run, with D3D smoothing over hardware differences. In OpenGL, I have to use extensions, and for a lot of the cutting edge features (like the ones that nvidia is exposing as OpenGL extensions) are not ARB or EXT extensions, and in that case I have to write seperate code for each card I want to support. Eg. If I use the skinning support in DirectX 8.0, my game will automatically take advantage of both the GeForce 2 and the Radeon. In OpenGL, I would have to use both the NV skinning extension and the ATI skinning extension. Do you think the ARB is going to be able to keep up with a market where major hardware changes happen in 6 months? Hell no.
    OpenGL is does not have the speed advantage anymore.

    Well there goes all your credibility. Please explain again, how an API can be faster or slower than another one. An API is just an interface. Some vendors' implementations may differ in speed, but there is nothing about the API that specifies a speed difference.
    >>>>>>>>>
    API's can be fast or slow. Have you ever implemented an API before? Take hypothetical API X and Y. To change the color in X, I have to create a color object, register it with the server, then activate the color change. In Y, I send three color components to activate the color change. Which do you think will be faster?

    You're programming down to the vertex on both API's. How much lower do you want to go, and still be able to run on all hardware?
    >>>>>>>>
    That's not my point. The vertex level is entirly irrelevant to what I'm talking about. In OpenGL, you have an abstract access to color, depth, etc buffers, vertex buffers, and hardware resources. In D3D, you have direct access to these. You can create hardware buffers, vertex buffers, etc. Hell, you even have direct access to what textures are uploaded to the card! None of this glPrioritizeTextures crap. Case in point, in D3D, you can render directly to a secondary hardware buffer, let's see OpenGL do that!

    They're silly... Another statement based in fact I see...
    >>>
    Did you bother to read the last half of the post??!!! I go through point by point on why GL extensions are silly! They aren't implemented fast enough to keep pace with the evolution of consumer hardware, and they aren't implemented evenly/they aren't emulated when not implemented.

  • by Nicolas MONNET ( 4727 ) <nicoaltiva&gmail,com> on Sunday May 14, 2000 @06:58AM (#1073815) Journal
    Wine seems to emulate Direct3D quite OK -- AFAIK, Unreal (not Tournament) for Windows runs on it in Direct3D mode. I guess it should be quite possible to link future Unreal games to WineLib.
  • The basis of the government's case (at least part of it) is that, in their opinion, a web browser cannot be part of an OS.

    That's just plain wrong.

    The basis of the government's case is that, according to US law, a monopoly cannot use their monopoly power to further their own products. It doesn't matter what Microsoft is trying to bundle with Windows; what matters is the fact that they are a monopoly. If you are a monopoly, you have to tread carefully, or your monopoly will be taken away -- forcefully. That's the law, and for good reason.

    Why is that so hard for people to understand?

    I'm glossing over the legal issues, partly because they're complicated and partly because I don't know them all, but please accept my rough interpretation of the case.

    I am sorry, but I cannot accept an interpretation that is fundamentally flawed.
  • Berlin, AFAIK aims at EMULATING X11: because its main purpose is not to implement X11, it will implement X11 on top of another API. That's emulation.

    Personally, I wouldn't call that an emulation, per se. I would call it a X server implemented for the Berlin platform (similar to the XFree port for OS/2). But that is starting to split hairs; if you want to call Berlin's X support "emulation", I won't argue with you.

    WINE, however, allows to run Windows programs on Linux. Linux is not meant to be a Windows implementation.

    True, but I never claimed that Wine made Linux a Windows emulator; I claimed Wine itself is a Windows emulator.

    WINE is Not A [CPU/Machine Code/Virtual Machine] Emulator.

    But it is an OS emulator.

    BTW, "Wine" originally stood for "Windows Emulator". The change to the recursive acronym came later. I suspect the reasons were:

    Emulating -- acting like -- Windows could be interpreted to mean crashing a lot, and they wanted to avoid that.

    A lot of people seem to think that "Windows Emulator" also means "i386 Emulator", which Wine most explicitly is not.

    Recursive acronyms are fun.
  • "Another time, he had some none-too-favourable comments about OpenGL, claiming that Carmack was the only one who could 'afford' to take an ideological stand.

    To me, this seems to intone that Tim Sweeney thinks Carmack can choose The Right Thing because he's got a legion of Quake-Heads who will buy anything he writes. But maybe, just maybe, Carmack has a legion of Quake-Heads who will purchase anything he writes BECAUSE he takes a stand and uses The Right Thing instead of bowing to pressure from other companies.

    IIRC, Carmack was told many times that his ideas for Castle Wolfenstein couldn't possibly be done on a modern PC (486 at the time). Had Carmack decided to go with the flow, I don't think Tim Sweeney would even be listened to. IMO of course.

    Bad Mojo [rps.net]
  • Yep, you heard it here first. OpenGL is losing the 3D API war to Direct3D. Heresy, you say, but hear me out.
    A) We are not talking about D3D 3.0 anymore. 7.0 is here, and its not your mother's pansy-ass D3D anymore. It's as fast as OpenGL, and blows it away in features and power.
    B) OpenGL used to be safe in the knowledge, that as much support as D3D could get, it was still king in terms of features. Not anymore. These days there are a host of features that the D3D programmer has direct access to, but is not directly accessible from the OpenGL API.
    C) OpenGL is does not have the speed advantage anymore. D3D has evolved to the point where it is just as fast as OpenGL, if not faster (depending ont he OpenGL ICD being used.)
    There are many reasons why OpenGL is falling behind, but I'll highlight the main ones.
    1) The OpenGL API is not powerful enough. It does not give you enough low-level access. When I first learned OpenGL, I was shocked to see that I had no control of hardware resources. I was used to DirectX and allocating and deallocating hardware buffers left and right. Nope, OpenGL didn't let me have any fun with that stuff. Even more shocking was that it had no concept of a vertex buffer. Sure, it would let you use the glVertexPtr command to select an array, but graphcis cards cannot optimize that properly because the data can change between calls to glVertexPtr. D3D offerers a host of low level features dealing with buffers and the like to allow you to wring as much performance as possible out of the hardware.
    2) OpenGL is not extenible. Sure there are OpenGL extensions, but they are silly compared to the ones in D3D. There are many problems with extensions... First, they take forever to pass as an ARB or EXT extensions. Second, even then they are not implemented on all ICDs. Take a look at the MS software renderer. No compiled vertex arrays or multitexturing for you. Third, they are not emulated when they are not implemented. You have to have seperate code paths for each combination of extensions you may require. In D3D, I can turn on a new feature confident in the fact that if the feature is not supported in hardware, it will be emulated. Eventually, all hardware will accelerate the feature and my app will automatically take advantage of it. No such thing for GL apps. The extension problem is going to reach a watershed as the new graphics accelerators come out with all these hardware features. Take the new accelerators that accelerate skinning or keyframing. MS will just make a keyframer COM object, and you can use the functions in that. If hardware accelerated keyframing is not available, then it will be emulated by D3D. In OpenGL, get ready for glKeyFramerATI and glKeyFramerNV and glKeyFramerS3, ad nauseum, or not glKeyFramer at all! Then you have more extensions for stuff liek texture compression. In D3D just, set call some functions in the texture object.
    Despite all this, I still use OpenGL. Why? Because though I prefer the COM-based approach of D3D, and I like the low level control, and I like advanced rendering features, I hate windows, and I like BeOS. All that BeOS (and Linux, and QNX, etc) supports is OpenGL. I think that OpenGL needs a major overhaul in version 2. OpenML is coming out to compete with the rest of DirectX, but can OpenGL be updated to compete with Direct3D? The shoe is on the other foot now.
  • Standards are vendor neutral. You can make up whatever definition you want, but that is how the industry uses the word. XML, TCP/IP, OpenGL, these are standards. DirectX is not a standard, because it is tightly coupled with the Windows API.

    Until it is possible to sell a complete Windows clone, nothing Microsoft-specific can be considered a standard.

    I suggest you have an assheadendectomy yourself. In one paragraph you mock having a choice in who implements a standard, and then claim standardization results in boring sameness. Make up your mind.

    Standards are good because they promote interoperability while at the same time encouraging competition. With MS, you're interoperable with nothing, and you're stuck with their mediocre implementation.
  • I've been picking up a little bit of OpenGL, and it seems to me to be an exceedingly well-thought-out, clean, and consistent API. Certainly better than the small amount of Win16/32, and the rather larger amount of DOS, programming I did. Admittedly, I haven't done any DirectX programming; maybe Microsoft has designed an uncharacteristically decent interface, but anyway:

    What is wrong with GL?

    Daniel
  • by Kojo ( 1903 ) on Sunday May 14, 2000 @10:50AM (#1073824) Homepage Journal
    Now, I'm not an industry expert, but Tim Sweeney lost me somwhere with his comments on Evil Avatar [evilavatar.com]...
    What's really holding back PC gaming...it's harder than it should be to make a game work with your 3D card, drivers, etc. That's not really Microsoft's fault; it's a natural result of having a huge ecosystem of independent PC makers, 3D card makers, and sound card makers working together.

    Ok, I agree with that...lots of different people can lead to compatiblity problems. This doesn't mean that they can't be overcome, but problems can and do occur. But then...

    In all, Microsoft has done a pretty good job of making PC's work. They have forced 3D hardware makers to adopt common standards with DirectX. That is good. They did the same for printer drivers, sound drivers, and PCMCIA. Their implementation isn't perfect, but it's pretty damn good.

    Jigga-What?!? Ok, this is were he losses me. First, isn't MS in trouble for exactly that...forcing software makers to do things that MS wanted them to do? What about choosing standards based on technical merit...you know, robustness, interoperability, that sort of thing...instead of standards based on their ability to tie developers to a single platform?

    What ever happened to Open Standards? You know, like the ones used to build the Internet and make email work? Again, I'm no expert, but they seem to work pretty well. Don't look now, but your soaking in them! I realize some of the Open Standards may not have developed as quickly or as well as some of the MS 'standards', but how much of that was due to MS marketing muscle?

    I'm not trying to bash MS or Tim Sweeney, but I don't know that dominating a market can be compared to 'setting standards'. If there's something I'm overlooking, someone please fill me in.

  • So you're saying an emulator is any program that exposes APIs, that all libraries are emulators?

    No, I'm not, and that should be obvious. Check my previous post, and you'll see the formal definition of what I'm saying an emulator is, namely, that which emulates (tries to act like, but is not) an origial product.

    Don't be obtuse.
  • Even granting your point that DX is now better than OpenGL (which I really don't), what might OpenGL be today if Microsoft had been willing to play nice and develop OpenGL alongside other companies like they originally said they would? - remember NT's original OpenGL support?

    It really seems like another case of MS not being happy with the existing open standards and deciding to reinvent the wheel using proprietary technology.

  • by Maïdjeurtam ( 101190 ) on Sunday May 14, 2000 @07:05AM (#1073834) Homepage Journal
    Isn't Linux a matter of choice ?

    I can't see why a Linux implementation of OpenGL/ Direct3D/Glide/Grits3D or any other 3D API I can think of would hurt anyone.

    If Linux's creed was unification, it would hurt, but Linux is about choice. So if you think OpenGL is the best 3D API, it's OK. But it's OK if some folks decide to implement Direct3D too (and they [winehq.com] do !), it's OK too.

    Stéphane
  • No, I mock the 13-year-old fuckfaces. If you need help understanding, place a <sarcasm></sarcasm> pair around anything that refers to gamez and gamerz. I hate g4m3z d00dZ with unmatched passion. G4m3Z d00dz, 0v3rcl0|<|<3r |<1DD13zz, and general peecee wankers piss me off. I long for a return to the days when the bizarre priesthood carefully guarded access to monumentally complicated, expensive, and inaccessible computers. I wish people could understand and appreciate the design and engineering that go into real computers, and the idea that computers are meant to compute. In short, I think the referenced article points everything that's wrong with the computer industry today, and passionately despise the forces at work in the massive dumbing-down of the computers themselves and those who use them.

    Hence the misspeeling. Ditto for the word "peecee."

  • I was just trying to be poetic. Geez, nerds. APIs can be faster or slower than others based on how effectivly it allows access to hardware and how effectivly it manages resources.

    But neither D3D nor OpenGL specify access to hardware or resource management! These are both implementation details, and can be bad or good depending on who wrote the implementation.

    In current implementations, however, the pipeline has been streamlined, the whole thing has been nearly rewritten, and provides much better access to hardware.

    Exactly my point. As implementations improve, speed improves. A few years ago, no one had a particularly fast OpenGL implementation, but as more and more engineering hours have been put into them, now, most vendors have very fast implementations. But the API never changed!

    I can easily (relativly) write an engine that uses every features in D3D. Now no matter what graphics card I'm using, my app will run, with D3D smoothing over hardware differences.

    No game developer in their right mind would risk using a feature not implemented in hardware on most cards. Why would you want to fall back to Microsoft's HAL? It's SLOW.

    In this case, OpenGL is almost the same as D3D. Instead of being implemented by the HAL, unsupported features of OpenGL are implemented by the vendor.

    If I use the skinning support in DirectX 8.0, my game will automatically take advantage of both the GeForce 2 and the Radeon. In OpenGL, I would have to use both the NV skinning extension and the ATI skinning extension.

    But what if the user is running on an S3 Virge? Heck, I guess he's out of luck! Time for the slow HAL to kick in! As a game developer, you should know whether or not your neat little trick is actually running in hardware!

    In OpenGL, you have an abstract access to color, depth, etc buffers, vertex buffers, and hardware resources. In D3D, you have direct access to these.

    This is entirely false. Both API's abstract access to vertex buffers, and provide direct access to frame buffers (depth, color). Do you really thing that when you fill a vertex buffer in D3D that you are using the vertex format that will eventually get sent to the card? Think again.

    Hell, you even have direct access to what textures are uploaded to the card! None of this glPrioritizeTextures crap.

    What about implementations that don't even load textures into a card's on-board memory? How will your LoadThisTextureToVideoMemory work on a card without local video memory free? You'll get an error and will have to load it into AGP anyway. Why not let the implementation take care of this minor detail.

    Case in point, in D3D, you can render directly to a secondary hardware buffer, let's see OpenGL do that!

    Read up on auxilary buffers.

  • I agree. DRI is just appearing, and OpenGL on DRI should privide excellent performance.

    3dfx VooDoo 5 6000 just BLOWS AWAY nVidia GeForce 2 according to early sightings, while Celeron II benchmarks shows that Intels decision to deliberately keep the 66MHz bus to slow it down was very effective!

    So much for X-Box performance! A 600MHz PIII derived processor (i.e. Celeron II) and nVidia are already looking outdated before the box is even introduced!

    VooDoo 5 6000 vs GeForce 2 [simhq.com]

    Celeron II benchmarks [ixbt-labs.com]

  • But neither D3D nor OpenGL specify access to hardware or resource management! These are both implementation details, and
    can be bad or good depending on who wrote the implementation
    >>>
    You cannot take that statement in a vacuum. Both OpenGL and D3D specify a significant number of implementation details, and it is a proven fact (go read some benchmarks circa 1996-7) that OpenGL was faster than D3D. Now take a look at current games that support both OpenGL and D3D. You'll see that they both run at about the same speed.

    No game developer in their right mind would risk using a feature not implemented in hardware on most cards. Why would you
    want to fall back to Microsoft's HAL? It's SLOW.

    In this case, OpenGL is almost the same as D3D. Instead of being implemented by the HAL, unsupported features of OpenGL are
    implemented by the vendor.
    >>>>>>>>>
    Take a closer look at my vertex skinning example. No matter what, the app will have to do vertex skinning (it's integral to many engines' animation capabilities.) In OpenGL, if you want to take advantage of hardware acceleration for skinning, you write 3 code paths, one for each case (software, nVidia, ATI). In D3D, you write one code path, (The D3D case) and D3D will take care of things. If you have to fall back to software, so be it, you'd have had to do that anyway, and if you have hardware, you get a nice speed increase. You say that unsupported featueres of OpenGL are supported by the vendor. That's the whole problem. In D3D, the core feature set is much larger, and unsupported features are emulated. Many of the newer extensions don't take a terrible speed hit if they are being emulated. Even if the speed hit IS that bad, D3D still comes out ahead. Take a hypothetical lighting feature that has not yet become an ARB extension. It is accelerated on two or three cards. In OpenGL, you'd have to support the case where HW acceleration is not available, in which case you'd disable the feature, the case where nVidia accelerates it, in which case you'd use the nVidia extensions, the case where ATI accelerates it, in which case you'd use the ATI extensions, etc. In D3D, you'd have two cases. If D3D reports that the feature is not being hardware accelerated, you disable it. Otherwise, leave it enabled. The best thing about this is that it takes advantage of new hardware AUTOMATICALLY.

    This is entirely false. Both API's abstract access to vertex buffers, and provide direct access to frame buffers (depth, color). Do you
    really thing that when you fill a vertex buffer in D3D that you are using the vertex format that will eventually get sent to the
    card? Think again.

    But what if the user is running on an S3 Virge? Heck, I guess he's out of luck! Time for the slow HAL to kick in! As a game
    developer, you should know whether or not your neat little trick is actually running in hardware!
    >>>>>
    If he's running a virge, he is scewed anyway. Besides that fact, vertex skinning is done by most apps anyway. Hardware just makes it faster. We're not talking about stuff like texture mapping here. Most of the new features in cards can be done adequatly in software, but not to the level the developer would like. As such, glazing over these differences in hardware makes sense. Plus, you do know if the feature is being accelerated, that's what driver caps are for. If it is not accelerated, you can fall back to lower quality rendering, but even if you don't, the app will still WORK.
    >>>>>>>>>>
    Okay, niggly little point, you do not have direct access to vertex buffers in D3D, but you do have direct control. When you create a vertex buffer, it is a locked object like a screen buffer. The display driver can manage these much better than a program-space array like glVertexPointer uses. (Yes, I know about complied vertex arrays, but that's an extension.) Second, OpenGL does not give you direct access to buffers. I'm not talking about his glCopyPixels crap, I'm talking about a pointer to video memory. Believe it or not, there are a lot of cool things you can do with that. Under direct access, I also include control. I find it silly that there is nothing in the core open gl API like a vertex buffer object. Texture objects exist, and both are buffers that are uploaded to the card, so why the lack of one.

    What about implementations that don't even load textures into a card's on-board memory? How will your
    LoadThisTextureToVideoMemory work on a card without local video memory free? You'll get an error and will have to load it
    into AGP anyway. Why not let the implementation take care of this minor detail
    >>>>>>>>>>
    If the card doesn't have local video memory free, you still have to load it into AGP memory. Under OpenGL, you have less control of the situation then under D3D. This is not a minor detail, it can get in the way of some nifty rendering styles. Say you are rendering a room with a strip of a certain texture down the middle. A lot of stuff in that area uses that texture, and it is critical that it stays in video memory at that time. However, it is not used for the rest of the room. Do you give it a really high texture priority? That way it stays in memory, but will eat lots of space while the rest of the room is rendering. You could reprioritize in the middle of the frame, but state changes are expensive. An API should not try to force a way of development on the developer. Why not give more low level control over stuff like this that can potentially be important?

    Read up on auxilary buffers.
    >>>>>>>>>
    I have, and they are fairly useless. Most hardware implementations of OpenGL can't render accelerated to an auxilliary buffer. This is bad because it is a pretty nifty effect to render parts of the scene into a texture buffer. It looks really cool when rendering stuff like portals, mirrors, etc. In the future, it could be even more helpful. Rendering a 3D movie to a texture buffer, then using that as the source texture would look pretty cool.

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...