Alienware Discuss New Video Array Technology For Gamers 76
Gaming Nexus writes "Over at Gaming Nexus, we've posted an interview with Alienware about their new video array technology, which 'will provide gamers with an expected 50% increase in gaming performance by utilizing two video cards.' The interview covers the creation of the technology, the problems they had in developing it, as well some more details on how it works." The short version is that it utilizes multiple cards to render one screen, similar to SLI, but with many more features added in as well. What Alienware has developed is a software layer that sits between the video drivers and the application, routing things to where they need to be.
AGP Slots (Score:1)
It's PCI-X. (Score:1)
You mean PCI-E ! (Score:2, Informative)
No, not that it really matters. And yes, I'm being overly anal.
--LordPixie
Yes yes. (Score:2, Funny)
Re:AGP Slots (Score:3, Informative)
GamingNexus: Was this something that you couldn't do with AGP or had you considered doing something with AGP?
Brian Joyce: We actually had a working prototype with AGP. But as soon as it became clear that PCI Express was going to become the industry standard, we had to start re-working it for PCI Express.
Re:AGP Slots (Score:4, Informative)
Still not sure whether they've patented it or not - hopefully not so we'll be able to but these mobo's from other vendors and build these rigs ourself without paying alienware an extra $1500 for unnecessary services.
Re:AGP Slots (Score:3, Informative)
Re:AGP Slots (Score:2)
I... guess.... (Score:5, Funny)
Dr. Pepper and Potato Chips: $5
Alienware Super Extreme Gaming System: $10,000
Having the "Sorry, I'm broke" Excuse to Avoid Going out on Weekends and Playing Computer Games Instead:
There are some things students can't afford. For everyone else, there's Alienware.
Re:I... guess.... (Score:2, Insightful)
Re:I... guess.... (Score:1)
You, my friend, are not a gamer.
Tax Deductible (Score:1)
Re: (Score:1, Redundant)
Multi-cards vs multi-heads (Score:5, Interesting)
Didn't this only last about 1/2 of a "generation" the last two times it was attempted?
"Two?" you say?
Yes, the obvious one is the old Voodoo 1 & 2 cards, but I distictly remember at least one (I think 2 or 3) company(ies) making cards that used 3 S3 chips (one processing each color) for a performance boost.
They were all "really hot" (popularity, not thermally... well, ok both) for a very short period of time, since the next full generation of chips completely blew them away.
It was silly then, it's silly now.
Now what _I_ want, is a triple-headed system that you can play FPS games on with a front and two side views (peripheral vision, or at least just a wider landscape in 2 or 3 monitors). The hardware is there (well, for dual at least), but do any games support this?
It _can't_ be that off-the-wall, after all, the SPARC version of Doom supported triple-heads way back in version 1.2! (they dropped it after that)
OK, that wasn't *exactly* the same thing... that required a different box for each of the left and right displays, but they acted as a slave so you only operated the center system... it was _extremely_ cool!
Hmmm... I wonder how long it'll be before 16:9 displays are common, the only one I know of is the sweet monster made by apple that costs as much as a used car!
Re:Multi-cards vs multi-heads (Score:3, Informative)
Re:Multi-cards vs multi-heads (Score:2)
Re:Multi-cards vs multi-heads (Score:2)
Further, matrox claimed that the parhelia would have performance competitive with or superior to ATI and nVidia offerings which were out at the same time. Well, as it turns out, the card shipped late, but even
Re:Multi-cards vs multi-heads (Score:3, Informative)
Go here [matrox.com] and check out the TripleHead Desktop table.
Re:Multi-cards vs multi-heads (Score:3, Insightful)
plus that it really makes more sense have the power in one card anyways if you're getting it, so the market for these is a niche one.
cool software anyways, kudos to them and yadda yadda.
Re:Multi-cards vs multi-heads (Score:1)
Re:Multi-cards vs multi-heads (Score:3, Insightful)
Re:Multi-cards vs multi-heads (Score:2, Informative)
"Video rendering is an inherently parallelizable problem..."
Um, no... not at all.
Think about how much bandwidth is needed just for the CPU of the system to feed the graphics hardware that is doing all the work (AGP 1x, 2x, 4x, now 8x and the new PCIx, etc.).
Rendering on two boards means _4x_ the traffic over the bus!
Don't believe me? Think about it.. your CPU has to feed card A _and_ card B (2x so far).
Then, since you're only displaying on ONE of them, card B has to xfer the rendered display back over
Re:Multi-cards vs multi-heads (Score:1)
There was a picture of the setup at some other site where this system was mentioned a couple weeks ago.
Re:Multi-cards vs multi-heads (Score:2)
First of all, most devices can access other devices and memory without the CPU being involved. This is what DMA and its ilk are for. Secondly, and I don't know if this is possible or not, but it's possible that two devices on a bus could both be written to at the same time, since they both are listening to the bus at all times.
Anyway, the realities of implementing a sol
Re:Multi-cards vs multi-heads (Score:2)
"First of all, most devices can access other devices and memory without the CPU being involved. This is what DMA and its ilk are for."
Sorry. While it is true that DMA allows one device to talk to another device without the assistance of the CPU, it still requires the transfer take place over the bus that the devices are plugged into.
If the bus is a bottleneck with one card, two makes things much worse.
"Secondly, and I don't know if this is possible or not, but it's possible that two devices on a bus co
Re:Multi-cards vs multi-heads (Score:1)
To the best of my knowledge 3d rendering is all about matrix operations. Matrix operations are inherently "parallelizable". Period.
What the heck does the bus have to do with any of it? Yes, there are going to be bottlenecks, but that doesn't mean the the original problem can't be easily broken down into discrete chunks. Inter-operation communication is an issue to be sure, but it isn't the driving force behind whether or not a algorithm can be processed in a par
Re:Multi-cards vs multi-heads (Score:2)
"What the heck does the bus have to do with any of it?"
Try unplugging your AGP card and see how well it works.
If the bus were so unimportant, why the heavy focus on the BUS since the original VGA card?
ISA->EISA->VLB->PCI->AGP->AGP2x->AGP4x->AG P 8x->PCI x...
If you would actually READ what I posted, you'd understand that first you have to TRANSFER the data to be rendered, then you have to TRANSFER the rendered image BACK so you can TRANSFER it to the other card again.
OK, so anothe
Re:Multi-cards vs multi-heads (Score:1)
Re:Multi-cards vs multi-heads (Score:2)
"After a card has rendered a frame/scene, there is no data that needs to be return; it has performed its job."
As I stated in at least 2 postings, if the video is not recombined with extra hardware downstream of the rendering (which as one poster indicated it is, making this a moot point), then the rendered data (which is drastically larger than the pre-rendered data) must be merged with the other card's rendered data in order for it to be displayed. This would require a data transfer over the only medi
Re:Multi-cards vs multi-heads (Score:1)
ugh (Score:2)
Re:ugh (Score:2)
PC games are not lagging because of hardware. There are other factors, like price for instance (consoles are much cheaper for the same hardware), playing on a TV, controllers, less console warez... etc.
But it certainly is not due to the technology, since it's the same.
Re:ugh (Score:1)
Re:ugh (Score:1)
Unreal 3 is not out yet, and next generate consoles will be using next generation graphics hardware as well, so there is a good chance that the new unreal engine will run on consoles as well.
I'm no
Re:ugh (Score:1)
Re:ugh (Score:2)
Console games are simple fun 5 minuters for playing on the sofa with your mates. They have neither the depth, or the eye candy of modern "PC" games. Sure they have lots of antialising, and are fairly smooth - but hell: they'd better be at TV resolution! If you saw that on your Pc's monitor you'd ask for your money back
Using multiple cards is a way of getting a "sneak preview" if you like at what the mainstream tech will do for y
Re:ugh (Score:1)
Re:ugh (Score:1)
Correction (Score:4, Informative)
In reality SLI stands for Scan Line Interleave.
Comparing SLI (Score:1)
Murphy's law here.... too many things can go wrong SLI-ing ATI and Nvidia cards, more than any forums can handle I am sure. Christ, the PC gaming industry has already shot itself in the foot with years of driver problems.
Re:Comparing SLI (Score:2)
How I assume it's going to work: both cards keep the same geometry/texture/whatever information in RAM. They both try and render the same scene, only the software tells each to render only one half of the screenby "blue-screening" -- defining
Re:Comparing SLI (Score:2)
I must admit though, your proposed solution is elegant in it's simplicity.
Re:Comparing SLI (Score:2)
This allows you to render at odd resolution *without* having to mess with transformation matrices.
Re:Comparing SLI (Score:1)
The compositor card is just a video switch. At the horizontal retrace interval where the subdivision is, it switches from one card's output to the other's. It's probably setup to count retrace intervals and do the switching itself, rather than being interrupt-driven. (I don't think horizontal retrace interrupts are supported by most video cards.)
Alienware NEEDS a new Director of Marketing! (Score:5, Interesting)
Of course any "hardcore gamer" knows about the history of their "patents pending technology" as their Director of Marketing calls it. Too bad he doesn't.
In the article, this guy says: "SLI stood for Scan Line interface where each card drew every other line of the frame and my understanding was that the major challenge was to keep the image in sync. If one line's longer than another, then tearing, artifacts, and keeping the two cards in sync was a real issue. The benefits of doing it half and half is we can take advantage of the load balancing and the synchronization challenge can be overcome."
Alright... I'm sure the technology they've developed over there is some hot fscking shit. I'm sure they have a top R&D team that knows what they're doing && this custom motherboard + pre-driver thing is a good idea. Once developed fully, it could let you keep adding as many video cards as your case can hold, even potentially from different manufacturers, to improve total rendering capacity. That is great. Alienware has some very talented people to solve all the associated problems with accomplishing this. I respect their achievement.
That said, what the hell do they have a Director of Marketing for who doesn't know what he's talking about? He gets the SLI acronym wrong. How the fsck could one scan line be longer than the other resulting in tearing or cards getting out-of-sync? Come on! I know he's not a technical guy but then he should just stick to his hype buzzwords && patents && shit like that because he totally ruins Alienware's credibility when he shows no understanding of the most prominent attempt at this type of endeavor in the past. At least he said "my understanding" in there but he should've said "I don't know or understand the history so I'll just talk about what I do know."
Although I hold Alienware in high regard for making really fast gaming computers (that are arguably worth the premium price if you can't be bothered to build your own), I lose substantial respect for them when they allow their cool new technology to be represented by a marketing turd who couldn't be bothered to understand the history of what his company has done or what he's talking about. Buy a clue if you care to succeed. I want to like Alienware... I really do. TTFN.
-Pip
Re:Alienware NEEDS a new Director of Marketing! (Score:1, Insightful)
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
-Pip
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
Screen-space subdivision (ala Alienware) can subdivide the geometry work as well as the rasterization work. However, there will still be a lot of geometry work being duplicated, since you don't know where a polyg
Re:Alienware NEEDS a new Director of Marketing! (Score:2, Insightful)
Although I hold Alienware in high regard for making really fast gaming computers (that are arguably worth the premium price if you can't be bothered to build your own), I lose substantial respect for them when they allow their cool new technology to be represented by a marketing turd who couldn't be bothered to understand the history of what his company has done or what he's talking about. Buy a clue if you care to succeed. I want to like Alienware... I really do. TTFN.
Try and actually order a system fro
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
As an example, a primary rule of video capture is that you tie yourself to a single timing source. In other words, if you're capturing both video and a
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
Second, Alienware is not exactly using generic off-the-shelf hardware. Can you find a PCI Express video card on any shelf? I'm sure they've worked with their vender in order to find out how to gen-lock the cards, in additi
Re:Alienware NEEDS a new Director of Marketing! (Score:1)
Until any of this actually makes it to market, it's all speculation. Perhaps NVIDIA and ATI are going to insist that PCI express cards have connectors for genlocking on even the lower-end gaming video cards. As it stands today, the only current NVIDIA chip that support
Interesting That Alienware (Score:3, Interesting)
Oh, yeah, right. They are.
I mean, come on, with the kinda influence they have - they asked ATI and nVida for custom cards for the Area51m - is it any real suprise they are attempting to make themselves even better?
I suppose that the fact there are a number of other producers in this Niche - See the earlier slashdot story - might encourage the development. But the simple fact remains - They are on top, and if they can lock down this intelectual property till 2nd gen, then they can release it publically and become innovators in more than just overclocking and cool case mods.
MMMmmmm...Cool case mods.
It kinda does apply here... (Score:4, Funny)
Re:It kinda does apply here... (Score:3, Interesting)
Re:It kinda does apply here... (Score:1)
http://www.cs.princeton.edu/~rudro/sketch00.pdf
Hardware's old news, but availability is not (Score:3)
The thing that's new about this implementation is that you won't have to run out and drop $40,000 on the base Onyx4 if you have an application that needs this (to some extent - SGI's solution will go to 16 cards, with the bandwidth to drive them all, while Alienware's is currently limited to 2). Only $4000 for the Alienware box.
Somehow I doubt that Alienware will get the patents that are 'pending' - I'd imagine that SGI probably already has a whole portfolio covering this area, since this kind of thing is their bread and butter. It's nice, though, to see a consumer-affordable implementation of this technology coming to market.
Re:Hardware's old news, but availability is not (Score:3, Funny)
Yes, because nobody is granted patents when there is a lot of prior art [yahoo.com].
Re:Hardware's old news, but availability is not (Score:2)
Make your own video card? (Score:1, Funny)
Or can I finally put to use those old ISA videos cards i used to have in my 386?
Load balancing? Not in their demo. (Score:5, Interesting)
In other words, in their examples, which used quake 3, there was NO load balancing going on. If there was, when we saw, for example, the top half of the screen, the size of the top half should have been constantly changing.
I understand fully that we were seeing alpha or beta level stuff here, but perhaps they should have waited until they had a fully functional model before showing it off.
Re:Load balancing? Not in their demo. (Score:2)
I've personally only seen one computer that cared if a monitor was plugged in, and that was all custom hardware.
Re:Load balancing? Not in their demo. (Score:2)
Instead we saw a fixed size, which indicates the card was always rendering the same size, meaning NO load balancing was being done.
Re:Load balancing? Not in their demo. (Score:2)
Or that the amount of data being fed to the two cards to crunch was staying roughly the same for the two seconds of available grainy video from Tech TV. Geeze, for a multi-thousand dollar system which requires an 800 watt power supply, and two top-of-the-line graphics cards for a %50 percent increase in performance, the best complaint you can come up with is that the preview
Re:Load balancing? Not in their demo. (Score:3, Interesting)
I didn't watch it on TechTV either, as I don't get TechTV. I watched it on the net, and the quality wasn't that poor, and there was significantly more than 2 seconds available.
Who said anything about upgrades every 6 months? Up until now (with the release of the Radeon X800 and GeForce 6800), there hasn't been a single videocard that has dramatically impro
Not quite practical (Score:4, Insightful)
Re:Not quite practical (Score:1)
Also, a 4-way screen-subdivision system will t
Wow, another way to charge us. (Score:1)
2 16x PCIe Slots.... (Score:2, Funny)
Alienware on the downside (Score:1)