AMD Multi-Display Tech Has Problems, Potential 138
EconolineCrush writes "While AMD's Eyefinity multi-display gaming tech is undeniably impressive at first glance, digging deeper reveals key limitations. Some games work well, others not at all, and many are simply better suited to specific screen configurations. A three-way setup looks to be ideal from a compatibility perspective, and given current LCD prices, it's really not all that expensive. But would you take that over a single high-resolution display or a giant HDTV?"
Hell Yes (Score:5, Insightful)
But would you take that over a single high-resolution display or a giant HDTV?"
If I'm sitting at my desk play, an HDTV at 1080p is going to look absolutely horrible. So is even a ridiculously expensive large format display. Even three low end 20 inch monitors will give a much higher resolution, and much, much higher DPI than I could get for the same amount of money spent on a single large display.
Gaps between monitors (Score:3, Insightful)
Even three low end 20 inch monitors will give a much higher resolution
With annoying gaps between the screens. Watch you not notice something because it's straddling a gap. But I bet you already realized this because you said three monitors, not two.
Re: (Score:2)
As opposed to not noticing it because your ultra-high-resolution single screen setup didn't provide you with the field of view needed to even render that far into your peripheral vision?
Of course, that assumes the setup changes that (Score:2)
You assume that a very widescreen setup gives you a wider vision. That the monitors on each side become sideviews. That is how flightsim setups work, but do games support this at all?
I don't think the gaps are that big a deal, cars have them and who cares about that?
Mostly I think this is a case of penis envy: "I can't afford a multi monitor setup, therefor it is stupid."
of course a single monitor that size would be better, but they just ain't available. You can buy three high quality 1980 monitors plus
Re: (Score:2)
I don't think the gaps are that big a deal, cars have them and who cares about that?
In a car, the combined movement of your head and binocular vision allows you to see what's behind that gap at relatively short distance. That won't work on a display unless you have head-tracking.
Re: (Score:1)
Re: (Score:2)
That is, primarily, what they are using to describe whether, or not, a game "supports" the ATI multi-monitor technology. Some games keep the field of vision and just increase the resolution (or worse, stretch the image if the monitors are set up in a non-standard aspect ratio). This makes the technology worthless. Of course, people with wide-screen monitors have been dealing with this problem for a while with games like Bioshock.
Personally, I think much of the dissing of this stuff is coming from people
Re: (Score:2, Interesting)
a 1995 Doom 1 edition supported multiple monitors (attached to multiple workstations). If one could use 3 PCs for a single player, if worked nicely and gave advantage of sideviews.
Re: (Score:2)
Not a lot that I'm aware of (I know you could change fov with the quake series, and I bet most flight sims will let you, but I'm about to do a 3 monitor eyefinity setup for iRacing [slashdot.org]. It lets you adjust fov, and enter bezel width for your monitors so it can be nice... example [youtube.com]
Re: (Score:3, Insightful)
The bezel gap makes me think this would be better suited for one main monitor and one monitor for menus/inventory/what-have-you, but I can't think of any games that would support something like that.
Having a dedicated map monitor would be amazing for so many games.
Re: (Score:3, Informative)
Doesn't Supreme Commander let you do that?
I haven't seen it I'm just sure I heard it did.
Re: (Score:3, Informative)
Yes, yes it does.
Re: (Score:1)
It's one of the few games that actually make use of additional displays (even though multi monitor setups have become much more popular over the last couple of years).
Re: (Score:2)
Supreme Commander makes pretty decent use of two monitors, but it's the only game I've found where it works.
On the other hand, I've been using multiple monitors for music production and video editing for a long time. Having the ability to multiple accelerated displays is a good thing. When doing music, for example, I have the DAW "track" view on one screen, the virtual "mixer" on another screen (controlled by one of several midi devices) and on the third screen, w
Wow does this with addons (Score:2)
I've seen pics of pure game view on screen and
everything else- maps, bags, stats, chat on the side monitor...
Re:Gaps between monitors (Score:5, Informative)
World in Conflict can put a map of second monitor.
There is some number of games which do allow changing their field of view, and work quite well...
http://www.matrox.com/graphics/surroundgaming/en/games/ [matrox.com]
http://www.widescreengamingforum.com/wiki/Essential_Games_List [widescreen...gforum.com]
Re:Gaps between monitors (Score:4, Funny)
+1 funny mod needed for a matrox.com URL that contains the substring 'gaming'
Re: (Score:2)
How come? Matrox is making a piece of equipment which allows for quite some time what AMD multi-display allows since recently. On all GFX cards, more or less.
Re: (Score:2)
You just don't get it.
That is good since it's just a splitter for video signal; takes very widescreen video from one output of whatever powerful card (or cards - SLI, et al) you like (out of those which can output such resolution; simply mutliply typical horizontal by 3) and displays it on three monitors.
This new AMD multi-display tech is essentially the same thing, built into the card (but Matrox solution is vendor-agnostic and on the market for a few years already)
Also, Matrox wasn't a joke with 3D games
Suprisingly not really a problem (Score:2)
With annoying gaps between the screens. Watch you not notice something because it's straddling a gap.
I've been doing a multi-monitor setup for a while. In practice this isn't a problem. Usually you have different items you are working on on different screens. Now and then you'll stretch across multiple monitors but really most of the time I prefer 2-3 monitors over one huge one. Normally I have my email/IM/calendar on one monitor, my active work on a second window and a browser or documentation on the third if I have it. (usually I have 2). Works really well.
Re: (Score:2)
That's exactly how I used to use a 3 CRT setup back in the day. Code in the middle, debug on the left, reference material/email/chat/misc. on the right.
Re: (Score:3, Insightful)
How do you manage to drive a car with all their "screen bezels"?
Also, when the time comes, I guess it's only contacts for you. not glasses (how so many people can put up with them?)...not that you could even wear glasses after the amputation of your nose that you already performed, so it won't irritatingly obstruct the field of vision.
Re: (Score:3, Insightful)
Also, when the time comes, I guess it's only contacts for you. not glasses
With glasses, you can make the world move relative to the rim by moving your head, and your brain uses this to help filter out the rim. Do PC window managers have an analogous feature to nudge all windows?
Re: (Score:2)
And the same is true with most common of apps that span several monitors - games; you head (camera) moves. Multimonitor with other apps usually works on the basis of "one window per monitor"/etc., so there's no issue as far as far as spanning & window managers are concerned.
Re: (Score:2)
games; you head (camera) moves.
With very few exceptions, such as the light gun game Police 911, the player's head does not control the camera.
Re: (Score:2)
But player's mind does; there's not much difference.
Or will you suddenly start to convince us that first person view & mouselook don't really feel right?
Re: (Score:2)
More importantly, with glas frames and car frames in your vision, you can make a minute movement with your head to see what is behind them, and your subconscious does that for you. With a computer screen this reflex does not work.
Re: (Score:2)
Of course it works, just on a slightly different level. Constant movements with mouselook. Scenery moving while you're "in" some vehicle.
Re: (Score:1)
Re: (Score:2)
Yup; I wonder if those saying "this can't work" even tried it...
One day I will set up three projectors, should be even more fun (and no bezels! ;) ). Maybe not that expensive, considering the periphery ones probably can do with worse parameters (resolution).
Re: (Score:2)
I'm not sure I will have a place at my disposal with the amount of space required; not that I would want to toy with proper screen/etc. for back projection...when there's really nothing wrong with front one?
Re: (Score:2)
However fun this always looks...not that practical, I guess. Not many applications which would justify not only the trouble (and additional expense / space consumed), but also actually standing inside. Those uses which do seem nice should be, well, nice enough with 3 screens filling most of FOV while you're sitting; basically a pimped-up three monitor thing. Requiring much less space, with quick setup in small room, easy to do with three cheap front projectors.
Re: (Score:2)
they don't seem to realize there are LCDs made specifically without borders for this purpose, and they're not all break-the-bank priced.
Then why don't I see borderless LCD monitors in Best Buy or Office Depot?
macs were not only the first PCs
I'm sorry, but the IBM Personal Computer 5150 was out in 1981, while the Macintosh didn't come out until 1984.
but are still to this day PCs.
Every country has an internal revenue service, but only one has "the IRS". Likewise, "PC" when abbreviated specifically connnotes Lenovo-compatible personal computers. (IBM sold its PC business to Lenovo half a decade ago.) Thus a Mac is a personal computer, but it is not a "PC" until you install Boot Camp or a VM because it do
Re: (Score:1)
The things you say don't make sense. 3 20" monitors will not give you "much higher resolution". At best, they'll give equal resolution. Assuming you buy 3 1920x1080 20" monitors and not 3 1680x1050 20" monitors. Because a 1080p monitor's resolution is 1920x1080.
As for looking "absolutely horrible", I suggest you try it before you bash it. I've been using a 37" 1920x1080 LCD as my primary monitor for years and it's freakin' awesome. Hooks up via DVI (with HDCP support). I sit back about 3' from the di
Re: (Score:2)
Perhaps he rotates his screens to get 3240x1920.
Re: (Score:1)
Re: (Score:2)
Psst. I'll let you in on a little secret. These TVs you speak of...They're LCD monotors! A 37" 1920x1080 "TV" works exactly the same as a 20" 1920x1080 "monitor". You can hook up as many "TVs" as you have display ports on your computer. Your video card(s) can't tell the difference. But don't tell anyone! It's [apparently] a secret.
Re: (Score:2)
It makes perfect sense. Three 1920x1080 monitors (which are rather cheap nowadays) give you 6 megapixels. You can find single screens with around that amount of pixels...but they will be significantly more expensive, also "per pixel", than three cheap ones.
Re: (Score:2)
Apparently you didn't understand his claim. His claim was that a 20" monitor would provide greater resolution than a 1080p TV. The highest resolution 20" panel you'll find at Best Buy or NewEgg has a resolution of 1920x1080. Exactly the same as a 1080p TV. There also seems to be some confusion about the use of multiple TVs as monitors. They're exactly the same. If you can connect 3 20" 1920x1080 monitors to your computer, you can connect 3 37" (or 52" or 60") 1920x1080" monitors to your computer. Goi
Re: (Score:3, Informative)
Apparently you didn't understand. He said:
Even three low end 20 inch monitors will give a much higher resolution, and much, much higher DPI than I could get for the same amount of money spent on a single large display.
(emphasis mine)
Re: (Score:2)
I tried a Philips 37'' 1920x1080 LCD last year. It was pretty awful. It was hard to keep focus and it looked much worse than my (several years old) monitors.
I'm back to my dual 22'' monitors. Not that I didn't plan to keep a secondary monitor anyway - there is no comparison on the total resolution as well as just putting windows on the secondary monitor. It is awesome for developing software.
I've also bought one of the new AMD cards to try 3 monitors for gaming eventually.
1080P? (Score:4, Funny)
And Here I was thinking 1080 lines of vertical resolution should be enough for anybody.
Re: (Score:2)
> And Here I was thinking 1080 lines of vertical resolution should be enough for anybody.
Everyone's airing of grievances over the seemingly pervasive march towards fewer vertical pixels than even mid-priced LCD monitors had 3 years ago (ah, the happy days when you could actually buy a laptop with 1920x1440 display) was a different Slashdot topic a few months ago. ;-)
Re: (Score:2)
Please define "HDTV."
I sit before a 24" Asus 1080p "HDTV" (it does have a proper HDMI input, after all, though it's geared toward being a computer monitor). It looks fine.
I also sit before a ~20" 1600x1200 IPS-paneled LCD from NEC, which has a little bit higher DPI, and fantastic viewing angle.
Meanwhile, my 5-year-old Dell laptop has a 15.4" 1920x1200 display. In terms of DPI, it's a dream.
However, I'm not about to take the 52" Samsung 1080p LCD from my living room and put it on my desk, though I do use i
Re: (Score:2)
Even three low end 20 inch monitors will give a much higher resolution
Low end 20 inch monitors seem to be 1600x900 so with three of them you would get just over twice as many pixels as on a 1920x1080 (full HD) screen. It also seems I can get a 2048 x 1152 display for less than the three low end 20 inch monitors.
Effective resolution (Score:3, Insightful)
obviously you've never owned a 30" lcd, 2560x1600 is a wonderful resolution. I've got one of those screens. It blows away anything you can buy in multi monitor.
A 3 monitor setup with 1920x1200 displays gives an effective resolution of 5760x1200. That's roughly 50% more pixels than your 30" 2560x1600 display. Nothing wrong with a huge monitor but it's not better for every purpose. Personally I find a multiple monitor setup more useful for the way I work. YMMV.
Re: (Score:2)
Actually, one of these plus a 20" portrait (1200x1600) screen to it's left and right is a fair bit more awesome than just the 30" display. Since, if you're doing that, you probably run two video cards anyways: throw in a 1080p projector, just in case of wanting to watch a movie in bed or something. Ten million pixels. Fun!
Hell yeah! (Score:2, Informative)
The problem with a big TV is only 1920x1080 max, and pixels the size of my fingernail. (There's a recent "obligatory" xkcd about this, I'll let some karmawhore who cares dig up the link...)
The problem with single high-resolution displays is that, while they keep the pixel density sane, they stop at only 30" (without going insanely expensive) and 2560x1600. Even these are way pricier than the same number of pixels in two or three smaller monitors.
Multiple monitors (30" if you can afford them, more likely 21-
Me! Me! (Score:2, Informative)
http://xkcd.com/732/ [xkcd.com]
Re: (Score:2)
Sadly this is the new norm, because HDTV sized displays are the new in thing, that seems to be about where companies have stopped packing pixels.
My 17" dell laptop is running at 1920x1200, and it's about perfect as far as DPI goes. The 21" monitor next to it only does 1680x1050. I've seen LCDs as large as 24" that only do 1920x1200. Come on, have all the manufacturers just given up at "1080P"?
Re: (Score:2)
Well, there's some hope [wikipedia.org]. Not for quite a few years, though...
Re: (Score:2)
What's worse, they take all those pixels and waste them on 20 pixel wide window borders and giant glossy buttons. I'm fine with accommodating the visually impaired, but I usually want more resolution so I can fit more *useful information* on a screen. I'm glad I switched to linux a long time ago, else I'd be a very sad person these days without at least some level of control on my GUI.
Anyone tried to change the system/menu font size on a mac yet?
Re: (Score:2)
The ATI drivers support Xinerama and RANDR.
In multi-card/X Screen (:0.0) mode, you only have Xinerama - and minimal configurability.
In multi-card/multi-screen mode, you have RANDR per screen + Xinerama extensions for glass layout.
If the above sounds like some obscure language, it is. The X code is only really understanding 2 heads with lots of X developers trying to kill multiple-GPU configurations (thanks Intel). It's not pretty, but the skeleton is mostly there. Look for comment the comment below http: [slashdot.org]
Whatever (Score:1)
Massive bezel makes them incredibly annoying if not flat out unusable.
Re: (Score:1)
Missing from the summary... (Score:5, Interesting)
What's not mentioned in the summary is that, if the game properly supports it, the screens on the right and left of your setup get tilted inwards a little and your field of view is increased by 3X (assuming a 3 display setup). This means that you get all the view you would normally get on the central screen and a massive amount of the peripheral vision that we all enjoy in real life by never get in gaming. Is there a gap from the screen bezels? Sure, but you barely notice it because you don't focus on the left and right wings. You just focus on the central display and use the other two to detect motion you wouldn't have otherwise seen (such as the enemy approaching you from your left).
Re: (Score:2)
That also means the periphery monitors don't really have to be of the same "quality" as the main one; lowering costs even more (which only strenghtens that. in "price per pixel". it's cheaper to buy three average screens than one with a massive resolution)
Also, people don't seem to mind "screen bezels" much when wearing glasses or drivinf a car; heck, not many cut off their nose so it won't be obstructing their field of view...
Re: (Score:2)
Re: (Score:2)
I imagine that in the case of RTS, the extra screens could be better used with auxiliary screens (such as construction and stats) and data (as they already do with some WOW add-ons and Supreme Commander).
I'm not sure if it was Supreme Commander, but I remember one RTS where you could point to a place and make a mini-screen out of it, so you could keep an extra screen with your base, the enemy's base, etc.
More screen = More resolution = Advantage. (Score:1)
No matter how big your TV, it's almost a certainty that 3x low res monitors would have more visual real-estate. More pixels.
You see more, you literally get a larger field of view.
You have a significant advantage if your resolution is higher and the game supports enhanced FOV. No TV, no matter how good, or no display no matter how big, can show you more.
One HDTV screen is too tall (Score:2)
3d helmet. (Score:1)
Re: (Score:2)
Some time ago, there was a university experiment combining a high-resolution (centered) display with a large-surface but low-res projection, for peripheral vision. Never heard of it again, so apparently it wasn't that successful. IIRC, the high-res part was fixed, though.
Small 1080p displays should be available, there's plenty of LCD projectors. Possibly not in an inch, probably not too cheap.
Linux users...screwed again (Score:5, Informative)
According to ATI, support for Eyefinity on Linux will be enabled by a 'future Catalyst release'. Three releases of the Catalyst driver have come and gone since I got my Radeon in February, and they still have zero support for Eyefinity on Linux. Which is irritating as hell, because the famed YouTube demo of Eyefinity running a flight sim on 24 screens was a Linux box.
Some days, it really sucks to be a Linux zealot. This is one of them.
Re: (Score:2)
Have their Linux drivers improved yet since their announcements from a few years ago? I remember their drivers would be improved/better. From what I read and heard, NVIDIA is still better. Also, what is the status of their opensourced drivers?
Re:Linux users...screwed again (Score:5, Informative)
Obligitory link - http://www.youtube.com/watch?v=N6Vf8R_gOec [youtube.com]
24 Displays done under Linux - on October last year. The drivers were carefully teased into that condition, and so the tech is on it's way.
Be aware that the RANDR/Xinerama maturity in Linux is weak, so it will take a few years for it to be able to handle >2 - note that it's take almost a decade to get reasonable 2-head support...
Re: (Score:2)
Some days, it really sucks to be a Linux zealot. This is one of them.
It won't make you feel any better, but nVidia is just as lame. I have a GTS 240 and it was unsupported by several driver releases which came out after the card hit the streets. I had to use an old driver release under which IT worked, but VDPAU didn't.
No Displayport == No luck (sortoff) (Score:3, Informative)
Re: (Score:1)
Re: (Score:2)
No, active DP -> VGA converters are much cheaper than active DP -> DVI converters for some reason but passive adapters will do you no good.
Re: (Score:2)
I recently bought a Radeon HD 5670 for $100. It has Eyefinity tech "with support for up to three displays". "Yeah, so what?" i was thing.. not exactly a new thing. Well, tricking the OS into thinking multiple displays are actually one? NEW! Glad for this ./ article.
PS, card made by XFX and purchased through TigerDirect. Great so far but had a TERRIBLE driver issue with crashing, had to roll back to an older driver.
Re: (Score:2)
Well, tricking the OS into thinking multiple displays are actually one? NEW!
Old, actually. The Matrox TripleHead2Go has been out since... 2007, I believe?
But now it's integrated into affordable videocards, so that's something.
As an Eyefinity card owner... (Score:2, Interesting)
Having 3 x 22" 1680x1050 Dell monitors side by side playing Hawx or WoW or any other game is absolutely stunning.
The Catalyst interface is a bit quirky (profiles do not remember relative screen position so you have to specify each time you change profiles) but once you have it setup and get into a game, choose your insane 5040 x 1050 resolution, you will be blown away.
Bezel gap is not as much of a problem as you might think. Your brain kinda just adj
Neat idea but not that useful (Score:2)
There are three major problems you face:
1) Whatever amount you were willing to spend on a monitor, you must now spend 3 times that. It requires 3 monitors of the same resolution, and to look good they need to be the same 3 monitors. That means your budget has to triple. So the argument of "Well there are cheap monitors," doesn't hold weight. If you were happy with a cheap monitor, you probably didn't want to spend much anyhow and now you need to spend as much as a more expensive monitor. If you like higher
Re: (Score:1)
Re: (Score:2)
No not really. I have a 5870 and quite like it, but I'm not buying 3 monitors. Desk space is part of it but money is most of it. I like good monitors, and I'll pay for one. $1000 was not to much to spend on a monitor, same as I spent on my HDTV. I am not going to spend $3000 on monitors though, that is out of my budget. I am also not willing to step down to inferior monitors. Then of course there's the performance issue. I like my games to look good and run smooth. My 5870 can do that with my single display
Re: (Score:2)
Dude you really need a 5970 minimum for decent triple monitor and even then you're talking mid range frame rates with some games ;)
WIth attendant PSU, maybe new case due to size.... etc. new desk? monitor stands? etc.
My mate has a 3x24 inch setup with a 5970 and it is... expensive. AND his FPS is pathetic compared to say 5850 @ 1920x1080.
But fudge, it is frickin awesome
what's the point (Score:1)
Re: (Score:1)
Re: (Score:2)
If you are interested go look up Sensics.
I would go multiple screens (Score:2)
Most of the activities I perform work better with multiple screens simply because I can have applications maximized on separate screens. Whether it be surfing the web, working with spreadsheets, or debugging applications.
As for gaming, a single, large screen would be fun. Add in left & right screens and it's even better.
HDTV with more than 1920x1080p ? (Score:2)
But would you take that over a single high-resolution display or a giant HDTV?
Yes, but point us to a mainstream HDTV which has indeed a triple resolution?
Most screens seem to be stuck at 1920x1080 [slashdot.org] (ob xkcd [xkcd.com]).
As the LCD Panel stagnates, an obvious work-around is to glue several of them together. Ergo EyeFinity.
Re: (Score:2)
I never understood that comic. You couldn't get that high of resolution content outside of owning a 35mm print until we had high definition distribution mechanisms like blu-ray. Why shouldn't that be exciting?
Large displays used to be many times the overall size and cost, why shouldn't that be impressive?
I own a Dell U2711 but we still watch movies around the house on my roommate's Epson 8500UB. Resolution isn't the only factor in what makes a good watching experience.
When will we get actual high-res displays instead? (Score:5, Interesting)
An actual high res monitor would be better than any of these supposedly "HD" screens kludged together using expensive GPU's.
I do have a 22.2" 3840x2400 IPS display (ViewSonic VP2290b), it's from 2003. It's driven by two DVI ports of a regular GeForce 8800GT in my Mac Pro. Additionally, I have two low-res (1920x1200) 24" screens connected to another GPU for video and games.
IBM sold their monitor factory to Sony around the same time they sold their ThinkPad business to Lenovo in 2005.
Since then, the meaning of "HD" has been just 1920x1080, just 22.5% of the resolution these 3840x2400 displays have.
Here's a wikipedia article about them: http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors [wikipedia.org]
Re: (Score:3)
Unfortunately, finding one of these magnificent monitors is damn hard and they still command rather high prices (although nowhere near the original ~$7500 price tag).
Dear monitor manufacturers, I just want a 200+dpi monitor, is that really so hard to understand? 100dpi is stone age technology compared to the massive leaps forward every single other piece of hardware has experienced.
Even the lowly computer mouse has gone from low-res two-button models hindered by the low update speed of the serial port to mo
Re: (Score:2)
Why are we still stuck at 100dpi?
We aren't stuck at 100dpi. The monitor quoted by otuz is about 200 dpi. If you are asking why they aren't more common, and why they are so expensive. That is becaus they are:
- Difficult to manufacture
- Unsupported by most software
- Pointless for 99% of applications
- Require high-end hardware to even make use of it
You can't compare the DPI of mice to DPI of screens. To increase the resolution of a mouse you don't have to increase the density of the sensors. Creating high resolution LCD screens is not tr
Re: (Score:2)
My point is that the average monitor is still stuck somewhere below 100dpi for no good reason.
Laptops are available with 145+dpi displays, some smartphones have displays in excess of 200dpi and yet the average desktop monitor has only moved from about ~75dpi to less than 100dpi in the last 20 years. Why can't I buy a desktop monitor with the same pixel density display as a 15.4" 1920x1200 Thinkpad?
- Difficult to manufacture
- Unsupported by most software
- Pointless for 99% of applications
- Require high-end hardware to even make use of it
- Somehow the panel manufacturers make it work for laptops etc.
- It's just higher resolution, nothing fancy abo
Re: (Score:2)
My point is that the average monitor is still stuck somewhere below 100dpi for no good reason.
My point is that 1) they aren't stuck, and 2) there is good reason for them to be behind.
Laptops are available with 145+dpi displays...Why can't I buy a desktop monitor with the same pixel density display as a 15.4" 1920x1200 Thinkpad?
You can. And they are expensive. Just like laptop displays are expensive.
- Somehow the panel manufacturers make it work for laptops etc.
Yes, just like they do for desktops. It's exactly the same - laptop displays are really expensive. This is why.
- It's just higher resolution, nothing fancy about it until you reach the limits of DVI etc.
There's a lot that is fancy about it. Memory usage, bandwidth, cost. Monitors work over more than just DVI connections. Manufacturing yields decrease by the square of the pixel density. So the higher the resolution, the harder it i
Re: (Score:2)
Where are these mythical 150+dpi displays sold? I have yet to see any for sale outside of the bizarro-pricing world of medical displays etc., which goes far beyond what I'm guessing you meant by "expensive".
But it seems you'd rather insult me than offer any geniune insight. It's funny how peo
Re: (Score:2)
But it seems you'd rather insult me than offer any geniune insight.
I didn't insult you because you want high-DPI displays. I insulted you because of your lack of reading comprehension.
Where are these mythical 150+dpi displays sold?
The very first post in this thread links to one. That's what started the discussion.
I have yet to see any for sale outside of the...
Aha! So you even know of them.
I want print-quality text on my monitor and I'm willing to compromise on a lot of other parameters to get it,
Except for price apparently. Every post you've complained that they are expensive. And every reply I've explained why manufacturing high DPI displays is pricier.
Re: (Score:2)
I am willing to compromise on price, but paying 10 or 20 times what I'd pay for a 100dpi display? That's just ridiculous.
Re: (Score:2)
Where are these mythical 150+dpi displays sold?
The very first post in this thread links to one. That's what started the discussion.
These aren't really available anywhere. There are just every now and then someone who sells a single display on ebay and they still sell for thousands a piece.
Re: (Score:2)
Unless you're watching video, the drawbacks you mentioned aren't particularly serious. For working with large amounts of text, pictures, graphs etc. or photo editing, you'd probably never even notice. Besides, the T22x monitors were first introduced in 2001. 9 years of semiconductor development should be able to get us markedly better response time (the 41hz refresh rate is perfectly fine for anything but gaming or 60fps video).
As you wrote, it's perfect for photo and graphic design work, why hasn't high-re
Re: (Score:2)
The display response time isn't worse than other displays from the era. The refresh rate isn't actually too bad, a regular graphics card drives it at 33.8Hz nowadays and I'd guess the largest limiting factor at the time was the driving electronics (dual-link dvi wasn't invented yet etc).
Sony probably also profits from the technology they bought from IBM by making 200dpi cell phone displays better than larger displays. After all, most people aren't demanding anything better than "HDTV", because most don't kn
One huge screen (Score:2)
Huge screen with huge resolution is probably the way I'd prefer and the technology is there, but for some reason ever since HDTV the resolutions themselves have been going backwards. And yes, of course there is an xkcd on the subject: http://xkcd.com/732/ [xkcd.com]
Minor editing nitpick: Does it have problems AND potential, problems OR potential, problems WITH potential or potential problems? I suspect it has problems BUT potential; this would be so more clear and less lazy than a comma ;)
Re: (Score:2)
Oh dear I made a right mess of that last sentence without using Preview. Serves me right for acting like a grammar nazi that it should, as per usual, backfire!
So as not to stray off topic, to add to my previous points, there are some very impressive and incredibly immersive looking setups there. The problem I have with it is they put the "window" into Windows. All those frames look terribly distracting. What's to stop any of that being manufactured in a single screen without much more expense than buyin
Re:Three huge screens (Score:1)
Why decide based on either/or? Go for "and": 3 screens, all huge!!!
As far as bezels go, it shouldn't be much more distracting than the A-pillars of your car.
If you want to eliminate them, you can set up projectors and align them.
57.2" is almost too wide (Score:1)
Frame Sync (Score:2)
Didn't RTFA but - requires support from games (Score:2, Interesting)