A sad day indeed, at least for me. I inevitably end up running some flavor of linux on almost all my computers sooner or later. ATI's support for linux, compared to nVidia, has been piss-poor to say the least (certainly in my experience).
I have always recommended and bought nVidia cards for this one reason. In fact, we recently went with nVidia's QuadroFX 4800 cards for 5 new workstations (requiring stereo vision on linux/macOS). Want to guess the biggest factor why no one on the team even dared think ATI ? ... The (almost always) nightmarish experiences with ATI's low-end 'consumer cards' on their personal linux machines led to a lack of faith in any ATI product.
Please note, as far as linux support goes, I am talking about the "official ATI closed source drivers".
The open ATI drivers really are improving. From my own experience, my home desktop with an nVidia 8800GTS can run for about a week before the display starts getting crazy amounts of corruption, and KDE grinds to a halt. My work machine, with an AMD HD4650 card, runs smoothly and never has any issues. Support for KMS is also a plus. I don't really do much 3D stuff on either machine though, so I haven't been noticing nVidia's generally superior OpenGL support for a while. Both machines have dual 1900x1080 monitors, and the home machine is a lot beefier than the work one.
As long as you don't need 3D acceleration, they are the best choice. In distributions like Ubuntu, they are built into the default install, so your display will be working perfectly even on the live CD. Also, their multi-monitor support is fantastic.
This makes it pretty pointless to have discrete ATI graphics under Linux. I have an integrated ATI graphics card in my desktop and discrete ATI graphics card in my laptop. They both have the same use cases; they can handle basic desktop functionalities (which is a huge step from fglrx) but not any serious 3d.
No, I have 2 recent cards, both running Radeon. From what I understand, the 2 projects have mostly merged, and in the future, Radeon is the driver to be used for all cards.
Unfortunately, this is not my experience. Just this summer I installed Ubuntu 10.04 desktop on a computer running a relatively new ATI card (It was top of the line a couple years ago, I believe). While in general use its fine, there are plenty of really annoying issues. Unfortunately for us, as we were attempting to use it as a media center, there were significant issues with full screen video, where it would freeze up for 5 - 10 seconds at a time every minute or two. This is a known issue for the entire series of cards, and is a driver bug, but hasn't been fixed in the 1+ year that the issue has been open. Then there are other smaller issues with the card such as a 5 second delay maximizing a window, if you're using compiz. Not a deal killer, but still annoying, and there were a few similar bugs like this. After this incredibly sour experience with ATI, I'm not sure I'll recommend them again for a long time. And I haven't even gotten into how long I had to fight with X just to get dual monitors to work.
At some arbitrary time in the past I did use the xorg server from this ppa https://launchpad.net/~ubuntu-x-swat/+archive/xserver-no-bac... when I was using compiz and things were slow on an ati card. I dont know if that ppa is updated for lucid but you could give it a try. If youre using it as a htpc you can disable compiz no ? I mean in most cases wouldnt you want to directly boot into mythtv / xbmc ? Also can you specify what card are you using ? If its x1xxx series or earlier then youre beyond the support window (the closed source driver is only for newer cards) and you might be better off using the radeonhd driver. I think I should do a write up on how to setup ati drivers on linux the right way. It takes < 5 mins to get a dual head display with auto detection going these days. HINT: aticonfig is your friend.
I think youre in for a surprise if you try the new drivers! On ubuntu the install is seamless, even otherwise the installer just works and ati has pretty simple interface for doing colour balancing of multiple displays.
Let me disagree with that. The open drivers don't support 3D, and the closed drivers make X crash at least once a day on my setup and similar ones. I have a laptop that is just one year old. All in all, I always got a better Linux experience with NVidia.
The open drivers do support 3D. You won't get the same performance as the closed drivers, but you will get accelerated 3D plenty good enough for desktop effects and simple 3d games.
Are you using some funky compiz plugins ? What distro are you using ? When did you last update your drivers ? What laptop are you using and what card does it have ?
I think that ATi is going in the right direction with their Linux drivers: the newest version has support for 2.6.34 and Xorg 1.8 out of the box, whereas with past versions hacks were necessary.
nVidia has closed source drivers too, no? I think your best bet is the Intel stuff. If I recall correctly, they produce actual open source drivers that end up integrated where they need to be, rather than as some binary blob you have to download.
Of course, I don't think the Intel ones are high end, but all I need is browsing and emacs, really.
There have been problems with the latest intel chips on Linux, and even on well supported chips I've had trouble with weird resolutions, multi-monitor, etc. I'm currently using an AMD 785G chipset based motherboard which comes with a Radeon 4200 - the open source drivers for this actually work, including enough 3D support for desktop compositing. It's more stable than the proprietary driver for my old GeForce 8800. (no crashes or corruption so far)
That said, Radeon 5xxx series support is in its infancy.
For best results on Linux, usually the latest anything is best avoided, while issues are worked out. It depends though, and things are improving a lot.
True, although intel have been very good about this in the past, and to some extent still are - there are often drivers in the kernel and in X.org for IGPs which haven't even been released yet.
I've also run into regressions on older hardware that no one cares to trouble-shoot because not enough people are on the setup (e.g. the soundcard on a PowerBook 667MHz for example).
As a corollary to this, I am also curious about vitamin D levels in people using UV tanning beds.
Assuming equal carcinogenic potential from sun exposure and from such beds (which is most likely an incorrect assumption), the results should be insightful.
The absolute amount of Vitamin D needed is quite modest, and the risk of cancer is (typically seen as) proportional to sun exposure; hence the recommendation for some - but not much - UV exposure.
Indeed. The D3 form is one you want. I actually had a endocrinologist prescribe the D2 form. Luckily my nutritionist has been following the literature for years.
And that incredibly low standard is also how the FDA set the "recommended daily allowance." Similar things are true for most other vitamins and minerals, as I recall.
Also, D2 causes overdose effects pretty easily, but you can take big multiples of the "RDA" in D3 (I think 10,000 IU/day for months for vitamin D deficiency has been mentioned) for quite a while without overdosing.
> you can take big multiples of the "RDA" in D3 (I think 10,000 IU/day for months for vitamin D deficiency has been mentioned) for quite a while without overdosing.
That's actually a problem, as people can become accustomed to taking large doses and then not understand why problems are occurring when they start way down the road.
It could work.
They could add a simulator-esque cursor without too much technical trouble.
That said, it shouldn't.
Why would anyone prefer indirect manipulation of a cursor over direct manipulation of the screen elements? If you've got an iPad in a keyboard dock, you're primarily writing and occasionally navigating. I don't see the use case.
Gorilla-arm would only set in if you were primarily navigating, in which case you'd just undock the thing.
The fact that it is closed source should not be a problem unless you are RMS!
um, or unless you use an operating system or architecture for which they don't build a binary, so you can't use it no matter how well it works as advertised.
riak does support automatic sharding and enables you to tune your CAP paramters. Getting nodes in and out of a cluster is way easier with riak.
While riak does also support map/reduce, generating something like a couchdb's "views" will probably end up being a problem since listing all the keys of large buckets (needed for a view of all your data) is a really expensive operation in riak. Also: couchdb supports incremental views, so if the view is fixed, you'll only need to update the changed documents.
My main problem with couchDB was that in systems with a lot of changing data, the compaction was too slow for me. I actually changed data faster than the compaction could work on. Also: you have to compact that stuff yourself by sending an explicit http request.
would you be willing to write up how your writes were outrunning the compactor in a little more detail in either a blog post or in an email to the apache couchdb dev list?
we've been talking about possible improvements to the compactor and having a solid use case that outruns it would help focus that.