Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was going to get a 4090 this year, but I just don't think 24gb VRAM is going to bee enough in the short term future(for AI related stuff).

Ended up getting a 3060 for 1/4 of the cost, and I'm planning on using/paying for Colab until some 6090 comes out with 128gb vram.

Something that kind of bothers me that I don't understand. Why is there such an obsession about having small computers/servers? I don't care if I have to put my computer in the basement because its the size of a closet or two. Maybe there is an EE issue, like too much current or EMF if you make a large computer.

I'd usually use colab, for 99% of the jobs... but I admittedly want to do some nsfw stuff with me and the wife with AI Art.



I agree we need more VRAM. I wish someone would combine a GTX 960 level processor made on modern lithography (to cut power draw) with 32GB of low power GDDR VRAM and try to fit it in a single slot. 1x DP connector would be fine.

I agree about small computers. The rich folks in NY City and Silicon Valley set the trends, and real estate is expensive there. Also, graphic arts people make advertisements, and they want PCs to look cool regardless of their power. If you want a good, big system, take a look at used Xeon-based workstations. Sometimes a used HP z840 will sell for as little as $400 with an 1125 watt power supply.


They made a 2 slot, 24GB ram card with the same chip as on the GTX 980 Ti, called the Quadro M6000.

They're still around US$650 on Ebay though, unlike the 12GB variant of that card (much, much cheaper). Random example:

https://www.ebay.com.au/itm/255069557859

---

Digging around a bit more, it looks like there's a "Tesla M40" card of the same generation with 24GB ram. Seems to be quite a few of those on Ebay for around US$150-200.


As far as I know, the Tesla M40 is server "only" (it doesn't have sufficient cooling for a consumer desktop)


Yeah, that's a good point. Did a bit of looking into them, and people do use them in desktops but need to add some cooling.

Some places now sell them with 3D printed cooling shroud and fan already pre-assembled:

https://www.ebay.com.au/itm/125971572492

If I had the need for something with 24GB ram (even though slow), I'd probably get one of those. It's a fraction of the cost of the alternatives, and is supported by CUDA still.


Dell T7810 or 7820 are also good candidates


> Why is there such an obsession about having small computers/servers?

Space. Most people don't have room for a closet sized computer, or if they do they can't justify using the space that way instead of for something else. Also cost of materials for a case: to sell generally you need to properly deal with EMI.

Most of the cost is in the RAM and CPUs/GPUs, which beyond a certain point you simply don't need a bigger unit to fit in. If you are wanting to build up more compute power the usual way is to get a rack in that cabinet worth of space and put separate machines in there (that generally means low profile high-density kit with cooling solutions that are not optimised for quietness, but I'm sure water cooled options are available too).


I did go with the 4090 and so far don't regret it. It has forced me to learn about some memory reducing tricks when I want to fine tune SD, but it's been fun to see what can be done.

While it may not be enough VRAM for the future, I'm sort of taking the opposite stance: I'm only going to be playing with AI that I can build on my own machine using (high end) consumer hardware.

People have been creating amazing work with only SD 1.4 and 1.5. If this is the limit of what I can do with SD for the next few years, so be it, there's more than enough to play with and I'm personally not willing to fully cede the future of AI to large corps. There needs to be hackers in the space pushing the limits of what can be done (without worrying about copyright, TOS, etc).

Maybe the best stuff will be locked behind a server, but I'm much more interested in the creative possibilities of running projects (nsfw and otherwise) on my personal hardware.

That said, I think it's great to extend that logic to less price pieces of hardware (i.e. I'm only interested in what I can build at home on 8gb of VRAM, etc)


I think you need only 12GB to train a DreamBooth model? I trained some on Collab over the Christmas break but just being able to run that locally would open a lot of doors. Using collabs is just kinda a pain.


> Why is there such an obsession about having small computers/servers

More likely just cost. If a "small" computer costs $2k, how much do you think a "big" one will cost? $10k+ probably, what is the market for that?

And you can buy a rack in your home and put 10 servers in it if you really want a closet sized computer.



You'd expect manufacturing costs to go down.


Most of the cost is in CPU/GPU/RAM.


Ahh, most of the cost is RAM. Then GPU then CPU.

Look at a Dell or HP server configurator website to see why.

If you "only" need GPU, a craigslist search should find a less expensive 2 or 3yo server to plug your latest GPU into.


Ahh, most of the cost is CPU. It’s so expensive to have register memory, that you have l1 l2 caches. Each further away and cheaper. RAM is the cheapest which is why you can have so much of it. Try having 16G of register memory. UMA is a step towards that, but not quite.


Yes, so if your CPU doesnt need to be the size of a box of cards, your GPU doesnt need to be the size of a library book, the manufacturing costs would go down.


It gets harder to build big things with small transistors, not easier, due to yield.

Also if you want to use big transistors, power dissipation also means you have to stay small or you will start fires.


This is why I like the "open case design." Your "PC" is just a frame to hang components on, with nothing to enclose it. And the nice thing is these days people care a lot about quiet computers, so you can have an open frame case with three GPUs that is just making a not-unpleasant whooshing sound at full bore.


This exposes the components to more dust and potentially static shocks as a result. Cases are also designed to encourage airflow over specific components, so being case-free might not result in improved thermals.

But it is certainly more convenient for tweaking your build.


That's vastly overblown. I'd be more worried about pets or just accidents knocking some things off.


I highly doubt open air is going to drastically reduce component life. I don’t see how open air is anywhere near as bad as some of the negative pressure and unfiltered builds typical of OEMs for dust build up. I don’t even think dust causes “static shocks” in computers, it only clogs ventilation leading to overheating primarily.

Open air certainly is a choice and cases do function to better guide airflow when designed and built in correctly


I doubt it as well - for the most part - but if you're buying high end parts, I think it's hard to argue that it doesn't expose (literally) the components to more risk.

My understanding of the association between dust and static shocks are that the friction between dust particles necessarily creates static charges.


I have a wall mounted Open design case (Thermaltake Core P5) - dust has not been an issue at all. In fact, getting rid of dust is lot easier. Since I don't have to open up the case to clean it out. I just use a handheld blower to blow out the dust every so often from the sides. Regardless, it for some reason has far less dust pile up than a close case. I assume because in close cases, dust has no room to move out as easily.


Most cases are not designed to any of that stuff particularly well. Cases simply cannot predict what people are going to put in them To make any sort of detailed modeling useful.

Most cases rely on moving a bunch of air. As long as you love enough air, you cool well enough.


It reminds me of those scifi villains with exposed brain sticking out the top of their head. You'd think it would be terribly vulnerable, and it is, but the cooling and dust-managenent advantages makes it worth it.


I think it has a lot to do with living situation. Most of us don't have basements or many square feet to spare anywhere.

I built a system on one of these open frames back in 2015 when I was living in New Mexico and kept it in my grad student office: https://www.amazon.com/gp/product/B009RRIP86 . It was great (especially for swapping components), but then I moved. Living in a small apartment here in CA, that thing was pretty annoying. One time, I got egg on the motherboard. Long story, but I projected the first Trump-Hillary debate onto an old bedsheet on the wall, had friends over, and supplied eggs. Let's just say I was surprised at how far an egg can splatter. I did my best to clean it, but I had recurring issues with the ram slots following that incident. Last week I finally got rid of the whole system after its long and prestigious career.

Also, I kept it under my desk for a while and had a bad habit of sticking my feet in there and causing problems.

So, advantages of a closed case besides small footprint:

- Raw egg does not splatter into the ram slots

- You can't put your feet in the computer


Most consumer, prosumer and pro hardware has to have the EE/EMF figured out to be able to sell.

Other things to setup your own gear that are worth looking into.

Factoring in electricity costs is important too long term.

Beyond this, running hardware long term:

- If you run your hardware using a PDU (Power distribution unit) that conditions the electricity and keeps it stable, there is much less wear and tear on the electronics, especially transistors, etc, that love to pop.

Lots of gear out there like this: https://www.apc.com/us/en/product/AP7921B/apc-netshelter-swi...

- Cooling without a fan is most certainly only so good and more dependant on how well the closet or the surrouding environment can cool. It's almost certainly better to use a good case with fans, and add some cabinet fans, for example by AC Infinity

https://www.amazon.com/s?k=ac+infinity

Is the closet sealed? Maybe port a hole out and run a small few items like a small stand up AC that can be temperature activated, and something to handle the humdity one way or the other depending on the basement.

For setting up your own space, places like ebay are your friend. Industrial grade equipment is not made to fail.


> - If you run your hardware using a PDU (Power distribution unit) that conditions the electricity and keeps it stable, there is much less wear and tear on the electronics, especially transistors, etc, that love to pop.

> Lots of gear out there like this: https://www.apc.com/us/en/product/AP7921B/apc-netshelter-swi...

Those distribute power. They have no kind of conditioning on board which you'd know if you read the link you provided. Those are just so you can remotely switch stuff on and off, any protection in datacenter would be in electric cabinet.

And you do want to have any protection, especially against overvoltage/lighting as far as possible from the device


In my case, I just realized the last time I used one of these was in a data centre where electricity was conditioned by another piece of equipment upstream from it.

It would be helpful if you provided a link to something that did condition the electricity since you know about that instead of dropping a mic. Thanks!


Why wouldn’t you use colab for nsfw? Do you really not trust Google Cloud that much? Or do they have some kind of policy I’m not aware of (can’t imagine how they would enforce this)?


As always, the red carpet is rolled out to welcome all new users into the funhouse, but with time, the carpet is withdrawn, the doors are very slowly bolted shut, and then the house master starts to look under the beds of all the guests who don't fit a particular group


Yes, I don't trust any offsite stuff. (Remember PRISM?)

Me and my wife aren't even that obsessed with nudity and stuff, but given how nsfw stuff comes up in media and politics, I'll keep it offline until nudist colonies are the norm.


You apparently can't use anything to do with automatic-111 webui from colab, don't remember the exact restriction but I hear even if you do a simple, print with the word it freaks out..


You can with paid Colab. You just can't on the free tier.


cool, I have the pro, haven't tried it myself.


Google Cloud really freaks out when you ask it to do weird stuff with porcupines.

My wife and I just want to do weird stuff with porcupines, damnit!


If you want more vram, the nvidia pro cards is always an option(if you can afford it). The top spec has 48gb of vram, and you can get more than one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: