>the usual rule is that if your cabbie starts talking about it, it's time to get out of the market.
>If your cabbie understands BTC you might have a point. We aren't even close to there yet
The point is that the cabbie likely doesn't understand BTC. When an asset class becomes so popular that people who have no idea how it works are putting money into it and loudly campaigning for others to do the same that is a sign that you are in a bubble.
It doesn't necessarily mean that the bubble is about to pop. Some bubbles can last decades. See real estate in a variety of locations around the world as an example. However lot of dumb money (people who don't understand what they're buying) flowing into a thing because of fomo can be a predictor that prices are unsustainable.
"When an asset class becomes so popular that people who have no idea how it works are putting money into it and loudly campaigning for others to do the same that is a sign that you are in a bubble."
It absolutely isn't a sign a of a bubble. There aren't many people that understand what they are investing in be it currency or Google.
The success of the investment has nothing to do with the cabbies understanding. Taking his advice does.
Have you participated in any bug bounty, responsible disclosure type stuff?
>a gift card phone bot that calls gift card phone lines, bypasses captcha, and transcribes balance.
It probably doesn't but does that company have a mechanism for people to report security problems?
This is partly about what you do with what you find out or build. If you discover some loop hole or vulnerability report it to the company then publicly publish your code etc once the security problem is fixed that is very different to finding a problem and directly using it to acquire money.
>because they only solve the problem that the PR submitter needs solved right now.
This is something that can lead to angry interactions between maintainers and pull request raisers. Never mind adding technical debt, pull requests can outright break uses cases they don't care about in order to implement the single use case they do care about.
raiser: "Merge my PR. It fixes this issue."
maintainer: "It fixes this single issue but you've broken this for everyone else."
"Possibly" in that comment refers to the possibility that the GGP comment was referring to the native peoples, rather than the meaning you took from it.
But how much of that land was actually "owned" by the people who actually first settled it? Didn't they have their own disputes and conquests?
Anyway the idea of land "ownership" is kind of stupid in the first place. You can't "own" land, you can just possess it under the rules of some kind of system of power.
100,000 years ago for most of the Old World, 40,000 years ago for Australia, and 6000 years ago at the maximum for Southern America's (12kya for North America).
The rule of thumb I've picked up somewhere is that for hard cheese you can just cut mouldy bits off. For soft cheese you should throw it out.
This is of course talking about the kind of mould that grows on it while its in your fridge. Not the outer layer some cheeses acquire during production.
IIRC, all cheese has mold. Some of the mold is safe and/or desirable to eat (blue cheese's blue is mold).
But those soft cheeses can grow other mold: If it does this, you have to throw it all out as it could be dangerous. In general, you don't want mold where you didn't expect it.
Sorta. Fresh cheeses, like curds or paneer, don't have significant bacteria or mold colonies, outside whatever natural stuff happens to be on all food. All aged cheeses have bacterial colonies, that's what does the aging, fermenting the milk's sugars into acid and other tasty byproducts. Some cheeses have mold as well, but not all. Usually it's on the surface, and breaks down the milk's proteins which turns it soft. That's why when you slice into brie or camembert, the outside is the softest part: the white, fuzzy mold on the outside has done its work on the proteins there. Blue cheese has holes poked in it where they insert the blue mold spores, to more evenly colonize the entire cheese instead of the outside.
Many cheeses actually benefit from acquiring wild mold colonies. There are ways to tell which molds are "good" and which are "bad," which home cheesemakers can learn. Here[1] is a properly aging cheddar, wrapped in bandages and allowed to mold. The mold gives some flavors, but more importantly dries out the cheese's exterior and acts as a guard to prevent nasty bacterial infections from getting into the "good stuff" in the middle.
>I feel like we would have a hindsight to design it better, faster, and cheaper.
Maybe I'm just pessimistic but I suspect the opposite would occur. It would be buggier, slower and more expensive once you started trying to operate at any kind of scale. The reason being that hidden within all the accumulated layers of cruft are a million fixes for specific security, reliability and performance issues discovered through real world usage.
That's a pity. Wouldn't it be nice if we somehow could separate the cruft from the craft?
What if some computational deity sifted through our collective codebase and distilled the essence, the minimal, purest kernel? Such minimangel's task would be daunting and probably useless, since the moment we'd throw some new use cases and requirements, NIH and patents, POCs and MVPs and we're back to the simmering mess in no time.
Those hidden snippets are not the cruft. Every ugly hack we would find is likely a required use case and the whole mess is the craft. Maybe we should stop calling it a mess and accept that non-trivial software with on-demand requirements always take this form...
So lets call it detail instead. You cannot separate detail from the craft.
ok but let's distinguish smart things that were done to workaround past blunders
from smart things that you would have done anyways.
Both are "craft", but only the latter is "necessary" and would thus survive the "total rewrite" posited in this "thought experiment".
For example, the x86 is fully of crap; layer upon layer of smarts have been added to deal with design flaws that it was impossible to fix without breaking backward compatibility.
Backward compatibility was practically important, because of network effect (and indeed competitors lose the battle because they were not compatible with x86, or even with their past selves). This practical constrain caused smart people to build smart tricks on top of that foundation.
Sure, it required a lot of smarts to do etc etc but how'd you call it when you end up tunnelling a SCSI command inside an ATA packet wrapped in another SCSI command?
https://www.mayoclinic.org/healthy-lifestyle/nutrition-and-h...