We really just need a group like ConsumerReports, Labdoor, WireCutter or something that actually digs into the quality and claims of devices. Very few people are going to buy 30 competing options as well as a known good control and compare and contrast them in a scientific context, but that's what's needed.
I remember looking around for a Radon sensor for way too long. An electronic device, a litmus-like strip test, some analog crystal ball -- hell a mail-in lab kit bag of carbon -- but they all seemed inaccurate. The few people that "bought two" always showed heavy variance while the majority that bought and got a number *loved it*. Even buying two that agree often just means the products are consistently and predictably wrong, but most products don't even check this box ffs.
Reviewers that try and apples-to-apples all products by comparing cost, size, advertised features and ergonomics are doing no better than you or I opening up a package and appraising it. For awhile this Google dev was putting up extremely detailed amazon reviews for USBC cables based on their purported features v. reality and it was wonderful, but tucked away and undiscoverable.
The world needs multiple parties providing unbiased in-depth reviews, but there's very little will to pay for even one group to do something like this. Ironically, paying for information in the information age seems almost comical to today's consumers and while there's demand for good information, no one's willing to open their wallets. Labdoor moved from subscription-only memberships to "certified products" and affiliate links because everyone wants to know what's in their multivitamin, but no one wants to pay for an indepth review. We aren't cultivating a supply-side that works for us, but against us.
The result of this is shilling and fake reviews running wild. I mean hell, consumers will buy a product knowing full well they might get a shoddy knock off and yet it's $5 less and they can get it tomorrow so they'll roll the dice. Sadly, we don't live in a world where people are intimately concerned with quality and accuracy in their purchases.
For tools and other items that you might get from Home Depot, Project Farm is pretty much the standard now in my book. https://www.youtube.com/c/projectfarm
His video on Water Purifiers only measures one dimension of water purification: TDS measurement. This leaves out a whole world of other aspects to consider when testing water and filtration quality. While TDS can give an indication of cleanliness of the water you can only glean from what you can measure, to this end the TDS measurement will not catch all contaminants in the water because it relies on electrical conductivity.
Also he does not disclose that his TDS measuring device is actually bundled with one of the devices he is testing. Should have used a third party tool to verify that it is giving an accurate reading instead of a unit bundled with one of the water filters as a marketing ploy. It gives some doubt as to how seriously he takes his other tests. These are the things you miss if you don't actually know any better(I assume the majority of his audience have not looked into what makes good water filtration before watching his video).
Nevertheless, the fact that he is really the only one doing these kinds of tests is enough to give him some sort of super reviewer credibility.
From his pod casts it looks like Linus (Linus Tech Tips) is building his lab up to be a massive consumer review machine, can't wait for that to happen. But I doubt they'd be able to tackle the massive amount of gadgets like USB cables, USB hubs, USB docks, chargers, air purity weather stations, etc. Plus their reviews often aren't in-depth enough for people to make their own mind up on stuff.
Linus has mentioned that part of the intent of LTT Labs/Lab 32 is to not just release video reviews, but to produce comprehensive datasheets on products with some sort of open-access release model. I think the video they did testing HDMI cables[0] is sort of a prototype for that.
Project Farm is poorly controlled, sample-size-one, poorly-designed-but-very-specific "testing." It's entertainment, not actual useful product testing, and a good example of this would be his "let's see how many bolts this impact gun can remove in X amount of time" - each time a bolt is screwed on and screwed off, it polishes the threads, changing the amount of resistance.
It's of slightly better quality than the usual youtube comparisons.
Channels like his are wildly popular because corporations prefer inexpert entertainers aka "influencers" and "creators" over product/market experts, so they shower the popular ones with free shit to "review."
This is one reason you see e-bike companies heavily courting the tech press; the tech press know fuck-all about what to look for in a bicycle, so for example, they don't notice that the bike has Chinesium bearings that won't survive being ridden in the rain, terrible no-name tires with little flat protection and high rolling resistance, and brakes made of chewing gum that perform much worse than proper offerings from major brands.
This is pretty wildly inaccurate specifically in Project Farm's case. Most "consumable" parts of tests, like bolts, are swapped out from test to test. From what I've seen he's pretty transparent when that's not the case.
He also explicitly buys all products with his own money (some by way of viewers contributing, of course) and products are sourced from what folks in the comments are asking for.
Scientifically, one should test multiple “identical” components for each test and average the results (or do something like a box and whisker plot) to incorporate manufacturing variability.
But his results are authoritative as the bar is low.
> and a good example of this would be his "let's see how many bolts this impact gun can remove in X amount of time" - each time a bolt is screwed on and screwed off, it polishes the threads, changing the amount of resistance.
He's smart enough to know that...
> Channels like his are wildly popular because corporations prefer inexpert entertainers aka "influencers" and "creators" over product/market experts, so they shower the popular ones with free shit to "review."
> while the majority that bought and got a number loved it
This seems to be Amazon's main unspoken strategy behind promoting positive reviews... many people want to be helpful and provide a review of a product that they purchased. That's great, but the problem is that most people are thoroughly unqualified to review things: "I took it out of the box and turned it on, five stars!"
(And I'm not even going to mention Vine Voice or whatever it is called, these are reviews you should ALWAYS ignore because they are given free products in exchange for ostensibly unbiased but implicitly biased reviews.)
Agreed, and that's if they review the product at all.
Lots of reviewers talk about the delivery, how Amazon Prime bungled the return or even a completely different product; all completely unrelated to the actual features of the product you'll be buying. Pair that with how Amazon allows companies to put completely different products on the same offering with "variants" that are again abused by companies to transfer reviews from a well received product onto a poorly received one. Most users just look at the five start v. one star ratio in the end anyway so an electrician going over a wire gets the same final weight as someone that confusingly put a question in the review box.
I've put up good quality critical reviews too and had them silently removed. Amazon may be an okay place for getting an item, but too many people use it as a discovery and curation vehicle and that's where the manipulation starts and ends. I get better product recommendations looking almost anywhere else than the Amazon search box.
There are lots of "favorites", but one of my favorites is when someone rates it highly and says that they ordered it, haven't yet received it, but are very excited to try it. So in other words, they have no idea, but they are so hopeful and excited they already rated it!
I would add that something like this often happens outside online shops for devices and more generally house appliances.
My theory is that many people are "proud" to have bought something, either for its novelty or because they managed to get the whatever for a lowish price/discount, or - on the opposite - because they already spent an awful amount of money so it must be exceptional, or because they read somewhere that the thing is exceptionally good.
Short of the thing completely not working, they are (in good faith) convinced that they got the best and talk endlessly on how good it is, and how nothing else is as good.
Once upon a time it was typical of many self-appointed Hi-Fi experts but in more recent years I have seen the same behaviour in many other cases, from vacuum cleaners to Tv's.
Both of these are caused by the emails sent out, where unsuspecting people try to do their best to answer the question or write a review - sometimes thinking they have to.
Yeah. Amazon didn't start the answers section like this. Once Amazon started emailing random purchasers to answer questions about the item they purchased, the quality of the answers went way down.
They'll do more than a discount. They have entire groups dedicated to giving out entirely free products in exchange for five star reviews. They provide entire catalogs of products available; you put in an order request(s) then get a confirmation to place the order. Then you buy the item on Amazon and leave your 5 star review. After the seller verifies your review you are sent full reimbursement typically through PayPal.
I got invited to join one of these groups from a friend and got tons of free products on Amazon this way. I remember getting a free home security system that was like $500 or $600. I never even took it out of the box, just re-sold it on Facebook Marketplace.
Eventually I left the group because I felt bad leaving these good reviews when some of the products were absolutely garbage. It's an unfortunate situation for the sellers though too; people don't buy products on Amazon that don't have reviews. So if a seller can't bootstrap a few good reviews early on then they won't ever start making real sales.
Wait till you see AliExpress reviews. Often it's, I received the package, yay it arrived (photo of package). I haven't opened the box or turned it on yet. 5 stars.
To be fair, just getting your order from Ali is a momentous achievement :P
Reviews are all like 'Such an improvement compared to my 10 year old 30 inch plasma 720p TV' or 'noce, it has Netflix'!
Not a single review mentioned that Android TV takes like 3 seconds to respond to any buttonpress on the remote because it's running a 8 year old ARMv7 CPU, basically like the first raspberri pi.
Since the 1960s we have "Stiftung Warentest" in Germany which does unbiased indepth product reviews and is a very popular magazine. They review more than 200 products every year and the tests are carried out by independent external test labs worldwide.
I bought one of their prior recommendations, the "Kaiterra Laser Egg+ Chemical", and as far as I could tell it did the "TVOC-to-CO2" lookup table conversion mentioned in this article, and so it would give CO2 readings that were inconsistent with same-branded CO2 sensor ("Kaiterra Laser Egg+ CO2"). WireCutter seem to have since updated the recommendation to something with a different type of sensor.
One of the reasons you may be having a problem with radon sensors is that most radon sensors will report results with a resolution in tenths of pCi/L (in the US) while the statistical variance in the range below 4.0 pCi/L (where most houses should be) is +/-25% or more for many common sensors. So you can have two readings which are e.g. 2.4 pCi/L and 1.9 pCi/L, and from the standpoint of a radon measurement specialist, these readings agree!
You might say, well, +/-25% sucks. You wouldn't accept a thermometer or pressure sensor which read +/-25%. And you are right, it does suck, but then try to design a radon sensor with a sensitivity of more than a few counts/hour per pCi/L but which is also affordable... you could spend $10k+ and get an Alphaguard which has a huge sensitivity and thus very little statistical uncertainty, or spend $25 on a charcoal kit which is +/-50% if you are lucky, or any of the options in between. If you buy a detector which does not explicitly list the sensitivity in the specs, you can be sure it absolutely sucks.
Because many detectors are measuring in single counts per hour, you have to measure over many hours (usually 48) just to get enough statistics to make a decent estimate of the radon level. Decent in this case being +/-10 to 25%. Then you add other sources of error like temperature, humidity, air speed, etc. If the sensitivity is absolutely abysmal, you may need to measure for weeks or months to get enough counts to know what your radon level is to +/-10%.
Often the best bang for your buck in radon measurements is to hire someone who has a really nice radon measurement system for a short term measurement. However, if your radon level changes over time then you get a good snapshot only of whats going on right then. For most homes this isn't a problem, but occasionally you run into a situation like one guy I remember whose well system had radon saturated water, so his radon was fine except when the shower was on.
It is an interesting thought to contemplate how one would disclose lightly radioactive water to a potential future purchaser of the home. I got curious and looked it up, apparently you can purchase an aeration system for $3k-6k to take the radon out of the water.
Unfortunately, there is no business model to support this. People don't want to pay to support unbiased product review sites. And as they say - "if you're not paying for the product, you are the product"
Project Farm has unsponsored empirical comparative reviews of multiple products for the same use. He has affiliate links for every product tested so he makes commission no matter which one you buy. Compare this to a one-at-a-time reviewer where the reviewer is motivated to sell each product as much as possible.
The only perverse incentives are (1) he makes more if you buy the more expensive one and (2) he doesn't make money from free/DIY solutions. 1 would require knowing the market really well and faking tests on video. W.r.t. 2 he's demonstrated good faith by including DIY and vintage options.
Was that single topic enough of a reason to throw away all the good content he makes about other things? It's not like his opinion on the value of a vaccine requirement will make him off base in his knowledge of machining and tools.
My dad has garbage political opinions, and is generally what I would call a low information voter who is swayed more by advertising than vote record, but he's still a very talented general contractor who can build anything and knows his stuff.
I pay for Consumer Reports. I used to love wirecutter - and then bought some of their recommendations that were garbage.
Consumer Reports sometimes overindexes to things I don't care about ("complicated technology"), but they usually put in the work and break it down well.
That's a new word to me, "overindex". Had to Google it, and that was kind of fun. It seems to be from economics, but nowadays a bit of a Microsoft term, maybe?
Here's Raymond Chen in 2017 explaining it [1]:
To over-index on an attribute means to give that attribute too much prominence in a discussion or analysis.
Also fun to note that Wictionary [2] does not mention this meaning at all, instead saying it means to index out of bounds in an array which given the technical usage sphere of the word is kind of amusing.
i think they grew too quickly and just hired some people that were "content" makers not real reviewers, who weren't spending the time required to actually understand, nor actually have any scientific method.
I wouldn't be surprised if some of their reviewers were getting paid on the side for their recommendations (it seemed like it might've been freelanced out reading some of the reviews).
I've seen some signs that they might be getting better... let's hope.
But Consumer Reports has always been great for me.
> Reviewers that try and apples to apples all products by comparing cost, size, advertised features and ergonomics are doing no better than you or I opening up a package and appraising it.
Unless you're buying every product on the market and load testing them in your garage, some folks do a lot better than that (e.g. project farm, or torque test channel).
But it clearly requires time and passion, and serious rigs for some of the tests (not everybody has a Skidmore lying around).
There are some good reviewers out there yeah. Sadly, they often rely on the creator's passion moreso than getting paid by their audience to provide a service. Youtube ad revenue and Amazon affiliate links are keeping these people's lights on more than the "thank you's" of their most ardent "supporters".
Google tightening the belt on Youtube monetization is probably, strangely, the best thing that could have happened to mostly-direct payments to content creators. Patreon and the like have made viewers at least somewhat more willing to pay out to access content. On the flip side, it seems to have also led to an explosion of more biased reviewers pushing their own discount codes and affiliate links to make up the difference.
There's a cultural misalignment somewhere. Journalism still hasn't cracked the nut nor has scientific peer review. Information curation and verification isn't cheap regardless of how much we'd like it to be so.
> The world needs multiple parties providing unbiased in-depth reviews, but there's very little will to pay for even one group to do something like this. Ironically, paying for information in the information age seems almost comical to today's consumers and while there's demand for good information, no one's willing to open their wallets. Labdoor moved from subscription-only memberships to "certified products" and affiliate links because everyone wants to know what's in their multivitamin, but no one wants to pay for an indepth review. We aren't cultivating a supply-side that works for us, but against us.
I feel like there might be more willingness to pay for this sort of stuff if people knew that they are being straight up lied to. Many people have a sort of vague "yeah probably the company is doing something scummy" attitude, but, at least in my experience, people still mostly trust that their devices do what they explicitly promise to do. Buying a TVOC sensor that doesn't actually measure TVOC at all somehow feels too outrageous to be true.
It's because we used to regulate against the ability to sell something which didn't do the main task it purported to do. Was that not the entire point of the BBB?
Here's one marketer's belief about its value in 2022:
"at this point, I believe the only people who still think the BBB means something are the older generation. And business owners seem to pursue BBB accreditation mainly so they can slap a logo seal on their site." - https://www.blogmarketingacademy.com/better-business-bureau/
Not sure what the BBB has to do with regulation? They are just a private organisation.
(And you are right, that they don't do a good job of what they are ostensibly trying to do.)
> It's because we used to regulate against the ability to sell something which didn't do the main task it purported to do.
If someone lies about what the product can do, can't you just sue them for fraud or so? (Do a class action lawsuit, if the individual damages are too small.)
Even if such a service existed, it wouldn't accomplish much. OEMs are free to change the design and internals of their products at any time and many have had no choice but to do so because of semiconductor shortages.
Oh, of course. You can compare external product numbers, but often these aren't updated when things change. That's not even getting into buying Levi 501's from Kmart will get you an inferior product compared to buying them from levi.com despite being marketed as the same product. Companies too will often abuse their brand and former good will in order to ship sub par products today that look almost like their products of yesteryear -- doubly so after an acquisition. I have a friend that is into Rubiks cubes and when asked which cube I should buy he just said.
> Any of them are great, except if they come from Rubiks
I still think being defeatist about this isn't helpful. More information, even if it's not complete and timeless is good. I think if this kind of information proliferated too, companies would have a disincentive to update what is loved, works and isn't broken.
Also, I realize the chip shortage is a real issue and has had genuine effects... still companies will find any external boogie man for why their products should get weaker, their turn around times longer and their products' sticker prices higher.
You can't forget the "but verify" part of "Trust but verify"
this is literally nonsense -- every electronic device in the USA sold to consumers must conform to basic standards, and stay that way. At face value, the reply says "you cannot really rate devices"
The most upsetting thing to me several years ago was that I couldn't find a way to verify a product's UL rating. The sticker on the product can say whatever it wants, but I couldn't find anywhere on UL's website to verify that the sticker was genuine.
You are just describing a company, a billion dollar company indeed, because I would buy from a company I could trust on these matters. Happy to pay the extra 10% or 20%, as long as you don't screw me on details, quality, etc.
I remember looking around for a Radon sensor for way too long. An electronic device, a litmus-like strip test, some analog crystal ball -- hell a mail-in lab kit bag of carbon -- but they all seemed inaccurate. The few people that "bought two" always showed heavy variance while the majority that bought and got a number *loved it*. Even buying two that agree often just means the products are consistently and predictably wrong, but most products don't even check this box ffs.
Reviewers that try and apples-to-apples all products by comparing cost, size, advertised features and ergonomics are doing no better than you or I opening up a package and appraising it. For awhile this Google dev was putting up extremely detailed amazon reviews for USBC cables based on their purported features v. reality and it was wonderful, but tucked away and undiscoverable.
The world needs multiple parties providing unbiased in-depth reviews, but there's very little will to pay for even one group to do something like this. Ironically, paying for information in the information age seems almost comical to today's consumers and while there's demand for good information, no one's willing to open their wallets. Labdoor moved from subscription-only memberships to "certified products" and affiliate links because everyone wants to know what's in their multivitamin, but no one wants to pay for an indepth review. We aren't cultivating a supply-side that works for us, but against us.
The result of this is shilling and fake reviews running wild. I mean hell, consumers will buy a product knowing full well they might get a shoddy knock off and yet it's $5 less and they can get it tomorrow so they'll roll the dice. Sadly, we don't live in a world where people are intimately concerned with quality and accuracy in their purchases.
edit: USBC not USB3