Seems more of a productivity killer rather than an aid but cats are great marketing. I see no reason not to submit this for YC funding in the next round
I don't want to criticize cloud flare, I love what they do and understand the scale of the challenge, but most people don't and 2 in a month or so like this is going to hit their reputation.
After being overly critical of Matrix the other day on here I have reeled back into another conclusion, is that talent issues are industry wide and it sucks making a bad hire where competence issues arise that don’t match the resume.
I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
Thank you for mentioning this. Not knowing the specs of a PS5, I'd assumed that the comparison was made because the PS5 now sold for less than the RAM it contains, and scalpers were now hungrily buying up all available PlayStation inventory just to tear them open and feast on the RAM inside.
But since it's 16 GB, the comparison doesn't really make sense.
It still is a rule of thumb, you dont need DDR5 for a gaming computer let alone 64gb. A low end am4 cpu + 16gb of DD4 3600 and a decent gpu will beat a ps5 in performance and cost. I dont understand why the headline made this strange comparison.
You do have the option to open up Discord voice chats on PS5. Amazing what Discord could do when forced to actually write something efficient.
Youtube also exists as an app, and maybe you can trick the heavily gimped built in browser to go there as well, although last I checked it wasn't trivial.
I can run the spotify electron app, discord and watch youtube on my 2nd monitor perfectly fine with 16gb of DDR3 ram. When I open my game I get better fps than the ps5.
It doesn't help that GPUs have also generally gone up over the past decade because there's more market for them besides gaming, along with how they benefit from being hugely parallel and the larger you can make them the better, and fabrication costs are shooting up. I think there was a GamersNexus video at the launch of one of the previous GPU generations that noted that there was a move from "more for your money" each generation towards "more for more", i.e. keeping the value roughly static and increasing the amount they charged for a more capable product.
Because, unless this changed, consoles are loss-leaders. At least back in the ps2/gamecube/OG Xbox, the systems were sold at a loss and the money was recouped on controllers and games.
Can’t use a ps2 controller to play a ps2 game on a ps2 without the ps2 console.
If this is still true or not, I don’t know. I do know that the ps5 with an optical drive cost $100 more than the digital edition. I also know that the drive does not cost $100 and sincerely doubt the labor makes up the difference.
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
We got around this by keeping the same number but swapping the months out. Started dating in August, actually got legally married same month, but the ceremony was in October all of the same day. I used the cheat code of familiarity to never forget.
This appears to be backfiring spectacularly.
It is a shame in many ways because a decent digital ID system would be very beneficial.
The problem is the approach is completely wrong.
There are already 10+ competing ID system which are now largely digital.
A solution on how to bring all that together done well could make things significantly more secure by reducing the attack surface and make it much more reliable.
Instead it looks like they are going for 1 more competing system, the implementation of which will be steered by politics and ideology rather than technology and technical requirements.
It does actually perform a security function. The lid angle sensor is used to know when the device is open or closed, and when closed, it physically disconnects the microphone. If you were to be able to recalibrate it at any time, you would leave your device vulnerable to having the microphone enabled when the lid is closed. You can argue whether that justifies the practice, but it's not as simple as just burning the EEPROM serial number in that tells it to turn the display on or off. It defends the user against an attack vector.
From that perspective making it one-time programmable is not unreasonable.
Though it could be simpler if it was something like a magnet on the lid that activates a magnetic switch on the bottom part (and it would be harder to have a false negative result). But Apple is going to Apple
Yes, it could be done with a Hall effect sensor or something like they used to. The cool thing about this approach is they actually use a different angle to turn the screen off as you close the lid than they do for turning it on when you open the lid, to create a better experience. Since it is a security feature, then the "open" vs "closed" state should use the same source of truth. So it's a trade-off of complexity and experience.
Is this anything to do with them taking passwords without consent?
I rarely use windows, and when I do one of the first things I do is switch from edge to chrome.
I think I set up edge and used it once to see what it was actually like, but I was pretty careful about the data syncing / sharing settings. I have the Microsoft authenticator app on my phone, I was pretty careful about the privacy settings on that too, but it's been through a couple of phone upgrades.
Somehow all of my passwords were making their way into Microsoft authenticator, so I must have missed something somewhere. I can only imagine how many millions of people must have had their passwords unintentionally slurped by Microsoft if they have been that aggressive with it.