Yes, I actually do need a faster dishwasher, thank you. Last 3 dishwashers I've owned have taken between 2 and 2.5 hours to do a load. Between my large family and frequently hosting people I am constantly waiting for the dishwasher to finish.
In Canada we have something called the "Tax Free Savings Account" which functions in a very similar manner. EXCEPT when the CRA started figuring out people were using it for private assets, like non-public company shares, they changed the rules.
I have a 2018 VW Atlas and its automated braking system has done the right thing 99.9% of the time. It's definitely saved me from 5-6 accidents -- being cut off a number of times, a racoon jumping in front of my car, and that one time my attention waned.
My Tesla, on the other hand, get's it right about 10% of the time. I wanted to type a higher percentage, but phantom breaking happens so often. The number of times I've seen the red X on the atlas in the entire time owning it is about the number of times the M3 loses its mind on a monthly basis.
The other interesting thing: I could probably write the algorithm for the adaptive cruise control on the Atlas. It's so predictable that I know exactly what it's going to do in reaction to traffic around me. If a car ahead of me slows down, I slow down at a constant rate. If the car speeds up, I know exactly how much it's going to speed up. Same with people changing lanes ahead of me, etc.
The M3 OTOH feels like a buggy machine learning model. I have no idea why it's at the current speed it's at. If I set it to 120KM/H and there's no car in front of me the chances of it being 120KM/H is about 30%. If a car pulls away from and my car should speed up there is no rhyme or reason to when and how quickly it speeds up.
All that being said, I'd still take the Tesla automated braking over nothing at this point. Reading the other posts, it sounds like VW is doing something right.
Do you mean that credit cards in other countries don't have magnetic stripes, or just that nobody uses them?
Living in Canada, there's basically no instance where we'd swipe our credit cards, but they've still got mag stripes; and when I visit the US, I sometimes have to swipe my card, and it does work to do so.
> Do you mean that credit cards in other countries don't have magnetic stripes, or just that nobody uses them?
I don't know about other countries, but a few days ago I got a new Visa card in France. It still has the magnetic stripe. I'm not sure if I've ever seen it used here, but a few years ago I think they were still in use in German gas stations (at least).
Some banks have a nice feature where they allow you to disable the magnetic strip of the card. I presume the information is still physically on the card, but they can somehow know that the payment was done that way and decline it.
I’m in the UK, my cards still have the stripes but in my entire life (I’m 27) I’ve never used it. I don’t even think the card payment terminals we have here have the reader for the stripe anymore.
In Europe, most still have the stripe, but often you can't do magstripe payments unless you enable it beforehand through your bank's app or similar (which is often temporary and disables itself again after 24 hours). I've only ever had to do that when travelling outside of Europe. Most terminals here can't even read the magstripe (and there are even some NFC-only ones).
The card brands are just now beginning the process of phasing out actually putting magstripes on the card. The USA will be a bit later, but over the next few years they’re going to start disappearing from cards.
You're right. Poor choice of words. I seem to recall it's around 2025ish where most cards won't have mag stripes. Mag stripes only exist today because of the US. Everyone else uses the chip for either chip and pin or tap.
He's an entertainer, not a scientist. His stuff is definitely entertaining and thought provoking, but he does make some "Gladwellian leaps" to gloss over some inconvenient facts to get to his ideas.
When I see the Gladwell hate threads, I usually conclude that I'm not quite smart enough to truly criticize him. I'm not an intense fan or anything, but I have read a book or two of his, many articles in the new yorker, and listened to some podcast eps. And, yeah, he's entertaining and thought provoking. His podcast endeared me to him since he genuinely seemed to enjoy it (I haven't listened in a couple years - not since WFH and not having to commute!).
Well, most of the criticism is not on charitable interpretation but like nitpicking isolated, out of context statements.
10K hours part is good example where people just jump on him for making this bullshit theory about anyone becoming expert by putting mindless 10,000 hours in any skill one wanted to acquire. But reading that book I never got impression that it is some kind of magic.
What I find most funny is that a lot of people who find MG fraud can easily get hoodwinked with some other person/place/technology/thing if they just happen to like it.
I haven't coded for a while, but here's a weird situation where print() worked better in some cases for me -- you know how people say "I have better learning retention when I take notes by hand"? I found that sometimes the act of writing the print() statement helped to focus my mind and think about what I was trying to debug. For example, what specific aspects of the data structure do I need to see and why? How do I want to frame the relationship between two or more variables? Stuff like that would sometimes lead to the discovery of the bug before I even completed the print() line.
Seeing the state is faster with a debugger is probably faster, but the process is an analog to explaining the situation to a rubber duck.
Many musicians have giant screens in their live sets, which display incredibly detailed visuals. When you're spending hundreds of thousands of dollars (if not more) on a live setup, it looks really amateurish to have an orange dot in the corner of the screen.
Yes, there are workarounds. But artists shouldn't have to deal with that when it worked perfectly fine beforehand. In addition, the more stuff you add to your setup, the higher chance that something will go wrong.
Professionals who spend hundreds of thousands of dollars on a live setup are not going to be using the internal I/O of their Macbook/Mac Pro. They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
The right call her for Apple is to allow users to give permission to specific apps to disable this but let's not start with the idea that pros are outputting directly from their computers without the right hardware.
>Professionals who spend hundreds of thousands of dollars on a live setup are not going to be using the internal I/O of their Macbook/Mac Pro. They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
Have you ever worked in this industry? Because yes they absolutely are.
The people building the video walls (renting them), and the people actually running the visuals are not the same people.
Of course I have. A VJ with an old Macbook Pro isn't someone for whom this dot makes the display "unusable". If it did, then they wouldn't be using it because they could also have notifications, OS alerts, security prompts, or anything else that shows up on a display come up during their performance.
There's a _massive_ difference between an unremovable and highly visible orange dot and the small chance that "notifications, OS alerts, security prompts etc." could pop up. I think it's undeniable that there's a contingent of people who play live video who will be negatively affected by this change and I'm surprised so many in this thread are implying that if they don't own a playback card, their experience doesn't matter.
No one is saying their experience doesn't matter. Stop arguing straw men. All anyone is saying is that, if having the ability to control the output that's going to a display outside of the OS is a necessity, then you need a hardware controller. That has always been the case. The OS can always interfere with a full-screen app on a secondary display. The only reason there's any issue now is that these people disagree with this specific feature of the OS. It's not "unremovable". Just turn off whatever recording device is active and it'll go away. If you're a bit more tech savvy, turn off SIP and change it yourself or go to github and built the utility that already exists to get rid of it.
All anyone in this thread is implying is that, if this is important to you, you need to have the hardware to do it. If Microsoft tomorrow decided to put a Windows logo in the corner of the screen just to say "fuck you", you would still be unaffected with a hardware I/O device.
> All anyone is saying is that, if having the ability to control the output that's going to a display outside of the OS is a necessity, then you need a hardware controller. That has always been the case.
And without the orange dot issue, it wasn't necessary to have absolute perfect control. (Which could still go wrong anyway, because it's a computer.) Making it necessary all of a sudden is bad.
Or to put it another way, even without a hardware controller it was possible to have your macbook's use of an external output be one of the strongest links in the chain. Which is enough. Was enough.
>it was possible to have your macbook's use of an external output be one of the strongest links in the chain
Not if you need to be able to decide was is and isn't displayed. The I/O subsystem on Monterey is almost exactly the same as it has been since like Mavericks. If it was enough in the past because people were lucky, then that's awesome. There are ways to deal with the dot if people want to continue running these setups off of luck and ignorance. I'm only arguing against the people that say the dot makes their setup "unusable" and somehow ruins their livelihood.
I'm not arguing a strawman, I'm arguing your preposition that because "notifications, OS alerts, security prompts" can also pop up, it's unreasonable for someone to complain about the recording indicator. They're entirely different animals and the fact you're comparing the two seems very uncharitable to me.
> Just turn off whatever recording device is active
It seems you haven't considered that the most obvious scenario someone would be annoyed about this is during a multimedia presentation when they're running both a video output and recording audio over top, an extremely common use-case that performers I personally know take part in. Suggesting this clearly non-technical individual to turn off SIP and compile and run some arbitrary code from Github is laughable to me.
They're not, though. I only used notifications and the like because they were common things that people probably have experienced outside of this specific situation. It doesn't have to be that. It could have been a microphone icon or a speaker icon or literally whatever the OS wants to display. And I'm not even arguing that it's unreasonable to complain about the recording indicator. I'm arguing that people who claim that something being on screen that they didn't want makes their workflow "unusable". If that was the case, there are solutions for that and they've been around for years. Just because they weren't affected by their ignorance in the past doesn't excuse that.
>clearly non-technical individual
If this person is that non-technical, then something like this is definitely not a showstopper that makes it "unusable". They are not the people I'm directing these comments at.
> because they could also have notifications, OS alerts, security prompts
Most will be using HDMI out as a second screen, so those won't show up
> Yes. I currently work in this industry
What do you use for video output? I'm a hobbyist and used iPad & TouchViz with HDMI plug. I just picked up Resolume and planned to use a Mac Mini M1 w/ Monterey and HDMI out. I'm livestreaming, so I'm only doing 1080.
You're not typically cloning your main screen but rather outputting to a second display (that doesn't have the focus). It would be very weird indeed for a notification to wind up on that display.
Weird but not impossible. The whole point here is that professionals who can't take that chance have always had to use a hardware I/O device because there is no other option.
Perhaps professionals with a lot of money riding on it, sure. But for prosumers, the status quo was good enough that breaking it cannot be justified by "oh it wasn't perfect so who cares?"
Airplane mode nearly prevents almost all the triggers that would cause such things to pop up. I thought this was standard VJ advice; turn on airplane mode before a performance.
Doesn't fix the orange dot, but it helps pretty much everything else.
Helps but doesn't solve. There is nothing that you can do to remove the OS's ability to put things on a secondary display outside of your control. The only option is to have a separate I/O controller.
You meet professionals not taking professional precautions in every field. It doesn't mean they're in the right, it means they're playing fast and loose and hoping common stuff doesn't bite them and their clients.
Yup! And that leads to situations just like this. I'm not saying it's right or wrong. I'm just saying that people who depend on not being surprised by stuff like this prepare for stuff like this because it happens, in whatever form it takes.
Yeah, I was seeing some of the tweets claiming that conferences had been cancelled over it and I can't help but think "good, maybe they'll learn to be more paranoid about critical projects". Though I assume it's 99% hyperbolic.
To have this be a business blocker, they clearly not only don't have any other standard tools available to prevent this display issue, but they also upgraded their OS, can't roll back to an earlier backup (probably don't have backups), and did all this on the machine used for the presentations, without any alternatives.
That's a shocking lack of care and they deserve to receive flack for it. You test that kind of thing well before the event, and then you do not touch anything. Least of all upgrading the OS. I can't help but assume that these people are just using their personal laptops and hoping for the best on the day of an event. If they are, a small orange dot is the least of their concerns, and serves as a nice canary to their employer.
> The right call her for Apple is to allow users to give permission to specific apps to disable this but let's not start with the idea that pros are outputting directly from their computers without the right hardware.
The unfortunate case with a preference is that as soon as you enable such a permission folks can force users to enable that permission to use their invasive software. The orange dot exists because applications have been abusing privacy by invasively using audio and visual recording to spy on people. The solution to this problem isn't very simple and while the orange dot is causing headaches the lack of an orange dot also causes headaches.
As per my comment - the unfortunate truth is that offering users a choice means denying users the freedom from being creeped on as every app under the sun asks for silent microphone access "for design reasons". We've seen how ineffective app permissions (that can't be selectively restricted by the OS as on Android) have been for iOS devices. Apps boot up and demand access to contacts, your camera and your microphone and if you refuse they quit out.
It can be empowering to users to deny bad choices - since it prevents users from being coerced by malicious software (i.e. tiktok, facebook, instagram - not like virus laden software).
That all said there is some legitimate functionality being lost with this decision.
Just my opinion, but these linguistic contortions undermine your point.
Providing users with a decision in which there is an asymmetry and/or incentives could be setting them up for manipulation. But i think there are ways to balance the asymmetry vs. just removing the choice. A simple report showing which apps were watching/listening along with screen time could be useful, for example.
I hope this isn't nitpicking but I don't consider those linguistic contortions. A minimum wage empowers workers to receive an (ideally) living wage while, on the surface, restricting them from being able to sell their time for ever lower amounts. There are a lot of debates as to the efficacy and justifiability of things like a minimum wage but it's important to remember that any prevailing sense of the linguistic definitions you might assume is a local effect. Comparing American vs. European definitions of empowerment is a pretty clear demonstration of this where in Europe the ability to live a good healthy life is paramount and restrictions that promote that life style are generally considered empowering.
I do think there might be some other solutions but I also think the orange dot is, for almost all users, a perfectly acceptable solution - visually obvious without being obnoxious.
> but I also think the orange dot is, for almost all users, a perfectly acceptable solution - visually obvious without being obnoxious.
That is why the user should be given the choice to activate it: make it a sensible default choice in the respective settings. The experienced users who know what they do should be empowered to make a different choice if it makes sense for their workflow.
The OP mentioned a preboot setting specifically which would not be per app and be tricky enough to scare regular users from being tricked into doing it. Sounds like a good solution for Pro users.
Yea - a preboot setting actually seems like a pretty rational way to allow this fix. I don't think it's easy to get ignorant users to mess around with bios or other system level settings.
I think you underestimate how many places do use the internal I/O. My tiny church has a single iMac doing both recording and running slides. Small concert venues aren't much better.
Some of the conventions I've gone to ran everything in a room off a single laptop. (I've set up such things.)
Concerts aren't much better. Only the largest events and venues have the kinds of "professional" setups you're thinking of.
I really don't understand why you are being so dismissive of many users, just because they don't meet your personal definition of "professional". This change makes some use cases objectively worse, and telling thousands of people to spend hundreds of dollars plus some amount of time to mitigate a change they didn't ask for is not a respectful position, IMO.
I think this change would be fine as a default, but it should be configurable by the end user.
I'm not being dismissive and this has nothing to do with my personal definition of the word "professional". You mis-framing my position doesn't help this discussion at all.
The point is that users never had control of the output on external monitors. This situation is exactly like the situation that's happened multiple times on every OS where someone discovers a way to do something that relies on some function that they either misunderstand or are misusing (think Y2K). Just because it hasn't bitten people in the ass doesn't mean that it's a good way of doing it. People in this thread have commented surprised that Apple hasn't run into this issue during internal uses by its production teams but the reality is that Apple's production teams don't run off the built-in I/O because that's not how you run a video system you need 100% control of.
I agree that Apple should find a way to make this better (or at least hide it on secondary displays) but that's only a workaround. Today it's an orange dot. Tomorrow, it'll just be some other display indicator. The fact is that you can't (and have never been able to) control what displays from the OS on these secondary monitors. The fact that it's just now biting people in the ass is their fault, not Apple's.
One issue is I don’t think software like PowerPoint or Keynote can output to a DeckLink or similar. A quick Google search indicates you need to use ProPresenter, which is a big change for someone who just wants to display PowerPoint slides.
That's partially true but only if you're only looking at DeckLink devices exclusively (which are mostly for full video productions) that rely on some specific things to function. There are other cards that have software that can capture the video from any window and use it as an I/O source. Think OBS but to an I/O device instead of to a virtual camera.
> They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
Um, no, how about the built in one worked fine, didn't require me to buy a very expensive additional piece of hardware. This change took away functionality that worked before the upgrade?
> let's not start with the idea that pros are outputting directly from their computers without the right hardware.
If professional means derives income from work, you would be wrong. If pro means works for an organization with unlimited budget, you would be right.
It had the same problem that it still has. The OS can place items on the display that you don't want in the middle of a presentation/performance. The only thing that "worked fine before" is that you were ok with what the OS put there because it was rare for that to happen.
>If professional means
It means that a dot in the upper right hand corner makes the function "unusable". If that's the case, then a professional would not extend/mirror a desktop display. They make sure that they control exactly what is being displayed and you can't (and have never been able to) do that with macOS.
It's not reasonable to say it's the "same" problem when they changed it this much.
> The only thing that "worked fine before" is that you were ok with what the OS put there because it was rare for that to happen.
So rare it may have never happened. So yes, it did work fine. Are you implying that's wrong? There's no way to make a bulletproof setup, after all. Maybe with an external device you get a glitched or blank screen instead of a notification, but any hardware or software could fail. Possible chance of failure is not the same as a constant 100% chance problem, and does not excuse a constant 100% problem.
They didn't change it much. They added an indicator to the existing OS UI when a recording device is active. That's the only change that was made. As I've said elsewhere here, they could have added anything to the UI in the past and it would have had the same effect. People just didn't care because that stuff didn't affect them.
>There's no way to make a bulletproof setup, after all.
No one is saying it needs to be bulletproof. If it did and that's important to your production, you'd have backups to switch over to immediately. You're taking what I'm saying out of context and arguing straw men. If a small dot on the screen makes things "unusable" for you, then your setup is wrong. If anything that you don't want on the screen that you didn't put there is important for you, then you need to create your setup to function like that and allow for that.
All I'm saying is that there are people all over here, professionals or otherwise, who claim that a small dot on the screen is a dealbreaker for their ability to do their jobs. If that's the case, then they've been leaving their livelihoods up to chance because every OS has the ability to display things on an external display on top of a full-screen application. I'm glad some people were lucky enough for that not to happen but people whose livelihoods depend on that don't leave those things to luck.
They changed the percentage of time you have an OS overlay on the screen drastically. That's the metric I was using.
> No one is saying it needs to be bulletproof.
When you accuse people in situations where the dot matters of "leaving their livelihoods up to chance" for using normal output, you basically are saying that the notification-stopping part of the system has to be bulletproof. Swap 'bulletproof' with 'nigh-perfect, far in excess of all the other parts of the system' if you want.
And no, that's not going "by their own logic" or anything. Refusing to accept a constant dot in the corner does not mean a single notification would ruin their career.
>It means that a dot in the upper right hand corner makes the function "unusable".
I'm pretty sure professional has nothing to do with dots in the corner. There's really not much to defend about this change, and this change really reduces the utility of macbooks for many people who derive their income from work using that macbook. Hopefully, apple gets the message from users and fixes it.
>I'm pretty sure professional has nothing to do with dots in the corner.
It absolutely does in terms of this conversation thread since that's what the topic of discussion is.
>really not much to defend about this change
Yes, there is. It makes the vast majority of Mac users aware of when their input device is activated in situations where they may not know.
>reduces the utility of macbooks for many people who derive their income from work using that macbook
It absolutely does not. It only affects the very small portion of users for whom the dot is a dealbreaker, that have to capture audio while presenting, that are not technical enough to follow the steps to remove the dot, and that also have never cared before about the OS being able to render chrome on their work/display output.
Because professionals can't take the chance that an in-app notification (from another app) or a menu bar or something else will end up in their output. We're not just talking about an external monitor here.
Yes it is. Notifications can pop-up if someone forgets to disable them. Any OS prompts can pop-up on the display. You don't leave those types of things to chance.
Well, this is a “no true Scotsman” fallacy. In practice they do. It's not frequent but I've seen that a few times and it it's always “fun” to watch.
So it shows that there's a lot of professionals out there not following best practices (which isn't surprising to be honest, it's the case in every industry, including super critical ones…).
Maybe the orange dot will actually help these people start using best practices in the end… (note that I'm not defending Apple's move when saying so, I really hate their tendency to think there customers are wrong and because they are Apple they know better)
It's not a "No True Scotsman" because I'm using their definition of "Scottsman". If someone wants to be able to have full control of what goes on the display, outside of the OS, then they have to have a hardware I/O controller on a Mac. Their only argument is that they were OK with what the OS was putting on there because it didn't affect their specific use case. It's only an issue because they don't like what the OS is doing now. It's great if people got lucky in the past and never ran into an OS prompt or an alert from an app (looking at you, Steam) but that doesn't change the fact that the situation is currently the same as it was before Monterey. Anyone who's saying that a dot in the corner makes it "unusable" has to admit that anything else would have also made it unusable yet they chose to continue without managing the I/O of the device and didn't care.
You're all over this thread trying to gaslight people into believing a constant dot is somehow the same a rare chance at an OS notification that you forgot to disable. People plug there mac directly into shit and that worked fine; now it doesn't end of story.
For an incredibly small set of people who are both professional enough for this to be an issue but also not professional enough to spend a hundred bucks on a device specifically meant for this purpose.
If you're going to claim gaslighting, then maybe actually include the intricacy that you seem to have missed there.
It continues to work fine for the vast majority of people.
No. You're describing a different setup when you suggest that hardware. Read the post again, "plugging the mac directly into shit" is broken now (for people that are using their microphone).
> for people that are using their microphone and are outputting video that they want undisturbed
Let's call a spade a spade. You're blowing it way out of proportion while in reality it's an issue that impacts an incredibly small fraction of the userbase.
I never said it affected a large fraction of mac users. I'm not blowing anything out of proportion.
Your post above was also only talking about those people with mic input and video output, so it shouldn't be a problem that I did the same.
Also it's not just video, it's basically anyone "plugging their mac into shit" for professional output purposes. Could be video, could be a powerpoint, could be a software demo.
Woah! Someone learned a new word and then used it incorrectly!
It's not about OS notifications that you forgot to disable. It's anything that the OS wants to display. Notifications and alerts were just an easy example because everyone knows what they are.
People can still plug their shit in and it'll "work fine". If they're recording audio, they'll see a little dot. If they want control of what shows up on the display, there are answer for that. Pretending like that hasn't always been the case is misguided.
I really don't believe anyone running Steam on their video computer is worried enough or even serious about reliability to use a dedicated video playback card. Sure it would be nice if everyone used a dedicated card, but it's 100x more important that those people stop running Steam... unless maybe if they're pro game streamers or something.
Also, even Steam requires extra permissions on Mac to display the overlays you mention.
I only mentioned Steam because I've had an experience where that happened to someone. They were running some game (similar to Jackbox but it wasn't that) for a conference and were just outputting the display and the Steam update prompt showed up on top of the display. It wasn't a big deal, it was just an example that summed up exactly the types of things I'm talking about.
And no, this wasn't a Steam overlay. It was a prompt to restart the Steam app to complete some update. That does not need additional permissions.
The "prepared" in my post implies that notifications are disabled.
Also notifications won't really be an issue for anyone but people using the machine both for personal and professional stuff. In the worse case, you can have different user accounts. A professional machine used for VJing or even audio recording will have zero notifications.
Not a problem in practice. On macOS, OS crashes show up on the first monitor, and so do other crash alerts. Also, again, if this is not an amateur thing, the only programs that will be running will be those directly related to the presentation.
Also I wonder if we're talking about different scales here. I'm not talking about the 150 inch monitor, I'm talking about video art, VJing, and small scale stuff. macOS works fine for those things.
But the "current" monitor is the one with the GUI and the mouse cursor. The secondary monitor is the one being used for external video. There are even dedicated APIs for it.
Are you a macOS user? Your other examples talk about Windows Update... the situation in macOS is a bit different, which is a lot of people doing audio/video flock to it. Not everyone needs external hardware, just a MacBook can do a lot.
Yes. macOS is my daily driver and I'm on Monterey.
All I'm saying is that, if anyone wants to say that this dot makes their use case unusable, then they have to admit that the current OS setup was always unusable for them because the OS was always able to display chrome on their displays. It may not have happened often or even in a way that they thought was "unusable" but it was able to happen. The only difference here is that they're not happy with the type of OS-level things that are displayed.
In my experience, people for whom any kind of errant display items matter use dedicated hardware devices for their I/O. If it didn't matter before because it was only windows/alerts/notifications/whatever, then that clearly doesn't make it "unusable" just "not preferred". I fully agree that there needs to be some kind of option for this on presentation displays but the people saying that SNL wouldn't have dedicated hardware for their displays is asinine.
Sure, in theory you are correct. We should seek the more reliable solution. In practice, this is not really a problem for anyone using macOS for small time visuals/performance/presentations, as long as you keep your computer well prepared for those situations. It works 99.9% of the time, which is 100% for most people (even pros) doing it sporadically. Maybe your solution covers a few more 9s, and you need those 9s (I know live broadcasting does), but this is unnecessary for most common folk, and you're dismissing this use case across this thread, which is why I'm answering to you.
I feel like the notifications issue you mention is bit of a red herring, because having too many things running in the background will cause problems regardless of using external gear, regardless of them showing on the screen or not. You can't rely on external gear alone for stability, the computer itself has to be stable. And the computer alone being stable is enough for 90% of people. And even if there are notifications... so what? This is people doing it for art purposes, on parties. They learn a lesson and never have to care again.
If those people are really using Steam on their computers (like you said on another comment), they surely aren't pros worried about performance, reliability, or anything of the sort that warrants a dedicated playback card, so I don't really see this use case (Steam+Blackmagic) existing at all.
Surely the default I/O is nowhere near enough for SNL or even for local broadcast, but it is good enough for a large contingent of people that don't need the same reliability that you or SNL needs. And it does works for them in practice, without notifications, and without OS chrome... except for the new orange dot, which is a nuisance.
You can afford a separate laptop for VJing, but don't want to spend $200 to get 100% protection from unexpected notifications, error messages, calls, and orange dot?
What separate laptop? macOS supports multiple accounts, no need for separate laptop. And people don't want to buy a completely unnecessary $200 dongle (it's actually cheaper) for something that worked 100% perfectly before. Is that really hard to understand?
If I could have audio inputs/outputs that were good enough for my audio work I would also prefer not using an external audio interface. I often compose on earbuds, and that's fine. For mixing I need something else. Some people might not. Who am I to judge?
Also, it's not an "or" option. Pros turn off notifications/internet, and don't leave Steam running when working, like the other poster is saying.
use any professional audio interfaces like DANTE? Just activating DANTE causes the stupid orange dot to show, even if you are only sending audio OUT from your Mac to, say, a digital mixing board.
There is ZERO reason to force the dot on every display. Restrict it to the display(s) also showing the menu bar and this becomes a non-issue. I can't believe anyone involved in this at Apple thought this was a remotely reasonable thing to do!
Wow, this is one of the most misinformed takes I've seen in a while. I know a number of performers who use a Macbook for their visuals, and they absolutely just plug their machines into whatever I/O is available at their venue. I don't know where you're getting this idea that everyone just lugs around a rackmount AV machine, and if they don't they're not truly a "professional".
We're not talking about a rackmount AV machine. We're talking about a tiny device that can output to HDMI.
And, again, we are talking about situations where this orange dot would make the function "unusable". Those situations are not situations where a professional uses the built-in I/O and leaves things to chance.
My friend has VJ'd large clubs and music festivals on her ancient Macbook without ever using an external display driver.
There's not a lot of money in the scene for most people. They use the software/hardware they have. Hiding notifications and colourful dots from the OS shouldn't really be an issue.
there’s quite a few people here replying to you letting you know how there are indeed situations where A/V professionals have and are continuing to use built-in I/O. I have seen the same. A properly prepared machine is immune to the issues I’ve heard you describe in this thread (notifications, etc).
It sounds like there’s something about all of these responses that isn’t resonating with you because I see a pattern of responding and letting us know our experiences are essentially invalid, for some reason. Are you able to speak to why that’s important to you? Why does this seem so far-fetched / unbelievable to you?
I think you're misreading what I'm saying. I have only been responding to people that are saying that the dot makes this setup "unusable". The machine you're describing is not possible without a dedicated hardware I/O device because the OS always has access to display devices and apps cannot override that.
If the dot makes their setup unusable, then the situation prior to Monterey should also have made their setup unusable because the OS could have popped up an alert dialog at any time (or any kind of OS chrome). Using built-in I/O is absolutely fine in professional settings but not for settings where you need complete control of what's being displayed and that's precisely what they're complaining about. They never had completely control of what was being displayed. They were just OK with it because it either didn't bother them often or it wasn't a dealbreaker for whatever they were doing. If you need to know that you're only going to see what you want to see, you have to use hardware I/O.
I've never said anyone's experiences are invalid. Stop talking down to me like a child and making things up.
You posted a ton of comments to this thread (over 60 of them!) perpetuating a massive flamewar, and broke the site guidelines in a whole bunch of places, including outright personal attacks. We ban accounts that carry on this way, and we've had to warn you about breaking the site guidelines more than once in the past.
We don't want flamewars here. Please make your substantive points thoughtfully and without swipes in the future—regardless of how wrong other people are or you feel they are.
Btw, when discussion degenerates to the level of people arguing about what each other did or didn't say, that's a sure sign that the conversation has become tedious and uninteresting to those not involved in the spat, and that it's time to step away from the comment box.
You weren't the only person in the thread doing these things, but (from the subset I've seen), your account was behaving the worst, both qualitatively (in terms of how badly you were breaking the site guidelines) and quantitatively (in terms of how many posts you made and how much fuel you added to the flames).
Could you please review https://news.ycombinator.com/newsguidelines.html and take the intended spirit of this site more to heart from now on? I don't want to ban you, but if this keeps happening, we'll end up having to, because it's not what this site is for and it destroys what it is for.
I'm sorry. I didn't mean to start a flamewar and I didn't think that posting comments was frowned upon. It's just something I'm passionate about and felt was being misconstrued by people.
Can you clarify where I broke the site guidelines? I don't believe that I've personally attacked anyone but that feels like a subjective measure so I'd like to be clear on what you consider a personal attack (in this thread specifically). I don't feel like calling someone out for mischaracterizing what I'm saying or flat-out being dishonest about what I've said is a personal attack but, if it is considered that by the admins, then I'll stop doing that. If it's something else, then I'd like to be aware of it so I can check myself in the future.
I'm reading the guidelines now and will try to take the intended spirit. I don't want to get banned but I also don't want discussion to be surface-level and I get that there's a fine line there.
I'm not going to be responding to any more comments on this post to prevent getting banned and I don't know if there's any way for you to respond to my questions outside of here but, if there is and I just don't know about it, please let me know. I'm hardly an expert when it comes to HN and usually just check the threads page to see replies and this is so far down now that I don't want to miss your reply.
> I have only been responding to people that are saying that the dot makes this setup "unusable". The machine you're describing is not possible without a dedicated hardware I/O device because the OS always has access to display devices and apps cannot override that.
This seems to be the crux of the disagreement in this thread. You're equating the effect of two quite different things:
1. A pop-up that's quite intrusive / potentially embarassing, but has (say) a 1/100 chance of happening any given show, and in any case would only be there for a few seconds; the rest of the show would be unaffected
2. A small but intentionally noticeable orange dot that's there 100% of the show for every show
Yes, if you want to be a top level professional, then you can afford to have neither. But I can certainly imagine people / venues where #1 would be considered a normal cost of doing business, but #2 would not.
That said, if fixing them both is as easy and inexpensive as people in this thread seem to think, then the small "nudge" by #2 to get them to fix #1 is probably beneficial for the ecosystem overall.
You're mis-framing what I'm saying. The effect is the same. There is something on the display that is unwanted. Regardless of what that thing is, the issue at hand is that people do not want things that they didn't choose to be on the display to actually be on the display. It doesn't matter what that is. In this case, it's a small dot but it could have been a little microphone icon or a camera icon or (like Windows) an orange border around the display or literally anything else.
The fact that it didn't affect some people before is great. They were lucky that they didn't have this happen. All I'm saying is that, for years before this change in Monterey, there have been things that pop-up like this on external displays because external displays do not have the ability to prevent the OS from using a display as a display (with a couple exceptions when used with APIs for exclusivity). That's a known problem with using extended displays and it's been known for a while, regardless of the OS you're using. There's also a solution to that problem that's pretty standard in the presentation/production industry.
That being said... there are now multiple solutions for people that both don't break the intention behind this change (they're slightly technical solutions that most people won't do) and allow for app developers themselves to fix the problem (using the APIs already provided to get screen exclusivity).
The point you're not addressing is that in practice, before this change, people usually had close enough to full control, except for maybe a <1% chance of something going wrong. The OS can, in principle, do anything, but in reality it usually doesn't. Whereas with this new orange dot, there is a 100% chance of it being there.
It's easy to imagine a pretty wide range of people for whom a tiny theoretical risk of the OS going crazy and showing some kind of notification even with notifications disabled is acceptable, but an orange dot that's 100% deterministically guaranteed to be there isn't.
No they didn't. That's literally the entire point. Just because people don't know that or weren't affected by it regularly doesn't change that fact. People are focusing on things like notifications because that was the initial example that I used for ease of understanding but there are literally thousands of things that the OS can display, on any OS, on top of a full-screen window or by kicking the display out of full-screen. Even on Windows, Virtual DJ has has a full-screen mode and errors will kick you out of full-screen and pop-up on that display. All of that is avoidable with an I/O box.
The fact is that if you're not ok with the OS choosing what shows up on your screen, there are ways to do that and they've been industry standard for years.
Thanks for explaining, I hear what you are saying about dedicated hardware giving much more control. In the case of this dot and an external monitor / display, and until the OS is updated, that definitely sounds like a great option.
Perhaps I was misreading what you are saying; regarding:
> If the dot makes their setup unusable, then the situation prior to Monterey should also have made their setup unusable because the OS could have popped up an alert dialog at any time
It sounds like we are in disagreement around current un-usability because of (dot/notifications/etc) implying past un-usability. I get what you are saying in theory, and agree with you.
However, in my life experience, and many others in this thread, I am hearing counterexamples to this statement. The way in which I observed you responding to these statements is what invalidating _to me_. My felt experience is not "making things up".
I do not know your age and I am making no assumptions about that. The intention of my original comment was to: respectfully and kindly share the impact of how I have been receiving your comments, gain more understanding about your perspective on this issue, and offer an (obviously subjective) reflection to you about a communication pattern that I noticed in the comments here.
This sort of reflection is always coming from place of curiosity and so carries an implicit invitation to deepen into greater shared understanding and connection; I have no attachment to you receiving that reflection and I apologize if it did not land well for you. I understand these kind of things do not always translate so well in text.
I have never said that anyone was making anything up. Please do not put words into my mouth or make statements on my behalf when I have not said them.
>I apologize if it did not land well for you
This sounds like a abusive partner apologizing. "I'm sorry you made me so mad that I had to hit you"
>I am hearing counterexamples to this statement
You're not hearing counter-examples. You're hearing situations where the person, in the past, was lucky enough to not be affected by the situation they were in. That's great. It's like someone who refuses to wear their seatbelt telling someone that they never wear their seatbelt and they've never been injured even though they've driven 10000000 miles in that car. If reducing your chances of dying in a car wreck is important to you, you would have worn a seatbelt. This is like complaining that the car's seat belt chime wasn't insistent enough after you got into a wreck and are now paralyzed.
I am not carrying any weapons in this conversation.
Nobody is attacking you here.
Yet, I experience your interactions here as carrying defensiveness and un-checked assumptions. That is my experience. That is different from me saying, “you are defensive.”
I am having an experience in this dialogue that feels in opposition to one of the main principles encouraged in this board, which is that of graciousness and good intent. I do not feel you have been reading my comments from that kind of orientation. Perhaps you have and I have misconstrued.
Regardless, I really don’t care enough about this issue to continue engaging with you about it like this.
So you made up something that I said or did and then framed it as my fault (something "not resonating" with me) and then want to claim that you're being gracious and discussing from a place of good intent. What un-checked assumptions am I making? Everyone that has responded to me and disagree with me has done so on the principle that this small dot is a "showstopper" that makes their setup "unusable" (it's literally in the title of the article in the OP). All my comments and responses are based on those statements, no assumptions. The difference is that these people feel that it is Apple's responsibility to fix the issues they're having without taking any responsibility of their ignorance on themselves.
And you're right... I haven't been reading your comments from that kind of orientation because your very first comment to me claimed that I had somewhere let someone know that their experience was invalid, which I never did, or that I had somehow claimed that people's realized experiences were far-fetched, which I also never did. How can I assume graciousness and good intent from someone who didn't assume the same of me and then went out of their way to mis-frame what I have been saying? The only time I ever even suggested that something was far-fetched were the people who were claiming that something like this would affect productions like SNL or Mariah Carey on NYE because those are far-fetched suggestions.
That's fine not to continue but please don't pretend that I was the one that engaged with you "like this". You initiated the current conversation, including its tone and intent.
Yeah, I'm watching this unfold and I don't see how the other person can't see the problem.
Someone spends a thousand, or two thousand dollars on a high-end device with state-of-the-art ports and graphics and processing, and because "OS notifications sometimes pop up if you don't disable them", real pros buy a piece of middleware hardware that does nothing but filter out a software issue?
Yes, exactly. If you need to be able to control what goes to the display and take that away from the OS, you need a hardware I/O device.
OS notifications and alerts are just examples of any number of things that could be displayed that are unwanted. In situations where something like that makes the setup "unusable", you have to have a hardware I/O device. There's not another option.
It's rendered on the external screen also, even when there is no menu bar.
For me, it's good, not bad. I’m sorry that for some people it means they need to do their job a little bit less careless, but the overall importance of this dot is worth it.
I had the chance to put these on a house I built. I ended up not doing it and using a more standard type. I slightly regret not having them in a few of my window use cases BUT, at least the ones I saw, felt very flimsy and rattled/slammed a lot.
Another challenge for this type of window: it's challenging to hook up a portable AC unit in an aesthetically. This is a factor if you don't have central air and need to retrofit cooling.
What’s the deal with leading with AirTags? It’s the least interesting part of the process. I can also use my eyes driving around neighbourhoods looking for high end cars. Or any other tracking device that’s existed since the 70s.
Is it supposed to create FUD around the dangerous world we live in now that we have AirTags?
Other scary technology listed in the article:
- the device used to factory reset the car
- blank keys
- a screw driver
It'd be more interesting to use 2 AirTags, and try to figure out when the owner is away from the car, e.g. if you can see them take an Uber to the airport and their car is in the driveway.
To turn on my criminal mind: you driving around neighborhoods: a witness might notice you cruising around a few evenings ago and note your license plate (you'd have to go at night because during the day the cars wouldn't be there, you might have to go a few times to see what cars are available).
The AirTag/any tracking device is quite ingenous, just go to the mall carpark (no one will be suspicious, it's just someone at the mall) and have a better selection of cars at one place, plant the AirTags (riskiest part) and then wait. Even if they found it at home, you already know where the car can be found, and if you're worried the cops might be waiting, just wait a few more months, statistically the car will still be there.
I'm sure there is a complex Venn diagram to be had here. Anyone making blanket statements about wants/efficiency/etc is a fool.
I have an hour commute each way if I want to go in and a dedicated 12x12 office bathed in natural light. There's no possible way a company office could come close to the productivity and comfort I have here, even on video calls.
Another chunk of the population lives in apartments not much bigger than my personal office and have a 10 minute walking commute into an office. If I were in that situation I'd maybe enjoy going into an office to do work more (but I probably would never choose to put myself into that specific living scenario).
There is likely everything in between those extremes. You also need to question intro/extro personalities, how much and what type of collaboration is part of the job, etc.
We used to live in a world where there was only one option: come into the office where the paper work lived. Now we have a choice. Let's take maximum advantage of that.