Hacker Newsnew | past | comments | ask | show | jobs | submit | shados's commentslogin

CC would explode even further if they had official Team/Enterprise plan (likely in the work, Claude Code Waffle flag), and worked on Windows without WSL (supposedly pretty easy to fix, they just didn't bother). Cursor learnt the % of Windows user was really high when they started looking, even before they really supported it.

They're likely artificially holding it back either because its a loss leader they want to use a very specific way, or because they're planning the next big boom/launch (maybe with a new model to build hype?).


They quietly released an update to CC earlier today so it can now be run natively on Windows.


Conversely Cursor is still broken on WSL2.


Assuming they don't have an EU presence of some sort, EU law doesn't apply to them.

Now if they want to open up shop in the EU, or use a payment processor to charge money that has EU presence, things change.


> Assuming they don't have an EU presence of some sort, EU law doesn't apply to them.

That's not correct. If they handle EU people's data, they are responsible for it and can still be fined. Obviously this cannot be enforced if they never visit and have no assets in the EU.


Its correct purely because of jurisdiction. EU laws don't apply for people with no presence in the EU, unless there was some kind of treaty where one country agrees to enforce another's.

That's just how laws, any law, works. The EU can "fine" all they want but it would be entirely symbolic.

That's like if US restaurants had to enforce EU food safety laws when on US soil because a EU citizen is eating there.

Fortunatelly, unlike US laws, GDPR, by virtue of being EU law, is actually readable by normal human beings, so its fairly straightforward:

https://gdpr.eu/article-3-requirements-of-handling-personal-...


Yes, and "the monitoring of their behaviour as far as their behaviour takes place within the Union" absolutely applies to examining activity of LinkedIn users from the EU.

As for jurisdiction in general: the US routinely jails people for activities that took place outside the US, as soon as they set foot on US soil - occasionally even when they don't even do that (Kim Dotcom). European convictions for civil matters will not result in an arrest warrant, but can result in financial penalties and confiscations applied to anything that has to go through Europe in one way or the other.

The limits of enforcement, in the internet era, are becoming mostly practical rather than theoretical. Which is interesting and poses a number of new, unanswered questions. Simply speaking, one cannot just wave away any law simply because they don't live in this or that place anymore.


Yup, but Article 3 point 2.a has a fairly strict definition, where for an entity outside of the US to be considered as "offering service" to EU members requires some kind of strict ties. The de facto examples is offering a product by specifically mentioning payment in Euro, or having presence on an domain with a top level TLD of a member state. If there's no ties that shows the offering is made to EU members, it doesn't apply.

Very very little tie is required (eg: just having one employee in the EU in a 50,000 people org would do it right there), but the law has been fairly consistently interpreted as such.

I get where you're coming from, but this isn't a if or but or theoretical. Its just how GDPR gets applied. I probably confused things by trying to introduce poor analogies, when the law itself is fairly clearly interpreted a specific way.


It really shows how AI is going to change the entire industry.

Let's imagine tomorrow a Product Manager at LinkedIn wants to introduce this as an official functionality? They're going to have to run it by management or their pod (or find the PM in charge of that area if its not them), finish existing project, wait for resources to be ready, have legal/marketing/compliance involved, get it developed, go through all the other red tape, etc.

I don't know exactly how LinkedIn works internally, but I'm sure some of this is accurate.

So maybe, MAYBE they'll have it in a couple of months? But someone can build it in a few hours, even if they're not super good at this stuff.

It changes everything about how we think about products and SaaS software.


> But someone can build it in a few hours, even if they're not super good at this stuff.

Note that the end result is not the same as what LinkedIn would have built. Perhaps in some ways better and in some ways worse.

E.g. personally I am not comfortable bulk uploading personal data of myself and my network to a third party server.


Yup. But my point is you can build this yourself, FOR yourself. So if you're not comfortable with using this one, you can build one on your own that you can trust (because you built it yourself).

Thats the whole point. In an AI world, you're no longer bound by the limits of what 3rd parties do or don't do, plus or minus some datasets (like in this case, the job postings).


Yep! I agree to the principle - I also would not trust a random third-party app with my personal details. Though as noted in my other comment, this app is mostly client-side (including the CSV and ZIP extraction), except of course for the JSearch calls to find jobs by company name, and the CSV export if you choose to use that.


  > personally I am not comfortable bulk uploading personal data of myself and my network to a third party server.
The subject of this show HN is completely client-side.


It changes nothing, what you describe has always been the case even before AI. There are things people can build in a weekend that take weeks or even months at a larger company. Large companies have a way of slowing everything down, for reasons that have nothing to do with coding.


Correct, but the amount of people who can do it has drastically increased, and the amount of time it takes for most people to build these things has drastically decreased.


It has not drastically increased. One could even argue it has decreased. Ultimately, the productivity gains will perish to bureaucracy.


This is exactly where I think we're heading as well. This project took about 2 hours.


Primarily because LinkedIn has to bother with complying with their privacy policies and other T&Cs, and your site has none of that.

Why should I take all of my data and give it to you, a rando on the internet? Is it being stored? Will it be shared? Sold? Maybe, as there's nothing that says you won't.

Looks neat, but strong pass because of the above.


Correct. But you can build this thing on your own, for yourself. LinkedIn didn't, and you don't trust this third party. So if you had the problem its trying to solve, you could just spend the 2-3 hours and build it for yourself, even if you don't have all the necessary skills to do so.

That's the whole point.


And in the process you develop another skill you can add to your LinkedIn profile.


Those are totally fair points but what I agreed with here was that people will create _personal_ software to solve a particular itch for themselves, which is what I did here for my friends. I just decided to throw it on HN as I have some API credits remaining :)


Not sure it changes EVERYTHING as the issue of sales and marketing still favours LinkedIn. Sales/Marketing is very expensive and sometimes even difficult to predict. Building is not everything.


Right, the point Im trying to make is that if someone has a requirement, they can build it on their own, for themselves, at much lower cost (because they don't need sales and marketing for themselves).

It was always possible (hire software devs to do it), but the bar and cost is much, MUCH lower.


Sure, no doubt about it.


This feature already exists and has existed for like 10 years (I know because I built the original implementation of it)


Yup. People act like they are geniuses there, but they were pulled kicking and screaming to allow native apps.


Honkai Impact 3rd does have pay2win ladder. It's pretty hardcore too, even as a whale it can be quite rough to get to the top. There's a LOT of top tier whales in that game and you need some pixel perfect timings to utilize it at the top level.


It's not just that. A large portion of IT people who work in these industries find Windows much easier to administer. They're very resistant to switching out even if it was possible and everything the company needed was available elsewhere.

Even if they did switch, they'd then want to install all the equivalent monitoring crap. If such existed, it would likely be some custom kernel driver and it could bring a unix system to its knees when shit goes wrong too.


I mean crowdstrike has a linux equivalent which broke rhel recently by triggering kernel panic


You highly overestimate the capabilities of the average IT person working for a hospital. I'm sure some could do it. But most who can work elsewhere.


The backend for frontend pattern. Something most apps where frontend and backend are in the same organization (fullstack or otherwise) should have. It does wonder to maintainability and performance.

Even though we use GQL here, we still have a B4F, so it's browser -> B4F -> GQL -> Database.

The tooling we use make it trivial so it doesn't add any development overhead (often reduces it. No CORS bullshit for one), and our app goes ZOOOOOOM.


Not really. The usual strategy for persisted query is to only use them in production or adjacent areas. You build your app using regular graphql doing whatever you need, then after you tested things and it all looks good, when you ship, you parse the codebase, store the queries, and ship them.

It doesn't work if you don't have control over the client (eg: exposing to third parties), in which case query complexity limits are easy to implement and do the job, too.


A lot of the points in the articles are issues anywhere, it just depends on how well they are understood in the context and how mature the tooling is. Eg: rate limiting in REST is rarely problematic because we have no end of WAFs and gateways and circuit breaker libraries that understand REST very well.

It's very true that a few years ago the gap was there, and people implementing GQL had to solve all these problems without access to these mature solutions.

But these days, they're mostly solved problems. Authorization by type that propagate through the graph is pretty simple. Rate limiting and complexity limits on queries is often built in the frameworks. Query parsing is solved with persisted query all the major frameworks have as first class citizen. Performance: lots of GQL caching solutions out there, and the "multi threaded by default" model used by most frameworks helps a lot.

Etc etc etc.

I jumped companies a lot, did both. For a while I too thought GQL was redundant, then went back to a more classic world after a successful GQL implementation. I had forgotten how many problems were solved, especially organizational and people communication issues, through GQL. And so we're moving to GQL again.

Someone in the comment mentioned GQL is a tech solution for an organizational problem. I'll say most good software tooling and frameworks ARE tech solutions for org problem, because technology is usually a solution to people problem. GQL is the peek of that, and it does it quite well. Don't use GQL to solve tech problems, there's better tools for those.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: