Hacker Newsnew | past | comments | ask | show | jobs | submit | stoptalkingshit's commentslogin

I think the "verified" thing just means they've provided a certain amount of id - as it does on most sites.


Piss off. All this bullshit about people taking some personal attribute or behavior and insisting that it's a major part of their "identity" is nothing more than a defense mechanism for losers.


Oh God. Really?


Like the considerate people who keep importing loser attitudes into computing, telling everyone else how to behave, etc.


Yes, this person who can slander GitHub all over the twitterverse with the dreaded "sexist" label with no evidence is totally powerless. lol.


This is going to hurt Horvath more than it hurts GitHub.

She's going to receive a torrent of abuse for speaking out like this, whether her version of events is accurate or not.


>This is going to hurt Horvath more than it hurts GitHub.

Really? You'd rather pay out a $250,000 settlement than incur an increase in the number of negative comments you receive on the internet for a couple weeks?


If you do a little research about how things work out for whistle blowers you might not be quite as confident about what a great deal this is for her.


And yet, she initiated that chain of events. You should think more carefully about what the word "power" means.


Apparently her exit from Github was to be a secret, but an anonymous person publicly posted about it to a social network with a nasty comment. So somebody inside Github leaked it. To the mattresses!


Are you aware of how patronising this response is? If you want to make a comment about the nature of power, make it. Don't tell me to think more carefully.

I'm not sure precisely which event you're saying she initiated, but if you're talking about her quitting her job and then speaking out like this: you should think about why someone would put their reputation and livelihood on the line.


"If you are one of the people in this thread jumping to defend GitHub in this situation, ask yourself why."

"If you want to make a comment about [...], make it. Don't tell me to think more carefully."

Pot, meet kettle.


Difference is I don't actually know why people are defending GH here, whereas the other commenter has something specific to say to me about power (and they did).


The nature of power is that the person who sets the ball rolling is the one in power. That's why a sculptor is the one with the power, not the chisel. The fact that there's some appearance of backlash doesn't alter that. And as to which chain of events, how could I be any clearer? The chain of events starting with her deciding to go public. The rest of your comment is just shallow moralizing that doesn't tell us anything about who has the power. I suppose you're the one to always side with the crying woman, because that's as deep as your understanding of power goes. It doesn't occur to you that tears themselves have power.


[deleted]


She did set it in motion. And she was clearly - in part - set in motion by others. But she has far more power than the person I was responding to is acknowledging. Tiger Woods has millions in the bank too, and he still had to self-flagellate when he wronged his wife. In summary: you are an idiot.


In summary: you are an idiot.

This kind of comment does not belong here. Please remember, this is not Slashdot or Reddit. We try to keep the level of discourse a little higher here at HN.


The chain of events did not start with 'her deciding to go public.' In fact neither of us know what started this ugly mess. But one thing we do know is that it started long before she went public. For example, the actions of the founder's wife were long before, as were many other hostile acts that have been alleged.


The chain of events that begins with her going public is part of a larger chain of events, which includes her employment at GitHub and the events she has described. That was not clear for your comment, and has large ramifications about who started what.

My position is simply that defending GitHub and dissecting her story is unnecessary at best and harmful at worst. GitHub will respond, and presumably more information will be brought forward by other parties. I will reserve judgement (not that it's really my place to judge at all!) until more of the story has emerged.

Your characterisation of my position betrays your sexist bias. Of course anyone with a voice has power. But why would someone quit their job and ignite a shitstorm of drama unless something really bad had happened to them? Some benefit of the doubt and empathy for Horvath seems appropriate at this junction.


> Your characterisation of my position betrays your sexist bias.

If you accuse someone of something so serious and cringe-inducing, you should at least try to argument/substantiate it.


Hilarious that you run the same tactic of accusing me of being "sexist" to discredit me. Nor does your response address anything other than your own deranged imaginings. I didn't say something bad didn't happen to them. What I said was that the she is not powerless. You tried to cast her as powerless - you were quite clear on this, and you were wrong. And somehow in your deranged, imbecilic mind pointing that out that is "mischaracterizing" your point-of-view, whereas rabbiting on about things I never even said and accusing me of being a "sexist" is not. All I can say is: LOL


I didn't say she was powerless. I said that GitHub has the power. On balance, they do.


Clueless. Just clueless. None of these benefits are tied to the crap browser platform that's been foisted on us. The stewards of the browser did a terrible job of designing it, as proven by shoddy hacks like asm.js. Did you know that it took WebGL to get them to introduce typed arrays? They weren't smart enough to see the usefulness of this, they had to bumble into it. Same with async HTTP requests. So woopty-doo you've caught up to where the JVM was over a decade ago. Semantic HTML has been destroyed by this mindset also.


WTF? You simply go around calling people clueless without knowing their background? What is wrong with you?

The browser platform is not perfect, its not close to perfect, its evolving to do something that it was not originally designed to but it still the best shot at cross platform software. You can pick the JVM for all elegance it had, it never achieved what simple servers with bad markup languages did. There is a reason why you are typing on a browser right now and not using a java applet. The reason is that even with all the drawbacks of web technology, it works, it is easy to implement, it is easy to learn. Its not about über programmers in ivory towers, its about empowering everyone with a technology that is easy to grasp. It takes a very good professional to craft a good web app but it still easy enough that people from non-technical backgrounds can share knowledge. Its a platform for sharing stuff.

Yes they added typed arrays just now. The engines evolved a lot during the last two years. Lots of things are being ironed out. Instead of just seeing deficiencies, see how progress is being made with something that is open, free and common.


So what you're saying is it's lowest-common-denominator crap? Then we agree!


None of these benefits are tied to the crap browser platform that's been foisted on us.

No, but they are tied to the web, for which browsers and JavaScript are an incumbent technology. You are right, in the same way that Windows has nothing in particular to do with games, outside of a fortunate adoption of DirectX. It's just that it's the way the world turned out. Admittedly, there are downsides.


The popularity of DirectX doesn't vindicate Windows either. So what's your point? It's shit but, eh, that's the way the world turned out? People have every right to be annoyed because the people who were entrusted with the concept of the web implemented it very badly. At least on Windows there's a serious effort to give me as much of the machine's power as possible. I had a better programming experience with Java applets more than a decade ago. It is a joke.


It's shit but, eh, that's the way the world turned out?

We are in agreement.

People have every right to be annoyed because the people who were entrusted with the concept of the web implemented it very badly.

Again, no disagreement. But in the meantime, implementing things is still my preferred activity to complaining.


Well, I agree. And asm.js etc are better than what we had before, so some praise should be allocated to them. But it is not innovative or elegant; it's a hack that takes us back to the state of the JVM more than a decade ago, and we're going to be paying the cost for it for some time. Hell, they can't even get the simplest possible thing - the syntax of HTML - right. So it will always anger me when I see web types touting their "innovation" etc. They are good at one thing: marketing to stupids.


They are good at one thing: marketing to stupids.

Windows. The market has spoken. Let's just all be smart enough to not be defeated by the marketing to stupids.


> At least on Windows there's a serious effort to give me as much of the machine's power as possible.

What is asm.js (and WebGL, for that matter) if not that?


WebGL is a deliberately limited but secure sandbox based on a spec for mobile devices that is two major releases out of date and makes an Xbox 360 look like science fiction.

Asm.js is an enormous hack where we limit the computational expressiveness of performance critical code by the semantics and standard library of a language notoriously shit at it.


> WebGL is a deliberately limited but secure sandbox based on a spec for mobile devices that is two major releases out of date and makes an Xbox 360 look like science fiction.

Unreal Engine 4 is a counterexample.

> Asm.js is an enormous hack where we limit the computational expressiveness of performance critical code by the semantics and standard library of a language notoriously shit at it.

How is it limited?


To be honest, the video in TFA looks like something out of a 2009 game, it's not particularly impressive. And I guess it'll probably be a rolling demo, not having the requirements of a complete game that also has to handle a bunch of computationally expensive stuff like collisions, game logic etc.


"Unreal Engine 4" is marketing buzzspeak. WebGL is modelled after OpenGL ES 2. Its GLSL support is massively outdated. It lacks features like multiple render targets (used by any deferred renderer in the XB360 era, e.g. Unreal Engine 3), geometry shaders, multisampled reads (for HDR MSAA and hybrid spatial/temporal AA techniques), and tons of other common techniques.

Asm.js is limited by having no 32-bit float type, no 64-bit int type, no SIMD, no dynamic memory allocation and the fact that they had to extend the language from day 1 just to be able to multiply a 32-bit integer by another quickly.

You may want to actually inform yourself.


> "Unreal Engine 4" is marketing buzzspeak. WebGL is modelled after OpenGL ES 2. Its GLSL support is massively outdated. It lacks features like multiple render targets (used by any deferred renderer in the XB360 era, e.g. Unreal Engine 3), geometry shaders, multisampled reads (for HDR MSAA and hybrid spatial/temporal AA techniques), and tons of other common techniques.

WebGL 2.0 has many of the extensions and is in Editor's Draft status: http://www.khronos.org/registry/webgl/specs/latest/2.0/ It has a prototype implementation already: https://wiki.mozilla.org/Platform/GFX/WebGL2

> Asm.js is limited by having no 32-bit float type, no 64-bit int type, no SIMD, no dynamic memory allocation and the fact that they had to extend the language from day 1 just to be able to multiply a 32-bit integer by another quickly.

asm.js has 32-bit floats already: https://hacks.mozilla.org/2013/12/gap-between-asm-js-and-nat...

64-bit ints and SIMD are being discussed on TC39 right now. Dynamic memory allocation (assuming you mean growing the typed array) isn't that important and we have ways to do it if we need to.

> You may want to actually inform yourself.

This kind of comment is emblematic of HN's evaporative cooling, especially when it's inaccurate. :(


There are numerous security issues with WebGL, so I wouldn't really call it "secure".


A decade-late attempt to catch up with the JVM. Notice how there's no design here? It's because the web people had no idea this would be necessary and it got bolted on as a reaction to competitors. That is to say, the innovation belongs to plugin vendors, not the web people.


I would have been perfectly happy to be working with the JVM here, except that the JVM as a platform for web-type client stuff died because the user experience was implemented so shoddily.

There was no conspiracy against it. It had its chance. It used to have high install rates. It was placed about as perfectly for success as you could hope to be. If it hadn't failed so miserably at what it set out to do and be (and instead succeeded in a different area), we would be using it now, and living in the utopia you wish for.

As an aside, a lot of your complaints against web technologies make it sound as if there is a cohesive group of people designing something and doing it wrong. Actually it's a much more chaotic process involving a diverse group of people and companies and competing interests. It doesn't guarantee that we end up with the best designs, but such a process has benefits too, and since it's happening in the open, you could actually take part too (assuming you aren't already).


>It doesn't guarantee that we end up with the best designs, but such a process has benefits too, and since it's happening in the open, you could actually take part too (assuming you aren't already).

Unfortunately, what is needed is LESS people participating.


Of course they do! asm.js is a collection of everything in Javascript that optimizes on the current hardware. It's not the least bit dynamic, i.e. it is not Javascript.


Those are all hardware setups, whereas the browser is a very poorly designed portability layer (viewed in this way). It makes me physically angry how badly the browser vendors have dropped the ball here. It was obvious years ago that things were going this way, but instead of developing a proper application-development standard they just shoved everything into HTML. Every time I hear Mozilla talking about "innovation" I get upset because it is such a blatant lie. You have reinvented to past badly here. The best that can be said is you have been good at marketing the whole thing by constantly pretending that it's "just Javascript" etc, when the end result is completely static (i.e. not Javascript).


> The best that can be said is you have been good at marketing the whole thing by constantly pretending that it's "just Javascript" etc, when the end result is completely static (i.e. not Javascript).

asm.js executes according to the ECMA-262 semantics, which are the full dynamic semantics of JavaScript.


What GALL you have to be so willfully dishonest. asm.js is statically typed and has only the most coincidental relationship to Javascript. Just as if I take pointers out of C it is no longer C. What's funny is that you are exhibiting EXACTLY the "nothing to see here" behavior I just described by invoking some completely nominal relationship between Javascript and asm.js. And it is exactly this "nothing to see here" behavior that emphasizes how deep down the web people know they are simply reacting to competitors to retain market share.


If you wrote a C compiler that had special optimizations for a subset of C without pointers, then it would still be a C compiler, and the subset that it optimized for would still be C. (In fact, this is exactly the approach taken with languages like GLSL 1.0, although they added some extensions to make it not C anymore.)


I didn't say the Javascript implementations weren't Javascript implementations. Further, you are incorrect. A subset of C cannot be C. Are you dumb? If I make an expression language is this C because C has a little expression language inside it? It is a subset - and also a SUBSET OF AN INFINITE SET OF OTHER LANGUAGES. Hopefully you can understand my anger that baboons are apparently allowed to implement programming languages. They got it wrong with Javascript and did a complete about-turn with emscripten etc in REACTION to competitors. I look forward to laughing at your continued attempts to hide this embarrassing little abortion.


> the subset that it optimized for would still be C

This doesn't mean to imply the subset is the entirety of C; it means the subset is part of the definition of C — just like you can say "'int x = 4;' is C", even though it's a (demonstration of a) subset of C.

The JavaScript which happens to be optimised under the name "asm.js" is still JavaScript that executes in any non-optimising runtime, and the semantics with those optimisations and without are the same.


Javascript was the wrong design choice, and the asm.js people are very sensitive about this. Which is why they pretend it's "just Javascript". Nobody denies that it's backward compatible. But it is a new language, the rest is word games.


People have been parroting this nonsense for years. You have WebGL etc in the browser now. Web "pages" aren't pages anymore, and there is no way to enforce semantically meaningful markup. All that's been accomplished by fighting against the future is to ensure that all these low level services were implemented poorly. You can have all your high-level, common concepts but they need to be built onto a layered system. What we have now is a laughable mess, and it proves that the browser vendors have no idea what they're doing.


HTML etc get flack because they're badly designed. What do you want? Semantic markup, document layout or distributed programming? Sorry but the web people got _everything_ wrong. We don't have a platform that does any one thing well. The whole thing is a laughable dog's breakfast with no layering, no foundations or unity of design. The better course is evolution.. for Mozilla, who have been very good at maintaining their market position while endlessly bungling platform design.


and yet despite its flaws it's become the most successful platform for content since moveable type. Java failed to do that. iOS, despite its financial success, is failing to do that. android is failing to do that. flash failed.

Aren't you the least bit curious why it succeeded despite not being to your taste? Do you think it's just accidental? That the laws of physics and rationality were suspended for one brief glorious moment in the 1990's? Are you a web creationist?

Or do you believe that there's an actual answer?


I think portions of it were wide availability of web browsers, combined with the explosive adoption rates of internet usage... as time moved forward, entrenchment of the existing sites (and their quirks) got buried in...

If we'd started with xml compatible markup (all tags must close in order), and no browsers supported a quirks mode... we'd have much cleaner web browser engines and a much more usable web today.

I think that JS has a few quirks as well... so does CSS.. JS and CSS came after HTML, and even then have grown/distorted a bit. XHTML broke too many things, so we went pragmatic with HTML5. Just the same, no "new wheel" will get adopted in this space whole-sale. People have ditched XHTML and run back to HTML5.

I think a lot of things could be better, and will get better... so long as there are billions of pages/sites out there as-is, quirks mode browsers aren't going away.


well-formedness? _that's_ the problem with the web? Not slow performance? Not no realistic offline story? Not a loosely-typed, dynamic language? Well-formedness.

Sigh.


How much of a browser's time is spent in JS vs. rendering? How much overhead is spent of parsing/rendering? JS isn't even 25% of overhead in most sites, or even dynamic applications. Rendering of reflows, and other UI elements is. Understanding this in the scope of JS is important, as this is where it gets triggered. Hell, having something like AngularJS out of the box in the browser earlier on would have helped a lot.

Just the same, it started with well-formedness being loose, and continued from there.


Sure, slow rendering/layout/performance would be a perfectly reasonable complaint about the web platform.

The well-formedness thing is, sorry to be blunt, a totally crazy thing to complain about.

Every single platform that has any popularity introduces rough edges like this over time. It's impossible not to because every single bug that introduces relaxations gets baked in as content comes to rely on it. It is impossible to ever remove those relaxations, and really, it's totally fine.

There is a cost to lack of well-formedness, but on the list of problems with the web it is waaaaay waaaay down there.


"Content" is the key there. But I am at loss what exactly iOS and Android are failing to do? I think they do very well with music, movies, books, etc, thank you very much. Hypertext content? Well, we do have web for that.


oh, iOS and Android are doing just fine, as of this moment. Check back in 5-10 years though.

Just how exactly do you archive that content anyway? Anything you "buy" on those platforms is not something you own. It's something you're licensing for short term use. Doesn't sound like a great outlet for culture to me. It sounds like a death trap.


Well, how do you explain the fact that Web did not win on desktop? I am still puzzled why people think web should take over mobile for some reason, but never mention desktop.

  > Just how exactly do you archive that content anyway?
Why should I archive anything? I don't archive web pages I visit either.

  > Anything you "buy" on those platforms is not something you own.
I don't care if I "own" something. Owning for the sake of owning means nothing to me. If I pay for the book it is because I want to read it. If I pay for the music, it is because I want to listen to it. Even CDs I do own are now represented by their cloudy ghosts using iTunes Match. Why? Because they are always there. I don't have to walk with backpack full of CDs just in case I'd like to listen to particular song. I can get it on any of the devices I use. Yeah, I don't have a install DVD for every app I bought. I don't care. I change my phone: they are already there. I get new Mac: I go to App Store app and just click "Install" for every app I want to have on that machine. Maybe to some it is a death trap, I don't know.


"Well, how do you explain the fact that Web did not win on desktop? I am still puzzled why people think web should take over mobile for some reason, but never mention desktop."

Define "win". From my point of view, the web has not only won desktop, but utterly dominated it, and relegated the rest of the OS to a mere substrate for webpages. The only things it hasn't really replaced are photoshop, final cut and protools. It's only a matter of time.

"Why should I archive anything? I don't archive web pages I visit either"

oh boy. Paging Jason Scott. Jason Scott on aisle 12.

" I don't care. I change my phone: they are already there. I get new Mac:"

I'm glad you can thrive only on corporately produced content you license for brief periods of time. Many people out in the world are not corporations, and produce things that they care about. Many use computers to do this. Most care about the thing they made, and not about the tech they made it with. And so it ends up in these closed off little data silos and proprietary formats- not only are these things not backed up, they can't be backed up.

And then those people die, and there is nothing left of that person except what they made on the iPad with iOS6. The apps the things live inside are not compatible with the newer iOS. When that iPad dies, it's like the father, the lost son, the missing daughter- they die for a second time.

But you know, it's good that owning that stuff doesn't matter to you, and therefore should not matter to anyone else.

ta.


The web did win on the desktop. New desktop apps are web apps, except for clients to sync your files with a cloud service. The popular desktop apps all predate the rise of the web.

Also, having books and movies stuck in your amazon or apple account is convenient for consumption, but horrible for creation. You can't DO anything with that content. I've wanted to extract interesting stuff from books I own before and was forced to make screengrabs. If you don't understand what's wrong with the media rental model, go read "the right to read" by stallman.


It is already there ... as long as the provider allows it. _that_ is the death trap. You can download a file (and locally backup it), not a stream.

Well, yes, technically, we as tech savvy people could, but the commoners (not to say technophobics), they don't know how.

and that is how it is designed. You don't need backup, it's in the cloud. What happens when it is not anymore ?

It is not owning but rather preserving that is the concern. Is everything worth preserving though, I don't know.


It is not clear what you refer to when you say ``buy".

By the way, you have drifted away from discussing the technological aspects of the Web. This line of argument doesn't help establish why HTML/JavaScript/CSS are better.


HTML can be archived. Backed up. Saved. Become a part of history.

Apps cannot.

That's why HTML matters.


The man accusing me of "web creationism" jumps from "HTML etc suck and were poorly designed from the start" to "there's no explanation for their success". And in the same breath you engage in magical thinking regarding cause of the popularity of the web (indeed, I would bet you aren't capable of mentally separating "the web" from implementation details like HTML, CSS etc). Creationism was highly "successful" for the longest time too. Bravo on looking an absolute fool.


It works just fine.. for non-realtime games (and has been widely used for such games since forever). I've done the tests - you cannot just "tweak" it, and even if you could you'd be relying on the networking stack to do something it's not intended to do.


You might be surprised. World of Warcraft and Guild Wars 2 are both TCP. It's pretty common for MMOs. It can also work well for some types of RTS games where the added latency isn't a major issue.


They're not real-time, though, in the sense that a delayed message is as good as no message. The tolerance is high enough that a few delays usually don't affect gameplay.


All flash games use TCP (as you can't use UDP in flash). There are many realtime flash games. Many MMORPGs use TCP for compatibility reasons: they are very much real-time, especially something like PVP in world of warcraft, etc. Guild Wars also uses TCP and is very much a 'twitch' based realtime game as well.


PVP in mmos are hardly real time.


Spell interruption in warcraft? Also first youtube video for 'guild wars 2 pvp' : https://www.youtube.com/watch?feature=player_detailpage&v=KR...

Looks like real time to me.


WoW is the type of game where it can reatroactively change things when data is received and have it mostly be unnoticeable.

Your spell interruption example illustrates it perfectly - spells can be interrupted if the interrupt is within the 1-2s cast time of the spell. It's adjudicated on server side, where the server knows the spellcast start and sends feedback on the spell results (including if any interrupts happened). If there is minimal delay then it works without being noticed; however when I had significant packet loss, then the spellcasting would 'end' for me only after the response (including check if it was interrupted) was received, and it could be multiple seconds later.

The point is, that WoW can mostly function if the orders and feedback are received with delays, since if some opponent in reality(server) is somewhere else than I see, then it usually doesn't change anything - but for a rapid movement FPS game, that would change hits to misses which is the core of that game.


Didn't play much wow but aren't spell cast times like 2-3 secs in wow? Even with a 0.5-1 sec delay, it would be playable. Never played guild wars

Maybe my definition of real time is incorrect. But you don't point and click in MMOs generally, it is all aoe spells or targetted abilities. There are big margins of time or spell aoe to overcome delays of network packets.


1.5 second cast times with abilities to push that down to around 1.0s. A lot of the interrupts are off the global cooldown in order for players to use spell interrupts effectively.

.5 - 1 seconds of delay may not seem like much, but it actually adds up to be quite significant because of the global cooldown would also be enforce client side and so a player with higher delay would have done fewer actions in the same amount of time.


Spells vary in casting time. Many spells are instant.


- and the instant ones aren't interruptible I guess?

People, I don't even get what you're discussing here. That TCP should be fine for any game's network code because it works for WoW?


I disagree. Even back in 2005-2006 - almost a decade ago!, in Guild Wars pvp, it was possible to interrupt a 1 second spell with a 1/4 second one, leaving you with 750ms for human reaction time and latency.


World of Warcraft is not a "twitch" game. Ever multi-screened it? Why make all this stuff up? This may surprise you, but Flash games don't set the standard for quality network code.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: