Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Surprisingly, it does matter! The key abstraction here is partial evaluation combined with memory safe languages.

Look at how the JVM world does it with GraalJS (or GraalPython or the other langs). This approach eliminates the sorts of vulnerabilities V8 is talking about, because the semantics of the language are defined as a memory-safe interpreter, which is then converted to JIT compiled code directly by having the compiler treat interpreter data structures as constants for the constant folding passes.

This gives you a 1-2 win:

1. The example given at the top of the blog post wouldn't be exploitable in interpreter mode in GraalJS, because the VM intrinsic would be written in Java and thus bounds checked.

2. It's also not exploitable once JIT compiled, because the intrinsic and its bounds check has been inlined into the user code function and optimized as a unit (meaning the bounds check might be removed if the compiler can prove that the user's implementation of ToNumber doesn't modify the size).

GraalJS has nearly V8 levels of peak performance, so this technique doesn't require big sacrifices for server workloads, and client workloads where latency matters more are now a focus of the team. In this way you can build a high performance JS engine that still has tight security. It supports code sandboxing also.

(Disclosure: I do part time work for Oracle Labs and sit next to some of the people doing that work)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: