Hacker Newsnew | past | comments | ask | show | jobs | submit | mdwrigh2's commentslogin

Yeah, that's exactly what happened with the Pixel C. A lot of politics around who would own tablets and laptops at the time that meant the winds changed direction ~yearly, hence the horribly confusing product line ups that happened.


Wrong Ian -- Ian Hickson wrote about that in a blogpost, this post is about Ian Lance Taylor


You can disable them, which is functionally uninstalling. Actually being able to uninstall them isn't technically possible because their base images live on a read-only partition so they survive factory reset (which erases all RW partitions). The OS could _pretend_ they're uninstalled, but that seems strictly worse than the existing presentation which more accurately reflects reality.


Not exactly the same situation, but I keep wondering : is deleting personal data only "pretending", since most of it could potentially be restored for quite some time (especially on transistor storage) ?


> When we refer to "log", this is almost always means a camera gamma - an OOTF.

Wouldn't this be the OETF? OOTF would include the EOTF, which is typically applied on the display side (as you noted).


You're right, looks like I got the acronym wrong. I'm referring to OETF.


> The ritual to compile, pack and flash super.img into the device is absurd.

I typically only do a full flash for the first build after a sync. Afterwards I just build the pieces I'm changing and use `adb sync` to push them to the device, skipping both the step that packs the image files and the flash. The `sync` target will build just the pieces needed for an `adb sync` if you don't know exactly what you need to build; I typically use it so I don't have to even think about which pieces I'm changing when rebuilding.

So typical flow goes something like:

``` // Rebase is only needed if I have existing local changes > repo sync -j12 && repo rebase

// I don't actually use flashall, we have a tool internally that also handles bootloader upgrades, etc. > m -j && fastboot flashall

// Hack hack hack... then: > m -j sync && syncrestart ```

Where `syncrestart` is an alias for:

``` syncrestart () { adb remount && adb shell stop && sleep 3 && adb sync && adb shell start } ```

Incremental compiles while working on native code are ~10 seconds with this method. Working on framework Java code can be a couple minutes still because of the need to run metalava.


This is really nice tip.

AFAIK, sync works on Linux only since it needs $ANDROID_PRODUCT_OUT. The problem is that I develop on a Windows machine (vscode with ssh extension) and my source code is on remote Linux machines (in premises), dedicated to building AOSP. Since I build at least for another 5 platforms, my working PC cannot cope with the current (and future) workload/space, so I asked to move all the source to dedicated hardware for building. Perhaps I can do it with ADB through Wifi...

I always thought sync worked with frameworks or packages, but since you mentioned "native" I guess it will also sync vendor stuff?


I'm not the parent you're asking, but figure you might be interested anyways since I could've likely made the same comment:

I work on displays within an OS team. Having some basic understanding of colour theory is critical for a significant number of modern display projects, particularly for the high end. For example, enabling colour accurate rendering (games, photos, etc), shipping wide-gamut displays (how do you render existing content on a WCG display?), etc. More specifically to the planckian locus, it generally comes up when deciding which white point to calibrate a given display to at the factory (e.g. iPhone is 6470K, S20 is 7020K in Vivid)[1][2] and if you're doing any sort of chromatic white point adaptation, like Apple's True Tone[1][2].

My background before joining the team was a degree in math, but I really enjoyed doing low level projects in my spare time, so ended up on an OS team. We also have colour scientists who study this full time and have a _significantly_ better understanding of it all than I do :)

[1]: https://www.displaymate.com/iPhone_13Pro_ShootOut_1M.htm#Whi... [2]: https://www.displaymate.com/Galaxy_S20_ShootOut_1U.htm#White... [3]: https://support.apple.com/en-gb/HT208909 [4]: http://yuhaozhu.com/blog/chromatic-adaptation.html


> It seems somewhat silly on a non-realtime OS to grab one frame, request a 15ms wait, and then grab another frame, when your deadline is 16.6ms

A couple reasons this isn't as silly as it seems

1) ~All buffers in Android are pipelined, usually with a queue depth of 2 or 3 depending on overall application performance. This means that missing a deadline is recoverable as long as it doesn't happen multiple times in a row. I'd also note that since Netflix probably only cares about synchronization and not latency during video play back they could have a buffer depth of nearly anything they wanted, but I don't think that's a knob Android exposes to applications.

2) The deadline is probably not the end of the current frame but rather than end of the next frame (i.e. ~18ms away) or further. The application can specify this with the presentation time EGL extension[1] that's required to be present on all Android devices.

[1]: https://www.khronos.org/registry/EGL/extensions/ANDROID/EGL_...


Skia has been around for 15 years (open source for 12) and is used in a number of high priority projects for Google (e.g. Android, Chrome). It's incredibly unlikely it will be killed any time soon.


> At a meta-level, I'm surprised that something with so factual (and testable) an answer can still not be settled.

It is absolutely settled, and has been tested over and over again. Power is roughly proportional to the amount of light emitted[1], so having dark grey is absolutely a power savings over pure white.

Google slides: https://www.theverge.com/2018/11/8/18076502/google-dark-mode... Display energy modeling: https://onlinelibrary.wiley.com/doi/am-pdf/10.1002/stvr.1635

[1]: This isn't totally true mostly because the display is broken into RGB elements emitting light of differing efficiencies and human perception of the brightness of those elements is not identical.


It's "Don't be evil" and it's still in the code of conduct:

  And remember… don’t be evil, and if you see something that you think isn’t right – speak up!
- https://abc.xyz/investor/other/google-code-of-conduct/


[^1] is the particular source I had in mind. It seems to no longer be a value but part of the signoff.

[^1]: https://gizmodo.com/google-removes-nearly-all-mentions-of-do...


The last sentence in your article:

'The updated version of Google’s code of conduct still retains one reference to the company’s unofficial motto—the final line of the document is still: “And remember… don’t be evil, and if you see something that you think isn’t right – speak up!”'

So they literally just moved it to the end of the code of conduct and did not remove it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: