Somebody could use this as a starting point. http://touchscale.co/ You'd have to collect new data on touch strength vs. weight to get the regression parameters.
(If you do this, let me know and I can add it to the site above, and then we can both delight in the surprisingly large amount of unmonetizable traffic it gets.)
The single most irritating killed feature from Apple. Redesign half of their UI to rely on 3D Touch to make sense, then get rid of 3D Touch without redesigning the UI. Previewing links, moving the cursor, interacting with items, they’re all “press and hold until haptic feedback” instead of “quickly press hard and get immediate feedback.” Easier to accidentally trigger, slower to trigger on purpose.
Hardware cost+extra weight (need to make the glass thicker to be able to handle extra force and not push on the display). Turns out nobody was really using it because discoverability sucked..
Hardware cost & weight, fine. Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen), and the last time I replaced a battery on a 12, the screen was just as thick as the XS.
>Turns out nobody was really using it because discoverability sucked..
Sure, but then redesign the UI after removing 3D Touch to not be equally undiscoverable but less precise. Even on the latest iOS beta with its full redesign, there's still many, many actions that require a long press that are completely undiscoverable. (For example, if you don't have the Shazam app installed, go find the list of songs Siri has recognized when asked "What's this song?" Don't look up the answer.)
> Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen)
I dont think this is a great argument. The glass maybe needs to be thicker so the sensors on the border can properly measure the pressure, not because the screen is close to shattering.
He is capable of pressing twice as hard as the feature required at maximum. The screen handles 2x the maximum without issues. Therefore, the glass is thick enough to handle half that pressure,as required by the feature.
As far as I know, the pressure is measured around the edge of the screen. If the screen is thin enough, it could bend when pressed and the pressure applied to the center of the screen can’t be properly measured. I don’t think the problem with a too thin screen is the screen breaking when pressing it.
The discoverability sucked because Apple never rolled this out to all of the devices, themselves grossly under utilized the feature and eventually ghosted it.
It was by far the best cursor control paradigm on iOS. Now everything is long press which is slow and as error prone.
I’m all for proposing different paradigms as accessibility but 3dtouch was awesome.
3D Touch was amazing for typing alone, I miss it basically every day when I type more than a couple of words on my phone. It was so great to be able to firm-press and slide to move the insertion point, or firmer press to select a word or create a selection. It was like a stripped down mobile version of the kind of write-and-edit flow of jumping around between words that I can get on a proper keyboard with Emacs keybindings drilled into my brain.
You can still move the cursor by long pressing on the space bar, in case you didn't know. There's no equivalent replacement for the selection behavior you're describing, though (as far as I'm aware).
I don't like it when old people are the reason the rest of us can't have nice things. Some grandma in Nebraska can't use 3D touch and now the rest of the demographic of Apple's customers are deprived of it.
There was a principle of UI design that all UI actions should be discoverable, either with a visible button or a menu item in the menus at the top of the screen (or window on Windows). This is annoying for power users and frequently used actions, so those can also be made available with keyboard shortcuts or right-click actions or what have you, but they must always be optional. This allows power users to be power users without impacting usability for novices.
We've been losing this idea recently, especially in mobile UIs where there's a lot of functionality, not much space to put it in, and no equivalent of the menu bar.
When I had an iPhone XS i could never understand how to predictably do a normal touch or a 3d touch, or where exactly the OS has different actions for one vs the other.
And I play games [1] using just my macbook pro's trackpad...
[1] For example, Minecraft works perfectly without a mouse. So does Path of Exile. First person shooters ofc don't.
iPhone 6s and 6s Plus (2015) - First to introduce 3D Touch
iPhone 7 and 7 Plus (2016)
iPhone 8 and 8 Plus (2017)
iPhone X (2017)
iPhone XS and XS Max (2018) - Last models with 3D
Interesting that the iPhone SE 2nd/3rd generation with iPhone 8 form factor do not have 3D touch but "Haptic touch" instead.
The choice of activation function isn't entirely clear to me, but I think it's definitely possible to make a network that operates entirely in the frequency domain. It would probably be pretty easy to start experimenting with such a thing with the nice complex number and FFT support in PyTorch 1.8. :)
Like you said, there's already a significant connection between convolutional networks and the Fourier domain (the convolution theorem).
Tangentially, I've recently worked on a project that focused on implementing convolution in the Fourier domain, and how that allows one to control useful properties of convolutions (like "smoothness" and orthogonality).
(If you do this, let me know and I can add it to the site above, and then we can both delight in the surprisingly large amount of unmonetizable traffic it gets.)