Hacker Newsnew | past | comments | ask | show | jobs | submit | ashertrockman's commentslogin

Somebody could use this as a starting point. http://touchscale.co/ You'd have to collect new data on touch strength vs. weight to get the regression parameters.

(If you do this, let me know and I can add it to the site above, and then we can both delight in the surprisingly large amount of unmonetizable traffic it gets.)


On iPhones at least a hack was to rest a metal spoon on the screen and weigh something in the spoon...


If anyone happens to be using an iPhone 6S... http://touchscale.co/


This worked all the way up through the iPhone Xs.


The single most irritating killed feature from Apple. Redesign half of their UI to rely on 3D Touch to make sense, then get rid of 3D Touch without redesigning the UI. Previewing links, moving the cursor, interacting with items, they’re all “press and hold until haptic feedback” instead of “quickly press hard and get immediate feedback.” Easier to accidentally trigger, slower to trigger on purpose.


Hardware cost+extra weight (need to make the glass thicker to be able to handle extra force and not push on the display). Turns out nobody was really using it because discoverability sucked..


Hardware cost & weight, fine. Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen), and the last time I replaced a battery on a 12, the screen was just as thick as the XS.

>Turns out nobody was really using it because discoverability sucked..

Sure, but then redesign the UI after removing 3D Touch to not be equally undiscoverable but less precise. Even on the latest iOS beta with its full redesign, there's still many, many actions that require a long press that are completely undiscoverable. (For example, if you don't have the Shazam app installed, go find the list of songs Siri has recognized when asked "What's this song?" Don't look up the answer.)


> Glass doesn't need to be thicker than it currently is (I can press on my 13 Pro's screen about twice as hard as was needed for 3D Touch's max depth, and no issues with the screen)

I dont think this is a great argument. The glass maybe needs to be thicker so the sensors on the border can properly measure the pressure, not because the screen is close to shattering.


Maybe you had a hard time parsing his comment.

He is capable of pressing twice as hard as the feature required at maximum. The screen handles 2x the maximum without issues. Therefore, the glass is thick enough to handle half that pressure,as required by the feature.

It's a good argument.


As far as I know, the pressure is measured around the edge of the screen. If the screen is thin enough, it could bend when pressed and the pressure applied to the center of the screen can’t be properly measured. I don’t think the problem with a too thin screen is the screen breaking when pressing it.


For what it’s worth, I made the same parsing error upon first read.


stiffness != strength


The discoverability sucked because Apple never rolled this out to all of the devices, themselves grossly under utilized the feature and eventually ghosted it.

It was by far the best cursor control paradigm on iOS. Now everything is long press which is slow and as error prone.

I’m all for proposing different paradigms as accessibility but 3dtouch was awesome.


3D Touch was amazing for typing alone, I miss it basically every day when I type more than a couple of words on my phone. It was so great to be able to firm-press and slide to move the insertion point, or firmer press to select a word or create a selection. It was like a stripped down mobile version of the kind of write-and-edit flow of jumping around between words that I can get on a proper keyboard with Emacs keybindings drilled into my brain.


You can still move the cursor by long pressing on the space bar, in case you didn't know. There's no equivalent replacement for the selection behavior you're describing, though (as far as I'm aware).


I don't understand why Apple doesn't just let us slide to move the cursor, who needs force touch?


Nobody? Really? It’s definitely the UX feature I miss most on modern iPhones. Long press feels janky in comparison.


Really? For me it’s the “open image in new tab” option in safari

Have no idea why you’d go out of your way to do that other than placating image sharing services


Apple UX is generally very bad, especially around discoveribility.


I hated when my mother in law came to me for help using her iPhone. She had a hard time controlling and understanding 3d touch.


I don't like it when old people are the reason the rest of us can't have nice things. Some grandma in Nebraska can't use 3D touch and now the rest of the demographic of Apple's customers are deprived of it.


There was a principle of UI design that all UI actions should be discoverable, either with a visible button or a menu item in the menus at the top of the screen (or window on Windows). This is annoying for power users and frequently used actions, so those can also be made available with keyboard shortcuts or right-click actions or what have you, but they must always be optional. This allows power users to be power users without impacting usability for novices.

We've been losing this idea recently, especially in mobile UIs where there's a lot of functionality, not much space to put it in, and no equivalent of the menu bar.


For many, old people are the reason the rest of us have nice things.


When I had an iPhone XS i could never understand how to predictably do a normal touch or a 3d touch, or where exactly the OS has different actions for one vs the other.

And I play games [1] using just my macbook pro's trackpad...

[1] For example, Minecraft works perfectly without a mouse. So does Path of Exile. First person shooters ofc don't.


  iPhone 6s and 6s Plus (2015) - First to introduce 3D Touch
  iPhone 7 and 7 Plus (2016)
  iPhone 8 and 8 Plus (2017)
  iPhone X (2017)
  iPhone XS and XS Max (2018) - Last models with 3D
Interesting that the iPhone SE 2nd/3rd generation with iPhone 8 form factor do not have 3D touch but "Haptic touch" instead.


The choice of activation function isn't entirely clear to me, but I think it's definitely possible to make a network that operates entirely in the frequency domain. It would probably be pretty easy to start experimenting with such a thing with the nice complex number and FFT support in PyTorch 1.8. :)

Like you said, there's already a significant connection between convolutional networks and the Fourier domain (the convolution theorem).

Tangentially, I've recently worked on a project that focused on implementing convolution in the Fourier domain, and how that allows one to control useful properties of convolutions (like "smoothness" and orthogonality).

I made a demonstration of convolution in the Fourier domain in PyTorch, which you might find interesting: https://nbviewer.jupyter.org/github/locuslab/orthogonal-conv...

More generally, you could look here for more code and the corresponding paper: https://github.com/locuslab/orthogonal-convolutions


SIGBOVIK is proud to have published this groundbreaking work in its 2021 proceedings, which were released today.

Also, check out the author's video: https://youtu.be/HLRdruqQfRk

For more non-serious research that is often executed seriously, see: http://sigbovik.org


Really clever. I am working through EOPL, and this seems pretty familiar; it's the procedural representation of a database!

Can't wait to deploy this with my next web app.


Whats EOPL?


Essentials of Programming Languages; an introduction to functional programming, as well as other concepts.


EPL isn't really about FP... it's more about programming languages.



It is definitely more accurate than that. I made some changes to make sure it is compatible with all sensitivity settings.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: