Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> I've seen children accustomed to using touch interfaces blurring the line between physical and virtual, i.e swiping at physical objects

A friend of mine in college once wrote date on a sheet of paper during an exam, expecting to find out what time it was...



sort by: page size:

> Touch screens are also pretty terrible for older people. Vision and fine motor skills both degenerate. Without tactile feedback, it is hard for older people to determine if they've successfully clicked on something. Gestures can be tough to teach and successfully implement without error.

And I've seen that when people are holding their phones in a living room. The cars are vibrating (normally), and the user is trying to drive, with one hand on the wheel, etc.


> Touchscreens are pretty natural and intuitive

Try to watch someone in their 60s operate them and you will learn that the are anything but.

My dad find them endlessly frustrating, as his first instinct when talking about something in an image or something is to put his finger on it. But that invariably triggers something on such a screen.

He also struggles with things like Google Maps on a PC with a scroll mouse, as to him the natural action is to "pull" the map closer rather than "zoom in". End result is that he often find himself scrolling the wrong way...


This! So many people attribute this "failed attempt" to bad parenting or stunted development and fail to realize it is only the natural progression of development.

Children (and adults) learn from their interactions with items, be it technology or not. Attempting to use "touchscreen behaviors" on something that isn't a touch interface is just another learning moment. They're attempting to use their knowledge and experiences to explore the world around them, as each generation did before us.

We're just more connected than we ever were, so this behavior is now more noticeable due to that connectivity and sharing.


This highlights something incredibly important. There's a very stark age line in people using touch screens by default. I TA an intro CS course, and I've noticed the line creep. It's somewhere between those born in 1998 and 2002. They just all like touch screens better because that's what they are used to.

I haven't even graduated and am feeling like a greybeard when it comes to this area. I have to wonder if it's simply that we were trained to be mouse by default...


>Touch is magical.

Touch is at best a second-class input system. It's never going to have the kind of pointing precision we got used to twenty years ago with mice, and it's never going to be comparable for text-input with a dedicated keyboard. We use it on our phones because it's all we've got, and as terrible as touch screens on a device that small are (my Samsung phone has an effective touch resolution of about 4x7 finger tips, and any elements smaller than that I can't reliably interact with), the alternatives are worse.


> Touchscreens were intended to reduce the level of distraction

Uhm, what???


> even though it isn't really necessary for anything.

It is one of the best, if not the best, way of letting users know what their action will lead to, and of letting them explore the range of available actions.

By behaving similarly to a well-know interface, it simplifies discovery, lowers friction and reduces frustration.

So when smartphones and touch interfaces were new to most people, behaviours such as this one were major, although nearly invisible, product features.

Now? Yes, it mostly isn’t necessary to anything, but it was nice.


> anything with an app is about 15,000 times more useable than the touch surfaces for most modern appliances.

As a person with pretty good sight (140% with glasses), I have to say that these sort of devices tend to be difficult to use even for me. For example, my father has an inductive stove with a touch interface, and I commonly fail to properly use it on the "subsequent first tries".

When I see a touch screen interface that is completely static (same "buttons" at the same positions), I often wonder what led to the decision to utilize them instead of just using hardware buttons/knobs. Is it a cost-saving measure, or just some managerial person who decided they needed to use something more modern?


>Now if touch screens had a dynamic 3d surface that you could navigate blindly it would be something else.

It's not about touch nor physicality of a button or knob, but about a control that is not dependent on UI state


> Touch interfaces provide no benefit over physical buttons

Unless you have something like arthritis.

Mind you, every other accessibility issue I thought of favoured physical buttons to some extent (eyesight, motor control, etc.)


I highly doubt you'll find a 14 year old who swipes at a book... kids very quickly learn context and rules of physics. It's much harder when some screens are touch screens and some are not, however, but I see adults trying to swipe non-touch screens all the time (mall displays, ads, iMacs etc.)

You may not be convinced, but the generation of younger kids expect every screen to be a touch screen. It's amusing to watch my kids (3 and 5) touch the screen expecting it to react the same way an iPad or iPhone responds.

I have seen a 3 year-old failing to grasp how to use the arrow keys to control a character in a game, but swipe to review pictures on an iPhone spontaneously. There is one indirection less, which probably makes all the difference.

(This does not mean that touch interfaces are better, though -- just that they are more intuitive.)


> it requires active attention to operate.

It just requires context. How that context is critically important. If it is a hierarchical menu, then the context is the navigation path (i.e. the sequence of previous button pushes, each of which transitions from one state to the next). Importantly, with a fixed hierarchical menu, the path to a button's functionality doesn't change and can be memorized. With some audio feedback, the current state can also be announced, so that a person's mental state matches the state the interface is in.

There are several problems with touchscreens, not the least of which is the context issue. The next issue is there is no tactile feedback, which requires you to look at where you are touching, often because interactive things can appear anywhere.


> A touch screen looks amazing and so futuristic

Not true. If you examine sci-fi movies from the 1960s onwards, you'll learn that the most futuristic-looking interfaces have the most buttons and physical affordances. Touchscreens were never regarded as futuristic, and thus rarely depicted in sci-fi.


It's just a typo. These are known to happen and predate touch screens.

I understand the argument and read it often on HN. I just want to mention that you can have muscle memory on touchscreen too. If you use a smartphone you probably can write without looking.

It’s a bit the same with the car touchscreen. I don’t notice big differences when I aim for a virtual button on the car touchscreen and a physical button.

For seldomly used buttons, I give a brief quick look both for virtual and physical.


And while we're doing that touch vs. tactile comparison, what I also loved about the old "dumb phones" was deterministic timing. I could do stuff on those phones from start to end without looking at the screen - I quickly learned and remembered that e.g. this operation completed instantly, and that menu has ~1/4 of loading time, etc.

I think this feature was actually key to being able to memorize how to perform operations quickly. There's no way you can do that with modern smartphone, where every other interaction lags for anything between 10 and 1000 ms, and inputs are sometimes dropped at random. It's this non-determinism of smartphone UI that makes me look at the screen all the time when using it.


«Critically, it lacks the bandwidth of other forms of input.»

The most classic forms of input (keyboard and mouse) are touch manipulations of plastic artifacts: their bandwidth is essentially a subset of touch rather than any other way around.

With modern touch screens we are only beginning to scratch the surface of touch sensory bandwidth. Certainly there remains plenty of interesting areas to expand haptic feedback. There's also plenty of ways left to blend and hybridize touch sensitivity and gestures and even context sensitive/context reliant physical objects (like digital pens/pencils or even how our keyboards might interact with touch).

«It's for consumption, not creation.»

Many of the earliest forms of human creativity were manipulating things through touch. Maybe we haven't yet arrived at something like a good approach to digital sculpting or pottery, but that doesn't mean that we won't.

Even then, I've seen amazingly creative touch apps. There are some really cool touch apps in music creation, just off the top of my head. The fact that you associate touch with consumption (and "less cognitive development") may say more about you than the world of touch apps that already exist.

«It leaves the screen dirty.»

Yeah. So? Things get dirty when you use them. You find ways to clean them. Are you keeping your screen pristine and untouched/undirtied because it needs to be sconced in a museum some day?

I get that it's a personal preference and it plainly drives the Monk-esque sorts of OCD wild, but at the end of the day, entropy wins anyway.

«It encourages wasteful use of screen real estate by UI.»

One person's waste of space is another person's accessibility. Touch invites larger click targets to better accommodate the fatness of people's fingers. That accommodation, however, also helps people with accuracy issues with a mouse (which has always been a big deal easily ignored). Even further, it helps those situations that shouldn't demand accuracy in the first place. When I'm working on a spreadsheet at work, why does every click need to be a "headshot" to get my work accomplished? Lining up those shots takes time and energy I could be spending on the actual work. Just because mice can be pixel accurate doesn't mean they should be. As monitor resolutions increase and DPI increases and pixels shrink this only becomes crazier when an application has a small pixel-accurate hitbox. (It amazes me how many mission critical Enterprise apps you see with old school 16x16 pixel icons on toolbars running on modern hardware as if those businesses have a need to be FPS sniper schools.)

Fitt's Law suggests we should do better than that. Targets that are bigger and/or closer to the mouse pointer are easier to hit. If it takes touch to force more developers to be more mindful of Fitt's Law, then that alone is reason enough to support touch.

next

Legal | privacy