If an application is showing a progress bar of some kind it means it is doing something. If that something is doing some work not just waiting for a network response, the app should indicator the current work it is doing.
Instead some apps use an “idiot light” approach. Case in point. My PC crashed. Startup Repair is running. All it shows is a progress indicator, not one based on completion status or time left, just a blue rectangle cycling by. Its been running for hours. Is is still doing anything, is it hung, is there hope?
Sure, for many apps, actual work status output is redundant and not useful to the “average” user. So when should more information be shown? When the elapsed time is over some threshold. Or if a user wants more information, they can signal that to the app. Many applications use this approach. Why something so fundamental as repairing disks doesn’t do this is very puzzling.
The same dialog box is used in other parts of Windows 10, like when creating a Restore Point, so we still have the same “idiot light” User Experience Design (UxD).
In a prior post I compared “Google Now” and the concept of a Proactive User Interface. It looks like ‘Google Now on tap’ will finally be a step in the right direction.
My first impression from a quick read of some articles is that it is an expansion of the info cards concept with more correlation with current UI context. This is such a powerful, and an extremely obvious feature, that you wonder why this was not done years ago. True, Google will put more search and Big Data power behind this. But, is it really predictive and will it “learn” a users information patterns?
An information pattern example (from my prior post) is a User is viewing a web site. There is a probability that if a certain amount of time is spent or a certain page or article type is visited, that clicking a share button will be followed by predictable actions. For example, sharing a link with a colleague or loved one. The UI presented will then present a proactive plan. See “Proactive User Interface“. Generating information related to context is still requiring the user to perform wasted effort to form and act on immediate action plans. So what are those octocore chips for?
Nov 9, 2015: Google just open sourced a Machine Learning system, TensorFlow.