What makes interaction intuitive? And why are IT systems so dependent on the one screen paradigm?
The Siftables tangible user interface shows us that there are alternatives. Do you see such interaction being used in a wide range of applications? Or are such specific approaches damned to serve their purpose as pure toys and learning aids?
For decades we have used keyboards to communicate and express ourselves in the digital world. Relatively recently, the focus of user input has shifted towards multi-touch interaction, at least for a select group of devices and tasks. Now it seems that handwriting is about to get a boost in the virtual space.
Wacom is releasing a cross-platform standard for sharing handwritten notes and drawings. This is even more impressive when we consider that users will be able to experience and interact with how their collaborators are sketching something or writing a note. Imagine sitting in your hotel room with your tablet in your hand, while watching in real-time how your child creates a drawing on his device at home. Technology is getting increasingly focused on the human factor, and I believe this is the exact way to follow.
Where’s a video with the people who are also working on the Tesla Model S, the famous electric car. It shows how companies like Tesla try to improve their efficiency by embracing novel technologies, like mid-air interaction, augmented reality and 3D printing.
By the way, when will I get my chance to play with a 3D printer? The CAD model is already prepared. 😉
It seems the big tech companies are starting to play with the idea of smartwatches. Proof is not only the fact that there are already some on the market, but also that increasingly more are coming out (like this week at IFA). Here are a couple of the most noteworthy smartwatch candidates:
I must say that I’m quite impressed with the amount of functionality that is being packed into these small wrist accessories. However, elements like battery life and the fact that these devices aren’t standalone (you still need to couple their functionality with some smartphone), is a deterrent for me. Once the smartwatches become truly smart (and this seems to be in the near future) and not require the “brain” of you smartphone, I’m in. Until then however, I simply enjoy following up on the developments.
It seems like the Emotiv company is preparing to develop and release a new EEG neuroheadset called the Emotiv Insight. While it will have fewer sensors than the Emotiv EPOC or Emotiv EEG, it will be easier to deploy and more user friendly, with what seems to be a powerful framework. Can’t wait to get my hands on one! 🙂
Oh, and before I forget, Emotiv also requires your support to develop this new product. For this purpose, they opened a Kickstarter project. As the Kickstarter is still running for a couple of days, you might want to consider chipping in so that this new EEG headset has a better chance to see the light of day.
This looks like a great idea opening entirely new possibilities in terms of interaction. Frankly, it reminds me of projects on smoke walls where the user could interact in a somewhat similar way. Now I just need to find the video for that too and post it here for comparison.
Have you heard of the Emotiv company? Well, they are one of the players that are developing portable wireless EEG solutions. And while their initial products—the Emotiv EPOC and Emotiv EEG—have been used with success in various research projects, gaming solutions and user evaluations, they are now focusing on a new device: the Emotiv Insight.
The Insight is planned to be a more user-friendly device that will require less effort for setting up and—based on my experience with the EPOC—will probably be more comfortable to wear. On the other hand, Emotiv Insight will have only 5 sensors that will reduce the resolution of the device when compared to the 14 channels available on the EPOC and EEG models.
However, the Insight will clearly have some advantages compared to the previous Emotiv neuroheadsets. You can currently read more about the Emotiv Insight on this Kickstarter page, where the company is trying to raise some money to start initial production (the goal of $100k has already been surpassed). And more importantly, if you also believe that this is a project that you believe could change the face of computing, evaluation or human-computer interaction, do participate on Kickstarter and help this project get funded. I’m convinced that brain-computer interfaces are one of the major ways to change the face of human-machine interaction in the following decades, and that projects like this one are simply stepping stones in that direction.
You can find some more information about the Emotiv Insight on Visual.ly.
This is an awesome project for showing young school children the fun behind science and engineering. The system uses genetic algorithm to create mutations and thus new generations of virtual creatures. However, live usually the survival metric for such creature generations is more objective (e.g. being able to move from A to B in a given period of time), in this case, young children will be able to decide. How? Well, they will wear the Emotiv EPOC headset that reads EEG brain signals and interprets them. Whenever the device notices that the child likes what he sees, that generation of creatures will survive, and vice versa (or so I deduce). Excellent idea!
Yes, as you can see here, researchers have finally taken the EPOC headset on the streets. In this case, it’s a study that took place on the streets of Edinburgh, about the correlation of emotional states and the environment of the various city areas. Cool, right?