The Future of Technology?

Ali Haider
Analytics Vidhya
Published in
6 min readMar 21, 2020

--

The world is forever changing.

Who would’ve guessed 40 years ago almost every person on this earth would have a computer more powerful than the ones used to take the first person to the moon in our pockets?

The technology we use in the future will drastically change from where it is today.

It is most likely that in less than 20 years our most common pieces of technology won’t even have a screen!

This shift away from screens into more interactive methods of user interplay is called zero user-interface (or zero UI).

It seems like something out of an iron-man or sci-fi movie, doesn’t it?

Now zero UI may seem all that, but there are some examples of the shift of people moving away from screens today. Take, for example, Apple’s Siri or Google’s Alexa. Two perfect examples of voice-controlled software.

Zero UI is all just about moving away from screens. Whether it be through haptic technology, voice control, computer vision, or artificial intelligence.

Today, I am going to focus on haptic technology and computer vision, within this article, two of the most underrated exciting emerging technologies.

Let’s dive straight into one of the most immersive ways technology can be apart of our lives.

Haptic Technology

A piece of technology that creates the sensation of touch

Haptic technology can and will revolutionize countless fields from fashion to the medical industry. Imagine being able to feel the fabric of those new pair of jeans you want to buy from online. Or a surgeon being able to operate on a patient from a remote location without having to be there.

This technology is already debuting in our everyday lives. The vibration you can feel on your gaming controller or phone are great examples of this technology.

There are three types of haptic technology:

  • Touchable
  • Wearable
  • Graspable
Credit: Stanford Shape Lab

Touchable Haptic Technology

The most common kind of this technology you can see today is touchable haptic technology.

A great example of this technology that could be seen in the next few years is the mimicking of certain textures on your devices.

One leader in this specific use for touchable haptic technology is Heather Culbertson. By recording how a pen reacts to it being stroked across a surface at different speeds and pressures, Heather was able to imitate those sensations on a touch screen surface. She calls this data-driven haptics.

An example of touchable haptic technology being shown on a touchscreen device with a pen Credit: Stanford Shape Lad

With technology continuing to emerge, a screen may no longer need to be included for this technology to work.

Graspable Haptic Technology

Another more exciting form of this topic is graspable haptic technology.

Graspable haptic technology works to imitate the feeling of objects, usually on a different scale. An example of where this technology could be useful is when combined with other technologies such as robotics or aerial drones.

Imagine being able to control a robot using haptic technology and being able to feel exactly what the robot is touching.

Joysticks to control robotic machinery that can cause you to feel resistance are a great example of a use for graspable haptic technology.

The biggest issue with creating most more immersive graspable haptic technology devices is trying to imitate the weight of objects. This was the reason behind the creation of Grabity.

Grabity is a mechanism you can use to grasp and squeeze virtual objects that makes it seem like those objects have weight.

Grabity, creation by Stanford University Shape Lab

Wearable Haptic Technology

Wearable haptic technology can be commonly used alongside virtual reality.

Imagine being able to write with a pencil in virtual reality and being able to feel the pencil’s pressure on the paper.

Devices using this technology have already been created. Wearable fingerpads are available that when you touch objects on a screen you can actually sense those objects by touch.

Similarly, a new wearable haptic device shaped almost like a ring has also been taking the world by storm. The object has motors within its build to provide the user with a more immersed virtual experience while still being able to interact with real-life objects.

Finally, a wearable vest that to aid individuals with hearing loss is an exciting opportunity for many around the world. This vest will work by using vibrations to allow these individuals to understand what is being said to them.

Understanding how this vest works is a little more difficult. The layers of sensors and operating layers are separated by silicone. In this device, PZT allows the computer to understand pressure exerted onto the device and convert it into feedback. This allows for user-input into the computer. The operating layer has tiny areas filled with air that allow for vibration to occur based on what the device needs to execute.

Credit: H.A. Sonar et al / Frontiers in Robotics and AI 2016

Computer Vision

A type of technology within the field of artificial intelligence that allows computers to understand the world around us

One of the most exciting technologies most commonly used in autonomous vehicles is computer vision. Autonomous vehicles need to understand what’s around them, which is why it is essential they use computer vision.

Credit: Towards Data Science

Computer vision works in a way similar to solving a jigsaw puzzle. The device recognizes certain parts of an image through its edges and other properties of it to detect certain objects. It looks at groups of pixels to see what they form.

For the device to understand what is what, machine learning is required. By inputting hundreds of pieces of data, the computer will be able to decipher objects that look like other objects and classify them.

Many fields from medicine to retail are able to use computer vision.

Within radiology, computer vision can be used to detect abnormalities in CT scans or MRIs.

Retail can find it much easier to see what items are out of stock in a store with the introduction of computer vision.

Computer vision works in a three-step process:

  1. Image of the focused area is taken
  2. The image is processed through machine learning based on the data that was inputted
  3. Objects within the image are determined

Many around the world are sensing the emergence of this technology.

Computer vision is one of the most remarkable things to come out of the deep learning and artificial intelligence world. The advancements that deep learning has contributed to the computer vision field have really set this field apart.

  • Wayne Thompson a SAS Data Scientist

The uses of computer vision and haptic technology are truly limitless.

A world with no screens is truly right around the corner! Are you ready for the change that is destined to happen that will influence the lives of millions around the world?

--

--

Ali Haider
Analytics Vidhya

A 16-year-old exponential thinker who’s interested in using emerging technologies such as Brain-Computer Interfaces and Gene editing to change the medical field