Personality of things

0
14
- Advertisement -

Humans are starting to get better acquainted with personal assistants such as Google Assistant, Siri, Cortana, and Bixby. But how would people feel about personalities translating to automobiles, laptops and other household items? Would we want a single seamless personality across all devices, or would we prefer to build new relationships with each of these things? Would we want these things to understand and empathize with us? Do we really need these things to “feel” what we feel or do we just need the experience that they “get” us.

Humans have an innate habit of anthropomorphizing objects around them, especially those that move, grow, or talk to them. As technological advances enable robotics and IoT devices with greater intelligence, humans are likely to assign personality to more devices that they interact with in their daily life.

The vacuum cleaner, which was once a simple tool, is now a Roomba with a cheerfully clueless personality as it makes happy chimes and bumps its way through the living room. And while replacing a standard vacuum cleaner is no big deal, many Roomba users demand that they get their exact same robot back from repairs and that it not be “killed” and scrapped for parts. They view it almost as a part of the family.

Tools on the other hand, are replaceable. And the more an intelligent system or piece of hardware feels like a tool, the more replaceable it becomes. In Star Trek, the crew doesn’t spare a second thought about replacing and upgrading the ship computer, which speaks to them in a monotonous disembodied voice because they view it as a tool. However upgrades or maintenance of Commander Data, an android crewmember, bring substantial concern because his human shape and personality make him feel alive and relatable. Further, studies have shown that humans are more likely to forgive mistakes if they are coming from a device that they view as “alive” whereas they have no such leniency for things they view as tools.

How does this apply to our future? Logically, a company concerned about improving retention and engagement, assigning a device a strong personality seems like an obvious way to capitalize on human anthropomorphization, and grant you leeway on bugs, while boosting retention and engagement. However, the real challenge is choosing how much personality to inject.

Personality Risk?

Going back to the cultural differences, having too much personality in a device can be a potential risk factor: some people may enjoy a new digital friend, but others may find the idea of a tool trying to be personable with them annoying.

An extreme example of this can be found in the book (and movie) The HitchHiker’s Guide to the Galaxy where doors with “real people personalities” that sigh happily as people walk through them, which can get extremely annoying over time. However, designers today are already thinking about when to “tone down” the personality of their devices.

Digital assistant designers today are cautious about how they use the assistant’s voice. For example, they are careful not to use the assistant’s voice for anything that could interrupt or otherwise irritate the end user, including alarms, timers, and even intercom broadcasts. Google Home lets users make outgoing phone calls, but not receive calls in (thus avoiding the assistant bothering you by ringing), and Amazon has been hesitant in its adoption of prompts or push notifications on Alexa. Major voice assistants have even started reducing how talkative they are. Google and Amazon have recently reduced the number of confirmation remarks for simple commands like “turn off the lights” to avoid users getting annoyed by hearing “okay, turning off the lights” when they just want to sleep.

Personality of Robotics

Robotics are an obvious place to inject artificial personalities, as humans already personify most robots. Researchers at Stanford are experimenting with methods of navigating human occupied spaces such as doorways and hallways with their Jackrabbot project. They use tones and gestures to display appeasement and frustration as it navigates in crowds of people. Other projects are even more direct in evoking emotion. Tombot robotics, for example, builds a golden retriever robot that acts as a support animal. Other companies are taking a horizontal approach: Embodied builds software designed to run on various robots to enable more lifelike and meaningful interactions.

Implications

Attaching personality to consumer products can create a greater sense of connection to the devices that we use. This attachment can make the use of these devices more “sticky,” increasing engagement over the long term, and potentially boosting the attach rate of services. However, this may be a delicate line to navigate: products that have too much personality can irk users and cause them to abandon the product entirely. Tamagotchi is a notable example of this phenomenon. Best practice may be to simply have a slider that allows users to adjust how much personality they want in their devices: from utilitarian tool to best friend. That way, humans have control over their things.

Written by Jonathan Shieber
This news first appeared on https://techcrunch.com/2019/07/13/personality-of-things/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Personality of things”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.