Buildings That Feel You

February 25, 2016
Enlighted Inc.

Clifton Lemon is a well-known lighting and building guru with a keen eye for the future. He has been contributing a series of articles to our blog on the future of lighting and his thoughts about some of the implications of IoT. This is the fifth of ten articles.

As IoT solutions are applied to building management, we are more able to address a problem that so far has been particularly difficult: how to measure and act upon emotions, behavior, and experience. This is crucial to us now because in order to effect the large scale behavioral change necessary to manage the global problems of climate change, shrinking resources, increasing urban density, and loss of biodiversity, we need to understand the fundamental mechanisms of behavior in entirely new ways. Fortunately new technology and economic theory are helping us to do that.

The engineers, architects, physicists, and other technically oriented professionals who design, construct, and operate the built environment are trained to understand physical things very well – materials, configurations, energy efficiency, and complex systems and projects. Where they run into trouble is putting actual, unpredictable, messy human beings into the picture. In the architecture field, this focus is evidenced by the conspicuous absence of people in project photos, as though people are irrelevant and will only mess things up once they start using the building.

In the past two decades or so, design practitioners, policy makers, industry organizations, and building owners have made great strides in energy efficiency. People in general are much more aware of it, and many technology and design innovations have resulted in considerable progress. But the underlying assumed behavioral motivator (saving) is not as powerful a motivator as others may be (like gaining status, knowledge, or productivity). The economic theoretical framework of behavioral economics, or BEcon, developed by researchers like Daniel Kahneman, Cass Sunstein , and Richard Thaler offers many productive ways of managing our irrational behavior around energy use, as detailed by Amy Jewell in this article.

An understanding of how people actually behave in buildings can be greatly enhanced by Enlighted’s sensor and analytic system, which from what I can tell is the first such system to market. It addresses one of the fundamental “layers” of design in buildings – the circulation system – in a direct, evidenced-based way, and gives architects, engineers, space planners, and facility managers valuable data on existing use patterns that can be used to “tune” existing facilities and inform the design and management of future ones.

And it’s quite possible now to go even deeper, much deeper in fact, into our understanding of how people construct and use buildings, by measuring emotions. Emotions drive behavior for the most part, and while it may not always be entirely necessary to know why people congregate in certain areas and avoid others, if we can understand the why, by cheaply and easily installing a sensor network to measure emotions, (and we can with technology available today) we will, and we should.

Emotion and Behavior Analytics

It’s not as though humans are unable to perceive and take action on the subtle emotional and behavioral signals we trade with each other, in fact that’s mostly how we communicate- non verbally. What these new technologies provide is a very precise and automated way to do this at a scale, with a deep data set, in real time. This can allow us to make better decisions at the scale we increasingly need to in order to manage our complex built environment. The emerging fields of emotion analytics (also sometimes called “affective computing) and behavior analytics shows some fascinating recent advances and touches on a wide variety of different technologies and research areas. Here are three main types of technologies that I know something about:

Facial expression recognition: Based partly on the work of University of California San Francisco researcher Dr. Paul Eckman, who developed the Facial Action Coding System, these software tools can recognize the emotions behind facial expressions and microexpressions that often elude our conscious perception. In other words, in a certain way, the computer knows how you’re feeling almost better than you do. The science behind this is very solid and based on decades of analyzing and categorizing thousands of different expressions across a wide variety of world cultures. It’s been used lately primarily in AB type testing for advertising, but it’s interesting enough for Apple to have recently acquired Emotient, a key provider of expression recognition software. Other companies that provide solutions in this field include Affectiva, Realeyes, and Nviso. I’ve experienced this before – it’s uncannily accurate, and can even be applied to large groups of people. It’s important to differentiate between expression recognition and facial identification. Expression recognition relies only on plotted vectors in faces and their changes relative to a set point, while facial recognition is a much deeper and more complex algorithm based on bitmap files.

Voice Analysis: According to the website for Beyond Verbal, a leading emotion analytics provider, “Emotion Analytics change the way we interact with our machines and ourselves – forever. By decoding human vocal intonations into their underlying emotions in realtime, Emotion Analytics enables voice powered devices, apps, and solutions to interact with us on an emotional level, just as humans do.” Voice analytics have wide application in health care, where it’s often possible to make diagnoses by recognizing voice modulation patterns; market research; and user interface design. As far as I know no one is yet using it for building controls, but it’s certainly worth exploring. Another leading voice analysis company is VocalIQ, also snapped up by Apple recently. Hmm…a pattern here?

Gait Analysis: Gait analysis is an emerging technology that can identify people and their emotional and physical states by analyzing how they walk. It has applications in security and sports biomechanics to help high performing athletes train. Gait analysis also can be used in conjunction with wearable computing devices and provide data about health and wellness for employees or patients. Gait analysis can be combined with indoor positioning systems (IPS) and other location analysis solutions to provide more granularity.

How Do Companies Benefit?

Energy efficiency metrics, while far from perfect, are well developed. True, there is much work to be done, and making most buildings optimally energy efficient will take many generations. But we understand how to do this fairly well, and implementing this knowledge is a matter of economics, policy, and cultural change, all of which depend heavily on an understanding of behavior, which can be gained by applying behavioral and emotion analytics to data gathered by IoT systems, Built on lighting, power, and IT networks, the sensors and input devices are relatively cheap, portable, and use little energy. Just as electrification dramatically changed the nature of work in the early part of the 20th century, causing companies to create healthcare and other employee benefits in order to curtail an unacceptably high turnover rate, IoT is causing dramatic disruptions to today’s workplace, for better or for worse. The rationale for even having offices in the first place is being seriously questioned, for very good reasons. Now that work can be done anywhere, physical assets become secondary to human assets, which need to be managed differently and better than in a largely industrial economy. Companies who put behavioral analysis in place in the workplace can better manage change, provide better comfort, and create healthier environments for their employees. Work environments can become more dynamic and responsive and increase productivity. While productivity has been notoriously difficult to define and measure, metrics for behavior and emotion will provide the data to establish causal relationships between the environment and output.

Fear of Loss of Privacy and Control

Of course serious questions arise when AI devices and networks can read out minds better than we can. Our initial reaction to most new powerful technology like this is fear, and we eventually get over it. We were afraid of credit cards over the internet when online commerce first started. We’re gradually getting comfortable paying a small price of loss of privacy for the much greater gains of utility connectedness, and better quality of life in general. While there is much potential for abuse and coercion with emotion and behavior analytics, fortunately there are several strategies to employ. For starters, data on the emotions and behavior of users or employees can be immediately disaggregated from identification with individual users- this in fact reduces the required computation al load as well. And users can system operators can negotiate various opt-in or opt-out options when designing their networks. Often, surprising behaviors can emerge when shared data visualization is transparent and real-time, such as the competition to save energy stimulated by dashboards designed by Lucid Design in early applications of their solution.

What will it Feel Like?

One of the difficult things about new technology is that we discuss and make judgments and decisions about most of it for a long time before actually experiencing it. Sometimes we first hear about a new innovation through an article that talks about its potential market by 2020 or so, kind of like trying to judge what a movie’s about by its widely reported box office returns. So it’s particularly hard to envision how a system that feels your mood will respond by changing the lights, temperature, or ventilation, for instance because very few of us (myself included) have ever even experienced adaptive lighting. But that doesn’t stop us from imagining how it will feel. It’s possible we may not feel much, if the lights dim or rise very slowly and the temperature regulates so that we’re comfortable most of the time. One immediate result might be that the irritating things about most office environments – drab lighting, stuffy air, poor layouts – may gradually subside and give way to places that feel better without necessarily undergoing huge transformations. Also, maybe if people understand how you’re feeling they’ll stop by your desk if they see your happiness level is off. And I’m also interested in the emotions of people designing and operating spaces – their happiness levels might spike as they gain more control over costs, logistical and strategic decisions and employee satisfaction. Part of the beauty of sensor networks with analytic capability is that we can use them to allow for much greater flexibility in the workplace, something vital to business today, and that alone will provide for unexpected positive outcomes.