Integrating interactive technologies into interior environments is becoming increasingly common. So, too, is the use of interactive robots in nonindustrial settings.

What if you could combine the two to create an interior space that is itself an intelligent, interactive agent? That’s the goal of a project being developed at Cornell University’s Architectural Robotics Lab.

Today’s "smart" environments employ a variety of technologies that respond to, and in some cases anticipate, interactions with occupants. These include sensors, cameras, touch screens, and voice-activated devices linked together by the internet of things (IoT).

Some are passive and await commands submitted by the user either orally or through the use of some type of Wi-Fi-connected digital device. Others are programmed to respond proactively to particular triggers, such as someone entering a room or changes in sunlight during the day.

Over time, by collecting, storing and analyzing data, some of these devices or the computers they are connected to can "learn" to identify user preferences and adjust the environment accordingly when the user is present in the space.

Another way engineers and designers are exploring how technology can be employed to address human needs is robotics. In recent years, labs and companies have introduced a variety of interactive and responsive service robots designed to perform a range of functions, from independently vacuuming a room to providing information at shopping malls and airports, to serving as a home health aide and companion for the elderly and chronically ill.

It’s not much of a stretch to envision "smart" environments that are designed to accommodate the use and movements of service robots in order to make them more supportive for occupants.

Several years ago, Rajesh Elara Mohan and his team at Singapore University of Technology and Design noted that more attention needed to be paid to the creation of barrier-free, robot-inclusive spaces using proven design best practices and principles, including lighting schemes, furniture choices and arrangement, wall and floor surfaces, and wayfinding. The challenge, as they see it, is to incorporate architectural and design features that optimize the performance of service robots but that also are aesthetically pleasing to human occupants.

Along similar lines, Keith Evan Green, a professor in the department of Design + Environmental Analysis (DEA) at Cornell University and author of "Architectural Robotics: Ecosystems of Bits, Bytes, and Biology," established the Architectural Robotics Lab (ARL) to focus on "making our physical surroundings interactive and adaptive to help us do what we do: work, play, learn, roam, explore, create, interconnect, heal, and age."

Combining design, robotics and psychology, projects developed at the lab create built environments embedded with robotics to support the activities they were designed for. One such project is the Animated Work Environment (AWE), a user-programmable, robotic work environment that can change shape to adapt the configuration for different work and play needs, such as collaborating, composing, presenting, viewing, lounging, and gaming.

Taking that concept a step further, Yixiao Wang, a DEA graduate student and member of ARL, is engaged in developing a prototype of an AI-enhanced, partially intelligent interactive environment in which the entire environment holistically serves to support the occupants. He refers to this environment as a "space agent," one that that is perceived by users as having humanlike traits, such as one might attribute to a robot or voice agent, like Siri.

Because it is partially intelligent, the space agent can respond to movement, gestures, voice commands, and such not only to respond to the user but to assess and anticipate how the environment needs to be adapted at any time to best accommodate the current activity.

To help visualize his idea, Yixiao describes a scenario in which an interior designer arrives at her office and prepares to begin her day’s work. A ceiling-mounted flexible robotic environment, similar perhaps to the AWE, gently bends down and presents her with an interactive surface tablet, positioned according to her preprogrammed preferences.

When she finds her creativity blocked, the space agent notices she has stopped working and provides her with prompts and images to inspire her. As she recommences her work, the space agent collaborates with her, searching for relevant information and making suggestions. Later, some colleagues enter the room for a scheduled meeting, and the space agent proactively reconfigures the environment to create privacy and block out external noise.

Yixiao concedes there are many issues, both technical and psychological, that need to be worked out. His study will assess several trials involving interior designers to determine how users might react and prefer to interact with the space agent. Perhaps in the not-too-distant future, those designers may be called upon to help design and implement space agents for their clients.