Invading a conference in a robotic doppelganger

This evening I attended the Educause Learning Initiative (ELI)’s annual conference, but not in person.  From my Vermont home I operated a  telepresence robot from Double Robotics, and drove around the ELI reception interacting with people.  Some first impressions:

The Double Robotics device (which I insist on calling a doppelbot) was simple, even elegant.  It’s an iPad on a stick, mounted on rollers, like a stripped-down videoconference system mated to a baby Segue.  I couldn’t operate it without the help of Michelle Diaz, who patiently charged the ‘bot then booted it up.

Taken by Dr JSchramm

Me, rolling across a hallway towards a mob.

I couldn’t see that, of course.  Instead I saw through DR’s Web interface, which was similarly simple:

"Seated" at a table with Ben Harwood and friends.

“Seated” at a table with Ben Harwood and friends.

The central panel shows what the doppelbot sees.  The left menu options control media tools, while the right shows what face the bot displays to the world.  That last panel also has a control for elevating or raising the bot, which is very entertaining.  On the keyboard arrow keys direct the robot to advance, retreat, or turn.

I was able to direct the doppelbot from my rural home, over a wireless network.

Once Michelle and I logged in and powered up, we rolled forward into a reception hall.  “We rolled” means Michelle walked alongside, while I clicked arrow keys and watched anxiously for obstacles.

For the next 30 minutes we rolled around a crowded hall full of friendly, often surprised ELI conference participants.  I looked for people to meet or reconnect with, then drove straight for them.  Many were delighted, and either knew of telepresence robots or figured out what was going on.

Photo by Liana Nunziato

Some face-to-face conversation.

The strangeness and excitement of the experience brought out my goofy side.  I insisted on “sitting down” at tables, lowering the iPad until it was roughly at the level of people’s heads.  I stood in line for the buffet, then shouted “gangway humans!” when rolling at top speed.  I toasted drinkers repeatedly.   In short mode I snuck up on participants, appearing at their elbows.

The mechanism worked well, but for one problem.  The video feed was nearly constant, only rarely stuttering.  I could hear people as well as we do in videoconference.  Only once did the doppelbot threaten to topple over (caught by Michelle!), which is remarkable for a crowded, slightly inebriated room.

The biggest problem was my audio.  The iPad’s speakers are far too tiny to make an impression on a crowd, and were quickly swamped by ambient noise.  People could only hear me if they put their head within inches of the device – and to one side, meaning we couldn’t look at each other while talking.  An uncrowded room should make this less of a problem, a theory proven by short conversations we held in the hallway next door.  Headphones, too, might help, attached to the doppelbot.

One possible productive weirdness: it was strange to see people not acknowledge me in robot form.  Most people did, of course, smiling, nodding, waving, or laughing.  But some participants saw “me” as furniture, apparently, as they refrained from expressing human connection.  They pushed around the iPad, or bumped it.  This might be a pedagogical tool for teaching people about the experience of marginalization, or dehumanization.

I expected to feel more awkward in the doppelbot than I did.  Instead it had a very Second Life at its best feel to it: simple movements usually just worked, while human interactions offered awkward moments due to lag or altered body language expectations.  I gradually felt present to a degree, holding a double awareness of that conference site with my far-away office.

With Susan Brower, taken by Lisa Stephens

People loved shooting photos and video of this.
With Susan Brower, taken by Lisa Stephens

I must emphasize the fun aspect of this experience.  This doppel-embodiment felt playful, inclining me to pull pranks.  People laughed frequently, and reflected quickly on possibilities, both serious and silly.  One participant wanted to dance with me, and I agreed – why not?

After nearly an hour of rolling and hollering, with Michelle patiently walking alongside, I decided to log out.  It was late on the east coast, and its embodied space had its own demands.    Tomorrow I’ll try another doppelbot session, perhaps in a different setting within the conference.

Readers should try out telepresence robotics, preferably from both sides.  This doppelbot technology could appear in many places, and now’s the time to explore.

(photos by DRJSchrammLiana Nunziato, Lisa Stephens)

Liked it? Take a second to support Bryan Alexander on Patreon!
Become a patron at Patreon!
This entry was posted in technology, Uncategorized. Bookmark the permalink.

28 Responses to Invading a conference in a robotic doppelganger

  1. How did you get the Double Robotics unit to ELI?

  2. CogDog says:

    The last human conference attendee on earth sat alone in a conference room….

  3. Had to laugh as I remembered the episode of The Big Bang Theory -> Sheldon #2 !

    • I don’t watch the show, but did see some clips of that. Is Sheldon normally very shy?

      • Joe Murphy says:

        Sheldon is too far on the autistic spectrum for “shy” to be the right word… I’d say that he’s deeply change averse, and narcissistically/obliviously committed to his own convenience and preferences. I’m not sure his fears (or “risk analyses” as he’d probably call them) are any weirder than anyone else’s, really, but for the fact he voices them bluntly.

      • Huh. Maybe I should watch the show, then.

  4. Joe Murphy says:

    I wonder if the designers have considered adding an acoustical shell to the top, to address your audio issues. Looking at the web site, it seems to me that the speakers are pointed straight up; reflecting the sound 90 degrees would deliver a lot more volume to the listeners. (It also might be less distracting in a workplace environment.) There are multiple brands of these available on the market for ~$10. (For some reason they’re called “amplifiers” but they’re really reflectors.)

  5. Good thoughts.
    I don’t think the DR designers did anything to the iPad’s speakers; if so, then they point, what, to the sides?

    • Joe Murphy says:

      iPad speakers point out the top (as it would be mounted on this robot). I’ve noticed even cupping my hand over my speaker can significantly increase the volume at my ears; if you look online you’ll find designs for building one with an index card or a red Solo cup. (I’m guessing people who spend $3000 on a telepresence robot don’t want to send quite that DIY message).

      There are some pretty cool looking designs available for a 3D printer… but would a big red gramophone horn above your head detract from your “presence”?

      • A giant horn would work fine – for me. A matter of personal style… which suggests the possibility that distant users will request on-site customization of the bot. Imagine bows, hats, ties, vests, ribbons, buttons.

        For sound I’m looking at the X-Mini II XAM4-B Portable Capsule Speaker,

        PS: great point about the DIY price gap!

  6. CogDog says:

    Seriously, it is interesting and fun that you tried it. But is it truly “presence” to have an iPad of you doping live video on a stick? How different is that than if someone walked around holding the device (pretty much what they had to do to accompany robo you?) I would guess the difference is you could steer?

    Did you *Feel* present? Did people Robo Alexander connect with really feel you there as a presence? Sure they see you and barely hear you.

    My first comment was trying to imagine a future conference where we all sit at home, and the conference hall is full of these robots rolling around hobnobbing and such.

    • I felt much more present than through videoconferencing or virtual worlds. Yes, the steering made a huge different, plus the entertaining vertical option. I was moving stuff around in another location, and knew this because of all kinds of feedback: seeing people and objects move past me or approach, motion blurs, bumping into things.
      Bumping into things: my keyboard actions caused effects in the visible world. I made people shriek or jump away. Miles away from videoconferencing.

    • Joe Clark says:

      I’d guess the feeling of presence emerges from the perception of agency — Bryan can move when/where he wishes, and participants see the bot moving around seemingly independently. The Second Life reference is apt — it reminds me of the tendency we have to project subjectivity into ‘bots, cleverly scripted animals, and other NPCs, as well as other avatars, for that matter. Reduced lag and wider sensory bandwidth (better sound, force feedback when you bump into things, etc.) would deepen the feeling of embodied presence, methinks.

      • Come to think of it, Joe, those issues with lag and bandwidth also (unfortunately) remind me of Second Life. I hope the doppelbot companies fare better.

        Agreed about our projection of agency. We’re pretty comfortable around robots, especially younger folks. The face, the voice, the movements doppelbots add really bring this forward.

        Do you think we’re in uncanny valley territory?

        • CogDog says:

          Great point about Second Life parallels, and with the dopplebot you have no worries of rezzing inside a wall or accidentally disrobing in public.

          I do believe from Bryan’s recount that the presence was realized (I’m only teasing about iPad on a stick ). But telepresence is not solely in the mechanics; I bet much has to do with Bryan’s strong personality, booming voice, and his natural ability to recognize people he knows or draw in people he first meets. It’s more than just the hardware…

      • jsclarkfl says:

        I think we’d only be in the valley if the telepresence bots became more humanoid (is it live? or is it Memorex?). Alan’s point about personality is spot-on, too — the wry humor makes the experience just meta enough for everyone but also puts the focus on the person. I’d imagine a bunch of shy ‘bots lurking around a conference gathering would be downright creepy.

      • Joe Clark says:

        Oops, jsclarkfl is me. Dang interwebz.

      • Great point about extroverted vs introverted bot-drivers, Joe.
        I imagine being in a room with doppelbots that lurk, saying little. Perhaps it’s hard to make out the face on the iPad, or maybe the person is disturbingly generic, or switches out frequently. Maybe we can just make out letters on their lapel: N… S…. is that an “A”?

      • Maybe it’s like teaching, acting, recitation, or other forms of performance, Alan. The platform is there for all, but it takes a certain measure of training and predilection to take advantage of it.

        Personal example: as a young kid I was painfully shy. In 5th grade I took the stage for a play, and reeled back from the crowd, fleeing into the wings. Years went by before my personality shifted and I started learning the craft of performance on stage.

  7. Pingback: On a second doppelbot experience | Bryan Alexander

  8. I was looking at Double Robotics website and I noticed some people put clothes on the bot to make it look more human. In your case perhaps you could have attached a long beard to the screen.

  9. Pingback: Doppelbotting in the Chronicle | Bryan Alexander

  10. Pingback: Considering a year of bloggery | Bryan Alexander

Leave a Reply

Your email address will not be published. Required fields are marked *