Microsoft Patent | Providing Living Avatars Within Virtual Meetings

Patent: Providing Living Avatars Within Virtual Meetings

Publication Number: 20190004639

Publication Date: 2019-01-03

Applicants: Microsoft

Abstract

Systems and methods for providing a living avatar within a virtual meeting. One system includes an electronic processor. The electronic processor is configured to receive a position of a cursor-control device associated with a first user within the virtual meeting. The electronic processor is configured to receive live image data collected by an image capture device associated with the first user. The electronic processor is configured to provide, to the first user and a second user, an object within the virtual meeting. The object displays live visual data based on the live image data and the object moves with respect to the position of the cursor-control device associated with the first user.

Background

Embodiments described herein relate to multi-user virtual meetings, and, more particularly, to providing living avatars within such virtual meetings.

Summary

Virtual meeting or collaboration environments allow groups of users to engage with one another and with shared content. Shared content, such as a desktop or an application window, is presented to all users participating in the virtual meeting. All users can view the content, and users may be selectively allowed to control or edit the content. Users communicate in the virtual meeting using voice, video, text, or a combination thereof. Also, in some environments, multiple cursors, each from a different user, are presented within the shared content. Accordingly, the presence of multiple users, cursors, and modes of communication may make it difficult for users to identify who is speaking or otherwise conveying information to the group. For example, even when a live video is displayed within the virtual meeting from one or more users, users may find it difficult to track what user is currently speaking, what cursor or other input is associated with user, and, similarly, what cursor is associated with a current speaker.

Thus, embodiments described herein provide, among other things, systems and methods for providing living avatars within a virtual meeting. For example, in some embodiments, a user’s movements and facial expressions are captured by a camera on the user’s computing device, and the movements and expressions are used to animate an avatar within the virtual meeting. The avatar reflects what a user is doing not just who the user is. For example, living avatars may indicate who is currently speaking or may reflect a user’s body language, which allows for more natural interactions between users.

To create a living avatar, the avatar is associated with and moves with the user’s cursor. Thus, as the user moves his or her cursor, other users can simultaneously view the movement of the cursor and the avatar and not be forced to focus on only one area within the virtual meeting. In some embodiments, live video may be used in place of an avatar and may be similarly associated with the user’s cursor. Similarly, when a user provides audio data but not video data (live or as an avatar), the audio data may be represented as an animation (an object that pulses or changes shape or color based on the audio data) associated with the user’s cursor. Thus, the living avatars associate live user interactions within a virtual meeting (in video form, avatar form, audio form, or a combination thereof) with a user’s cursor or other input mechanism or device to enhance collaboration.

For example, one embodiment provides a system for providing a living avatar within a virtual meeting. The system includes an electronic processor. The electronic processor is configured to receive a position of a cursor-control device associated with a first user within the virtual meeting. The electronic processor is configured to receive live image data collected by an image capture device associated with the first user. The electronic processor is configured to provide, to the first user and a second user, an object within the virtual meeting. The object displays live visual data based on the live image data and the object moves with respect to the position of the cursor-control device associated with the first user.

Another embodiment provides a method for providing a living avatar within a virtual meeting creating a virtual meeting. The method includes receiving, with an electronic processor, a position of a cursor-control device associated with a first user within the virtual meeting. The method includes receiving, with the electronic processor, live image data collected by an image capture device associated with the first user. The method includes providing, with the electronic processor, an object to the first user and a second user within the virtual meeting. The object displays the live image data and the object moves with respect to the position of the cursor-control device associated with the first user.

Another embodiment provides a non-transitory computer-readable medium including instructions executable by an electronic processor to perform a set of functions. The set of functions includes receiving a position of a cursor-control device associated with a first user within a virtual meeting. The set of functions includes receiving live data collected by a data capture device associated with the first user. The set of functions includes providing an object to the first user and a second user within the virtual meeting. The object displays data based on the live data and the object moves with respect to the position of the cursor-control device associated with the first user. The data include at least one selected from a group consisting of live image data captured by the data capture device, a live avatar representation based on live image data captured by the data capture device, and a live animation based on live audio data captured by the data capture device.

发表评论

电子邮件地址不会被公开。 必填项已用*标注