US9349118B2 - Input, display and monitoring of contact center operation in a virtual reality environment - Google Patents
Input, display and monitoring of contact center operation in a virtual reality environment Download PDFInfo
- Publication number
- US9349118B2 US9349118B2 US13/471,890 US201213471890A US9349118B2 US 9349118 B2 US9349118 B2 US 9349118B2 US 201213471890 A US201213471890 A US 201213471890A US 9349118 B2 US9349118 B2 US 9349118B2
- Authority
- US
- United States
- Prior art keywords
- monitor
- virtual object
- human
- virtual
- monitored
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000012544 monitoring process Methods 0.000 title description 26
- 238000000034 method Methods 0.000 claims abstract description 53
- 230000004044 response Effects 0.000 claims abstract description 34
- 238000009877 rendering Methods 0.000 claims abstract description 31
- 238000001514 detection method Methods 0.000 claims abstract description 16
- 238000004891 communication Methods 0.000 claims description 88
- 230000009471 action Effects 0.000 claims description 35
- 230000033001 locomotion Effects 0.000 claims description 31
- 230000006870 function Effects 0.000 claims description 28
- 230000008859 change Effects 0.000 claims description 25
- 230000003993 interaction Effects 0.000 claims description 14
- 230000000694 effects Effects 0.000 claims description 9
- 230000000977 initiatory effect Effects 0.000 claims description 6
- 238000012546 transfer Methods 0.000 claims description 5
- 239000003795 chemical substances by application Substances 0.000 description 176
- 230000008569 process Effects 0.000 description 11
- 230000002452 interceptive effect Effects 0.000 description 9
- 229910003460 diamond Inorganic materials 0.000 description 8
- 239000010432 diamond Substances 0.000 description 8
- 230000006399 behavior Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 230000001960 triggered effect Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 230000006855 networking Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000630 rising effect Effects 0.000 description 3
- 238000012216 screening Methods 0.000 description 3
- 230000008933 bodily movement Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 238000012896 Statistical algorithm Methods 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 238000002513 implantation Methods 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/101—Collaborative creation, e.g. joint development of products or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Definitions
- the disclosure relates generally to contact centers and particularly to contact center monitoring.
- Contact centers service incoming contacts from customers while predictive dialing facilities initiate outgoing customer contacts regarding goods and services.
- An important function in contact centers and predictive dialing facilities is monitoring performance in light of policies and objectives.
- An ongoing problem, particularly in large and/or distributed systems, is enabling one or more supervisory staff members to monitor performance of resources, particularly contact center agents. Supervisory staff members are on their feet much of the day circulating between contact center agents, adding to job fatigue. This problem is made only worse by covering agents spread over a large physical area. In geographically distributed contact centers, in particular the problem is different because the supervisory staff members must understand the issue and access screen information being viewed by a remote agent. This can add time to the customer experience while the supervisory staff loads and reviews the screen information.
- a related problem is that agent behaviors can change when a supervisor is nearby. Some agents become nervous and less effective compared to his or her normal performance level. Other agents perform better because of the effect of “the boss” watching. Thus, the physical presence of the supervisor may have an undesirable impact on contact center monitoring and/or performance.
- the disclosure is directed generally to a virtual reality environment for monitoring a selected entity.
- a method that includes the steps:
- a virtual reality environment rendering module providing, by a virtual reality environment rendering module, a virtual reality environment representative of or associated with one or more of a contact center, a predictive dialer, and media collaboration session, the virtual reality environment comprising virtual objects associated with a plurality of human entities;
- a method in response, providing, by an information manager through the virtual reality environment to a human entity, information associated with another entity corresponding to a selected monitored virtual object.
- a method includes the steps:
- each of the monitored virtual objects is active in the virtual reality environment independent of any control or other input by the corresponding monitored entity
- the monitor virtual object is automatically suspended when the monitor virtual object performs a first function and automatically reactivated when the monitor virtual object performs a second function, the status of the monitor virtual object being independent of the human monitor logging into or out of the virtual reality environment.
- a system that includes:
- a monitor communication device of a human monitor to monitor the plurality of agent communication devices and/or human agents
- a processor executable virtual reality environment rendering module to provide a virtual reality environment comprising a plurality of monitored virtual entities corresponding to the plurality of human agents and a monitored virtual entity corresponding to the human monitor, the monitor virtual entity and/or monitored virtual entity being able to move relative to a coordinate system defining the virtual reality environment;
- a processor executable information manager to provide, to the human monitor, information associated with a selected monitored entity corresponding to a selected human agent.
- the monitored virtual objects and the monitor virtual object are avatars;
- the information comprises one or more of: a live media stream between the selected monitored entity and a customer, a contact identifier (“ID”), a contact type code, an outbound contact initiation method, a customer ID, a data source ID, an ID of the selected monitored entity and customer involved in the contact, a business role code, a party role start date and time, a contact direction code, a contact direction description, a contact purpose, a contact reason, a contact media interaction disposition, a contact disposition, a contact wait treatment and/or time, an ID of the selected monitored entity, a selected monitored entity ID, a selected monitored entity percentage time in state by state identifier, a current state of the selected monitored entity, a skill of the selected monitored entity, a skill level for the selected monitored entity, a performance statistic and/or metric of the selected monitored entity, a customer value indicator, a customer historic business level and/or purchase(s), a contact history, contact details, contact queue details, a
- a focus and/or focal point of the monitor virtual object is/are determined.
- the focus is based on one or more of a virtual position of the monitor virtual object relative to the coordinate system and/or the selected monitored virtual object, a proximity of a virtual location of the monitor virtual object relative to a virtual location of the selected monitored virtual object, a virtual gesture or touch of the monitor virtual object, and/or a view of the monitor virtual object.
- the determined focus and/or focal point of the monitor virtual object is, for example, the selected monitored virtual object.
- the one or more trigger events is/are one or more of the monitor virtual object entering and/or leaving a defined region, area, or volume in the virtual reality environment, selection, by the human monitor, of the selected monitored virtual object, performance of a predetermined action by the monitor virtual object, receipt of a request from a monitored entity, detection of a keyword or key word phrase in a media stream associated with a contact between a monitored entity and a customer, and detection of a human monitor's keyword or key word phrase in an audio stream and/or a text string.
- the virtual reality environment based on the determined focus and/or focal point of the monitor virtual object, is reconfigured, by the virtual reality rendering module, whereby a virtual position of the monitor virtual object is closer to a virtual position of the at least one selected monitored virtual object and a dashboard comprising the information associated with the at least one selected monitored virtual object is displayed to the human monitor.
- the interrupt is a first movement of the monitor virtual object.
- the provided information is a live media stream from a contact between the selected monitored entity and a customer.
- the human monitor is connected automatically as a third party to the contact, and the human monitor is disconnected automatically from the contact in response to a second movement by the monitor virtual object.
- the interrupt is a first movement of the monitor virtual object
- the monitored entity is an enqueued contact with a customer in an on hold status.
- a communication module automatically connects the human monitor to the customer contact, and, by a second movement of the monitor virtual object, the human monitor is disconnected automatically from the customer contact and the customer contact is again put on an on-hold status.
- the present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration.
- the disclosed architecture can increase the effectiveness of supervisory staff in contact center and predictive dialer operations while avoiding the deviant agent behavior from a supervisory staff member being in close physical proximity to the agents.
- Supervisors can, from a fixed location, virtually monitor each of his or her contact center agents as if he or she were standing at the workstation of each of the selected agents. This can allow supervisory staff to monitor seamlessly whether agents are co-located or geographically distributed. It can reduce the cost overhead associated with contact center and/or predictive dialer operations.
- the avatars of the monitored agents can be a picture of the actual corresponding agent, thereby enabling the supervisor to memorize the actual agent's appearance for later use.
- each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
- automated refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material”.
- An “avatar” is the graphical representation of a person, such as a user, or a person's alter ego or character. It may take either a three-dimensional or four-dimensional form, as in games or virtual worlds, or a two-dimensional form as an icon in Internet forums and other online communities.
- a “call center” is a physical place where customer and other telephone calls are handled by or on behalf of an organization. Typically, a call center concurrently handles a considerable volume of calls, which treatment includes screening and forward calls to human or automated resources or agents for servicing.
- a “computer-readable medium” refers to any tangible storage and/or transmission medium that participate in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
- Non-volatile media includes, for example, NVRAM, or magnetic or optical disks.
- Volatile media includes dynamic memory, such as main memory.
- Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
- a digital file attachment to e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium.
- the computer-readable media is configured as a database
- the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software implementations of the present disclosure are stored.
- a “contact” refers to an interaction between selected parties/entities over one or more channels.
- the parties/entities can be human, such as a customer, agent, and supervisor, or nonhuman, such as an Interactive Voice Response unit, a Web server, content analyzer, email server, and the like.
- a “contact center” (or a customer interaction center or e-contact center) is a facility within an enterprise from which customer contacts are directed to or managed or serviced by contact center resources.
- the contact center typically includes one or more online call centers but may include other types of customer contacts, including e-mail newsletters, postal mail catalogs, Web site inquiries and chats, instant messages, and the collection of information from customers during in-store purchasing.
- a contact center is generally part of an enterprise's overall customer relationship management (“CRM”).
- module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element. Also, while the disclosure is presented in terms of exemplary embodiments, it should be appreciated that individual aspects of the disclosure can be separately claimed.
- a “predictive dialer” dials a list of telephone numbers and connects answered dials to people making calls, often referred to as agents. Predictive dialers use statistical algorithms to minimize the time that agents spend waiting between conversations, while minimizing the occurrence of someone answering when no agent is available.
- a “virtual world” or “virtual environment” or “virtual reality environment” is typically an online community that takes the form of a computer-based simulated environment through which users can interact or communicate (such as via video, audio, and/or text communication modalities) with one another and use and create objects.
- the term has become largely synonymous with interactive virtual reality environments (which have a two-, three or four-dimensional coordinate space), where the users take the form of avatars visible to others and able to move within the virtual reality environment. These avatars usually appear as textual, two-dimensional, three-dimensional, or four-dimensional representations, although other forms are possible (auditory and touch sensations for example).
- the “fourth” dimension referenced herein with respect to the avatar and virtual reality environment can be expressed in many forms.
- the dimension could be temporally based, or a function of a measure of time.
- the dimension could be evolution over time (aging, maturity, expertise and the like).
- the dimension is based on localized feedback. Other variations or combinations are also possible for the dimension.
- FIG. 1 is a block diagram of a monitoring system according to an embodiment
- FIG. 2 is a block diagram of virtual reality supervision module according to an embodiment
- FIG. 3 depicts monitored entity ordering according to an embodiment
- FIGS. 4-5 depict a display according to various embodiments.
- FIGS. 6A-B depict flowcharts according to an embodiment.
- the invention will be illustrated below in conjunction with an exemplary communication system. Although well suited for use with, e.g., a system having an ACD or other similar contact processing switch, the invention is not limited to use with any particular type of communication system switch or configuration of system elements. Those skilled in the art will recognize that the disclosed techniques may be used in any communication application in which it is desirable to have a team of agents engage in training while servicing a contact.
- a virtual reality supervision module exports information being accessed by a selected agent (such as the agent screen(s) or display(s) and agent/customer media stream(s)) and/or information describing the selected agent (such as an agent representation, e.g., a photograph, video monitoring output, an avatar, etc.) and/or the selected agent and/or contact center, predictive dialer, and/or call center performance metrics into a virtual reality environment or world.
- a selected agent such as the agent screen(s) or display(s) and agent/customer media stream(s)
- information describing the selected agent such as an agent representation, e.g., a photograph, video monitoring output, an avatar, etc.
- agent and/or contact center e.g., a photograph, video monitoring output, an avatar, etc.
- a supervisor or monitor representation e.g., a photograph, video monitoring output, an avatar, etc.
- he or she can view selected agents, such as partitioned physically in workspaces or work areas, seated before a computer screen or display with one or more possible communication devices.
- the supervisor or monitor would be automatically or selectively linked to a media stream between the selected agent and a customer and would view the selected agent's screen(s) or display(s).
- the screen(s) or display(s) could optionally enlarge to give the appearance that the supervisor or monitor would see if he or she were the selected agent.
- the supervisor or monitor could passively monitor the media stream or join in as an active participant. If the supervisor or monitor elected to join, he or she could whisper announce that election to the selected agent.
- a change in state of a monitored entity causes a corresponding change in a state of the monitored entity's virtual object.
- a change in a state of a contact center agent causes a corresponding change in a state or appearance of the agent's avatar.
- a change in a performance metric associated with the agent can cause a corresponding change in an appearance of the agent's avatar to reflect the performance metric.
- the color, size, or other physical characteristic or behavior of the agent's avatar and/or a virtual object in proximity to the agent's avatar could change in response to a performance metric falling below or rising above a selected threshold.
- the words “supervisor” and “user” and the words “entity” and “object” are used interchangeably.
- “user” is also used to refer not only to the supervisor but also to a human monitored entity, such as a contact center agent.
- “supervisor” is used to refer not only to a supervisor but also to other types of human monitors, which may or may not be a supervisor of the monitored entity (which may be human or non-human).
- an agent can act as a supervisor by reviewing his or her behavior over a period in a virtual reality environment.
- an agent could train to be a supervisor by reviewing actual events in a virtual reality training simulator.
- the monitoring system 100 includes a supervisor communication device 104 , a virtual reality supervision module 108 and associated database 112 , a plurality of monitored entities 116 a - n , all interconnected by networking elements 120 , 124 , and 128 , as appropriate.
- the supervisor communication device 104 can be any computational device having one or more display(s) or screen(s), such as a personal computer, a laptop, a notebook computer, a smart phone, and the like.
- the virtual reality supervision module 108 receives inputs from the supervisor communication device 104 , one or more selected monitored entities 116 a - n , and the database 112 and, based on the input, provides a virtual reality environment for presentation to the supervisor via the supervisor communication device 104 and/or to a monitored entity via a monitored entity communication device (not shown).
- the virtual reality supervision module 108 is a modified version of Avaya Web.aliveTM or AvayaliveTM.
- the database 112 can have any suitable data structure, such as a relational or object oriented data structure, and includes a variety of data, including without limitation contact center, predictive dialer, and/or call center information or objects (e.g., for a contact, contact ID (an internally generated unique identifier of a contact); contact type code (which identifies the type of contact), outbound contact initiation method, customer ID (the identifier(s) of the customer(s) engaged in the contact), data source ID, party ID (the identifier(s) of the parties to the contact), business role code, party role start date/time (the date/time that the contact center recognizes that the party may play a role in interactions with the enterprise), contact direction code, contact direction description, contact purpose, contact reason, contact media interaction disposition, contact disposition, and contact wait treatment/time; for an agent, agent identifier, agent contact information, agent group identifier and information, agent percentage time in state by state identifier, current agent state, agent skill(s), agent skill level for each agent skill, agent performance statistics
- the monitored entities 116 a - n can be any human, computational, or other tracked or monitored entity.
- monitored entities include agents or agent groups, agent communication devices (e.g., desktops, phones, displays, keyboards, mouse or other user input device, etc.), inbound or outbound customer contacts, queues or queue groups, trunks or trunk groups, and other selected entities or objects.
- the networking elements 120 , 124 , and 128 can be any signal path or channel, such as a bus, a local area or wide area network, and the like.
- the virtual reality supervision module 108 includes a user focus module 200 , an information manager 204 , a virtual reality environment rendering module 208 , user interface module 212 , interrupt module 216 , communication module 220 , and control module 224 , all interconnected by a networking element 228 , which can be any of the networking elements 120 , 124 , or 128 discussed with reference to FIG. 1 .
- the user focus module 200 determines a focus or focal point of a selected avatar relative to the virtual reality environment and/or avatars or other objects (or monitored entities) in the environment.
- the focus may be based on a virtual position, a location, or physical proximity of the avatar, a virtual gesture or touch of the avatar, and/or a view of the avatar.
- An avatar's view, which can be presented to a corresponding user, can itself depend on the location and/or orientation of the avatar within the virtual reality environment.
- the avatar's view may depend on the direction in which the avatar is facing and the selected viewing option, such as whether the user has opted to have the view appear as if the user were looking through the eyes of the avatar or whether the user has opted to pan back from the avatar to see a three-dimensional view of where the avatar is located and what the avatar is doing in the three-dimensional computer-generated virtual reality environment.
- the information manager 204 retrieves, from the database 112 , data and receives and filters inputs from the supervisor communication device 104 , monitored entities 116 a - n , user focus module 200 , user interface module 212 , and interrupt module 216 and provides relevant information to the virtual reality environment rendering module 208 .
- the retrieved data includes not only information related to a contact center, predictive dialer, and/or call center and/or components thereof, but also information related to the virtual reality environment itself.
- the latter information includes the coordinate system and the various objects indexed or located in the coordinate system.
- the virtual reality environment rendering module 208 receives information from the information manager 204 and renders and updates the virtual reality environment.
- the virtual reality environment can be interactive, have rules based on reality (e.g., gravity, mass, gravity, topography, locomotion, real-time actions, and communication, etc.) and a two-, three or four-dimensional coordinate space.
- the virtual reality environment depicts the user(s) and selected object(s) as avatars or other visual or graphical representation visible to the user and possibly to others.
- the coordinate system can have any configuration, such as a floor plan with multiple rooms, a network site map with agent avatars at each network node, a circular room with the agent avatars positioned around the circumference of the room and the supervisor avatar in the central interior of the room, and so forth.
- the user's avatar, and possibly other avatars, is/are able to move within the virtual reality environment in response to commands received from the user, such as key strokes, mouse cursor movements, user gestures or bodily movements, and the like.
- the avatars usually appear as textual, two-dimensional, three-dimensional, or four-dimensional representations, though other forms are possible (auditory and touch sensations for example).
- the virtual reality environment rendering module 208 presents perceptual stimuli to the user based on his or her movement or other interaction with the objects in the virtual reality environment.
- multiple users by a corresponding client, connect simultaneously to a virtual reality server to interact with one another and with the virtual reality environment.
- the user interface module 212 receives command and request inputs from the supervisor communication device 104 and/or communication device(s) (not shown) associated with a set of monitored entities 116 a - n and provides information, such as a visual rendering of the virtual reality environment (received from the virtual reality environment rendering module 208 ) and contact center, predictive dialer, and/or call center information or component thereof information (received from the information manager 204 ), to the supervisor communication device 104 and/or a communication device (not shown) associated with a selected monitored entity 116 a - n.
- information such as a visual rendering of the virtual reality environment (received from the virtual reality environment rendering module 208 ) and contact center, predictive dialer, and/or call center information or component thereof information (received from the information manager 204 ), to the supervisor communication device 104 and/or a communication device (not shown) associated with a selected monitored entity 116 a - n.
- a user can see a representation of a portion of the computer-generated virtual reality environment on a display and input commands via his or her user input device, such as a mouse or keyboard.
- the user interface module 212 receives the command(s) and other input(s) from the user and passes the user input to the virtual reality environment rendering module 208 .
- the virtual reality environment rendering module 208 causes the user's avatar or other object under the control of the user to execute the desired action in the virtual reality environment. In this way, the user may control a portion of the virtual reality environment, such as his or her avatar or other objects in contact with the avatar, to change the virtual reality environment.
- the user interrupt module 216 based on information received from the user focus module 200 , information manager 204 , and/or user interface module 212 , generates an interrupt, requiring one or more specified actions by the control module 224 .
- the virtual reality environment may include triggers or trigger events that generate actions when triggered by an activity in the environment.
- the user interrupt module 216 is a scripted, such as Perl, REXX, JavaScript, VBScript, and Tcl/Tk mapping function designed to listen and watch for trigger events in the virtual reality environment, look up the trigger event in a trigger event table to determine the trigger event type, and pass the trigger event type and associated parameter(s) to a scripted function.
- a script is a program or sequence of instructions that is interpreted or carried out by another program rather than by the computer processor (as in the case of a compiled program).
- the associated parameters can include the identity of the user who caused the trigger event (if available), any trigger event variables, and a string which can be used by the Scripted logic to select an appropriate action.
- the Scripted mapping function enables input from the user focus module 200 , information manager 204 , and/or user interface module 212 to be passed to a Scripted function so that interaction with the virtual reality environment content can cause particular actions or events to occur within the virtual reality environment, such as providing, in substantial real time, the supervisor with a selected agent's desktop display, dashboard (which are supervisor-configurable to enable him or her to drill down into additional detail in a selected pane or segment of the dashboard), and/or current media stream(s) in customer contact sessions.
- trigger event detection may be performed by functionality other than and different from the Scripted mapping function.
- the functionality is described as being scripted, it is to be appreciated that the functionality could be a compiled program or a combination of a script and a compiled program.
- Trigger events in the virtual reality environment can take many forms. Examples of possible trigger events include when a user enters or leaves a defined region, area, or volume in the virtual reality environment (i.e., the user enters/leaves a partitioned area or room), the user approaches or leaves a particular person, the user starts/stops talking to another user, the user clicks on an object, the user invokes a control feature to enable his or her avatar to take special action within the virtual reality environment, and the user starts/stops/updates content mapped to a surface of the virtual reality environment.
- a monitored entity such as a queue or trunk group, can automatically request a supervisor's attention or assistance.
- a contact center agent can request a supervisor's attention through a user interface, through gesture recognition, or via automatic speech recognition of a keyword or key word phrase in an audio stream or keyword or key word phrase in a text string.
- Other trigger events include gesture recognition, or automatic speech recognition of a supervisor's keyword or key word phrase in an audio stream or a text string.
- Further trigger events include commands or requests received from a user interface of the supervisor or agent.
- Other actions or stimuli may be captured as trigger events; therefore, this list is not exhaustive but simply shows several examples of possible actions that may be captured and interpreted as triggers.
- the rendered virtual reality environment will cause the rendered virtual reality environment to be updated.
- the action may be interpreted as an event which may cause information (particularly non-virtual world information such as contact center, predetermined dialer, and/or call center information) associated with the approached agent to be retrieved by the information manager 204 and provided to the user.
- information particularly non-virtual world information such as contact center, predetermined dialer, and/or call center information
- the movement would be interpreted as an event to update not only the rendered virtual reality environment but also the non-virtual world information provided to the user.
- the monitored entity is represented by an avatar or other visual or graphical image or representation whose appearance is a function primarily of a resource or media type.
- an automated resource such as an interactive response unit
- the interactive response unit and human agent would have a substantially similar appearance provided they use a common media type and substantially different appearance if they were to use a different media type.
- the communication module 220 effects a communication session between the user and a selected object by a selected communication modality, such as voice call, multimedia call, email, instant messaging, text chat, and the like.
- the communication session can be by initiating a new session or joining an existing session.
- the control module 224 supervises and controls operations of the other modules and manages the interactions between and among them.
- a supervisor avatar 300 is associated with a supervisor and first, second, third, . . . pth agent avatars 304 a - p are associated with contact center agents under the supervision of the supervisor.
- the supervisor is spatially proximal (relative to the coordinate system of the virtual reality environment) to the first and second contact center agent avatars 304 a - b (which acts as a trigger for an interrupt and is deemed by the user focus module 200 to be the focus 308 of the supervisor) and therefore the interrupt module 216 , assuming that the supervisor is interested in the first and second agent avatars 304 a - b , causes the information manager 204 to provide the user interface module 212 with selected non-virtual reality environment information associated with the first and second agent avatars 304 a - b for presentation to the supervisor. What non-virtual reality environment information is to be provided to the supervisor or other action to be performed in response to a selected trigger may be defined by the supervisor.
- the user interface module 212 either provides the non-virtual reality environment information directly to the supervisor communication device 104 for display to the supervisor or to the virtual reality environment rendering module for incorporation into the virtual reality environment display information provided to the supervisor communication device 104 .
- the supervisor avatar 300 moves in the direction 312 indicated and is now positioned in spatial proximity to the second and third agent avatars 304 b - c as shown in FIG. 3B (which spatial proximity acts as a trigger for an interrupt and is deemed by the user focus module 200 to be the focus 312 of the supervisor).
- the interrupt module 216 In response to the detected trigger, the interrupt module 216 , assuming that the supervisor is now interested in the second and third agent avatars 304 b - c , causes the information manager 204 to provide the user interface module 212 with selected non-virtual reality environment information associated with the second and third agent avatars 304 b - c for presentation to the supervisor.
- an avatar is generally a three dimensional rendering of a person or other creature that represents the user in the virtual reality environment.
- a user corresponding to an avatar selects the appearance of the avatar when creating a profile (such as specifying height, weight, gender, age, hair color, etc) and controls, tactilely, by bodily movement, by eye movement, etc., the movement of the avatar in the virtual reality environment.
- the appearance of the avatar is based on information collected from an on-line source, such as a social network (e.g., FacebookTM or LinkedInTM).
- the user can cause the avatar to walk, run, wave, talk, or make other similar movements.
- the image 318 see FIG.
- 3A and other similar images representing the avatar in the virtual reality environment are crude avatar representations not intended to show how an avatar would actually appear in a virtual reality environment.
- Avatars have generally been represented herein using simple geometric shapes or two dimensional drawings, rather than complex three dimensional shapes such as people and animals. The actual appearance of avatars in the computer-generated virtual reality environment is not important to the concepts discussed herein.
- avatars are discussed with reference to human contact center agents, it is to be appreciated that avatars can be associated with non-human monitored entities.
- an avatar can be a picture or image of a selected agent's personal computer, desktop, phone, keyboard or other user input device, etc. It could also be a representation of an automated resource, such as an interactive response unit.
- FIG. 4 which represents a display 400 of the supervisor communication device 104
- an agent avatar 404 corresponding to a selected contact center agent is depicted in a first display portion 408 of the display 400 .
- a dashboard 412 which is depicted in a second display portion 420 , comprises first, second, . . . zth information display segments 416 a - z , each of the first, second, . . . zth information display segments 416 a - z includes non-virtual reality environment information related to the contact center, predictive dialer, and/or call center and/or the selected contact center agent.
- the dashboards of at least two of the selected agents contain different types of non-virtual reality environment information, such as a first dashboard containing one or more of agent group identifier and information, agent percentage time in state by state identifier, current agent state, agent skill(s), and agent skill level for each agent skill and a second dashboard containing a different one or more of agent group identifier and information, agent percentage time in state by state identifier, current agent state, agent skill(s), and agent skill level for each agent skill.
- a first dashboard containing one or more of agent group identifier and information, agent percentage time in state by state identifier, current agent state, agent skill(s), and agent skill level for each agent skill
- agent percentage time in state by state identifier current agent state, agent skill(s), and agent skill level for each agent skill
- agent skill level for each agent skill.
- display portions other than or in addition to the dashboard are displayed simultaneously with the virtual reality environment.
- one or more additional display portions can display the corresponding selected agent's desktop display in substantial real time, a historic display of the selected agent's desktop, a live video of the selected agent, and the like.
- the virtual reality environment is viewed by the user with non-virtual reality environment information, in a transparent overlay or background, overlaying the three dimensional content of the virtual reality environment, whereby the three dimensional content of the virtual reality environment continues to be visible through the overlay or background.
- a dashboard showing selected agent and/or contact center, predictive dialer, and/or contact center information overlays the virtual reality environment so that both the dashboard contents and the underlying virtual reality environment are both substantially visible to the user.
- the first and second display portions 408 and 420 would be at least partially overlapping. As will be appreciated, other display configurations are possible within the virtual reality environment.
- the supervisor avatar has changed focus from one selected contact center agent to multiple contact center agents.
- Each agent avatar 404 a - q in a respective first display portion 408 a - q has a corresponding dashboard 412 a - q in a corresponding second display portion 420 a - q .
- the ordering of the agent avatars from left-to-right, right-to-left, or otherwise can depend on the relative performance of each agent. For instance, the left-most agent avatar can correspond to the least performing agent of the selected agents and the right-most to the best performing agent of the selected agents.
- FIG. 5 depicts how the number of supervisor selected avatars can vary depending on the breadth of focus by the supervisor.
- the changing focus of the supervisor changes the spatial relationships between objects in the coordinate system of the virtual reality environment.
- the virtual reality environment can be reconfigured by the virtual reality environment rendering module 208 .
- the virtual reality environment can be changed or reconfigured such that two or three stations immediately before the selected two or three agent avatars (as determined by the user focus module 200 ) are the two or three stations in closest (virtual) spatial proximity to the supervisor avatar or are the two or three stations currently being monitored by the supervisor avatar. In this way, the supervisor need spend less time with navigation keys walking his or her avatar from virtual partition-to-partition or room-to-room to monitor the two or three selected agents.
- automatic speech recognition of a selected term such as “supervisor” in an audio or text stream or other keyword or key word phrase detection in an audio stream or text string or detected contact center agent gesture or other movement would bring a specific contact center agent to the attention of the supervisor and/or move the virtual position of the agent station into spatial proximity with the supervisor avatar to avoid the supervisor needing to recognize the specific contact center agent, determine the specific contact center agent's virtual location, and navigate the supervisor avatar to the specific contact center agent's virtual location.
- the supervisor's experience or total virtual reality session is recorded for future replay.
- the supervisor, or the control module 224 can tag and/or timestamp media streams, agent screen detail, dashboard, or other performance monitoring information to recall specific data or events at a future point in time.
- the tags could output snippets or mark specific events or sections of a larger recording.
- the supervisor avatar when within a defined (virtual) spatial proximity of an avatar associated with a selected contact center agent, hears, in a type of eavesdropping mode, either the agent's end alone or both the agent's and customer's ends of a live contact session.
- the supervisor via his or her avatar, would hear all or part of the dialog between the selected agent and customer.
- the selected agent's dashboard or current desktop display would pop up on the supervisor's display (such as discussed above with reference to FIGS. 4-5 ), thereby enabling the supervisor to monitor more effectively the selected agent's performance overall and current servicing of the customer.
- the supervisor's display changes in real time to reflect changes in the selected agent's desktop.
- the communication module 220 initiates a new session with a selected agent by an action undertaken in the virtual reality environment by the user's avatar relative to the selected agent's avatar or the user joins an existing session between a customer and an agent by an action undertaken in the virtual reality environment by the user's avatar relative to the selected agent's avatar, customer's avatar, or an avatar or other visual or graphical image or object in the virtual reality environment associated with the existing session.
- Initiation of the new session or joining of an existing session can also be caused by other trigger events, such as detection of a keyword or key word phrase spoken by the supervisor.
- the action trigger in the virtual reality environment is the (virtual) proximity of the participants' avatars.
- a change in state of a monitored entity or set of monitored entities causes a corresponding change in a state of a monitored entity's virtual object.
- a change in a state of a contact center agent or group of which the contact center agent is a part causes a corresponding change in a state or appearance of the agent's avatar.
- a change in a performance metric associated with the agent, supervisor, predictive dialer, contact center, and/or call center can cause a corresponding change in an appearance of the agent's avatar to reflect the performance metric.
- exemplary performance metrics include a selected monitored entity percentage time in state by state identifier, a current state of the selected monitored entity, a skill of the selected monitored entity, a skill level for the selected monitored entity, a performance statistic of the selected monitored entity (e.g., an agent “scorecard”), and/or compliance with a contact center, predictive dialer, and/or call center performance statistic, policy, and/or objective (e.g., requirement regarding first-call resolution, service level/response time, average, predicted, forecasted, actual, or mean wait time, forecasting accuracy, (e.g, forecasted work load versus actual work load), contact quality, service level compliance rate, agent occupancy, conversion rate, up-sell/cross-sell rate, cost per call, and customer satisfaction (e.g., complaints, first contact resolution rate, transfer rate
- performance statistics include the time an agent spends reviewing daily alerts or special promotions, logged into a relevant application, organizing his or her workspace before servicing a contact, contact wrap-up activity, average or total contact handle time, average speed of answer, adherence to schedule (a measurement of how much time during an agent's shift he or she is logged in and handling contacts), contact quality (e.g., courtesy and professionalism, providing customers with correct and relevant information, first-contact resolution (e.g., one-and-done), and grammar and spelling in text communication), analyzed competency, number of escalations, agent peak performance index, transfer rate, communication skills, adherence to procedures, and customer satisfaction.
- contact quality e.g., courtesy and professionalism, providing customers with correct and relevant information
- first-contact resolution e.g., one-and-done
- grammar and spelling in text communication analyzed competency, number of escalations, agent peak performance index, transfer rate, communication skills, adherence to procedures, and customer satisfaction.
- a monitor re-arranges monitored virtual entities based on selected criteria (including the performance metrics above), such as for coaching or help.
- a monitor can select a performance metric threshold and organize the monitored virtual entities in a subgroup for those having in common a specified relationship (above or below) the specified threshold.
- the subgroup can automatically be placed into a conference with the supervisor via a one-way or two-way virtual communication medium (e.g., live voice, email, text, chat and the like), monitored as discussed above (such as shown in FIGS. 3A, 3B, and 4-5 ), or otherwise placed in spatial proximity to the monitor virtual entity and the virtual reality environment.
- a one-way or two-way virtual communication medium e.g., live voice, email, text, chat and the like
- a grouping of monitored virtual entities or objects can be assembled automatically and selected information conveyed or broadcast to the group.
- two-way communications between the supervisor and supervised entities would be enabled, such as to engage in a question-and-answer session.
- An example would be where a supervisor desires to convey information to agents regarding a performance metric falling below or rising above a selected threshold.
- a monitor can customize the appearances of avatars associated with himself or herself and/or monitored entities to emphasize information important to the monitor.
- an avatar of a monitored entity can have a first appearance when performing to a higher level and a second different appearance when performing to a lower level.
- the virtual reality environment can be modified to suit the particular personality and beliefs of the monitor.
- a monitor avatar can have a first appearance to a first monitored entity and a second different appearance to a second monitored entity. In this manner, the virtual reality environment can be modified to suit the particular personality and beliefs of the monitor.
- spatially dislocated entities such as two supervisors, two agents, an agent and a subject matter expert, or a supervisor and agent, are able to communicate through the virtual reality environment.
- first and second spatially dislocated agents can converse with one another regarding a work-related topic via the virtual reality environment. This has the advantage of facilitating agent-to-agent interaction with substantial benefits to the contact center, including a better quality of customer service and inter-employee relationship.
- the communication module 220 enables a supervisor to receive a queued call (on hold) with a customer by an action undertaken in the virtual reality environment by the supervisor's avatar relative to an avatar or icon or visual or graphical image associated with the call.
- Each of the calls in-queue has a corresponding avatar or icon or image, which may or may not be virtually shown as being related to a specific queue. Moving the supervisor avatar to a location spatially proximal to the avatar or icon or image would cause the communication module 220 to route the call to the supervisor's communication device 104 .
- a supervisor avatar selects an avatar or other virtual representation of a queued in bound or out bound contact, such as a contact on hold or initiated but not yet answered, and interact with the customer or potential customer, either through or outside of the virtual reality environment, on the other end of the contact.
- the supervisor avatar would be connected by the communication module 220 with the customer and able to service the customer or otherwise interact with the customer.
- a trigger event for such intervention could be the perceived value or rating of the customer, a customer request, a wait time of the contact, or a sensed indicator by the customer of a need for immediate attention, such as detection by an interactive response unit of the contact center of customer frustration or anger.
- the trigger event would also work for a contact being serviced by an agent, in which event the supervisor would use this modality to join in the contact session.
- the supervisor avatar could move away from the virtual representation of the contact, which would cause the supervisor to be disconnected automatically from the contact and the contact to be placed again on hold or disconnected too.
- the supervisor interaction with the customer could be recorded and/or keyword or key word phrase analyzed and tagged or annotated for use by the agent selected to subsequently service the contact.
- this configuration could have the appearance of the supervisor avatar located in a corridor, room, or other defined area behind the agent avatars, with virtual representations of contacts virtually interacting with the agent avatars and progressing through a path of travel (which represents the queue) to the agent avatars.
- the virtual contact avatars, icons or visual or graphical images exit the agent avatar area through an exit. This type of customer interaction could also be possible for an agent avatar.
- a person represented by an avatar such as the supervisor, normally is not “in world”, or does not have an active avatar in the virtual reality environment, unless the person is logged into the virtual reality supervision module 108 . Due to the monitoring function, this arrangement is modified such that the virtual reality supervision module 108 , on its own initiative or at the request of a user, particularly the supervisor, has the ability to activate the avatars and other virtual representations of monitored entities 116 a - n even when they are not logged into the virtual reality supervision module 108 or are not “in world”. In this manner, a monitored entity does not have the ability to evade monitoring simply by “forgetting” to log into the virtual reality supervision module 108 .
- the supervisor can toggle into and out of the virtual reality environment to interact with monitored entities.
- the supervisor's avatar would be “on hold” or inactive but would, upon the supervisor toggling into the virtual reality environment be active again without the supervisor needing to log out and log in, respectively.
- the control module 224 or interrupt module 212 receives a stimulus in step 600 .
- the stimulus requires the virtual reality supervision module 108 to analyze the virtual reality environment and/or the behavior of the supervisor and/or monitored entities inside and/or outside of the virtual reality environment.
- the stimulus may be passage of a predetermined period of time, system interrupt, and/or request by the supervisor and/or a monitored entity.
- step 604 the user focus module 200 determines a focus of the user as discussed above.
- the virtual reality environment rendering module 208 reconfigures the set of monitored entities based on the sensed user focus. Reconfiguration may include not only selecting new monitored entities for more intense monitoring but also accessing new or additional types of virtual and non-virtual reality environment information associated with the new monitored entities for presentation to the user.
- the virtual reality environment rendering module 208 determines whether or not to reorder, in the virtual reality environment, the members of the set of monitored entities. When reordering in the virtual reality environment is to be performed, the virtual reality environment rendering module 208 , in step 616 , reorders the members of the set based on the sensed or determined focus of the user.
- the interrupt module 216 determines whether the stimulus was an interrupt, such as an interrupt resulting from detection of one or more trigger events or triggers. When the stimulus is not an interrupt, the virtual reality supervision module 108 returns to step 600 . When the stimulus is an interrupt, the interrupt module 216 , in decision diamond 624 , determines the type of interrupt.
- interrupts There are five types of interrupts shown. However, the system is not limited to the interrupts specified.
- a first type of interrupt is triggered by a request by the supervisor, an agent, and/or the virtual reality supervision module 108 for a resource, such as a subject matter expert, interactive response unit (e.g., an interactive voice response unit, etc.), etc.
- the virtual reality supervision module 108 determines the availability of the requested resource and, in step 632 , provides the resource when available. Availability may be determined on any suitable basis, such as first-come-first-served.
- a second type of interrupt is triggered by a monitored entity, such as described above.
- the control module 224 in decision diamond 636 , determines whether or not to accept the action associated with the interrupt. This decision may be made by the control module 224 alone or based on a response from the supervisor.
- the virtual reality supervision module 108 returns to step 600 .
- a third type of interrupt is triggered by a supervisor and can be of numerous sub-types, which is determined in decision diamond 644 .
- a first sub-type is when the supervisor toggles out of, or leaves, the virtual reality environment without logging out. In that event, the virtual reality environment rendering module 208 , in step 648 , suspends the avatar associated with the supervisor.
- a second sub-type is when the supervisor toggles in, or re-enters, the virtual reality environment without logging in.
- the virtual reality environment rendering module 208 in step 652 , reactivates the avatar associated with the supervisor.
- a third sub-type is from the supervisor or a non-virtual reality environment application, such as a contact center, predictive dialer or call center application (or a monitored component thereof such as an agent group, queue, queue group, trunk, or trunk group).
- the sub-type is associated with a command, request, or alarm.
- the virtual reality environment rendering module 208 takes the appropriate action.
- a fourth type of interrupt is triggered by the virtual reality environment rendering module 208 itself, such as described above.
- the control module 224 in step 660 , maps the interrupt, or trigger event, against a rule set and selects and applies an appropriate rule.
- a fifth type of interrupt is triggered by a customer, as discussed above.
- the interrupt module 216 determines whether or not to accept the action associated with the interrupt.
- the interrupt module 216 in step 668 , creates an appropriate set of data structures and/or creates appropriate communication path(s) to enable the supervisor to interact with the customer.
- the virtual reality supervision module 108 proceeds to step 640 and terminates operation with respect to the interrupt.
- certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system.
- a distributed network such as a LAN and/or the Internet
- the components of the system can be combined in to one or more devices, such as a server, or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network.
- the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system.
- the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof.
- a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof.
- one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
- the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements.
- These wired or wireless links can also be secure links and may be capable of communicating encrypted information.
- Transmission media used as links can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
- the monitoring system is used for other monitoring, such as monitoring of testing or other type of facilities (in which the monitored entity is the facility or segment or component thereof and the supervisor is security, scientists or other personnel responsible for the operation or maintenance of the facility), monitoring of classrooms or other teaching facilities (in which the monitored entities are the pupils or students and the supervisor is the teacher or instructor), monitoring of security cameras (in which the monitored entities are the security cameras and the supervisor is security personnel responsible for the operation, maintenance, and/or monitoring of the cameras), Transportation Security Administration or TSA-type transportation screening (in which the monitored entities are the TSA security personnel and/or passengers and the supervisor is TSA manager over the security personnel or the TSA personnel, as appropriate) and other like situations, where the supervisor or security can be spread too thin.
- monitoring of testing or other type of facilities in which the monitored entity is the facility or segment or component thereof and the supervisor is security, scientists or other personnel responsible for the operation or maintenance of the facility
- monitoring of classrooms or other teaching facilities in which the monitored entities are the pupils or students and the supervisor is the teacher or instructor
- monitoring of security cameras
- the supervisor would virtually interact with a virtual representation of the passenger and, by the virtual interaction, access the passenger's credentials, ticket information, baggage x-ray images or results, and/or security screening results.
- the monitoring system is used for monitoring students or pupils remotely.
- the teacher or instructor would have a corresponding avatar viewable by the students and the students, or the monitored entities, would have corresponding avatars viewable not only by the teacher but also by one another.
- the avatars would be human-like and have human-like capabilities.
- the student avatars would be arranged before the teacher or instructor avatar in a classroom-like setting at desks or other stations. For instance, student avatars could raise their virtual hands, speak up virtually, or ask a textual question virtually. As each student avatar performs one of these activities or is otherwise prompted by the teacher or instructor, the virtual classroom would be rearranged such that the student avatar and/or his or her desk would come to the front of the virtual classroom to the attention of the other avatars.
- the systems and methods of this disclosure can be implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like.
- a special purpose computer a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like.
- any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure.
- Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices.
- processors e.g., a single or multiple microprocessors
- memory e.g., a single or multiple microprocessors
- nonvolatile storage e.g., a single or multiple microprocessors
- input devices e.g., input devices
- output devices e.g., input devices, and output devices.
- alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
- the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms.
- the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
- the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like.
- the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like.
- the system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
- the present disclosure in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof.
- the present disclosure in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and ⁇ or reducing cost of implementation.
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Operations Research (AREA)
- Economics (AREA)
- Marketing (AREA)
- Data Mining & Analysis (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (35)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/471,890 US9349118B2 (en) | 2011-08-29 | 2012-05-15 | Input, display and monitoring of contact center operation in a virtual reality environment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161528521P | 2011-08-29 | 2011-08-29 | |
US13/471,890 US9349118B2 (en) | 2011-08-29 | 2012-05-15 | Input, display and monitoring of contact center operation in a virtual reality environment |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130050199A1 US20130050199A1 (en) | 2013-02-28 |
US9349118B2 true US9349118B2 (en) | 2016-05-24 |
Family
ID=47742990
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/471,890 Active 2033-11-04 US9349118B2 (en) | 2011-08-29 | 2012-05-15 | Input, display and monitoring of contact center operation in a virtual reality environment |
US13/471,931 Active 2033-08-24 US9105013B2 (en) | 2011-08-29 | 2012-05-15 | Agent and customer avatar presentation in a contact center virtual reality environment |
US13/471,954 Active 2033-09-26 US9251504B2 (en) | 2011-08-29 | 2012-05-15 | Configuring a virtual reality environment in a contact center |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/471,931 Active 2033-08-24 US9105013B2 (en) | 2011-08-29 | 2012-05-15 | Agent and customer avatar presentation in a contact center virtual reality environment |
US13/471,954 Active 2033-09-26 US9251504B2 (en) | 2011-08-29 | 2012-05-15 | Configuring a virtual reality environment in a contact center |
Country Status (1)
Country | Link |
---|---|
US (3) | US9349118B2 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170078236A1 (en) * | 2015-09-16 | 2017-03-16 | CrowdReach, LLC | Systems, computing devices, and methods for facilitating communication to multiple contacts via multiple, different communication modalities |
US10109095B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
US10109096B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
US10158634B2 (en) | 2016-11-16 | 2018-12-18 | Bank Of America Corporation | Remote document execution and network transfer using augmented reality display devices |
US10210767B2 (en) | 2016-12-13 | 2019-02-19 | Bank Of America Corporation | Real world gamification using augmented reality user devices |
US10212157B2 (en) | 2016-11-16 | 2019-02-19 | Bank Of America Corporation | Facilitating digital data transfers using augmented reality display devices |
US10217375B2 (en) | 2016-12-13 | 2019-02-26 | Bank Of America Corporation | Virtual behavior training using augmented reality user devices |
US10311223B2 (en) | 2016-12-02 | 2019-06-04 | Bank Of America Corporation | Virtual reality dynamic authentication |
US10339583B2 (en) | 2016-11-30 | 2019-07-02 | Bank Of America Corporation | Object recognition and analysis using augmented reality user devices |
US10362167B2 (en) * | 2013-06-20 | 2019-07-23 | Avaya Inc. | Proximity based interactions with wallboards |
US10481862B2 (en) | 2016-12-02 | 2019-11-19 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
US10586220B2 (en) | 2016-12-02 | 2020-03-10 | Bank Of America Corporation | Augmented reality dynamic authentication |
US10600111B2 (en) | 2016-11-30 | 2020-03-24 | Bank Of America Corporation | Geolocation notifications using augmented reality user devices |
US10607230B2 (en) | 2016-12-02 | 2020-03-31 | Bank Of America Corporation | Augmented reality dynamic authentication for electronic transactions |
US10685386B2 (en) | 2016-11-30 | 2020-06-16 | Bank Of America Corporation | Virtual assessments using augmented reality user devices |
US10943229B2 (en) | 2016-11-29 | 2021-03-09 | Bank Of America Corporation | Augmented reality headset and digital wallet |
US20220292254A1 (en) * | 2021-03-15 | 2022-09-15 | Avaya Management L.P. | Systems and methods for processing and displaying messages in digital communications |
US11553087B1 (en) * | 2020-09-29 | 2023-01-10 | Amazon Technologies, Inc. | Communication session using a virtual environment |
US11785140B2 (en) | 2020-09-23 | 2023-10-10 | Avaya Management L.P. | Gesture-based call center agent state change control |
US11799920B1 (en) | 2023-03-09 | 2023-10-24 | Bank Of America Corporation | Uninterrupted VR experience during customer and virtual agent interaction |
Families Citing this family (66)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9053196B2 (en) | 2008-05-09 | 2015-06-09 | Commerce Studios Llc, Inc. | Methods for interacting with and manipulating information and systems thereof |
EP2610724B1 (en) * | 2011-12-27 | 2022-01-05 | Tata Consultancy Services Limited | A system and method for online user assistance |
US9582332B2 (en) * | 2012-08-31 | 2017-02-28 | Intel Corporation | Enabling a cloud to effectively assign workloads to servers |
US8792633B2 (en) | 2012-09-07 | 2014-07-29 | Genesys Telecommunications Laboratories, Inc. | Method of distributed aggregation in a call center |
US9900432B2 (en) | 2012-11-08 | 2018-02-20 | Genesys Telecommunications Laboratories, Inc. | Scalable approach to agent-group state maintenance in a contact center |
US9756184B2 (en) | 2012-11-08 | 2017-09-05 | Genesys Telecommunications Laboratories, Inc. | System and method of distributed maintenance of contact center state |
US9477464B2 (en) | 2012-11-20 | 2016-10-25 | Genesys Telecommunications Laboratories, Inc. | Distributed aggregation for contact center agent-groups on sliding interval |
US10412121B2 (en) | 2012-11-20 | 2019-09-10 | Genesys Telecommunications Laboratories, Inc. | Distributed aggregation for contact center agent-groups on growing interval |
US20140188552A1 (en) * | 2013-01-02 | 2014-07-03 | Lap Chan | Methods and systems to reach target customers at the right time via personal and professional mood analysis |
US20140279800A1 (en) * | 2013-03-14 | 2014-09-18 | Agincourt Gaming Llc | Systems and Methods for Artificial Intelligence Decision Making in a Virtual Environment |
US8767948B1 (en) * | 2013-03-15 | 2014-07-01 | Genesys Telecommunications Laboratories, Inc. | Back office services of an intelligent automated agent for a contact center |
US9578171B2 (en) | 2013-03-26 | 2017-02-21 | Genesys Telecommunications Laboratories, Inc. | Low latency distributed aggregation for contact center agent-groups on sliding interval |
US20150088761A1 (en) * | 2013-09-20 | 2015-03-26 | International Business Machines Corporation | Implementing a bargaining strategy between teams with majority voting |
US9571644B2 (en) | 2013-11-20 | 2017-02-14 | Avaya Inc. | Contact advocate |
US9444940B2 (en) * | 2013-11-20 | 2016-09-13 | Avaya Inc. | Pseudo agent matching |
US9805320B2 (en) * | 2014-02-27 | 2017-10-31 | Genesys Telecommunications Laboratories, Inc. | Tag-based performance framework for contact center |
US9641686B1 (en) * | 2014-03-21 | 2017-05-02 | Amazon Technologies, Inc. | Method for using customer attributes to select a service representative |
US10839409B1 (en) | 2014-04-30 | 2020-11-17 | Wells Fargo Bank, N.A. | Augmented reality store and services orientation gamification |
US10726473B1 (en) | 2014-04-30 | 2020-07-28 | Wells Fargo Bank, N.A. | Augmented reality shopping rewards |
US10395292B1 (en) | 2014-04-30 | 2019-08-27 | Wells Fargo Bank, N.A. | Augmented reality electronic device using facial recognition functionality and displaying shopping reward at retail locations |
US20150347421A1 (en) * | 2014-05-29 | 2015-12-03 | Avaya Inc. | Graph database for a contact center |
US9531880B2 (en) * | 2014-06-04 | 2016-12-27 | Avaya Inc. | Optimization in workforce management using work assignment engine data |
US10178473B2 (en) * | 2014-09-05 | 2019-01-08 | Plantronics, Inc. | Collection and analysis of muted audio |
US9338404B1 (en) * | 2014-12-23 | 2016-05-10 | Verizon Patent And Licensing Inc. | Communication in a virtual reality environment |
US10529030B2 (en) * | 2015-01-09 | 2020-01-07 | Conduent Business Services, Llc | System and method for labeling messages from customer-agent interactions on social media to identify an issue and a response |
US10097953B1 (en) | 2015-06-13 | 2018-10-09 | United Services Automobile Association (Usaa) | Network based resource management and allocation |
US9596349B1 (en) * | 2015-06-29 | 2017-03-14 | State Farm Mutual Automobile Insurance Company | Voice and speech recognition for call center feedback and quality assurance |
US10412194B1 (en) * | 2015-12-10 | 2019-09-10 | Massachusetts Mutual Life Insurance Company | Systems and methods for managing computer-based requests |
US9761056B1 (en) * | 2016-03-10 | 2017-09-12 | Immersv, Inc. | Transitioning from a virtual reality application to an application install |
DE112017002784T5 (en) * | 2016-06-02 | 2019-02-14 | Sony Corporation | Display controller and display control method, display device and mobile device |
US11108708B2 (en) | 2016-06-06 | 2021-08-31 | Global Tel*Link Corporation | Personalized chatbots for inmates |
US10289261B2 (en) | 2016-06-29 | 2019-05-14 | Paypal, Inc. | Visualization of spending data in an altered reality |
EP3264228A1 (en) * | 2016-06-30 | 2018-01-03 | Nokia Technologies Oy | Mediated reality |
US9596350B1 (en) * | 2016-07-21 | 2017-03-14 | Genesys Telecommunications Laboratories, Inc. | Virtual interactions in contact center operations |
US20180067717A1 (en) * | 2016-09-02 | 2018-03-08 | Allomind, Inc. | Voice-driven interface to control multi-layered content in a head mounted display |
CN106385682A (en) * | 2016-09-21 | 2017-02-08 | 平越 | Multi-functional virtual reality entertainment system and method thereof |
US10484643B2 (en) | 2016-11-10 | 2019-11-19 | Avaya Inc. | Intelligent contact recording in a virtual reality contact center |
US10595012B2 (en) | 2016-12-02 | 2020-03-17 | Google Llc | Representations of event notifications in virtual reality |
US10404804B2 (en) * | 2017-01-30 | 2019-09-03 | Global Tel*Link Corporation | System and method for personalized virtual reality experience in a controlled environment |
WO2018195280A1 (en) * | 2017-04-21 | 2018-10-25 | Walmart Apollo, Llc | Virtual reality network management user interface |
US10824293B2 (en) * | 2017-05-08 | 2020-11-03 | International Business Machines Corporation | Finger direction based holographic object interaction from a distance |
US20180342109A1 (en) * | 2017-05-25 | 2018-11-29 | Thomson Licensing | Determining full-body pose for a virtual reality environment |
US10091358B1 (en) * | 2017-06-26 | 2018-10-02 | Splunk Inc. | Graphical user interface for call center analysis |
US10069972B1 (en) * | 2017-06-26 | 2018-09-04 | Splunk, Inc. | Call center analysis |
US20190025906A1 (en) | 2017-07-21 | 2019-01-24 | Pearson Education, Inc. | Systems and methods for virtual reality-based assessment |
US10748525B2 (en) * | 2017-12-11 | 2020-08-18 | International Business Machines Corporation | Multi-modal dialog agents representing a level of confidence in analysis |
US10901687B2 (en) | 2018-02-27 | 2021-01-26 | Dish Network L.L.C. | Apparatus, systems and methods for presenting content reviews in a virtual world |
US20200380486A1 (en) * | 2018-03-20 | 2020-12-03 | Rocky Jerome Wright | Augmented reality and messaging |
CN108771866B (en) * | 2018-05-29 | 2021-09-24 | 网易(杭州)网络有限公司 | Virtual object control method and device in virtual reality |
CA3101001A1 (en) * | 2018-07-19 | 2020-01-23 | Soul Machines Limited | Machine interaction |
US10606345B1 (en) * | 2018-09-25 | 2020-03-31 | XRSpace CO., LTD. | Reality interactive responding system and reality interactive responding method |
US11538045B2 (en) | 2018-09-28 | 2022-12-27 | Dish Network L.L.C. | Apparatus, systems and methods for determining a commentary rating |
CN109472913A (en) * | 2018-10-23 | 2019-03-15 | 广东觅游信息科技有限公司 | Business hall intelligent queuing method and system |
US11403933B2 (en) * | 2019-05-06 | 2022-08-02 | Teleperformance Se | Systems and methods for implementing and using a proximity dashboard |
CN110225089A (en) * | 2019-05-09 | 2019-09-10 | 厦门网宿有限公司 | It is a kind of that the method and system of differentiation cloud desktop is provided |
US11531950B2 (en) * | 2019-12-30 | 2022-12-20 | Liveperson, Inc. | Dynamic analytics and forecasting for messaging staff |
US11437017B2 (en) * | 2020-07-23 | 2022-09-06 | International Business Machines Corporation | Embodied negotiation agent and platform |
US11691076B2 (en) * | 2020-08-10 | 2023-07-04 | Jocelyn Tan | Communication with in-game characters |
US11172163B1 (en) * | 2021-01-29 | 2021-11-09 | Zoom Video Communications, Inc. | Video call queues |
US20220270505A1 (en) * | 2021-02-24 | 2022-08-25 | Interactive Video Images, Inc. | Interactive Avatar Training System |
US20230156157A1 (en) * | 2021-11-15 | 2023-05-18 | Lemon Inc. | Facilitating collaboration in a work environment |
US11677908B2 (en) | 2021-11-15 | 2023-06-13 | Lemon Inc. | Methods and systems for facilitating a collaborative work environment |
US11887266B2 (en) * | 2022-05-23 | 2024-01-30 | The Government of the United States of America, as represented by the Secretary of Homeland Security | Augmented reality security screening and guidance |
US20230376969A1 (en) * | 2022-05-23 | 2023-11-23 | Bank Of America Corporation | Providing customer service within a metaverse |
JP7571981B2 (en) | 2022-09-12 | 2024-10-23 | 株式会社アバターディスパッチ | Customer service server, customer service system, customer service method, and program |
US12100088B2 (en) * | 2022-12-30 | 2024-09-24 | Theai, Inc. | Recognition of intent of artificial intelligence characters |
Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999017189A1 (en) * | 1997-09-30 | 1999-04-08 | Mci Communications Corporation | Monitoring system client for a call center |
US6987846B1 (en) * | 2000-05-31 | 2006-01-17 | Rockwell Electronic Commerce Technologies, Llc | System and method of changing entity configuration information while automatically monitoring and displaying such information in a call center |
US20080139315A1 (en) | 2006-12-08 | 2008-06-12 | Nortel Networks Limited | Provision of Contact Center Services To Players of Games |
US20080275794A1 (en) | 2007-05-03 | 2008-11-06 | Emma E. Aguirre | Virtual real estate office |
US20090241126A1 (en) | 2008-03-18 | 2009-09-24 | International Business Machines Corporation | Service and Commerce Based Cookies and Notification |
US20090240359A1 (en) * | 2008-03-18 | 2009-09-24 | Nortel Networks Limited | Realistic Audio Communication in a Three Dimensional Computer-Generated Virtual Environment |
US20090251457A1 (en) | 2008-04-03 | 2009-10-08 | Cisco Technology, Inc. | Reactive virtual environment |
US20090307051A1 (en) | 2008-06-10 | 2009-12-10 | Finn Peter G | Consumer rating and customer service based thereon within a virtual universe |
US20100083140A1 (en) | 2008-09-26 | 2010-04-01 | International Business Machines Corporation | Method, system, and program product for variable rendering of virtual universe avatars |
US20100162121A1 (en) | 2008-12-22 | 2010-06-24 | Nortel Networks Limited | Dynamic customization of a virtual world |
US20100169797A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks, Limited | User Interface for Orienting New Users to a Three Dimensional Computer-Generated Virtual Environment |
US20100169795A1 (en) | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Interrelating Virtual Environment and Web Content |
US20100169799A1 (en) | 2008-12-30 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Enabling Presentations to Large Numbers of Users in a Virtual Environment |
US20100169486A1 (en) | 2008-12-30 | 2010-07-01 | Nortel Networks Limited | Access to resources in a virtual environment |
US20100169798A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Visual Indication of User Interests in a Computer-Generated Virtual Environment |
US20100169837A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Providing Web Content in the Context of a Virtual Environment |
US20100169796A1 (en) | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Visual Indication of Audio Context in a Computer-Generated Virtual Environment |
US20100164956A1 (en) * | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Monitoring User Attention with a Computer-Generated Virtual Environment |
US20100169444A1 (en) | 2007-07-11 | 2010-07-01 | International Business Machines Corporation | Method, system and program product for assigning a responder to a requester in a collaborative environment |
US20100235218A1 (en) | 2008-09-29 | 2010-09-16 | Avaya Inc. | Pre-qualified or history-based customer service |
US20100257464A1 (en) * | 2008-02-28 | 2010-10-07 | Chevron U.S.A. Inc. | System and method for immersive operations intelligence |
US20100299180A1 (en) * | 2005-06-10 | 2010-11-25 | Odysseas Tsatalos | Virtual Office Environment |
US20100296417A1 (en) | 2009-05-20 | 2010-11-25 | Avaya Inc. | Grid-based contact center |
US20100306021A1 (en) | 2007-10-22 | 2010-12-02 | O'connor Neil | Contact center integration into virtual environments |
US20100313147A1 (en) | 2007-10-22 | 2010-12-09 | Michael Hartman | Representations of communications sessions in virtual environments |
US20110055927A1 (en) | 2009-08-27 | 2011-03-03 | International Business Machines Corporation | Updating assets rendered in a virtual world environment based on detected user interactions in another world |
US20110066938A1 (en) | 2009-09-15 | 2011-03-17 | Verizon Patent And Licensing Inc. | Method and system for mutidimensional virtual online support center |
US20110075819A1 (en) | 2009-09-30 | 2011-03-31 | International Business Machines Corporation | Customer support center with virtual world enhancements |
US20110125793A1 (en) | 2009-11-20 | 2011-05-26 | Avaya Inc. | Method for determining response channel for a contact center from historic social media postings |
US20110125826A1 (en) | 2009-11-20 | 2011-05-26 | Avaya Inc. | Stalking social media users to maximize the likelihood of immediate engagement |
US20110255683A1 (en) | 2010-04-14 | 2011-10-20 | Avaya Inc. | One-to-one matching in a contact center |
US8082297B2 (en) | 2007-05-24 | 2011-12-20 | Avaya, Inc. | Method and apparatus for managing communication between participants in a virtual environment |
US8386930B2 (en) * | 2009-06-05 | 2013-02-26 | International Business Machines Corporation | Contextual data center management utilizing a virtual environment |
US8601386B2 (en) * | 2007-04-20 | 2013-12-03 | Ingenio Llc | Methods and systems to facilitate real time communications in virtual reality |
US8655674B2 (en) | 2008-06-24 | 2014-02-18 | International Business Machines Corporation | Personal service assistance in a virtual universe |
-
2012
- 2012-05-15 US US13/471,890 patent/US9349118B2/en active Active
- 2012-05-15 US US13/471,931 patent/US9105013B2/en active Active
- 2012-05-15 US US13/471,954 patent/US9251504B2/en active Active
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999017189A1 (en) * | 1997-09-30 | 1999-04-08 | Mci Communications Corporation | Monitoring system client for a call center |
US6987846B1 (en) * | 2000-05-31 | 2006-01-17 | Rockwell Electronic Commerce Technologies, Llc | System and method of changing entity configuration information while automatically monitoring and displaying such information in a call center |
US20100299180A1 (en) * | 2005-06-10 | 2010-11-25 | Odysseas Tsatalos | Virtual Office Environment |
US20080139315A1 (en) | 2006-12-08 | 2008-06-12 | Nortel Networks Limited | Provision of Contact Center Services To Players of Games |
US8601386B2 (en) * | 2007-04-20 | 2013-12-03 | Ingenio Llc | Methods and systems to facilitate real time communications in virtual reality |
US20080275794A1 (en) | 2007-05-03 | 2008-11-06 | Emma E. Aguirre | Virtual real estate office |
US8082297B2 (en) | 2007-05-24 | 2011-12-20 | Avaya, Inc. | Method and apparatus for managing communication between participants in a virtual environment |
US20100169444A1 (en) | 2007-07-11 | 2010-07-01 | International Business Machines Corporation | Method, system and program product for assigning a responder to a requester in a collaborative environment |
US20100313147A1 (en) | 2007-10-22 | 2010-12-09 | Michael Hartman | Representations of communications sessions in virtual environments |
US20100306021A1 (en) | 2007-10-22 | 2010-12-02 | O'connor Neil | Contact center integration into virtual environments |
US20100257464A1 (en) * | 2008-02-28 | 2010-10-07 | Chevron U.S.A. Inc. | System and method for immersive operations intelligence |
US20090240359A1 (en) * | 2008-03-18 | 2009-09-24 | Nortel Networks Limited | Realistic Audio Communication in a Three Dimensional Computer-Generated Virtual Environment |
US20090241126A1 (en) | 2008-03-18 | 2009-09-24 | International Business Machines Corporation | Service and Commerce Based Cookies and Notification |
US20090251457A1 (en) | 2008-04-03 | 2009-10-08 | Cisco Technology, Inc. | Reactive virtual environment |
US20090307051A1 (en) | 2008-06-10 | 2009-12-10 | Finn Peter G | Consumer rating and customer service based thereon within a virtual universe |
US8655674B2 (en) | 2008-06-24 | 2014-02-18 | International Business Machines Corporation | Personal service assistance in a virtual universe |
US20100083140A1 (en) | 2008-09-26 | 2010-04-01 | International Business Machines Corporation | Method, system, and program product for variable rendering of virtual universe avatars |
US20100235218A1 (en) | 2008-09-29 | 2010-09-16 | Avaya Inc. | Pre-qualified or history-based customer service |
US20100162121A1 (en) | 2008-12-22 | 2010-06-24 | Nortel Networks Limited | Dynamic customization of a virtual world |
US20100169795A1 (en) | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Interrelating Virtual Environment and Web Content |
US20100169796A1 (en) | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Visual Indication of Audio Context in a Computer-Generated Virtual Environment |
US20100164956A1 (en) * | 2008-12-28 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Monitoring User Attention with a Computer-Generated Virtual Environment |
US20100169837A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Providing Web Content in the Context of a Virtual Environment |
US20100169798A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks Limited | Visual Indication of User Interests in a Computer-Generated Virtual Environment |
US20100169797A1 (en) | 2008-12-29 | 2010-07-01 | Nortel Networks, Limited | User Interface for Orienting New Users to a Three Dimensional Computer-Generated Virtual Environment |
US20100169486A1 (en) | 2008-12-30 | 2010-07-01 | Nortel Networks Limited | Access to resources in a virtual environment |
US20100169799A1 (en) | 2008-12-30 | 2010-07-01 | Nortel Networks Limited | Method and Apparatus for Enabling Presentations to Large Numbers of Users in a Virtual Environment |
US20100296417A1 (en) | 2009-05-20 | 2010-11-25 | Avaya Inc. | Grid-based contact center |
US8386930B2 (en) * | 2009-06-05 | 2013-02-26 | International Business Machines Corporation | Contextual data center management utilizing a virtual environment |
US20110055927A1 (en) | 2009-08-27 | 2011-03-03 | International Business Machines Corporation | Updating assets rendered in a virtual world environment based on detected user interactions in another world |
US20110066938A1 (en) | 2009-09-15 | 2011-03-17 | Verizon Patent And Licensing Inc. | Method and system for mutidimensional virtual online support center |
US20110075819A1 (en) | 2009-09-30 | 2011-03-31 | International Business Machines Corporation | Customer support center with virtual world enhancements |
US20110125826A1 (en) | 2009-11-20 | 2011-05-26 | Avaya Inc. | Stalking social media users to maximize the likelihood of immediate engagement |
US20110125793A1 (en) | 2009-11-20 | 2011-05-26 | Avaya Inc. | Method for determining response channel for a contact center from historic social media postings |
US20110255683A1 (en) | 2010-04-14 | 2011-10-20 | Avaya Inc. | One-to-one matching in a contact center |
Non-Patent Citations (13)
Title |
---|
"Avaya web.alive(TM) Enterprise Collaboration Experience," Avaya, Aug. 2011, 4 pages. |
"Avaya web.alive™ Enterprise Collaboration Experience," Avaya, Aug. 2011, 4 pages. |
Avava-The Power of We; "Avaya web.aliveTM Enterprise Collaboration Experience"; 4 pages, ~Aug. 2011. |
Avava-The Power of We; "Avaya web.aliveTM Enterprise Collaboration Experience"; 4 pages, ˜Aug. 2011. |
Avaya Compact Contact Center-System Supervisor's Manual; 40DHB0002UKDV Issue 1 (Apr. 2003); 96 pages. |
Avaya web.alive Fact Sheet, Avaya, Feb. 2011, 4 pages. |
Avaya-Avaya web.alive; Fact Sheet; 4 pages, ~Feb. 2011. |
Avaya-Avaya web.alive; Fact Sheet; 4 pages, ˜Feb. 2011. |
Notice of Allowance for U.S. Appl. No. 13/471,931, mailed Mar. 31, 2015 12 pages. |
Notice of Allowance for U.S. Appl. No. 13/471,954, mailed Aug. 24, 2015 5 pages. |
Official Action for U.S. Appl. No. 13/471,954, mailed May 1, 2015 9 pages. |
U.S. Appl. No. 13/471,931, filed May 15, 2012, Chavez. |
U.S. Appl. No. 13/471,954, filed May 15, 2012, Chavez. |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10362167B2 (en) * | 2013-06-20 | 2019-07-23 | Avaya Inc. | Proximity based interactions with wallboards |
US10652195B2 (en) * | 2015-09-16 | 2020-05-12 | CrowdReach, LLC | Systems, computing devices, and methods for facilitating communication to multiple contacts via multiple, different communication modalities |
US20170078236A1 (en) * | 2015-09-16 | 2017-03-16 | CrowdReach, LLC | Systems, computing devices, and methods for facilitating communication to multiple contacts via multiple, different communication modalities |
US10979425B2 (en) | 2016-11-16 | 2021-04-13 | Bank Of America Corporation | Remote document execution and network transfer using augmented reality display devices |
US10158634B2 (en) | 2016-11-16 | 2018-12-18 | Bank Of America Corporation | Remote document execution and network transfer using augmented reality display devices |
US10212157B2 (en) | 2016-11-16 | 2019-02-19 | Bank Of America Corporation | Facilitating digital data transfers using augmented reality display devices |
US10462131B2 (en) | 2016-11-16 | 2019-10-29 | Bank Of America Corporation | Remote document execution and network transfer using augmented reality display devices |
US10943229B2 (en) | 2016-11-29 | 2021-03-09 | Bank Of America Corporation | Augmented reality headset and digital wallet |
US10600111B2 (en) | 2016-11-30 | 2020-03-24 | Bank Of America Corporation | Geolocation notifications using augmented reality user devices |
US10685386B2 (en) | 2016-11-30 | 2020-06-16 | Bank Of America Corporation | Virtual assessments using augmented reality user devices |
US10679272B2 (en) | 2016-11-30 | 2020-06-09 | Bank Of America Corporation | Object recognition and analysis using augmented reality user devices |
US10339583B2 (en) | 2016-11-30 | 2019-07-02 | Bank Of America Corporation | Object recognition and analysis using augmented reality user devices |
US10607230B2 (en) | 2016-12-02 | 2020-03-31 | Bank Of America Corporation | Augmented reality dynamic authentication for electronic transactions |
US11288679B2 (en) | 2016-12-02 | 2022-03-29 | Bank Of America Corporation | Augmented reality dynamic authentication for electronic transactions |
US10481862B2 (en) | 2016-12-02 | 2019-11-19 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
US10311223B2 (en) | 2016-12-02 | 2019-06-04 | Bank Of America Corporation | Virtual reality dynamic authentication |
US10586220B2 (en) | 2016-12-02 | 2020-03-10 | Bank Of America Corporation | Augmented reality dynamic authentication |
US11710110B2 (en) | 2016-12-02 | 2023-07-25 | Bank Of America Corporation | Augmented reality dynamic authentication |
US10999313B2 (en) | 2016-12-02 | 2021-05-04 | Bank Of America Corporation | Facilitating network security analysis using virtual reality display devices |
US10109096B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
US10109095B2 (en) | 2016-12-08 | 2018-10-23 | Bank Of America Corporation | Facilitating dynamic across-network location determination using augmented reality display devices |
US10217375B2 (en) | 2016-12-13 | 2019-02-26 | Bank Of America Corporation | Virtual behavior training using augmented reality user devices |
US10210767B2 (en) | 2016-12-13 | 2019-02-19 | Bank Of America Corporation | Real world gamification using augmented reality user devices |
US11785140B2 (en) | 2020-09-23 | 2023-10-10 | Avaya Management L.P. | Gesture-based call center agent state change control |
US11553087B1 (en) * | 2020-09-29 | 2023-01-10 | Amazon Technologies, Inc. | Communication session using a virtual environment |
US20220292254A1 (en) * | 2021-03-15 | 2022-09-15 | Avaya Management L.P. | Systems and methods for processing and displaying messages in digital communications |
US11556696B2 (en) * | 2021-03-15 | 2023-01-17 | Avaya Management L.P. | Systems and methods for processing and displaying messages in digital communications |
US11799920B1 (en) | 2023-03-09 | 2023-10-24 | Bank Of America Corporation | Uninterrupted VR experience during customer and virtual agent interaction |
Also Published As
Publication number | Publication date |
---|---|
US20130050199A1 (en) | 2013-02-28 |
US9105013B2 (en) | 2015-08-11 |
US9251504B2 (en) | 2016-02-02 |
US20130051547A1 (en) | 2013-02-28 |
US20130051548A1 (en) | 2013-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9349118B2 (en) | Input, display and monitoring of contact center operation in a virtual reality environment | |
US11755978B2 (en) | System for providing dynamic recommendations based on interactions in retail stores | |
US11088971B2 (en) | Virtual area communications | |
Saatçi et al. | (re) configuring hybrid meetings: Moving from user-centered design to meeting-centered design | |
US10878226B2 (en) | Sentiment analysis in a video conference | |
US9621731B2 (en) | Controlling conference calls | |
US10931827B1 (en) | Database allocation and analytics for service call centers | |
WO2022086954A1 (en) | Methods and systems for simulating in-person interactions in virtual environments | |
US20060256953A1 (en) | Method and system for improving workforce performance in a contact center | |
US20160036981A1 (en) | System and method for case-based routing for a contact | |
US20160036982A1 (en) | System and method for anticipatory dynamic customer segmentation for a contact center | |
US11849245B2 (en) | Participation management system for videoconferencing | |
US20230132588A1 (en) | Adaptive integrated, and interactive education and communication system for people with hearing loss and hearing healthcare providers and related methods | |
Uchihira et al. | Collaboration management by smart voice messaging for physical and adaptive intelligent services | |
CN106797382A (en) | For the system and method for the route based on event of call center | |
US20210075750A1 (en) | Method for controlling display of a consultation session | |
Lim | Beyond Fatiguing Virtual Meetings: How Should Virtual Meetings for Workplaces Be Supported? | |
Luo | Models of reference services | |
Lewis et al. | Co-evolution: exploring synergies between Artificial Intelligence (AI) and the supervisor | |
Klokmose | Banu Saatçi1*, Kaya Akyüz2*, Sean Rintel3 & | |
Danninger | Intelligently connecting people: facilitating socially appropriate communication in mobile and office environments | |
ROMAN | Improving Meetings Effectiveness using Interactive Artifacts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AVAYA INC., NEW JERSEY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHAVEZ, DAVID L.;REEL/FRAME:028211/0518 Effective date: 20120509 |
|
AS | Assignment |
Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:029608/0256 Effective date: 20121221 Owner name: THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A., P Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:029608/0256 Effective date: 20121221 |
|
AS | Assignment |
Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, PENNSYLVANIA Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 Owner name: BANK OF NEW YORK MELLON TRUST COMPANY, N.A., THE, Free format text: SECURITY AGREEMENT;ASSIGNOR:AVAYA, INC.;REEL/FRAME:030083/0639 Effective date: 20130307 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS ADMINISTRATIVE AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS INC.;OCTEL COMMUNICATIONS CORPORATION;AND OTHERS;REEL/FRAME:041576/0001 Effective date: 20170124 |
|
CC | Certificate of correction | ||
AS | Assignment |
Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL COMMUNICATIONS CORPORATION), CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 029608/0256;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A.;REEL/FRAME:044891/0801 Effective date: 20171128 Owner name: VPNET TECHNOLOGIES, INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS INC., CALIFORNI Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: OCTEL COMMUNICATIONS LLC (FORMERLY KNOWN AS OCTEL Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 041576/0001;ASSIGNOR:CITIBANK, N.A.;REEL/FRAME:044893/0531 Effective date: 20171128 Owner name: AVAYA INC., CALIFORNIA Free format text: BANKRUPTCY COURT ORDER RELEASING ALL LIENS INCLUDING THE SECURITY INTEREST RECORDED AT REEL/FRAME 030083/0639;ASSIGNOR:THE BANK OF NEW YORK MELLON TRUST COMPANY, N.A.;REEL/FRAME:045012/0666 Effective date: 20171128 |
|
AS | Assignment |
Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 Owner name: GOLDMAN SACHS BANK USA, AS COLLATERAL AGENT, NEW Y Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045034/0001 Effective date: 20171215 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA INTEGRATED CABINET SOLUTIONS LLC;OCTEL COMMUNICATIONS LLC;AND OTHERS;REEL/FRAME:045124/0026 Effective date: 20171215 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, MINNESOTA Free format text: SECURITY INTEREST;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:053955/0436 Effective date: 20200925 |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, AS COLLATERAL AGENT, DELAWARE Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;INTELLISIST, INC.;AVAYA MANAGEMENT L.P.;AND OTHERS;REEL/FRAME:061087/0386 Effective date: 20220712 |
|
AS | Assignment |
Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 Owner name: AVAYA HOLDINGS CORP., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS AT REEL 45124/FRAME 0026;ASSIGNOR:CITIBANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:063457/0001 Effective date: 20230403 |
|
AS | Assignment |
Owner name: WILMINGTON SAVINGS FUND SOCIETY, FSB (COLLATERAL AGENT), DELAWARE Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA MANAGEMENT L.P.;AVAYA INC.;INTELLISIST, INC.;AND OTHERS;REEL/FRAME:063742/0001 Effective date: 20230501 |
|
AS | Assignment |
Owner name: CITIBANK, N.A., AS COLLATERAL AGENT, NEW YORK Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNORS:AVAYA INC.;AVAYA MANAGEMENT L.P.;INTELLISIST, INC.;REEL/FRAME:063542/0662 Effective date: 20230501 |
|
AS | Assignment |
Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: CAAS TECHNOLOGIES, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY II, LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: HYPERQUALITY, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: ZANG, INC. (FORMER NAME OF AVAYA CLOUD INC.), NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: VPNET TECHNOLOGIES, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: OCTEL COMMUNICATIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 045034/0001);ASSIGNOR:GOLDMAN SACHS BANK USA., AS COLLATERAL AGENT;REEL/FRAME:063779/0622 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 53955/0436);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063705/0023 Effective date: 20230501 Owner name: AVAYA INTEGRATED CABINET SOLUTIONS LLC, NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: INTELLISIST, INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: AVAYA INC., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 Owner name: AVAYA MANAGEMENT L.P., NEW JERSEY Free format text: RELEASE OF SECURITY INTEREST IN PATENTS (REEL/FRAME 61087/0386);ASSIGNOR:WILMINGTON TRUST, NATIONAL ASSOCIATION, AS NOTES COLLATERAL AGENT;REEL/FRAME:063690/0359 Effective date: 20230501 |
|
AS | Assignment |
Owner name: AVAYA LLC, DELAWARE Free format text: (SECURITY INTEREST) GRANTOR'S NAME CHANGE;ASSIGNOR:AVAYA INC.;REEL/FRAME:065019/0231 Effective date: 20230501 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |