US20140282109A1 - Context frame for sharing context information - Google Patents
Context frame for sharing context information Download PDFInfo
- Publication number
- US20140282109A1 US20140282109A1 US14/206,906 US201414206906A US2014282109A1 US 20140282109 A1 US20140282109 A1 US 20140282109A1 US 201414206906 A US201414206906 A US 201414206906A US 2014282109 A1 US2014282109 A1 US 2014282109A1
- Authority
- US
- United States
- Prior art keywords
- items
- participant
- collaboration session
- leader
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1069—Session establishment or de-establishment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1083—In-session procedures
- H04L65/1089—In-session procedures by adding media; by removing media
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1083—In-session procedures
- H04L65/1093—In-session procedures by adding participants; by removing participants
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/403—Arrangements for multi-party communication, e.g. for conferences
- H04L65/4038—Arrangements for multi-party communication, e.g. for conferences with floor control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/14—Session management
- H04L67/142—Managing session states for stateless protocols; Signalling session states; State transitions; Keeping-state mechanisms
Definitions
- an interactive collaboration session different items of the interactive collaboration session are identified. For example, different agenda items or comments made during the collaboration session can be identified.
- a participant in the interactive collaboration session can provide input to tag an individual item during the interactive collaboration session.
- context information is associated with the individual item. For example, a participant can indicate that they like a particular comment made in the interactive collaboration session.
- a participant can associate a context frame, such as a window, with the individual item.
- information is generated to display the associated context information to a first participant of the interactive collaboration session.
- tagging the individual item is accomplished by a second participant of the interactive collaboration session.
- the tagged individual item is an item that accepts input from the second participant of the interactive collaboration session.
- the input is received by displaying a window for the second participant to input the context information.
- a tag tray is used for tagging individual item of the interactive collaboration session.
- the tag tray displays a number that identifies a limit on a number of times a particular type of context information can be tagged in the interactive collaboration session.
- tagging the individual item comprises dragging an icon to tag the individual item.
- the context frame displays a number associated with an individual type of context information that represents a number of associated context items.
- the tagged individual item comprises one or more of: a like, a dislike, an importance, a question, a clarification, a comment, a document, a contact, a favorite, a monetary value, a summary, a presentation, a photo, a value driver, and a link.
- a selection of the like button is received from a participant.
- information is generated to displaying a participant who liked the individual item.
- FIG. 1 is a block diagram of a first illustrative system for controlling a display of a collaboration framework.
- FIG. 2 is a diagram of a first view of a leader window for controlling a collaboration framework.
- FIG. 3 is a diagram of first view of a participant window of a controlled view in a collaboration framework.
- FIG. 4 is a second diagram of a second view of a leader window for controlling a collaboration framework.
- FIG. 5 is a diagram of a second view of a participant window of a controlled view in a collaboration framework.
- FIG. 6 is a diagram of a leader window for viewing context in collaboration framework.
- FIG. 7 is a diagram of a view of a participant window that contains a context frame for managing contributions to an interactive collaboration session.
- FIG. 8 is a first diagram of a first view of a leader window for controlling a collaboration framework that includes a stepped activity framework.
- FIG. 9 is a diagram of a first view of a participant window of a controlled view in a collaboration framework that uses a stepped activity framework.
- FIG. 10 is a second diagram of a second view of a leader window for controlling a collaboration framework that includes a stepped activity framework.
- FIG. 11 is a diagram of a second view of a participant window of a controlled view in a collaboration framework that uses a stepped activity framework.
- FIG. 12 is flow diagram of a method for controlling a display of a collaboration framework.
- FIG. 13 is a flow diagram of a method for managing context information in an interactive collaboration session.
- FIG. 1 is a block diagram of a first illustrative system 100 for controlling a display of a collaboration framework.
- the first illustrative system comprises communication devices 101 A- 101 N, a network 110 , and a collaboration system 120 .
- the communication devices 101 A- 101 N may be any device that can communicate on the network 110 , such as a Personal Computer (PC), a telephone, a video system, a cellular telephone, a Personal Digital Assistant (PDA), a tablet device, a notebook device, a smart phone, and the like. As shown in FIG. 1 , any number of communication devices 101 A- 101 N may be connected to the network 110 , including only a single communication device 101 . In addition, the communication device 101 may be directly connected to the collaboration system 120 .
- PC Personal Computer
- PDA Personal Digital Assistant
- the communication device 101 comprises a processor 102 , a memory 103 , and a network interface 104 .
- the processor 102 can be any type of processor 102 , such as a microprocessor, a multi-core processor, a Digital Signaling Processor (DSP), a microcontroller, and/or the like.
- the memory 103 can be any type of memory, such as a Random Access Memory, a Flash Memory, a hard disk, and/or the like.
- the network interface 104 can be any type of interface, such as a wireless interface, a wired interface, a cellular interface, an Ethernet interface, and/or the like. The network interface 104 is used to communicate with the network 110 .
- the communication device 101 generally includes participant output devices, such as a display, a speaker, a vibrator, and the like.
- the communication device 101 can include participant input devices such as a keyboard, a keypad, a touch screen, a mouse, a pointing device, and/or the like.
- the processor 102 and the memory 103 can be used to execute programs.
- the processor 102 and the memory 103 can be used to execute a web browser or other applications that work in conjunction with the collaboration system 120 .
- the network 110 can be any network that can send and receive information, such as the Internet, a Wide Area Network (WAN), a Local Area Network (LAN), the Public Switched Telephone Network (PSTN), a packet switched network, a circuit switched network, a cellular network, a combination of these, and the like.
- the network 110 can use a variety of protocols, such as Ethernet, Internet Protocol (IP), Session Initiation Protocol (SIP), Integrated Services Digital Network (ISDN), and the like.
- IP Internet Protocol
- SIP Session Initiation Protocol
- ISDN Integrated Services Digital Network
- the collaboration system 120 can be any hardware/software that can communicate on the network 110 , such as a collaboration server, a multimedia server, a server, a web server, a web conferencing system, a communication system, a combination of these, and the like.
- the collaboration system 120 can include a server that incorporates the memory/data storage 127 .
- the memory/data storage contains programming/computer instructions for implementing various modules or functions.
- the collaboration system 120 includes a processor 126 for executing the programming/computer instructions in the memory/data storage 127 .
- the modules or function can communicate information via the network interface 128 to the communication devices 101 A- 101 N.
- the collaboration system 120 further comprises an authentication module 121 , a voting module 122 , a collaboration module 123 , a template 124 , and a display module 125 .
- the authentication module 121 can be any hardware/software that can authenticate a participant, such as, a private key authentication system, a digital certificate system, a password verification system, and/or the like.
- the authentication module 121 can use one or more encryption technologies such as Data Encryption Standard (DES), Public Key Encryption (PKI), Secure Socket Layer (SSL), Diffie Hellman, Advanced Encryption Standard (AES), and/or the like.
- DES Data Encryption Standard
- PKI Public Key Encryption
- SSL Secure Socket Layer
- AES Advanced Encryption Standard
- the authentication module 121 is used to authenticate participants into the collaboration system 120 .
- the voting module 122 can be any hardware/software that can process votes from collaborating participants.
- the voting module 122 works in conjunction with the collaboration module 123 to process votes of collaborating participants.
- the collaboration module 123 can be any hardware/software that allows participants at the communication devices 101 A- 101 N to collaborate with one another.
- the collaboration module 123 works in conjunction with the authentication module 121 and voting module 122 .
- the collaboration module 123 is designed to facilitate different types of collaborations between participants.
- the display module 125 can be any hardware/software that can generate a display of the interactive collaboration session and/or can generate information for display in the interactive collaboration session.
- the display module 125 can be a web server, a video card, a browser, and the like.
- the display module 125 is shown as being part of the collaboration system 120 . However, in some embodiments, the display module 125 may in the communication devices 101 A- 101 N and/or the collaboration system 120 .
- An interactive collaboration session typically occurs between a leader and a set of participants.
- the leader and the participants join the interactive collaboration session using the communication devices 101 A- 101 N.
- the leader of an interactive collaboration session is typically defined by a system administrator.
- the leader of the interactive collaboration session is typically a designated person that facilitates the flow of the interactive collaboration session.
- An interactive collaboration session can be any type of group activity where two or more people collaborate.
- an interactive collaboration session can be a brainstorming session, a discussion of one or more topics, a group meeting, a regularly scheduled meeting, and/or the like.
- co-leaders there may be one or more defined co-leaders.
- the co-leader(s) may have all the same permissions as the leader or only a sub-set of permissions.
- the co-leader may have all the permissions of the leader with the exception of setting default permissions for participants.
- the leader for example, at the communication device 101 A, logs in to communication system 120 via the authentication module 121 .
- the leader generates a collaboration activity using the collaboration module 123 .
- the collaboration activity (e.g., an agenda) is used to define the interactive collaboration session.
- an activity of the interactive collaboration session can be a series of items that will be discussed during the interactive collaboration session.
- the leader defines a set of participants that will collaborate on the collaboration activity.
- the leader via the collaboration module 123 , generates a structure of the interactive collaboration session.
- the structure of an interactive collaboration session can be any structure defined by the leader or defined in the template 124 .
- a hierarchical structure in a collaboration activity can comprise categories, ideas associated with each category, and comments associated each idea that are viewed in levels (See FIG. 2 as an illustrative example).
- the names of the structure of the interactive collaboration session can be different.
- the leader can also define the structure of the interactive collaboration session by using a structure wizard (not shown) that allows the leader to define the structure.
- the structure of the interactive collaboration session is displayed by the display module 125 to the leader (See FIG. 2 as an example).
- participant window 300 For the selected activity (See FIGS. 3 and 5 as examples of participant windows 300 ) by the display module 125 .
- the view presented to the participants in a participant window 300 is typically different than the view presented to the leader in a leader window 200 .
- the leader can control which levels of the structure of the interactive collaboration session are displayed by the display module 125 to the participants. This can be accomplished via input from the leader, such as clicking on a button, selecting a menu, via a voice command, via a gesture, and/or the like. This way the leader can select individual levels/groupings/items, multiple levels/groupings/items, or all the levels/groupings/items in the structure of the interactive collaboration session to display by the display module 125 to the participants. These types of input can also be used to control other aspects/levels of the invention, such as agendas, rosters, instructions, documents, and/or the like.
- the collaboration module 123 uses the input from the leader to control the interactive collaboration session.
- the leader can divide the structure of the interactive collaboration process into two or more structures. For example, if the leader wants to break an existing interactive collaboration session into two different groups, one to discuss the current activity and a new group to discuss a new activity, the leader can add/create a new collaboration activity and take a subset of the participants in the existing activity and place them in the new activity. For example, if the current collaboration activity is Activity A with participants H, I, J, and K. The leader can divide the current activity into two activities (Activity A-1 and Activity A-2) and move the participants J and K into the Activity A-2. The leader can moderate the two sessions by toggling back and forth between the two activities. This can be done in a similar manner as done with separate activities (See FIG.
- the participants of the two activities may or may not be able to view the interactive collaboration session of the other activity.
- the leader may be able to divide the activity into two or more activities where all the participants (or only specific ones) may be able to toggle back and forth between the two activities in a similar manner as the leader.
- the leader can then merge the two activities back into a single activity. This can be accomplished by merging the corresponding categories of one activity with the categories of the other activity. Categories that came from Activity A-1 could have a flag that identifies that the categories came from the divided Activity A-1. Likewise, categories that came from Activity A-2 can be identified in a similar manner. The corresponding structure for the ideas associated with the categories and the comments associated with the ideas will be merged in a similar manner. This way all the categories, ideas, and comments will be available to the leader in the merged activity.
- the leader can propose a vote using the voting module 122 on a specific ballot item.
- the leader can have control over how the participants view the voting results based on clicking a button, menu, voice input, and/or the like.
- Individual ballot items on the vote can be controlled using Ballot Item Controls (Not pictured).
- the display of individual vote criteria can also be controlled using Vote Criteria displays (Not Pictured).
- the collaboration module 123 can identify items (e.g., categories, ideas, comments, etc.) for the interactive collaboration session.
- the collaboration module 123 receives an input to tag an individual item in the interactive collaboration session (e.g., as shown in FIG. 7 ).
- a participant can tag a comment that the participant likes the comment.
- the collaboration module 123 associates context information with the tagged item.
- the like is associated with the tagged item.
- Another participant or the same participant can associate a context frame with the individual item of the plurality of items (as shown in FIG. 7 , element 710 ).
- the display module 125 can generate information to display the associated context information to the participant.
- the display module 125 can generate information to display the context information to the participant by sending a web page to the communication device 101 .
- the communication devices 101 , the collaboration system 120 , the authentication module 121 , the voting module 122 and the collaboration module 123 are stored-program-controlled entities, such as a computer or processor, which performs the processes described herein by executing program instructions stored in a tangible computer readable storage medium, such as a memory or disk.
- FIG. 2 is a diagram of a first view of a leader window 200 for controlling a collaboration framework.
- the leader window 200 is presented by communication device 101 of the leader.
- the leader window 200 can include various selectable buttons, menus, text boxes or other inputs that allow the leader to control the collaboration framework.
- the leader window 200 comprises an activity menu 201 , collaboration activities 202 - 203 , a roster 204 , participant status buttons 205 , an activity control button, 206 , a participant control button 207 , a leader category column 210 , a category control button 211 , an add category box 212 , an add categories control button 213 , a leader idea column 220 , an ideas control button 221 , an add idea box 222 , an add idea control button 223 , a leader comment column 230 , a comment control button 231 , an add comment box 232 , an add comment control button 233 , a leader instructions area 240 , a view instructions control button 241 , a leader documents area 250 , and a view documents control button 251 .
- the leader window 200 is the main window that the leader uses to control structure and levels of the interactive collaboration session.
- the leader can create a new collaboration activity by selecting the activity menu 201 to add a new activity.
- the leader has created two activities: 1) Activity ABC 202 , and 2) Activity XYZ 203 .
- the Activity ABC 202 is the current activity being displayed in leader window 200 (indicated by the grey background). If the leader selected the Activity XYZ 203 , the leader window 200 would display the structure of the interactive collaboration session for the Activity XYZ 203 , which can have a completely different structure
- the activity menu 201 also includes the activity control button 206 .
- the activity control button 206 in this example, is a toggle button that allows the leader to control whether the participants can view activity items. In this example, the activity control button 206 has been toggled to grey. When toggled to grey, the participants will unable to see the activity items ( 202 and 203 ). If toggled to green (green is represented by black in all attached black and white drawings), the participants will see the activity items 202 and 203 .
- the roster 204 shows which participants of the interactive collaboration session are currently logged in and involved in the interactive collaboration session for the Activity ABC 202 .
- three participants are currently involved in the interactive collaboration session for the Activity ABC 202 (indicated by the green color (black in the drawing) of the user status button 205 for each user).
- a user can be either be a participant or a leader.
- the Leader may also control if a person is part of an interactive collaboration session by clicking on the user status button 205 of each user. This would result in the user status button 205 changing color (e.g., from green to red).
- the participant status button 205 may also convey that a participant has been defined as part of the interactive collaboration session, but is currently not logged into the interactive collaboration session.
- the view that is presented to the leader can show one color for logged in and collaborating (e.g., green), one color for not logged in (e.g., grey), and another color for logged in, but where the leader has taken the participant out of the interactive collaboration session (e.g., red).
- the ability to take a participant in and out of the interactive collaboration session is useful based on the specific topic being discussed.
- the leader can also control if all the participants can view the other participants in the interactive collaboration session by clicking the participant control button 207 .
- the participant control button 207 if toggled to grey disables all participants from viewing the other participants in the interactive collaboration session.
- the structure of the interactive collaboration session for the Activity ABC 202 comprises three levels.
- the names given to these levels can be customized by the session leader to match the structure of collaboration that they intend to guide participants through.
- Levels may be named anything the leader wishes, such as Questions/Responses/Comments or Categories/Risks/Root causes.
- the levels are named: 1) categories, 2) ideas, and 3) comments.
- the leader instructions area 240 and leader documents area 250 can also be considered levels.
- the structure of the interactive collaboration session for the Activity ABC 202 is shown in the leader category column 210 , the leader ideas column 220 , and the leader comment column 230 .
- the leader category column 210 shows three categories for the activity ABC 202 .
- the categories in this embodiment, are like an activity of the interactive collaboration session. In other alternatives, the categories can be used for a variety of purposes. For example, items in the category column 210 can be an idea, an event, an action, a concept, a question, a combination of these, and the like.
- the categories 210 are: 1) Sales, 2) Public Relations, and 3) Marketing.
- the number next to the categories indicates the total number items below that level.
- Marketing shows a 4 (0) in the leader categories column 210 .
- the 4 indicates that there are 4 total items in marketing (2 in ideas, and 2 in comments).
- the 0 indicates that all the items have been viewed by the leader.
- the Sales item in the leader category column 210 show a 2 (1). This indicates that there are two items below this item and that there is one item that has not been viewed by the leader.
- the structure i.e., the ideas and comments associated with a category
- Marketing as shown in the leader ideas column 220 , has two ideas: 1) whether to change their logo, and 2) should we hire Sally?
- the “should we hire Sally” item has been selected (indicated by the grey background), which results in the two comments that were added (in the leader comments column 230 ) being displayed, one that indicates that yes, she is good and a second one that indicates no, she cannot do X.
- the text items in each of the levels are defined as a contribution. A contribution can come from a leader, a co-leader, or a participant.
- Each of the levels of the interactive collaboration session has a button ( 211 , 221 , and 231 ), that allows the leader to control the view of participants.
- the category control button 211 allows the leader to control if the participants can view or not view the contributions in the leader categories column 210 in the participant window 300 (see FIGS. 3 and 5 ).
- the ideas control button 221 allows the leader to control if the participants can view or not view the contributions in the leader ideas column 220 in the participant window 300 .
- the comment control button 231 allows the leader to control if the participants can view or not view the contributions in the leader comment column 230 in the participant window 300 .
- buttons 211 , 221 , and 231 are a toggle button that changes color.
- the button ( 211 , 221 , and 231 ) is green (black in the drawing) when the participants can view the respective column.
- the button ( 211 , 221 , and 231 ) is grey when the participant cannot view the respective column.
- one or more of the buttons ( 211 , 221 , and 231 ) may be a three way toggle button that displays three colors (e.g., green, yellow, and black). When toggled to green, the participants can view the respective column. When toggled to grey, none of the participants can view the respective column.
- Each of the columns 210 , 220 , and 230 contains a text box ( 212 , 222 , and 232 ) that allows the leader to add an item to each level in the structure.
- the leader via the add category box 212 can add a new category to the leader category column 210 .
- the leader, via the add idea box 222 can add a new idea to the leader ideas column 220 .
- the leader, via the add comments box 232 can add a new comment to the leader comments column 230 .
- the leader can also control whether the participants can add items (contributions) to each of the levels of the structure of the interactive collaboration session.
- the leader can select the add categories control button 213 to allow or not allow a participant to add a new item to a participant category column (See FIG. 3 ).
- the leader can select the add idea control button 223 to allow or not allow a participant to add a new idea to the participant idea column.
- the leader can select the add comment control button 233 to allow or not allow a participant to add a new idea to the participant comment column.
- a contribution that is provided by a participant is typically anonymous. However, in some embodiments, the contribution provided by the participant will be marked with an identifier indicating the participant who made the contribution. In one embodiment, this is only known the leader. In other embodiments, it may be known by the other participants.
- the leader can also control if the participants can view instructions in the leader instructions area 240 . In some embodiments this area will be represented as a tab in the interface. This is accomplished by the leader selecting the view instructions control button 241 . Likewise, the leader can control if participants can view and add documents to the leader documents area 250 . In some embodiments, this area will be represented as a tab in the interface.
- the leader can toggle the view documents control button 251 to control if participant can view and add and/or delete documents. Although not shown, control of viewing documents may be separate from control of adding and/or deleting documents. In FIG. 2 , both the view instructions control button 241 and the view documents control button 251 have been toggled to green (black in the drawing), indicating that the participants will be able to view instructions and documents along with adding and deleting documents.
- the leader via a control button may control if a participant can modify another participant's contribution.
- the leader may control (enable or disable) if a participant can navigate the structure of the interactive collaboration session.
- the leader window 200 is an example of a hierarchical structure of an interactive collaboration.
- the structure of the interactive collaboration session may be based on different structures, such as based on tabs.
- the tabs may be used to define the structure of the interactive collaboration session. For example, instead of columns, there would be a tab for the different items, such as a tab for the categories, a tab for the ideas, and a tab for the comments (or tabs for different/additional groupings).
- a control button may be displayed on each of the tabs to allow the leader to enable/disable a tab from being viewed by the participants of the interactive collaboration session.
- a grid can be used to define the structure of the interactive collaboration session.
- a control button is placed on a row(s) and/or column(s). This allows the leader to enable/disable what rows/columns can be view by the participants of the interactive collaboration session.
- items of a menu can be used to define the structure of the interactive collaboration session.
- a control button is placed on menus and/or menu items. This allows the leader to enable/disable what menus/menu items can be viewed by the participants of the interactive collaboration session.
- buttons 205 , 206 , 207 , 211 , 221 , 231 , 241 , 251 , 213 , 223 , and 233 is green when the participants can view the contributions/items/information and is grey when the participants cannot view the contributions/items/information.
- FIG. 3 is a diagram of a first view of a participant window 300 of a controlled view in a collaboration framework.
- the participant window 300 comprises a participant categories column 310 , a participant ideas column 320 , a participant comments column 330 , a participant instruction area 340 and a participant documents area 350 .
- the participant window 300 also comprises a participant add categories box 312 , a participant add ideas box 322 , and a participant add comments box 332 .
- the participant window 300 is the corresponding participant window 300 that is displayed to the participants based on the current settings of leader window 200 as shown in FIG. 2 .
- the activity control button 206 and the participant control button 207 in FIG. 2 have been toggled to grey.
- the Activity ABC 202 and the Activity XYZ 203 in the activity menu 201 , along with the roster 204 are not displayed in the participant window 300 .
- the category control button 211 In the leader window 200 , the category control button 211 , the ideas control button 221 , and the comment control button 231 have been toggled to green (black in the drawing) to allow the participants to view each of the columns ( 210 , 220 , and 230 ). These selections result in the leader categories column 210 , the leader idea column 220 , and the leader comment column 220 of the leader window 200 being displayed respectively in the participant categories column 310 , the participant idea column 320 , and the participant comment column 330 of the participant window 300 .
- the add categories control button 213 , the add idea control button 223 , and the add comment control button 233 have been toggled green (black in the drawing) to allow the participants to enter items into the respective columns ( 310 , 320 , and 330 ).
- the view instructions control button 241 is toggled to green (black in the drawing; to indicate that the participants can view this field). This results in the leader instructions area 240 being displayed the participant window 300 .
- the view documents control button 251 is toggled to green (black in the drawing). This results in the leader documents area 250 being displayed in the participant window 300 .
- FIG. 4 is a diagram of a second view of a leader window 200 for controlling a collaboration framework.
- the difference between FIG. 4 and FIG. 2 is that the category control button 211 has been toggled grey by the leader to indicate that the participants cannot view the information in the participant category column 310 .
- the add idea control button 223 has been toggled to grey to indicate that the participants cannot add any new ideas to the ideas columns 320 .
- the instructions control button 241 and the documents control button 251 button have also been toggled to grey to indicate that the participants cannot view the information in the respective areas 340 and 350 .
- FIG. 5 is a diagram of a second view of a participant window 300 of a controlled view in a collaboration framework.
- the categories column 310 is no longer displayed in the participant window 300 shown in FIG. 5 .
- the add idea control button 223 being toggled (to grey) as shown in FIG. 4
- the participant add ideas box 322 is no longer displayed in participant window 300 as shown in FIG. 5 .
- the participant window 300 no longer shows the participant instruction area 340 and the participant documents area 350 are no longer displayed.
- the leader can select various combinations of the category control button 211 , the ideas control button 221 , the comment control button 231 , the instructions control button 241 , and the documents control button 251 , the leader can easily control which levels of the structure of the interactive collaboration session are displayed to the participants.
- FIG. 6 is a diagram of a leader window 600 for viewing contributions in a collaboration framework. Although primarily used by a leader, the leader windows 200 and 600 can be used by anyone who has been give the necessary permissions, such as a co-leader or manager.
- the leader window 600 comprises the same items of leader window 200 (excluding the items 240 , 241 , 250 and 251 ).
- the leader window 600 also comprises a search box 620 , a search button 621 , a search window 624 .
- the search window 624 further comprises an add to category button 622 , an add to idea button 623 , and a search result 625 .
- the search box 620 allows the leader to enter a keyword(s) in order to search the structure of one or more activities 202 - 203 . For example, as shown in search box 620 , the leader wants to search for the key word “Sales Schedule.” After entering the keyword(s), the leader selects the search button 621 . This brings up the search window 624 . In this example, there was one contribution that was found in searching the Activity ABC 202 .
- the contribution was the statement “Is Sales Schedule really behind?” that is displayed in search result 625 .
- the contribution “Is sales schedule really behind?” is not shown in the structure because it is a comment off of the sales activity in Activity ABC.
- the leader can select the search result 625 and then select the add to category button 622 to add the selected contribution to the leader category column 210 .
- the participant can select the add to idea button 623 and add the selected search result 625 to the currently displayed leader idea column 220 .
- the leader can also select the go to button 624 to bring up the structure for the “Is sales schedule really behind?” contribution.
- FIG. 7 is a diagram of a view of a participant window 700 that contains a context frame 710 for managing context information.
- FIG. 7 comprises participant window 700 , tag tray 720 , document display window 730 , dislike display window 731 , and clarification enter window 732 .
- the participant window 700 comprises the participant ideas column 320 , the participant comments column 330 , and the participant add categories box 312 .
- the participant window 700 also comprises the context frame 710 .
- the context frame 710 is described below from a participant view in the participant window 700 . However, the context frame 710 can work in the same manner for the leader window 200 .
- the context frame 710 is shown as being over the idea “Should we hire Sally.”
- the context frame 710 works in conjunction with the tag tray 720 for associating context information with items (e.g., ideas or comments) of the participant window 700 .
- the context frame 710 further comprises a like tag button 711 , a dislike tag button 712 , a documents tag button 713 , and an add question tag button 714 .
- the context frame 710 can comprise a variety of buttons, such as an important tag button (to flag an item as important), a clarification tag button (where a participant can clarify an item), a comment tag button, and a contact button (where a participant can add a contact to an item), a favorites tag button, a monitory value tag button (where a participant can associate a monetary number with an item), a summary tag button (by default is not visible to participants), presentation tag button, a photo tag button, a link tag button (e.g., a link to a web site), a custom tag button and/or the like.
- the context frame 710 can also comprise copy, cut, paste, delete, indent, out dent, and reorder buttons.
- the leader may control which buttons each participant sees in the context frame 710 . For example, one participant may see different buttons than another participant, or participants may see a different set of buttons in each of the activities in a session.
- each of the tag buttons 711 - 714 is a number.
- the number indicates an amount associated with the tag button 711 - 714 for the particular item that the context frame 710 is over.
- the like tag button 711 has 3 likes associated with the idea “Should we hire Sally.”
- the dislike tag button 712 has one dislike associated with the idea “Should we hire Sally.”
- the documents tag button 713 has one document associated with the idea “Should we hire Sally.”
- the add question tag button 714 has two questions associated with the idea “Should we hire Sally.”
- these numbers may not be displayed, and instead, the tag button 711 will appear on the context frame where it has been applied at least once, and not be displayed if it has not been applied to the information by any user.
- the tag tray 720 comprises a variety of icons for tagging different types of context information to items in the participant window 700 .
- the tag tray 720 comprises a like icon 721 , a dislike icon 722 , an importance icon 723 , a question icon 724 , a clarification icon 725 , a comment icon 726 , a document icon 727 , and a contact icon 728 .
- the tag tray 720 can also include other icons for associating different types of context information, such as a favorites icon, a monetary value icon, a summary icon, a presentation icon (for holding a link to a presentation), a photo icon, a link icon (e.g., a link to a web page), a custom icon, and/or the like.
- the icons that a participant sees in the participant tag tray 720 can be controlled by the leader or can be an administered.
- Each of the icons 721 - 728 in the tag tray 720 may have a limit number associated with the icon.
- icons 721 - 722 and 724 - 728 have a limit number below the respective icon 721 - 722 and 724 - 728 .
- the like icon 721 has the limit No. 10/5 below the like icon 721 .
- the 10 represents the maximum number of likes that the participant can use for the activity (Activity ABC). In this example, the participant has used five likes out of the allowed ten likes.
- the importance icon 723 has a dash below the importance icon 723 . The dash indicates that there is no limit for the importance icon.
- an icon only has a single number below the icon, this indicates the maximum limit, but that the participant has not used any of the maximum limit.
- the comment icon 724 shows that this particular participant has a limit of 10 comments, but the participant in this case has not used any of the 10 available comments for the activity ABC.
- the participant can select an icon 721 - 728 and drag the icon onto an item to select the item. For example, the participant could select the clarification icon 725 and drag the clarification icon 725 (as shown in step 740 ) onto the item “Yes, she is good.” This results in the clarification enter window 732 being displayed so that the participant can enter a clarification of the comment “Yes, she is good.”
- the process of selecting an item by dragging an icon onto an item can be done for any of the icons 721 - 728 . For example, the participant can drag the dislike icon 722 onto the item “Yes, she is good” item.
- this information can then be displayed during the interactive collaboration session via the context frame 710 .
- this information can then be displayed during the interactive collaboration session via the context frame 710 .
- the participant drags the like icon 721 onto an item the other participants in the interactive collaboration session will see the added like for the item by moving the context frame over the item.
- the participant of participant window 700 would see the number under the like tag item 711 change from 3 to 4 in the context frame 710 .
- participants and the leader can see details of the context information associated with the each of the tag buttons 711 - 714 in the context frame 710 . For example, if the participant or the leader selects the documents tag button 713 , this results in the documents display window 730 being displayed to the participant or leader in step 742 .
- the documents display window 730 shows the document that is associated with the item “Should we hire Sally,” which is the employee handbook.doc. The participant can then select the employee handbook.doc to open this document.
- the participant or leader can click on the dislike tag button 712 .
- the dislike display window 731 is then displayed in step 743 to the participant or leader that indicates, in this example, that Jack Hammer did not want to hire Sally. This can also be done in a similar manner for the like tag button 711 .
- the participant and/or leader may only be able to see that there is a dislike, but not be allowed to click on the dislike tag button 712 to display the particular participant that disliked the item.
- the context frame 710 can be displayed to a participant in various ways.
- a context frame button (not shown) could be by each item that allows the participant to bring up the context frame 710 for each item.
- the participant can drag the context frame 710 to different items in the participant window 700 to display the context information associated with the item.
- the process of liking and disliking an item can be repeated by the same participant. For example, if Jack Hammer initially liked the item “Should we hire Sally” and then later on in the discussion decided that he no longer liked hiring Sally, Jack Hammer could dislike the item “Should we hire Sally.” This would decrement the like count under the like tag button 711 by one and increment the count under the dislike tag button 712 .
- the icons in the tag tray 720 and the button on the context frame can change based on the item that is selected. For example, a different set of icons in the tag tray 720 and a different set of buttons on the context frame 720 may be displayed when the context frame is over the item “Yes, she is good.” In one embodiment, the icons in the tag tray 720 and/or context frame can be enabled or disabled based on the selected item.
- tags can also create other tags that can be applied to any item/contribution. For example, tags could be created to feature items, include items in a report, identify items as belonging to a particular classification etc.
- only the leader will have access to the tag tray 720 .
- the leader can use the tag tray 720 to determine which buttons 711 - 714 will be displayed in the context frame 710 to the participants.
- the individual participants can then select the button 711 - 714 and perform a drag-in-drop onto the item (e.g. item 710 ) to tag the item.
- the leader could do a drag-in-drop of the like icon 721 onto the context frame 710 to add the like button 711 to the context frame 710 that is displayed to the participants.
- a participant could indicate that she likes the “Should We Hire Sally” item by dragging-in-dropping the like button 711 onto the item “Should We Hire Sally.”
- the leader can also remove a button 711 - 714 from the context frame 710 . This is accomplished by the leader dragging-and-dropping one or more of the buttons 711 - 174 onto the tag tray 720 .
- FIG. 8 is a first diagram of a first view of a leader window 800 for controlling a collaboration framework that includes a stepping framework.
- the leader window 800 is similar to the leader window 200 with the addition of a step frame 801 .
- the step frame 801 comprises step buttons 810 A- 810 N.
- the step frame 801 is used to step a leader through a defined series of steps in an interactive collaboration session by allowing the leader to click on each step button 810 A- 810 N to advance to the next step in the interactive collaboration session.
- a typical interactive collaboration session will include a series of steps or patterns. For example, the leader may want to discuss the category “sales” first in an interactive collaboration session with the participants. Next the leader may want to discuss the “marketing” category. In each step, the leader may only want the participants to view specific information in the interactive collaboration session. For example, when the leader wants to discuss the “sales” category, the leader may want to only display the ideas column 220 (controlled by the ideas control button 221 ) and the comments column 230 (controlled by the comment control button 231 ). In addition, the leader may only want to allow the addition of comments into the comment column (controlled by the add comment control button 233 ). For the “marketing” step, the leader may only want to display the comments column 230 without allowing any comments by the participants.
- the step frame 801 allows the leader to step through the interactive collaboration session by selecting the step buttons 810 A- 810 N to provide the desired view (by automatically selecting the buttons 206 , 207 , 211 , 213 , 221 , 223 , 231 , 233 , 241 , and 251 defined for that step) to the participants in the interactive collaboration session. This way, the leader does not have to worry about what is being displayed to the participants in each step of the interactive collaboration session.
- the steps of an interactive collaboration session can be defined by the leader or an administrator prior to the beginning of the interactive collaboration session in the template 124 .
- the template 124 can be generated in various ways. For example, the leader or administrator can set a particular set of buttons 206 , 207 , 211 , 213 , 221 , 223 , 231 , 233 , 241 , and 251 to a particular configuration and then save it off as a step. This process can be repeated for each of the steps of the interactive collaboration session.
- the steps can be programmed by a leader defining the settings of the buttons 206 , 207 , 211 , 213 , 221 , 223 , 231 , 233 , 241 , and 251 for each step in the template 124 .
- the template 124 is then loaded as part of the interactive collaboration session.
- the loading of the template 124 causes the step frame 801 to be displayed in the leader window 800 .
- the template 124 has N steps. Each of the steps are represented by one of the step buttons 810 A- 810 N.
- the leader can go to a specific step by clicking on the specific step button 810 for the step that the leader wants displayed to the participants of the interactive collaboration session.
- a previous and next buttons could be used in place of the step buttons or a combination of step buttons 810 and next and previous buttons can be used.
- step button 810 B (as indicated by the grey background of step button 810 B) to display the configuration associated with step 2 of the interactive collaboration session.
- the template 124 defines that only the categories column 210 will be displayed along with allowing the participants to add to the categories. This results in the leader window 800 being displayed to the leader that only has the category control button 211 and the add categories control button 213 set to green (black in FIG. 8 ).
- a corresponding participant window 300 is displayed to the participants of the interactive collaboration session as shown in FIG. 9 .
- the participant window 300 in FIG. 9 only shows the categories column 310 along with the participant add categories box 312 .
- the configuration associated with the selected step is then used to display the defined view for the leader and the participants.
- a context associated with the step button 810 is displayed.
- the context for step 2 is for a “review of categories” for activity ABC as shown near the top of the leader window 800 .
- Information that describes the step can be displayed in different areas in leader window 800 . For example, this information could be displayed on the step buttons 810 or be displayed via a popup window when a cursor is positioned above one of the step buttons 810 .
- the leader may still manually select one or more of the buttons 206 , 207 , 211 , 213 , 221 , 223 , 231 , 233 , 241 , and 251 to dynamically change the view in the participant window 300 during the interactive collaboration session. This gives the leader the ability to easily control the interactive collaboration session.
- FIG. 10 is a second diagram of a second view of a leader window 800 for controlling a collaboration framework that includes a stepping framework.
- the leader after selecting step 2 as described in FIG. 8 , selects the step button 810 C (as indicated by the grey background of step button 810 C) for step 3 of the interactive collaboration session.
- Information for step 3 of the interactive collaboration session in the template 124 is used to create the display as shown in FIG. 10 .
- buttons 221 , 223 , 231 , and 233 are toggled from grey to green (black in the drawings) based on the template 124 to indicate that the participants of the interactive collaboration session can now view and enter information in the ideas column 220 and the comments column 230 .
- This also results in the display of a corresponding participant window 300 that is displayed to the participants of the interactive collaboration session as shown in FIG. 11 .
- the participants can now add ideas and comments.
- step 3 when the activity for step 3 also changes to “discuss categories and give comments” in the leader window 900 . This gives the leader the context of this step (step 3 ). The leader then repeats this process through step N, which completes the interactive collaboration session.
- FIG. 12 is flow diagram of a method for controlling a display of a collaboration framework.
- the process starts in step 1200 .
- the process generates, in step 1202 , a structure of an interactive collaboration session that comprises one or more levels.
- the structure of the interactive collaboration session can be generated based on a template that is defined by the leader of the interactive collaboration session.
- the structure of the interactive collaboration session is displayed to a leader of the interactive collaboration session in step 1204 (e.g., as shown FIGS. 2 , 4 , 6 , 8 , and 10 ).
- Input is received from the leader to control the display of one or more levels of the structure of the interactive collaboration session in step 1206 (e.g., by the leader clicking on one of the buttons 206 , 207 , 211 , 213 , 221 , 223 , 231 , 233 , 241 , and 251 as described previously).
- a display of one or more of the levels that is displayed to a set of participants of the interactive collaboration session is controlled in step 1208 .
- the leader of the interactive collaboration session clicks on the category control button 211 , the participant categories column 310 is removed from the participant window 300 .
- the process determines in step 1210 if the interactive collaboration session is complete. If the interactive collaboration session is not complete in step 1210 , the process goes to step 1206 to receive additional input from the leader. Otherwise, if the interactive collaboration session is complete in step 1210 , the process ends in step 1212 .
- FIG. 13 is a flow diagram of a method for managing context information in an interactive collaboration session.
- the process starts in step 1300 .
- the process identifies a plurality of items in an interactive collaboration session in step 1302 .
- An item can be identified from a profile.
- An item can be an item associated with a level in the structure of the interactive collaboration session.
- an item can be an item in the leader categories column 210 and/or the participant categories column 310 such as “sales”, “PR”, and “marketing.
- An item can be an item in the columns 201 , 204 , 220 , 230 , 240 , 250 , 320 , 330 , 340 , 350 and/or the like.
- the item could be the column itself.
- the item may be text object, a graphical object, a button, an icon, a menu item, a tab, a menu bar, a row, a combination of these, and the like.
- Input to tag an individual item is received during the interactive collaboration session in step 1304 .
- a participant of the interactive collaboration session can like the “Yes she is good” item using the tag tray 720 in the participant comments column 310 of the participant window 300 .
- the context information is associated with the tagged item (e.g., the like that is associated with the tagged item “Yes she is good”) in step 1306 .
- a context frame 710 is associated with the tagged item in step 1308 .
- the associated context information (e.g., the like) is displayed to a participant (e.g., via a number that displays the total number of likes for the item) of the interactive collaboration session in step 1310 .
- information can be generated in a web page that is sent to a communication for display at a communication device.
- the process determines in step 1312 if the interactive collaboration session is complete in step 1312 . If the interactive collaboration session is not complete in step 1312 , the process goes to step 1304 to get more tagged items. Otherwise, if the interactive collaboration session is complete in step 1312 , the process ends in step 1314 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
In an interactive collaboration session, different items of the interactive collaboration session are identified. For example, different agenda items or comments made during the collaboration session can be identified. A leader can use a tag tray to configure which tag types will be made available to the participants of their session, and how those tags will behave. A participant in the interactive collaboration session can provide input to apply a tag to an item during the interactive collaboration session. In response to the input to apply the tag to the item, context information is associated with the item. For example, a participant can indicate that they like a particular comment. A participant can use a context frame to the selected item. In response to applying tags to the individual item, information is generated to display the applied tag information to the leaders and participants of the interactive collaboration session.
Description
- This application claims the benefit under 35 U.S.C. §119(e) of U.S. Provisional Application No. 61/787,164, filed Mar. 15, 2013, entitled “CONTROLLABLE DISPLAY OF A COLLABORATION FRAMEWORK SYSTEM,” the entire disclosure of which is incorporated herein by reference.
- The systems and methods that relate to collaboration systems and in particular to collaboration frameworks.
- With the advent of interactive collaboration technologies, such as web conferencing, the ability for participants to collaborate has increased dramatically. In a fully interactive collaboration session, current systems are limited in their ability to dynamically capture and share context information between participants. As the interactive collaboration session progresses, it is sometimes difficult for participants to recall subjects that were discussed previously. One way to deal with this is to allow an interactive collaboration session to be recorded. A drawback with a recorded interactive collaboration session is that the information is not presented in a form that can easily be disseminated to users of an ongoing live interactive collaboration session.
- Systems and methods are provided to solve these and other problems and disadvantages of the prior art. In an interactive collaboration session, different items of the interactive collaboration session are identified. For example, different agenda items or comments made during the collaboration session can be identified. A participant in the interactive collaboration session can provide input to tag an individual item during the interactive collaboration session. In response to the input to tag the individual item, context information is associated with the individual item. For example, a participant can indicate that they like a particular comment made in the interactive collaboration session. A participant can associate a context frame, such as a window, with the individual item. In response to associating the context frame with the individual item, information is generated to display the associated context information to a first participant of the interactive collaboration session.
- In one embodiment, tagging the individual item is accomplished by a second participant of the interactive collaboration session.
- In one embodiment, the tagged individual item is an item that accepts input from the second participant of the interactive collaboration session. The input is received by displaying a window for the second participant to input the context information.
- In one embodiment, a tag tray is used for tagging individual item of the interactive collaboration session.
- In one embodiment, the tag tray displays a number that identifies a limit on a number of times a particular type of context information can be tagged in the interactive collaboration session.
- In one embodiment, tagging the individual item comprises dragging an icon to tag the individual item.
- In one embodiment, the context frame displays a number associated with an individual type of context information that represents a number of associated context items.
- In one embodiment, the tagged individual item comprises one or more of: a like, a dislike, an importance, a question, a clarification, a comment, a document, a contact, a favorite, a monetary value, a summary, a presentation, a photo, a value driver, and a link.
- In one embodiment, the context information comprises a like; the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a like button along with a number that indicates an associated number of likes. A selection of the like button is received from a participant. In response to receiving the selection of the like button, information is generated to displaying a participant who liked the individual item.
- In one embodiment, the context information comprises a document; the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a document button along with a number that indicates an associated number of documents. A selection of the document button by a participant is received. In response to receiving the selection of the document button, a link to a document associated with the individual item is displayed.
-
FIG. 1 is a block diagram of a first illustrative system for controlling a display of a collaboration framework. -
FIG. 2 is a diagram of a first view of a leader window for controlling a collaboration framework. -
FIG. 3 is a diagram of first view of a participant window of a controlled view in a collaboration framework. -
FIG. 4 is a second diagram of a second view of a leader window for controlling a collaboration framework. -
FIG. 5 is a diagram of a second view of a participant window of a controlled view in a collaboration framework. -
FIG. 6 is a diagram of a leader window for viewing context in collaboration framework. -
FIG. 7 is a diagram of a view of a participant window that contains a context frame for managing contributions to an interactive collaboration session. -
FIG. 8 is a first diagram of a first view of a leader window for controlling a collaboration framework that includes a stepped activity framework. -
FIG. 9 is a diagram of a first view of a participant window of a controlled view in a collaboration framework that uses a stepped activity framework. -
FIG. 10 is a second diagram of a second view of a leader window for controlling a collaboration framework that includes a stepped activity framework. -
FIG. 11 is a diagram of a second view of a participant window of a controlled view in a collaboration framework that uses a stepped activity framework. -
FIG. 12 is flow diagram of a method for controlling a display of a collaboration framework. -
FIG. 13 is a flow diagram of a method for managing context information in an interactive collaboration session. -
FIG. 1 is a block diagram of a firstillustrative system 100 for controlling a display of a collaboration framework. The first illustrative system comprisescommunication devices 101A-101N, anetwork 110, and acollaboration system 120. - The
communication devices 101A-101N may be any device that can communicate on thenetwork 110, such as a Personal Computer (PC), a telephone, a video system, a cellular telephone, a Personal Digital Assistant (PDA), a tablet device, a notebook device, a smart phone, and the like. As shown inFIG. 1 , any number ofcommunication devices 101A-101N may be connected to thenetwork 110, including only a single communication device 101. In addition, the communication device 101 may be directly connected to thecollaboration system 120. - The communication device 101 comprises a
processor 102, amemory 103, and anetwork interface 104. Theprocessor 102 can be any type ofprocessor 102, such as a microprocessor, a multi-core processor, a Digital Signaling Processor (DSP), a microcontroller, and/or the like. Thememory 103 can be any type of memory, such as a Random Access Memory, a Flash Memory, a hard disk, and/or the like. Thenetwork interface 104 can be any type of interface, such as a wireless interface, a wired interface, a cellular interface, an Ethernet interface, and/or the like. Thenetwork interface 104 is used to communicate with thenetwork 110. - The communication device 101 generally includes participant output devices, such as a display, a speaker, a vibrator, and the like. The communication device 101 can include participant input devices such as a keyboard, a keypad, a touch screen, a mouse, a pointing device, and/or the like. The
processor 102 and thememory 103 can be used to execute programs. For example, theprocessor 102 and thememory 103 can be used to execute a web browser or other applications that work in conjunction with thecollaboration system 120. - The
network 110 can be any network that can send and receive information, such as the Internet, a Wide Area Network (WAN), a Local Area Network (LAN), the Public Switched Telephone Network (PSTN), a packet switched network, a circuit switched network, a cellular network, a combination of these, and the like. Thenetwork 110 can use a variety of protocols, such as Ethernet, Internet Protocol (IP), Session Initiation Protocol (SIP), Integrated Services Digital Network (ISDN), and the like. - The
collaboration system 120 can be any hardware/software that can communicate on thenetwork 110, such as a collaboration server, a multimedia server, a server, a web server, a web conferencing system, a communication system, a combination of these, and the like. For example, thecollaboration system 120 can include a server that incorporates the memory/data storage 127. The memory/data storage contains programming/computer instructions for implementing various modules or functions. Thecollaboration system 120 includes aprocessor 126 for executing the programming/computer instructions in the memory/data storage 127. The modules or function can communicate information via thenetwork interface 128 to thecommunication devices 101A-101N. - The
collaboration system 120 further comprises anauthentication module 121, avoting module 122, acollaboration module 123, atemplate 124, and adisplay module 125. Theauthentication module 121 can be any hardware/software that can authenticate a participant, such as, a private key authentication system, a digital certificate system, a password verification system, and/or the like. Theauthentication module 121 can use one or more encryption technologies such as Data Encryption Standard (DES), Public Key Encryption (PKI), Secure Socket Layer (SSL), Diffie Hellman, Advanced Encryption Standard (AES), and/or the like. Theauthentication module 121 is used to authenticate participants into thecollaboration system 120. - The
voting module 122 can be any hardware/software that can process votes from collaborating participants. Thevoting module 122 works in conjunction with thecollaboration module 123 to process votes of collaborating participants. - The
collaboration module 123 can be any hardware/software that allows participants at thecommunication devices 101A-101N to collaborate with one another. Thecollaboration module 123 works in conjunction with theauthentication module 121 andvoting module 122. Thecollaboration module 123 is designed to facilitate different types of collaborations between participants. - The
display module 125 can be any hardware/software that can generate a display of the interactive collaboration session and/or can generate information for display in the interactive collaboration session. For example, thedisplay module 125 can be a web server, a video card, a browser, and the like. Thedisplay module 125 is shown as being part of thecollaboration system 120. However, in some embodiments, thedisplay module 125 may in thecommunication devices 101A-101N and/or thecollaboration system 120. - An interactive collaboration session typically occurs between a leader and a set of participants. The leader and the participants join the interactive collaboration session using the
communication devices 101A-101N. The leader of an interactive collaboration session is typically defined by a system administrator. The leader of the interactive collaboration session is typically a designated person that facilitates the flow of the interactive collaboration session. An interactive collaboration session can be any type of group activity where two or more people collaborate. For example an interactive collaboration session can be a brainstorming session, a discussion of one or more topics, a group meeting, a regularly scheduled meeting, and/or the like. In addition to leaders, there may be one or more defined co-leaders. The co-leader(s) may have all the same permissions as the leader or only a sub-set of permissions. For example, the co-leader may have all the permissions of the leader with the exception of setting default permissions for participants. - The leader, for example, at the
communication device 101A, logs in tocommunication system 120 via theauthentication module 121. The leader generates a collaboration activity using thecollaboration module 123. The collaboration activity (e.g., an agenda) is used to define the interactive collaboration session. For example, an activity of the interactive collaboration session can be a series of items that will be discussed during the interactive collaboration session. During this process, the leader defines a set of participants that will collaborate on the collaboration activity. The leader, via thecollaboration module 123, generates a structure of the interactive collaboration session. - The structure of an interactive collaboration session can be any structure defined by the leader or defined in the
template 124. For example, a hierarchical structure in a collaboration activity can comprise categories, ideas associated with each category, and comments associated each idea that are viewed in levels (SeeFIG. 2 as an illustrative example). In other embodiments, the names of the structure of the interactive collaboration session can be different. The leader can also define the structure of the interactive collaboration session by using a structure wizard (not shown) that allows the leader to define the structure. The structure of the interactive collaboration session is displayed by thedisplay module 125 to the leader (SeeFIG. 2 as an example). - Once the collaboration activity is defined. Participants at the
communication devices 101B-101N authenticate to thecollaboration system 120 viaauthentication module 121. The participants select the collaboration activity they are going to join. The participants who have joined the collaboration activity are then displayed in a participant window for the selected activity (SeeFIGS. 3 and 5 as examples of participant windows 300) by thedisplay module 125. The view presented to the participants in aparticipant window 300 is typically different than the view presented to the leader in aleader window 200. - The leader can control which levels of the structure of the interactive collaboration session are displayed by the
display module 125 to the participants. This can be accomplished via input from the leader, such as clicking on a button, selecting a menu, via a voice command, via a gesture, and/or the like. This way the leader can select individual levels/groupings/items, multiple levels/groupings/items, or all the levels/groupings/items in the structure of the interactive collaboration session to display by thedisplay module 125 to the participants. These types of input can also be used to control other aspects/levels of the invention, such as agendas, rosters, instructions, documents, and/or the like. Thecollaboration module 123 uses the input from the leader to control the interactive collaboration session. - In another embodiment, the leader can divide the structure of the interactive collaboration process into two or more structures. For example, if the leader wants to break an existing interactive collaboration session into two different groups, one to discuss the current activity and a new group to discuss a new activity, the leader can add/create a new collaboration activity and take a subset of the participants in the existing activity and place them in the new activity. For example, if the current collaboration activity is Activity A with participants H, I, J, and K. The leader can divide the current activity into two activities (Activity A-1 and Activity A-2) and move the participants J and K into the Activity A-2. The leader can moderate the two sessions by toggling back and forth between the two activities. This can be done in a similar manner as done with separate activities (See
FIG. 2 ,items 202 and 203). Once the activities are divided, the participants of the two activities, based on a setting of the leader, may or may not be able to view the interactive collaboration session of the other activity. In another embodiment, the leader may be able to divide the activity into two or more activities where all the participants (or only specific ones) may be able to toggle back and forth between the two activities in a similar manner as the leader. - Once the interactive collaboration session for the two activities is complete, the leader can then merge the two activities back into a single activity. This can be accomplished by merging the corresponding categories of one activity with the categories of the other activity. Categories that came from Activity A-1 could have a flag that identifies that the categories came from the divided Activity A-1. Likewise, categories that came from Activity A-2 can be identified in a similar manner. The corresponding structure for the ideas associated with the categories and the comments associated with the ideas will be merged in a similar manner. This way all the categories, ideas, and comments will be available to the leader in the merged activity.
- In another embodiment, the leader can propose a vote using the
voting module 122 on a specific ballot item. The leader can have control over how the participants view the voting results based on clicking a button, menu, voice input, and/or the like. Individual ballot items on the vote can be controlled using Ballot Item Controls (Not pictured). The display of individual vote criteria can also be controlled using Vote Criteria displays (Not Pictured). - In another embodiment, the
collaboration module 123 can identify items (e.g., categories, ideas, comments, etc.) for the interactive collaboration session. Thecollaboration module 123 receives an input to tag an individual item in the interactive collaboration session (e.g., as shown inFIG. 7 ). For example, a participant can tag a comment that the participant likes the comment. Thecollaboration module 123 associates context information with the tagged item. For example, the like is associated with the tagged item. Another participant or the same participant can associate a context frame with the individual item of the plurality of items (as shown inFIG. 7 , element 710). Thedisplay module 125 can generate information to display the associated context information to the participant. For example, thedisplay module 125 can generate information to display the context information to the participant by sending a web page to the communication device 101. - Illustratively, the communication devices 101, the
collaboration system 120, theauthentication module 121, thevoting module 122 and thecollaboration module 123 are stored-program-controlled entities, such as a computer or processor, which performs the processes described herein by executing program instructions stored in a tangible computer readable storage medium, such as a memory or disk. -
FIG. 2 is a diagram of a first view of aleader window 200 for controlling a collaboration framework. In accordance with embodiments of the present disclosure, theleader window 200 is presented by communication device 101 of the leader. In addition, theleader window 200 can include various selectable buttons, menus, text boxes or other inputs that allow the leader to control the collaboration framework. Theleader window 200 comprises anactivity menu 201, collaboration activities 202-203, aroster 204,participant status buttons 205, an activity control button, 206, aparticipant control button 207, aleader category column 210, acategory control button 211, anadd category box 212, an addcategories control button 213, aleader idea column 220, anideas control button 221, anadd idea box 222, an addidea control button 223, aleader comment column 230, acomment control button 231, anadd comment box 232, an addcomment control button 233, aleader instructions area 240, a viewinstructions control button 241, a leader documentsarea 250, and a view documentscontrol button 251. - The
leader window 200 is the main window that the leader uses to control structure and levels of the interactive collaboration session. The leader can create a new collaboration activity by selecting theactivity menu 201 to add a new activity. In this example, the leader has created two activities: 1)Activity ABC 202, and 2)Activity XYZ 203. TheActivity ABC 202 is the current activity being displayed in leader window 200 (indicated by the grey background). If the leader selected theActivity XYZ 203, theleader window 200 would display the structure of the interactive collaboration session for theActivity XYZ 203, which can have a completely different structure - The
activity menu 201 also includes theactivity control button 206. Theactivity control button 206, in this example, is a toggle button that allows the leader to control whether the participants can view activity items. In this example, theactivity control button 206 has been toggled to grey. When toggled to grey, the participants will unable to see the activity items (202 and 203). If toggled to green (green is represented by black in all attached black and white drawings), the participants will see theactivity items - The
roster 204 shows which participants of the interactive collaboration session are currently logged in and involved in the interactive collaboration session for theActivity ABC 202. In this example, three participants are currently involved in the interactive collaboration session for the Activity ABC 202 (indicated by the green color (black in the drawing) of theuser status button 205 for each user). A user can be either be a participant or a leader. The Leader may also control if a person is part of an interactive collaboration session by clicking on theuser status button 205 of each user. This would result in theuser status button 205 changing color (e.g., from green to red). Theparticipant status button 205 may also convey that a participant has been defined as part of the interactive collaboration session, but is currently not logged into the interactive collaboration session. The view that is presented to the leader can show one color for logged in and collaborating (e.g., green), one color for not logged in (e.g., grey), and another color for logged in, but where the leader has taken the participant out of the interactive collaboration session (e.g., red). The ability to take a participant in and out of the interactive collaboration session is useful based on the specific topic being discussed. - The leader can also control if all the participants can view the other participants in the interactive collaboration session by clicking the
participant control button 207. Theparticipant control button 207 if toggled to grey disables all participants from viewing the other participants in the interactive collaboration session. - The structure of the interactive collaboration session for the
Activity ABC 202, comprises three levels. The names given to these levels can be customized by the session leader to match the structure of collaboration that they intend to guide participants through. Levels may be named anything the leader wishes, such as Questions/Responses/Comments or Categories/Risks/Root causes. In the example shown in 202, the levels are named: 1) categories, 2) ideas, and 3) comments. In this example, there are three levels; however, in other embodiments, there can be more or less than three levels. Theleader instructions area 240 andleader documents area 250 can also be considered levels. The structure of the interactive collaboration session for theActivity ABC 202 is shown in theleader category column 210, theleader ideas column 220, and theleader comment column 230. Theleader category column 210 shows three categories for theactivity ABC 202. The categories, in this embodiment, are like an activity of the interactive collaboration session. In other alternatives, the categories can be used for a variety of purposes. For example, items in thecategory column 210 can be an idea, an event, an action, a concept, a question, a combination of these, and the like. - In this example, the
categories 210 are: 1) Sales, 2) Public Relations, and 3) Marketing. The number next to the categories indicates the total number items below that level. For example, Marketing shows a 4 (0) in theleader categories column 210. The 4 indicates that there are 4 total items in marketing (2 in ideas, and 2 in comments). The 0 indicates that all the items have been viewed by the leader. The Sales item in theleader category column 210 show a 2 (1). This indicates that there are two items below this item and that there is one item that has not been viewed by the leader. - In this example, the structure (i.e., the ideas and comments associated with a category) that is shown is a structure for Marketing (identified by the line under Marketing in the leader categories column 210). Marketing, as shown in the
leader ideas column 220, has two ideas: 1) whether to change their logo, and 2) should we hire Sally? The “should we hire Sally” item has been selected (indicated by the grey background), which results in the two comments that were added (in the leader comments column 230) being displayed, one that indicates that yes, she is good and a second one that indicates no, she cannot do X. The text items in each of the levels are defined as a contribution. A contribution can come from a leader, a co-leader, or a participant. - Each of the levels of the interactive collaboration session has a button (211, 221, and 231), that allows the leader to control the view of participants. The
category control button 211 allows the leader to control if the participants can view or not view the contributions in theleader categories column 210 in the participant window 300 (seeFIGS. 3 and 5 ). The ideas controlbutton 221 allows the leader to control if the participants can view or not view the contributions in theleader ideas column 220 in theparticipant window 300. Thecomment control button 231 allows the leader to control if the participants can view or not view the contributions in theleader comment column 230 in theparticipant window 300. - Each of the
buttons - Each of the
columns add category box 212 can add a new category to theleader category column 210. The leader, via theadd idea box 222 can add a new idea to theleader ideas column 220. The leader, via the add commentsbox 232 can add a new comment to theleader comments column 230. - The leader can also control whether the participants can add items (contributions) to each of the levels of the structure of the interactive collaboration session. The leader can select the add
categories control button 213 to allow or not allow a participant to add a new item to a participant category column (SeeFIG. 3 ). The leader can select the addidea control button 223 to allow or not allow a participant to add a new idea to the participant idea column. The leader can select the addcomment control button 233 to allow or not allow a participant to add a new idea to the participant comment column. A contribution that is provided by a participant is typically anonymous. However, in some embodiments, the contribution provided by the participant will be marked with an identifier indicating the participant who made the contribution. In one embodiment, this is only known the leader. In other embodiments, it may be known by the other participants. - The leader can also control if the participants can view instructions in the
leader instructions area 240. In some embodiments this area will be represented as a tab in the interface. This is accomplished by the leader selecting the viewinstructions control button 241. Likewise, the leader can control if participants can view and add documents to the leader documentsarea 250. In some embodiments, this area will be represented as a tab in the interface. The leader can toggle the view documentscontrol button 251 to control if participant can view and add and/or delete documents. Although not shown, control of viewing documents may be separate from control of adding and/or deleting documents. InFIG. 2 , both the viewinstructions control button 241 and the view documentscontrol button 251 have been toggled to green (black in the drawing), indicating that the participants will be able to view instructions and documents along with adding and deleting documents. - In other embodiments, the leader via a control button (not shown) may control if a participant can modify another participant's contribution. In another embodiment, the leader may control (enable or disable) if a participant can navigate the structure of the interactive collaboration session.
- The
leader window 200 is an example of a hierarchical structure of an interactive collaboration. However, in alternative embodiments, the structure of the interactive collaboration session may be based on different structures, such as based on tabs. The tabs may be used to define the structure of the interactive collaboration session. For example, instead of columns, there would be a tab for the different items, such as a tab for the categories, a tab for the ideas, and a tab for the comments (or tabs for different/additional groupings). A control button may be displayed on each of the tabs to allow the leader to enable/disable a tab from being viewed by the participants of the interactive collaboration session. - In another embodiment, a grid can be used to define the structure of the interactive collaboration session. A control button is placed on a row(s) and/or column(s). This allows the leader to enable/disable what rows/columns can be view by the participants of the interactive collaboration session.
- In a different embodiment, items of a menu can be used to define the structure of the interactive collaboration session. A control button is placed on menus and/or menu items. This allows the leader to enable/disable what menus/menu items can be viewed by the participants of the interactive collaboration session.
- In one embodiment the color of the
buttons -
FIG. 3 is a diagram of a first view of aparticipant window 300 of a controlled view in a collaboration framework. In this exemplary view, theparticipant window 300 comprises aparticipant categories column 310, aparticipant ideas column 320, a participant commentscolumn 330, aparticipant instruction area 340 and a participant documentsarea 350. Theparticipant window 300 also comprises a participant addcategories box 312, a participant addideas box 322, and a participant addcomments box 332. Theparticipant window 300 is the correspondingparticipant window 300 that is displayed to the participants based on the current settings ofleader window 200 as shown inFIG. 2 . - In the
leader window 200, theactivity control button 206 and theparticipant control button 207 inFIG. 2 have been toggled to grey. Thus, theActivity ABC 202 and theActivity XYZ 203 in theactivity menu 201, along with theroster 204 are not displayed in theparticipant window 300. - In the
leader window 200, thecategory control button 211, theideas control button 221, and thecomment control button 231 have been toggled to green (black in the drawing) to allow the participants to view each of the columns (210, 220, and 230). These selections result in theleader categories column 210, theleader idea column 220, and theleader comment column 220 of theleader window 200 being displayed respectively in theparticipant categories column 310, theparticipant idea column 320, and theparticipant comment column 330 of theparticipant window 300. - In addition, in the
leader window 200, the addcategories control button 213, the addidea control button 223, and the addcomment control button 233 have been toggled green (black in the drawing) to allow the participants to enter items into the respective columns (310, 320, and 330). This allows the participants (in addition to the leader) to add new items (contributions) to be displayed in both the category columns (210 and 310), the idea columns (220 and 320), and the comment columns (230 and 330). - In the
leader window 200, the viewinstructions control button 241 is toggled to green (black in the drawing; to indicate that the participants can view this field). This results in theleader instructions area 240 being displayed theparticipant window 300. Likewise, the view documentscontrol button 251 is toggled to green (black in the drawing). This results in the leader documentsarea 250 being displayed in theparticipant window 300. -
FIG. 4 is a diagram of a second view of aleader window 200 for controlling a collaboration framework. The difference betweenFIG. 4 andFIG. 2 is that thecategory control button 211 has been toggled grey by the leader to indicate that the participants cannot view the information in theparticipant category column 310. In addition, the addidea control button 223 has been toggled to grey to indicate that the participants cannot add any new ideas to theideas columns 320. The instructions controlbutton 241 and the documents controlbutton 251 button have also been toggled to grey to indicate that the participants cannot view the information in therespective areas -
FIG. 5 is a diagram of a second view of aparticipant window 300 of a controlled view in a collaboration framework. Based on thecategory control button 211 being toggled (to grey) as shown inFIG. 4 , thecategories column 310 is no longer displayed in theparticipant window 300 shown inFIG. 5 . Likewise, based on the addidea control button 223 being toggled (to grey) as shown inFIG. 4 , the participant addideas box 322 is no longer displayed inparticipant window 300 as shown inFIG. 5 . Based on theinstructions control button 241 and the documents controlbutton 251 being toggled, theparticipant window 300 no longer shows theparticipant instruction area 340 and the participant documentsarea 350 are no longer displayed. - Since the leader can select various combinations of the
category control button 211, theideas control button 221, thecomment control button 231, theinstructions control button 241, and the documents controlbutton 251, the leader can easily control which levels of the structure of the interactive collaboration session are displayed to the participants. -
FIG. 6 is a diagram of aleader window 600 for viewing contributions in a collaboration framework. Although primarily used by a leader, theleader windows leader window 600 comprises the same items of leader window 200 (excluding theitems - The
leader window 600 also comprises asearch box 620, asearch button 621, asearch window 624. Thesearch window 624 further comprises an add tocategory button 622, an add toidea button 623, and asearch result 625. Thesearch box 620 allows the leader to enter a keyword(s) in order to search the structure of one or more activities 202-203. For example, as shown insearch box 620, the leader wants to search for the key word “Sales Schedule.” After entering the keyword(s), the leader selects thesearch button 621. This brings up thesearch window 624. In this example, there was one contribution that was found in searching theActivity ABC 202. The contribution was the statement “Is Sales Schedule really behind?” that is displayed insearch result 625. In this example, the contribution “Is sales schedule really behind?” is not shown in the structure because it is a comment off of the sales activity in Activity ABC. The leader can select thesearch result 625 and then select the add tocategory button 622 to add the selected contribution to theleader category column 210. In a similar manner, the participant can select the add toidea button 623 and add the selectedsearch result 625 to the currently displayedleader idea column 220. The leader can also select the go tobutton 624 to bring up the structure for the “Is sales schedule really behind?” contribution. -
FIG. 7 is a diagram of a view of aparticipant window 700 that contains acontext frame 710 for managing context information.FIG. 7 comprisesparticipant window 700,tag tray 720,document display window 730,dislike display window 731, and clarification enterwindow 732. Theparticipant window 700 comprises theparticipant ideas column 320, theparticipant comments column 330, and the participant addcategories box 312. Theparticipant window 700 also comprises thecontext frame 710. Thecontext frame 710 is described below from a participant view in theparticipant window 700. However, thecontext frame 710 can work in the same manner for theleader window 200. - In this example, the
context frame 710 is shown as being over the idea “Should we hire Sally.” Thecontext frame 710 works in conjunction with thetag tray 720 for associating context information with items (e.g., ideas or comments) of theparticipant window 700. Thecontext frame 710 further comprises a like tag button 711, adislike tag button 712, adocuments tag button 713, and an addquestion tag button 714. Although not shown, thecontext frame 710 can comprise a variety of buttons, such as an important tag button (to flag an item as important), a clarification tag button (where a participant can clarify an item), a comment tag button, and a contact button (where a participant can add a contact to an item), a favorites tag button, a monitory value tag button (where a participant can associate a monetary number with an item), a summary tag button (by default is not visible to participants), presentation tag button, a photo tag button, a link tag button (e.g., a link to a web site), a custom tag button and/or the like. Thecontext frame 710 can also comprise copy, cut, paste, delete, indent, out dent, and reorder buttons. The leader may control which buttons each participant sees in thecontext frame 710. For example, one participant may see different buttons than another participant, or participants may see a different set of buttons in each of the activities in a session. - Below each of the tag buttons 711-714 is a number. The number indicates an amount associated with the tag button 711-714 for the particular item that the
context frame 710 is over. For example, the like tag button 711 has 3 likes associated with the idea “Should we hire Sally.” Thedislike tag button 712 has one dislike associated with the idea “Should we hire Sally.” The documents tagbutton 713 has one document associated with the idea “Should we hire Sally.” The addquestion tag button 714 has two questions associated with the idea “Should we hire Sally.” In some embodiments, these numbers may not be displayed, and instead, the tag button 711 will appear on the context frame where it has been applied at least once, and not be displayed if it has not been applied to the information by any user. - The
tag tray 720 comprises a variety of icons for tagging different types of context information to items in theparticipant window 700. Thetag tray 720 comprises a like icon 721, a dislike icon 722, an importance icon 723, a question icon 724, a clarification icon 725, a comment icon 726, a document icon 727, and a contact icon 728. Thetag tray 720 can also include other icons for associating different types of context information, such as a favorites icon, a monetary value icon, a summary icon, a presentation icon (for holding a link to a presentation), a photo icon, a link icon (e.g., a link to a web page), a custom icon, and/or the like. The icons that a participant sees in theparticipant tag tray 720 can be controlled by the leader or can be an administered. - Each of the icons 721-728 in the
tag tray 720 may have a limit number associated with the icon. In this example, icons 721-722 and 724-728 have a limit number below the respective icon 721-722 and 724-728. The like icon 721 has the limit No. 10/5 below the like icon 721. The 10 represents the maximum number of likes that the participant can use for the activity (Activity ABC). In this example, the participant has used five likes out of the allowed ten likes. The importance icon 723 has a dash below the importance icon 723. The dash indicates that there is no limit for the importance icon. If an icon only has a single number below the icon, this indicates the maximum limit, but that the participant has not used any of the maximum limit. For example, the comment icon 724 shows that this particular participant has a limit of 10 comments, but the participant in this case has not used any of the 10 available comments for the activity ABC. - To associate context information with an item (contribution), the participant can select an icon 721-728 and drag the icon onto an item to select the item. For example, the participant could select the clarification icon 725 and drag the clarification icon 725 (as shown in step 740) onto the item “Yes, she is good.” This results in the
clarification enter window 732 being displayed so that the participant can enter a clarification of the comment “Yes, she is good.” The process of selecting an item by dragging an icon onto an item can be done for any of the icons 721-728. For example, the participant can drag the dislike icon 722 onto the item “Yes, she is good” item. - As the participants of the interactive collaboration session associate various types of context information with an item (e.g., the clarification just described), this information can then be displayed during the interactive collaboration session via the
context frame 710. For example, as a participant drags the like icon 721 onto an item, the other participants in the interactive collaboration session will see the added like for the item by moving the context frame over the item. In this example, if another participant dragged the like icon 721 onto the “Should we hire Sally” item, the participant ofparticipant window 700 would see the number under the like tag item 711 change from 3 to 4 in thecontext frame 710. - Via the
context frame 710, participants and the leader can see details of the context information associated with the each of the tag buttons 711-714 in thecontext frame 710. For example, if the participant or the leader selects thedocuments tag button 713, this results in the documents displaywindow 730 being displayed to the participant or leader instep 742. The documents displaywindow 730 shows the document that is associated with the item “Should we hire Sally,” which is the employee handbook.doc. The participant can then select the employee handbook.doc to open this document. - If the participant wanted to see who did not like the item “Should we hire Sally,” the participant or leader can click on the
dislike tag button 712. Thedislike display window 731 is then displayed instep 743 to the participant or leader that indicates, in this example, that Jack Hammer did not want to hire Sally. This can also be done in a similar manner for the like tag button 711. In some embodiments, the participant and/or leader may only be able to see that there is a dislike, but not be allowed to click on thedislike tag button 712 to display the particular participant that disliked the item. - The
context frame 710 can be displayed to a participant in various ways. For example, a context frame button (not shown) could be by each item that allows the participant to bring up thecontext frame 710 for each item. Alternatively, there can be a context frame icon in thetag tray 720 that allows a participant to select an item by dragging the context frame icon onto an item to display the context frame for that item. In one embodiment, the participant can drag thecontext frame 710 to different items in theparticipant window 700 to display the context information associated with the item. - The process of liking and disliking an item can be repeated by the same participant. For example, if Jack Hammer initially liked the item “Should we hire Sally” and then later on in the discussion decided that he no longer liked hiring Sally, Jack Hammer could dislike the item “Should we hire Sally.” This would decrement the like count under the like tag button 711 by one and increment the count under the
dislike tag button 712. - In another embodiment, the icons in the
tag tray 720 and the button on the context frame can change based on the item that is selected. For example, a different set of icons in thetag tray 720 and a different set of buttons on thecontext frame 720 may be displayed when the context frame is over the item “Yes, she is good.” In one embodiment, the icons in thetag tray 720 and/or context frame can be enabled or disabled based on the selected item. - Leaders can also create other tags that can be applied to any item/contribution. For example, tags could be created to feature items, include items in a report, identify items as belonging to a particular classification etc.
- In one embodiment, only the leader will have access to the
tag tray 720. The leader can use thetag tray 720 to determine which buttons 711-714 will be displayed in thecontext frame 710 to the participants. The individual participants can then select the button 711-714 and perform a drag-in-drop onto the item (e.g. item 710) to tag the item. For example, the leader could do a drag-in-drop of the like icon 721 onto thecontext frame 710 to add the like button 711 to thecontext frame 710 that is displayed to the participants. A participant could indicate that she likes the “Should We Hire Sally” item by dragging-in-dropping the like button 711 onto the item “Should We Hire Sally.” - The leader can also remove a button 711-714 from the
context frame 710. This is accomplished by the leader dragging-and-dropping one or more of the buttons 711-174 onto thetag tray 720. -
FIG. 8 is a first diagram of a first view of aleader window 800 for controlling a collaboration framework that includes a stepping framework. Theleader window 800 is similar to theleader window 200 with the addition of astep frame 801. Thestep frame 801 comprisesstep buttons 810A-810N. Thestep frame 801 is used to step a leader through a defined series of steps in an interactive collaboration session by allowing the leader to click on eachstep button 810A-810N to advance to the next step in the interactive collaboration session. - A typical interactive collaboration session will include a series of steps or patterns. For example, the leader may want to discuss the category “sales” first in an interactive collaboration session with the participants. Next the leader may want to discuss the “marketing” category. In each step, the leader may only want the participants to view specific information in the interactive collaboration session. For example, when the leader wants to discuss the “sales” category, the leader may want to only display the ideas column 220 (controlled by the ideas control button 221) and the comments column 230 (controlled by the comment control button 231). In addition, the leader may only want to allow the addition of comments into the comment column (controlled by the add comment control button 233). For the “marketing” step, the leader may only want to display the
comments column 230 without allowing any comments by the participants. Normally the leader would have to manually select each of thebuttons participant window 200 for each step in the interactive collaboration session. Thestep frame 801 allows the leader to step through the interactive collaboration session by selecting thestep buttons 810A-810N to provide the desired view (by automatically selecting thebuttons - The steps of an interactive collaboration session can be defined by the leader or an administrator prior to the beginning of the interactive collaboration session in the
template 124. Thetemplate 124 can be generated in various ways. For example, the leader or administrator can set a particular set ofbuttons buttons template 124. - The
template 124 is then loaded as part of the interactive collaboration session. The loading of thetemplate 124 causes thestep frame 801 to be displayed in theleader window 800. In this example, thetemplate 124 has N steps. Each of the steps are represented by one of thestep buttons 810A-810N. The leader can go to a specific step by clicking on the specific step button 810 for the step that the leader wants displayed to the participants of the interactive collaboration session. In this example, there is a specific step button 810 associated with each step in the interactive collaboration session. However, in other embodiments, a previous and next buttons could be used in place of the step buttons or a combination of step buttons 810 and next and previous buttons can be used. - In the
FIG. 8 the leader has selectedstep button 810B (as indicated by the grey background ofstep button 810B) to display the configuration associated withstep 2 of the interactive collaboration session. Forstep 2, thetemplate 124 defines that only thecategories column 210 will be displayed along with allowing the participants to add to the categories. This results in theleader window 800 being displayed to the leader that only has thecategory control button 211 and the addcategories control button 213 set to green (black inFIG. 8 ). A correspondingparticipant window 300 is displayed to the participants of the interactive collaboration session as shown inFIG. 9 . Theparticipant window 300 inFIG. 9 only shows thecategories column 310 along with the participant addcategories box 312. When the leader selects another step button 810, the configuration associated with the selected step is then used to display the defined view for the leader and the participants. - When the leader selects each of the step buttons, a context associated with the step button 810 is displayed. In this example, the context for
step 2 is for a “review of categories” for activity ABC as shown near the top of theleader window 800. Information that describes the step can be displayed in different areas inleader window 800. For example, this information could be displayed on the step buttons 810 or be displayed via a popup window when a cursor is positioned above one of the step buttons 810. - During a particular step, the leader may still manually select one or more of the
buttons participant window 300 during the interactive collaboration session. This gives the leader the ability to easily control the interactive collaboration session. -
FIG. 10 is a second diagram of a second view of aleader window 800 for controlling a collaboration framework that includes a stepping framework. InFIG. 8 , the leader, after selectingstep 2 as described inFIG. 8 , selects thestep button 810C (as indicated by the grey background ofstep button 810C) forstep 3 of the interactive collaboration session. Information forstep 3 of the interactive collaboration session in thetemplate 124 is used to create the display as shown inFIG. 10 . When the leader selects thestep button 810C forstep 3 in the interactive collaboration session, thebuttons template 124 to indicate that the participants of the interactive collaboration session can now view and enter information in theideas column 220 and thecomments column 230. This also results in the display of a correspondingparticipant window 300 that is displayed to the participants of the interactive collaboration session as shown inFIG. 11 . For this step of the interactive collaboration session, the participants can now add ideas and comments. - In
FIG. 10 , when the activity forstep 3 also changes to “discuss categories and give comments” in the leader window 900. This gives the leader the context of this step (step 3). The leader then repeats this process through step N, which completes the interactive collaboration session. -
FIG. 12 is flow diagram of a method for controlling a display of a collaboration framework. The process starts instep 1200. The process generates, instep 1202, a structure of an interactive collaboration session that comprises one or more levels. The structure of the interactive collaboration session can be generated based on a template that is defined by the leader of the interactive collaboration session. - The structure of the interactive collaboration session is displayed to a leader of the interactive collaboration session in step 1204 (e.g., as shown
FIGS. 2 , 4, 6, 8, and 10). Input is received from the leader to control the display of one or more levels of the structure of the interactive collaboration session in step 1206 (e.g., by the leader clicking on one of thebuttons - In response to the input from the leader in
step 1206, a display of one or more of the levels that is displayed to a set of participants of the interactive collaboration session is controlled instep 1208. For example, when the leader of the interactive collaboration session clicks on thecategory control button 211, theparticipant categories column 310 is removed from theparticipant window 300. The process determines instep 1210 if the interactive collaboration session is complete. If the interactive collaboration session is not complete instep 1210, the process goes to step 1206 to receive additional input from the leader. Otherwise, if the interactive collaboration session is complete instep 1210, the process ends instep 1212. -
FIG. 13 is a flow diagram of a method for managing context information in an interactive collaboration session. The process starts instep 1300. The process identifies a plurality of items in an interactive collaboration session instep 1302. An item can be identified from a profile. An item can be an item associated with a level in the structure of the interactive collaboration session. For example, an item can be an item in theleader categories column 210 and/or theparticipant categories column 310 such as “sales”, “PR”, and “marketing. An item can be an item in thecolumns - Input to tag an individual item is received during the interactive collaboration session in step 1304. For example, a participant of the interactive collaboration session can like the “Yes she is good” item using the
tag tray 720 in theparticipant comments column 310 of theparticipant window 300. The context information is associated with the tagged item (e.g., the like that is associated with the tagged item “Yes she is good”) instep 1306. Acontext frame 710 is associated with the tagged item in step 1308. - In response to step 1304, the associated context information (e.g., the like) is displayed to a participant (e.g., via a number that displays the total number of likes for the item) of the interactive collaboration session in
step 1310. For example, information can be generated in a web page that is sent to a communication for display at a communication device. The process determines instep 1312 if the interactive collaboration session is complete instep 1312. If the interactive collaboration session is not complete instep 1312, the process goes to step 1304 to get more tagged items. Otherwise, if the interactive collaboration session is complete instep 1312, the process ends instep 1314. - Of course, various changes and modifications to the illustrative embodiment described above will be apparent to those skilled in the art. These changes and modifications can be made without departing from the spirit and the scope of the system and method and without diminishing its attendant advantages. The following claims specify the scope of the invention. Those skilled in the art will appreciate that the features described above can be combined in various ways to form multiple variations of the invention. As a result, the invention is not limited to the specific embodiments described above, but only by the following claims and their equivalents.
Claims (20)
1. A method for collaboration comprising:
identifying a plurality of items for an interactive collaboration session;
receiving an input to tag an individual item of the plurality of items in the interactive collaboration session;
in response to receiving the input to tag the individual item of the plurality of items, associating context information with the individual item of the plurality of items;
associating a context frame with the individual item of the plurality of items; and
in response to associating the context frame with the individual item of the plurality of items, generating information to display the associated context information to a first participant of the interactive collaboration session.
2. The method of claim 1 , wherein tagging the individual item of the plurality of items is accomplished by a second participant of the interactive collaboration session.
3. The method of claim 2 , wherein the tagged individual item of the plurality of items is an item that accepts an input from the second participant of the interactive collaboration session and further comprising:
displaying a window for the second participant to input the context information.
4. The method of claim 1 , further comprising a tag tray or the context frame that is used for tagging the individual item of the plurality of items.
5. The method of claim 4 , wherein the tag tray displays a number that identifies a limit on a number of times a type of context information can be tagged in the interactive collaboration session.
6. The method of claim 4 , wherein tagging the individual item of the plurality of items is comprises dragging an icon or button to tag the individual item of the plurality of items.
7. The method of claim 1 , wherein the context frame displays a number associated with an individual type of context information that represents a number of associated context items.
8. The method of claim 1 , wherein the tagged individual item of the plurality of items comprises one or more of: a like, a dislike, an importance, a question, a clarification, a comment, a document, a contact, a favorite, a monetary value, a summary, a presentation, a photo, a value driver, and a link.
9. The method of claim 1 , wherein the context information comprises a like, wherein generating the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a like button along with a number that indicates an associated number of likes and further comprising:
receiving a selection of the like button; and
in response to receiving the selection of the like button, generating information for displaying a participant who liked the individual item of the plurality of items.
10. The method of claim 1 , wherein the context information comprises a document, wherein generating the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a document button along with a number that indicates an associated number of documents and further comprising:
receiving a selection of the document button; and
in response to receiving the selection of the document button, displaying a link to a document associated with the individual item of the plurality of items.
11. A system for collaboration comprising:
a processor enabled collaboration module configured to identify a plurality of items for an interactive collaboration session, receive an input to tag an individual item of the plurality of items in the interactive collaboration session, associate context information with the individual item of the plurality of items in response to receiving the input to tag the individual item of the plurality of items, and associate a context frame with the individual item of the plurality of items, and
a processor enabled display module configured to generate information to display the associated context information to a first participant of the interactive collaboration session in response to associating the context frame with the individual item of the plurality of items.
12. The system of claim 11 , wherein tagging the individual item of the plurality of items is accomplished by a second participant of the interactive collaboration session.
13. The system of claim 12 , wherein the tagged individual item of the plurality of items is an item that accepts an input from the second participant of the interactive collaboration session, and wherein the collaboration module is further configured to display a window for the second participant to input the context information.
14. The system of claim 11 , further comprising a tag tray or the context frame that is used for tagging the individual item of the plurality of items.
15. The system of claim 14 , wherein the tag tray displays a number that identifies a limit on a number of times a type of context information can be tagged in the interactive collaboration session.
16. The system of claim 14 , wherein tagging the individual item of the plurality of items is comprises dragging an icon or button to tag the individual item of the plurality of items.
17. The system of claim 11 , wherein the context frame displays a number associated with an individual type of context information that represents a number of associated context items.
18. The system of claim 11 , wherein the context information comprises a like, wherein generating the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a like button along with a number that indicates an associated number of likes, wherein the collaboration module is further configured to receive a selection of the like button, and wherein the display module is further configured to generate information for display about a participant who liked the individual item of the plurality of items in response to receiving the selection of the like button.
19. The system of claim 11 , wherein the context information comprises a document, wherein generating the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a document button along with a number that indicates an associated number of documents, wherein the collaboration module is further configured to receive a selection of the document button, and wherein the display module is further configured to generate information for displaying a link to a document associated with the individual item of the plurality of items in response to receiving the selection of the document button.
20. A method for collaboration comprising:
identifying a plurality of items for an interactive collaboration session;
receiving an input to tag an individual item of the plurality of items in the interactive collaboration session;
in response to receiving the input to tag the individual item of the plurality of items, associating context information with the individual item of the plurality of items;
associating a context frame with the individual item of the plurality of items;
in response to associating the context frame with the individual item of the plurality of items, generating information to display the associated context information to a first participant of the interactive collaboration session, wherein the context information comprises a like, wherein generating the information to display the associated context information to the first participant of the interactive collaboration session comprises generating information for displaying a like button along with a number that indicates an associated number of likes;
receiving a selection of the like button; and
in response to receiving the selection of the like button, generating information for displaying a participant who liked the individual item of the plurality of items.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/206,906 US20140282109A1 (en) | 2013-03-15 | 2014-03-12 | Context frame for sharing context information |
EP14160359.7A EP2779060A1 (en) | 2013-03-15 | 2014-03-17 | Context frame for sharing context information |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361787164P | 2013-03-15 | 2013-03-15 | |
US14/206,906 US20140282109A1 (en) | 2013-03-15 | 2014-03-12 | Context frame for sharing context information |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140282109A1 true US20140282109A1 (en) | 2014-09-18 |
Family
ID=51534443
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/206,906 Abandoned US20140282109A1 (en) | 2013-03-15 | 2014-03-12 | Context frame for sharing context information |
US14/206,638 Active 2034-08-27 US9483161B2 (en) | 2013-03-15 | 2014-03-12 | Controllable display of a collaboration framework system |
US15/157,244 Abandoned US20160259506A1 (en) | 2013-03-15 | 2016-05-17 | Controllable display of a collaboration framework system |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/206,638 Active 2034-08-27 US9483161B2 (en) | 2013-03-15 | 2014-03-12 | Controllable display of a collaboration framework system |
US15/157,244 Abandoned US20160259506A1 (en) | 2013-03-15 | 2016-05-17 | Controllable display of a collaboration framework system |
Country Status (1)
Country | Link |
---|---|
US (3) | US20140282109A1 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140258939A1 (en) * | 2013-03-08 | 2014-09-11 | Information Resources, Inc. | Selection of hierarchically organized items |
US20140317502A1 (en) * | 2013-04-18 | 2014-10-23 | Next It Corporation | Virtual assistant focused user interfaces |
US20160269341A1 (en) * | 2015-03-11 | 2016-09-15 | Microsoft Technology Licensing, Llc | Distribution of endorsement indications in communication environments |
US9563618B2 (en) | 2009-09-22 | 2017-02-07 | Next It Corporation | Wearable-based virtual agents |
US9589579B2 (en) | 2008-01-15 | 2017-03-07 | Next It Corporation | Regression testing |
US9823811B2 (en) | 2013-12-31 | 2017-11-21 | Next It Corporation | Virtual assistant team identification |
US9824188B2 (en) | 2012-09-07 | 2017-11-21 | Next It Corporation | Conversational virtual healthcare assistant |
US9838347B2 (en) | 2015-03-11 | 2017-12-05 | Microsoft Technology Licensing, Llc | Tags in communication environments |
US9836177B2 (en) | 2011-12-30 | 2017-12-05 | Next IT Innovation Labs, LLC | Providing variable responses in a virtual-assistant environment |
US10210454B2 (en) | 2010-10-11 | 2019-02-19 | Verint Americas Inc. | System and method for providing distributed intelligent assistance |
US10379712B2 (en) | 2012-04-18 | 2019-08-13 | Verint Americas Inc. | Conversation user interface |
US10489434B2 (en) | 2008-12-12 | 2019-11-26 | Verint Americas Inc. | Leveraging concepts with information retrieval techniques and knowledge bases |
US10545648B2 (en) | 2014-09-09 | 2020-01-28 | Verint Americas Inc. | Evaluating conversation data based on risk factors |
US11196863B2 (en) | 2018-10-24 | 2021-12-07 | Verint Americas Inc. | Method and system for virtual assistant conversations |
US11507559B2 (en) * | 2020-05-25 | 2022-11-22 | Hewlett Packard Enterprise Development Lp | Object sharing by entities using a data structure |
US11568175B2 (en) | 2018-09-07 | 2023-01-31 | Verint Americas Inc. | Dynamic intent classification based on environment variables |
US11989521B2 (en) | 2018-10-19 | 2024-05-21 | Verint Americas Inc. | Natural language processing with non-ontological hierarchy models |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140282109A1 (en) * | 2013-03-15 | 2014-09-18 | GroupSystems Corporation d/b/a ThinkTank by GroupS | Context frame for sharing context information |
US20150193521A1 (en) * | 2014-01-09 | 2015-07-09 | Google Inc. | Methods for Generating an Activity Stream |
US10133460B2 (en) | 2014-03-26 | 2018-11-20 | Unanimous A.I., Inc. | Systems and methods for collaborative synchronous image selection |
US10110664B2 (en) * | 2014-03-26 | 2018-10-23 | Unanimous A. I., Inc. | Dynamic systems for optimization of real-time collaborative intelligence |
US10277645B2 (en) * | 2014-03-26 | 2019-04-30 | Unanimous A. I., Inc. | Suggestion and background modes for real-time collaborative intelligence systems |
US11269502B2 (en) | 2014-03-26 | 2022-03-08 | Unanimous A. I., Inc. | Interactive behavioral polling and machine learning for amplification of group intelligence |
US12099936B2 (en) | 2014-03-26 | 2024-09-24 | Unanimous A. I., Inc. | Systems and methods for curating an optimized population of networked forecasting participants from a baseline population |
US10310802B2 (en) | 2014-03-26 | 2019-06-04 | Unanimous A. I., Inc. | System and method for moderating real-time closed-loop collaborative decisions on mobile devices |
US12001667B2 (en) | 2014-03-26 | 2024-06-04 | Unanimous A. I., Inc. | Real-time collaborative slider-swarm with deadbands for amplified collective intelligence |
US9959028B2 (en) | 2014-03-26 | 2018-05-01 | Unanimous A. I., Inc. | Methods and systems for real-time closed-loop collaborative intelligence |
US10817159B2 (en) | 2014-03-26 | 2020-10-27 | Unanimous A. I., Inc. | Non-linear probabilistic wagering for amplified collective intelligence |
US11941239B2 (en) | 2014-03-26 | 2024-03-26 | Unanimous A.I., Inc. | System and method for enhanced collaborative forecasting |
US10817158B2 (en) | 2014-03-26 | 2020-10-27 | Unanimous A. I., Inc. | Method and system for a parallel distributed hyper-swarm for amplifying human intelligence |
US12079459B2 (en) | 2014-03-26 | 2024-09-03 | Unanimous A. I., Inc. | Hyper-swarm method and system for collaborative forecasting |
US11151460B2 (en) | 2014-03-26 | 2021-10-19 | Unanimous A. I., Inc. | Adaptive population optimization for amplifying the intelligence of crowds and swarms |
US9940006B2 (en) | 2014-03-26 | 2018-04-10 | Unanimous A. I., Inc. | Intuitive interfaces for real-time collaborative intelligence |
CA2886485C (en) * | 2014-03-31 | 2023-01-17 | Smart Technologies Ulc | Method for tracking displays during a collaboration session and interactive board employing same |
US9674243B2 (en) | 2014-09-05 | 2017-06-06 | Minerva Project, Inc. | System and method for tracking events and providing feedback in a virtual conference |
US10965622B2 (en) * | 2015-04-16 | 2021-03-30 | Samsung Electronics Co., Ltd. | Method and apparatus for recommending reply message |
USD786284S1 (en) * | 2015-05-21 | 2017-05-09 | Layer3 TV, Inc. | Display screen or portion thereof with an animated graphical user interface |
JP6523095B2 (en) * | 2015-07-30 | 2019-05-29 | 株式会社Mugenup | Thread display control system and program |
US9906755B1 (en) * | 2016-03-31 | 2018-02-27 | Biton, Llc | Method for collective contribution video creation and messaging |
JP2021157242A (en) * | 2020-03-25 | 2021-10-07 | 富士フイルムビジネスイノベーション株式会社 | Information processor and information processing program |
US11949638B1 (en) | 2023-03-04 | 2024-04-02 | Unanimous A. I., Inc. | Methods and systems for hyperchat conversations among large networked populations with collective intelligence amplification |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6314408B1 (en) * | 1997-07-15 | 2001-11-06 | Eroom Technology, Inc. | Method and apparatus for controlling access to a product |
US20050010589A1 (en) * | 2003-07-09 | 2005-01-13 | Microsoft Corporation | Drag and drop metadata editing |
US20090217175A1 (en) * | 2008-02-22 | 2009-08-27 | Accenture Global Services Gmbh | System for providing an interface for collaborative innovation |
US20120284606A1 (en) * | 2011-05-06 | 2012-11-08 | David H. Sitrick | System And Methodology For Collaboration Utilizing Combined Display With Evolving Common Shared Underlying Image |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7072940B1 (en) * | 2000-08-14 | 2006-07-04 | Ford Motor Company | System and method for managing communications and collaboration among team members |
US20020085030A1 (en) * | 2000-12-29 | 2002-07-04 | Jamal Ghani | Graphical user interface for an interactive collaboration system |
US7613687B2 (en) * | 2003-05-30 | 2009-11-03 | Truelocal Inc. | Systems and methods for enhancing web-based searching |
US9654425B2 (en) * | 2003-06-16 | 2017-05-16 | Meetup, Inc. | System and method for communicating among members of meeting groups |
US9264462B2 (en) * | 2003-06-16 | 2016-02-16 | Meetup, Inc. | System and method for confirming attendance for in-person meetings or events |
US20120179981A1 (en) * | 2011-01-07 | 2012-07-12 | Meetup, Inc. | Collaboration Meeting Management in a Web-Based Interactive Meeting Facility |
US9947053B2 (en) * | 2003-06-16 | 2018-04-17 | Meetup, Inc. | System and method for conditional group membership fees |
US7428704B2 (en) * | 2004-03-29 | 2008-09-23 | Lehman Brothers Holdings Inc. | Dynamic presentation generator |
US20060009992A1 (en) * | 2004-07-02 | 2006-01-12 | Cwiek Mark A | Method and system for assessing a community's preparedness, deterrence, and response capability for handling crisis situations |
US7676433B1 (en) * | 2005-03-24 | 2010-03-09 | Raf Technology, Inc. | Secure, confidential authentication with private data |
US20080244401A1 (en) * | 2007-04-02 | 2008-10-02 | Microsoft Corporation | User interface teaching concepts in an application |
US7895526B2 (en) * | 2007-04-03 | 2011-02-22 | Oracle International Corporation | User interface design for enabling non-sequential navigation, preview and review of branching and looping wizards |
US20100037151A1 (en) * | 2008-08-08 | 2010-02-11 | Ginger Ackerman | Multi-media conferencing system |
US8499041B2 (en) * | 2009-01-26 | 2013-07-30 | The Boeing Company | Collaborative browsing and related methods and systems |
US8739045B2 (en) * | 2011-03-02 | 2014-05-27 | Cisco Technology, Inc. | System and method for managing conversations for a meeting session in a network environment |
US20130080776A1 (en) * | 2011-09-28 | 2013-03-28 | John T. Elduff | Secure Document Collaboration |
US9280760B2 (en) * | 2011-10-26 | 2016-03-08 | Citrix Systems, Inc. | Integrated online workspaces |
US20140282109A1 (en) * | 2013-03-15 | 2014-09-18 | GroupSystems Corporation d/b/a ThinkTank by GroupS | Context frame for sharing context information |
-
2014
- 2014-03-12 US US14/206,906 patent/US20140282109A1/en not_active Abandoned
- 2014-03-12 US US14/206,638 patent/US9483161B2/en active Active
-
2016
- 2016-05-17 US US15/157,244 patent/US20160259506A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6314408B1 (en) * | 1997-07-15 | 2001-11-06 | Eroom Technology, Inc. | Method and apparatus for controlling access to a product |
US20050010589A1 (en) * | 2003-07-09 | 2005-01-13 | Microsoft Corporation | Drag and drop metadata editing |
US20090217175A1 (en) * | 2008-02-22 | 2009-08-27 | Accenture Global Services Gmbh | System for providing an interface for collaborative innovation |
US20120284606A1 (en) * | 2011-05-06 | 2012-11-08 | David H. Sitrick | System And Methodology For Collaboration Utilizing Combined Display With Evolving Common Shared Underlying Image |
Non-Patent Citations (1)
Title |
---|
Nathan Linnell, WHy Doesn't Facebook Share names who likes your page, Search Engine Watch, April 3, 2012, 3 pages * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10109297B2 (en) | 2008-01-15 | 2018-10-23 | Verint Americas Inc. | Context-based virtual assistant conversations |
US9589579B2 (en) | 2008-01-15 | 2017-03-07 | Next It Corporation | Regression testing |
US10438610B2 (en) | 2008-01-15 | 2019-10-08 | Verint Americas Inc. | Virtual assistant conversations |
US10176827B2 (en) | 2008-01-15 | 2019-01-08 | Verint Americas Inc. | Active lab |
US10489434B2 (en) | 2008-12-12 | 2019-11-26 | Verint Americas Inc. | Leveraging concepts with information retrieval techniques and knowledge bases |
US11663253B2 (en) | 2008-12-12 | 2023-05-30 | Verint Americas Inc. | Leveraging concepts with information retrieval techniques and knowledge bases |
US11250072B2 (en) | 2009-09-22 | 2022-02-15 | Verint Americas Inc. | Apparatus, system, and method for natural language processing |
US11727066B2 (en) | 2009-09-22 | 2023-08-15 | Verint Americas Inc. | Apparatus, system, and method for natural language processing |
US9563618B2 (en) | 2009-09-22 | 2017-02-07 | Next It Corporation | Wearable-based virtual agents |
US10795944B2 (en) | 2009-09-22 | 2020-10-06 | Verint Americas Inc. | Deriving user intent from a prior communication |
US10210454B2 (en) | 2010-10-11 | 2019-02-19 | Verint Americas Inc. | System and method for providing distributed intelligent assistance |
US11403533B2 (en) | 2010-10-11 | 2022-08-02 | Verint Americas Inc. | System and method for providing distributed intelligent assistance |
US10983654B2 (en) | 2011-12-30 | 2021-04-20 | Verint Americas Inc. | Providing variable responses in a virtual-assistant environment |
US9836177B2 (en) | 2011-12-30 | 2017-12-05 | Next IT Innovation Labs, LLC | Providing variable responses in a virtual-assistant environment |
US11960694B2 (en) | 2011-12-30 | 2024-04-16 | Verint Americas Inc. | Method of using a virtual assistant |
US10379712B2 (en) | 2012-04-18 | 2019-08-13 | Verint Americas Inc. | Conversation user interface |
US11029918B2 (en) | 2012-09-07 | 2021-06-08 | Verint Americas Inc. | Conversational virtual healthcare assistant |
US11829684B2 (en) | 2012-09-07 | 2023-11-28 | Verint Americas Inc. | Conversational virtual healthcare assistant |
US9824188B2 (en) | 2012-09-07 | 2017-11-21 | Next It Corporation | Conversational virtual healthcare assistant |
US20140258939A1 (en) * | 2013-03-08 | 2014-09-11 | Information Resources, Inc. | Selection of hierarchically organized items |
US9606700B2 (en) * | 2013-03-08 | 2017-03-28 | Information Resources, Inc. | Selection of hierarchically organized items |
US10445115B2 (en) * | 2013-04-18 | 2019-10-15 | Verint Americas Inc. | Virtual assistant focused user interfaces |
US20140317502A1 (en) * | 2013-04-18 | 2014-10-23 | Next It Corporation | Virtual assistant focused user interfaces |
US11099867B2 (en) | 2013-04-18 | 2021-08-24 | Verint Americas Inc. | Virtual assistant focused user interfaces |
US9823811B2 (en) | 2013-12-31 | 2017-11-21 | Next It Corporation | Virtual assistant team identification |
US10088972B2 (en) | 2013-12-31 | 2018-10-02 | Verint Americas Inc. | Virtual assistant conversations |
US9830044B2 (en) | 2013-12-31 | 2017-11-28 | Next It Corporation | Virtual assistant team customization |
US10928976B2 (en) | 2013-12-31 | 2021-02-23 | Verint Americas Inc. | Virtual assistant acquisitions and training |
US10545648B2 (en) | 2014-09-09 | 2020-01-28 | Verint Americas Inc. | Evaluating conversation data based on risk factors |
US9838347B2 (en) | 2015-03-11 | 2017-12-05 | Microsoft Technology Licensing, Llc | Tags in communication environments |
US10462087B2 (en) | 2015-03-11 | 2019-10-29 | Microsoft Technology Licensing, Llc | Tags in communication environments |
US20160269341A1 (en) * | 2015-03-11 | 2016-09-15 | Microsoft Technology Licensing, Llc | Distribution of endorsement indications in communication environments |
US11568175B2 (en) | 2018-09-07 | 2023-01-31 | Verint Americas Inc. | Dynamic intent classification based on environment variables |
US11847423B2 (en) | 2018-09-07 | 2023-12-19 | Verint Americas Inc. | Dynamic intent classification based on environment variables |
US11989521B2 (en) | 2018-10-19 | 2024-05-21 | Verint Americas Inc. | Natural language processing with non-ontological hierarchy models |
US11196863B2 (en) | 2018-10-24 | 2021-12-07 | Verint Americas Inc. | Method and system for virtual assistant conversations |
US11825023B2 (en) | 2018-10-24 | 2023-11-21 | Verint Americas Inc. | Method and system for virtual assistant conversations |
US11507559B2 (en) * | 2020-05-25 | 2022-11-22 | Hewlett Packard Enterprise Development Lp | Object sharing by entities using a data structure |
Also Published As
Publication number | Publication date |
---|---|
US20140282108A1 (en) | 2014-09-18 |
US9483161B2 (en) | 2016-11-01 |
US20160259506A1 (en) | 2016-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9483161B2 (en) | Controllable display of a collaboration framework system | |
US10303347B1 (en) | Systems and methods for customizing sub-applications and dashboards in a digital huddle environment | |
TWI693523B (en) | Online collaboration systems and methods | |
CN109891446B (en) | Template-based calendar event with graphics enrichment | |
US11496460B2 (en) | Dynamic, customizable, controlled-access child outcome planning and administration resource | |
US8762870B2 (en) | Multifunction drag-and-drop selection tool for selection of data objects in a social network application | |
DE102017121758A1 (en) | Intelligent assistant for repeated actions | |
US20130246525A1 (en) | Instant transition from a public conversation thread to a private chat or instant message environment | |
US20130239180A1 (en) | Web-based conference collaboration tool with dynamic content and roles | |
DE102011010440A1 (en) | DEVICE SURFACES FOR USER ROLL, CONTEXT AND FUNCTION AND SUPPORT SYSTEM MASHUPS | |
US11729228B2 (en) | Systems and methods for sharing content externally from a group-based communication platform | |
CN103563344B (en) | Method and apparatus for joining a meeting using the presence status of a contact | |
US11652858B2 (en) | Integration of communication platform functionality with a third-party application | |
US12058185B2 (en) | Channel generation in a communication platform | |
US11079907B1 (en) | Systems and methods for reacting to messages | |
US10528211B2 (en) | Computing systems and processes for simultaneous co-development of dashboard interfaces | |
US11991135B2 (en) | Differentiated message presentation in a communication platform | |
US11822764B2 (en) | User interface for searching content of a communication platform using reaction icons | |
US20240095459A1 (en) | Topic Identification Based on Virtual Space Machine Learning Models | |
US20150278718A1 (en) | Systems and methods for communication sharing in a relationship management system | |
WO2023235149A1 (en) | Generating collaborative documents for virtual meetings in a communication platform | |
Comes et al. | From fuzzy exploration to transparent advice: insights into mobile advisory services | |
EP2779060A1 (en) | Context frame for sharing context information | |
US11294549B1 (en) | Systems and methods for customizing sub-applications and dashboards in a digital huddle environment | |
US11968244B1 (en) | Clustering virtual space servers based on communication platform data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GROUPSYSTEMS CORPORATION D/B/A THINKTANK BY GROUPS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WENGER, MATT;ALLISON, TRAVIS R.;MACLEAN, JUSTIN K.;SIGNING DATES FROM 20140310 TO 20140312;REEL/FRAME:032617/0512 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |