Nothing Special   »   [go: up one dir, main page]

US20140372955A1 - Visual selection of an anatomical element for requesting information about a medical condition - Google Patents

Visual selection of an anatomical element for requesting information about a medical condition Download PDF

Info

Publication number
US20140372955A1
US20140372955A1 US14/477,540 US201414477540A US2014372955A1 US 20140372955 A1 US20140372955 A1 US 20140372955A1 US 201414477540 A US201414477540 A US 201414477540A US 2014372955 A1 US2014372955 A1 US 2014372955A1
Authority
US
United States
Prior art keywords
user
annotation
anatomical
medical
medical conditions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/477,540
Inventor
Matthew M. Berry
Robert M. Berry
Wesley D. Chapman
Shawn B. Saunders
Michael V. Caldwell
Spencer T. Hall
Christopher T. Owens
Daniel D. Lyman
Darren L. Turetzky
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cochlear Ltd
Original Assignee
Orca Health Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=46236179&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20140372955(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Orca Health Inc filed Critical Orca Health Inc
Priority to US14/477,540 priority Critical patent/US20140372955A1/en
Assigned to Orca MD, LLC reassignment Orca MD, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERRY, ROBERT M., BERRY, MATTHEW M., CALDWELL, MICHAEL V., CHAPMAN, WESLEY D., HALL, SPENCER T., LYMAN, DANIEL D., OWENS, CHRISTOPHER T., SAUNDERS, SHAWN B., TURETZKY, DARREN L.
Assigned to ORCA HEALTH, INC. reassignment ORCA HEALTH, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Orca MD, LLC
Publication of US20140372955A1 publication Critical patent/US20140372955A1/en
Assigned to WORKMAN NYDEGGER PC reassignment WORKMAN NYDEGGER PC LIEN (SEE DOCUMENT FOR DETAILS). Assignors: ORCA HEALTH, INC
Assigned to ORCA HEALTH, INC. reassignment ORCA HEALTH, INC. RELEASE OF ATTORNEY'S LIEN Assignors: WORKMAN NYDEGGER
Assigned to COCHLEAR LIMITED reassignment COCHLEAR LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ORCA HEALTH INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • A61B5/748Selection of a region of interest, e.g. using a graphics tablet
    • G06F19/321
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0295Operational features adapted for recording user messages or annotations

Definitions

  • This invention relates to systems, methods, and computer program products related to the interactive annotation of displayed graphical elements and to providing users with information about selected graphical elements.
  • Annotation typically involves appending descriptive information to objects.
  • annotation may involve the addition of handwritten notes on paper documents.
  • annotation can involve appending handwritten notes, markings, etc. to content of the paper documents.
  • Implementations of the present invention include systems, methods and computer program products configured for annotating and sharing graphical elements, as well as for providing information surrounding selected graphical elements.
  • an annotation computing provides one or more interactive user interfaces for managing, viewing, displaying, manipulating, and/or annotating the graphical elements.
  • One or more embodiments also provide a central clearinghouse for obtaining, storing, tracking, and sharing information, including annotated and/or un-annotated graphical elements.
  • some embodiments can extend to receiving information about experienced medical conditions and to providing medical information or advice about the inputted condition(s).
  • a method for providing information about a medical condition based on user input can include presenting a user with a user interface that in includes an anatomical assembly representing an anatomical region of the human body.
  • the user interface can also include user-selectable display elements which, when selected, indicate areas in the anatomical assembly corresponding to areas in which a medical symptom is experienced.
  • User input selecting one or more of the user-selectable elements of the anatomical assembly can be received.
  • the user can be presented with a selection of medical conditions corresponding to the selected elements.
  • the user can be presented with information about one or more medical specialists knowledgeable about the medical conditions and the selected elements.
  • FIG. 1 illustrates a schematic diagram of an annotation environment for annotating and sharing records in accordance with one or more implementations of the invention
  • FIG. 2 illustrates a layout of a management user interface of an annotation system, in accordance with one or more implementations of the invention
  • FIG. 3 illustrates a layout of an annotation user interface of an annotation system, in accordance with one or more implementations of the invention
  • FIG. 4 illustrates a communications user interface 400 of an annotation system, in accordance with one or more implementations of the invention
  • FIG. 5A illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention
  • FIG. 5B illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention
  • FIG. 5C illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention.
  • FIG. 6 illustrates a layout of an anatomical selection user interface, in accordance with one or more implementations of the invention
  • FIG. 7 illustrates a layout of a condition information user interface, in accordance with one or more implementations of the invention.
  • FIG. 8 illustrates a layout of a user interface which includes an interactive anatomical display, in accordance with one or more implementations of the invention
  • FIG. 9 illustrates a flowchart of a series of acts in a method, in accordance with an implementation of the present invention, for annotating a graphical element
  • FIG. 10 illustrates a flowchart of a series of acts in a method, in accordance with an implementation of the present invention, for providing information about a medical condition based on user input.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below.
  • Embodiments described herein are generally directed to methods, systems, and computer storage media configured for annotating graphical elements, which can include images, videos, models, and the like. Embodiments of the invention can also include accessing and transmitting the annotations.
  • an annotation computing system receives, generates, or obtains one or more graphical elements and provides one or more interactive user interfaces for managing, viewing/displaying, manipulating, and/or annotating the graphical elements.
  • Graphical element manipulation includes any form of image or video manipulation, such as rotating, zooming, color modification and/or adjustment, cropping, trimming, joining, and the like.
  • Graphical element annotation includes any form of descriptive commenting or enhancement to graphical elements, such as the addition of one or more highlights, selections, shapes, objects, textual comments, visual or audible enhancements, etc.
  • audible annotations can also be applied to and associated with the graphical elements.
  • Embodiments also include a central clearinghouse system for obtaining, storing, tracking, and sharing information, including annotated and/or un-annotated graphical elements.
  • a clearinghouse can connect the annotation computing system with third-party sources and destinations of information, other annotation computing systems, or other clearinghouses.
  • At least one embodiment also includes sharing and obtaining graphical elements separate from any clearinghouse(s).
  • some embodiments can extend to methods, systems, and computer storage media for inputting information about experienced medical conditions (e.g., experience pain), and receiving medical information or advice about the inputted condition. These embodiments can extend to local or remote diagnosis, education, and investigation about medical conditions, tracking medical histories, etc. These embodiments can be employed separately from or in connection with the annotation computing system and the clearinghouse.
  • experienced medical conditions e.g., experience pain
  • medical information or advice about the inputted condition e.g., experience pain
  • These embodiments can extend to local or remote diagnosis, education, and investigation about medical conditions, tracking medical histories, etc.
  • FIG. 1 illustrates a schematic diagram of an annotation environment 100 for annotating and sharing records, including graphical elements, in accordance with one or more implementations of the invention.
  • the annotation environment 100 can include an annotation system 104 communicatively coupled with one or more data sources 102 and one or more data destinations 108 .
  • the annotation environment 100 can also include the annotation system 104 communicatively coupled with and a central record clearinghouse 106 , which can, in turn, be coupled with the data source(s) 102 and destination(s) 108 .
  • the clearinghouse 106 can serve as a central repository for storing and tracking records and for communicating records between the annotation system 104 , the data source(s) 102 , and the destination(s) 108 .
  • the annotation system 104 can comprise any computing system capable of receiving, sending, and managing records, as well as annotating graphical elements.
  • the annotation system 104 may take the form of a desktop or laptop computer, a personal desktop assistant (PDA), a tablet computer (e.g., iPad), a phone device or other smart device, etc.
  • the annotation system 104 can include a management component 104 A, an annotation component 104 B, and a communications component 104 C.
  • the annotation system 104 can also include any additional components 104 D. It will be appreciated that the annotations system 104 can consist of a standalone server system or can comprise a distributed system, such as can be distributed throughout a computing cloud.
  • the management component 104 A can manage one or more textual, graphical and/or audible records.
  • a record which can apply to any contextual topic, will include one or more graphical elements.
  • the annotation environment 100 can, in one or more implementations, comprise a medical annotation environment.
  • each record may be a different medical record for different patients, and each medical record may include one or more graphical elements (e.g., medical images).
  • each medical record may correspond with a different condition connected with the patient, a different doctor visit by the patient, different categories of disease, etc.
  • a record may simply be the graphical element itself.
  • the graphical elements can be any appropriate graphical elements including images, videos, and models.
  • the graphical elements can include images, videos, or models of human or animal anatomical features.
  • the graphical elements may be virtual anatomical models, Magnetic Resonance Imaging (MRI) scans, X-Ray images, Computerized Axial Tomography (CT or CAT) scans, photographs, endoscopic images or videos, etc.
  • MRI Magnetic Resonance Imaging
  • CT or CAT Computerized Axial Tomography
  • the graphical elements will correspond directly to the applicable industry.
  • the graphical elements may be engineering schematics, design photographs, artwork, etc.
  • the management component 104 A can organize the records into one or more categories.
  • categories may include patients, doctors, medical conditions, hospitals or offices, severity levels, time periods, etc.
  • the management component 104 A can also provide functionality for adding or removing categories, adding or removing records or graphical elements, selecting one or more categories or records view viewing or editing, selecting one or more graphical elements for annotation, etc.
  • the management component 104 A can also provide functionality for sending or receiving records or graphical elements via the communications component 104 C.
  • FIG. 2 illustrates a layout of a management user interface 200 of the annotation system 104 , for managing medical records, consistent with to one or more implementations.
  • the management user interface 200 can include one or more interface controls 202 for adding categories and/or records. As illustrated, each category may correspond to a different patient, and each record may correspond to one or more graphical elements. However, other combinations are also possible.
  • medical record information can be stored and associated with different owners, different medical professionals, different conditions, etc.
  • Various property information and schemas can be used to group medical records or other medical or graphical information according to any desired need or preference.
  • the management user interface 200 can also include one or more interface controls 204 for searching existing categories and/or records. Additionally, the management user interface 200 can include a listing 206 of existing categories or records. As illustrated, the listing 206 can be an alphabetical listing of patents. However, any format of the listing 206 is available, such as drop-down menus, tables, combo boxes, etc. In connection with the listing 206 of categories or records, the management user interface 200 can also include a detailed view 208 of graphical elements or a display frame for displaying any combination of graphical elements available in a selected category or record. The graphical elements can be collated, grouped, and displayed in any desired format. The graphical elements can also comprise selectable links to other types of records, such as sound recordings or multimedia files, for example.
  • the management user interface 200 can be used to access, display, and annotate the graphical elements, or the linked to files (e.g., sound files, multimedia files, etc.).
  • the management user interface 200 provides a doctor with a listing of medical records for his or her patients, including records associated with Aaron Anderson, who has been selected. Graphical elements, or any other elements relevant to Aaron Anderson, are shown.
  • the graphical elements can be grouped or categorized, such as by type, a date on which the graphical elements were generated or modified, a date on which the graphical elements were obtained by the annotation system 104 , a body part, or according to any other desired characterization.
  • the grouping can be performed automatically, when received, such as by parsing data on the records, or manually, as desired.
  • the management user interface 200 can also include one or more interface controls for obtaining additional records and/or graphical elements, and one or more interface controls for removing records and/or graphical elements. Additionally, the management user interface 200 can provide one or more interface controls for selecting a graphical element for making, editing, and/or accessing annotations. Further, the management user interface 200 can provide one or more interface controls for sending graphical elements to a destination 108 (e.g., patients, doctors, dentists, clearinghouses, or any other appropriate entity).
  • a destination 108 e.g., patients, doctors, dentists, clearinghouses, or any other appropriate entity.
  • the annotation component 104 B of the annotation system 104 can provide one or more user interfaces and/or controls for annotating graphical elements.
  • Annotations can include any combination of colors, highlights, animations, images, audio, video, text, and the like, which is added to a record or an element of the record and which is thereafter associated with the record/record element.
  • the annotative elements can be applied from a library of available elements, or can be applied in a free-form manner (e.g. dynamic color selectors, image import, free-hand drawing).
  • pre-built or pre-scripted annotations are presented for selection and annotation of a graphical element.
  • Each annotation can also include descriptive comments or other information added by the annotator.
  • Descriptive comments can include text, drawings, images, audio recordings, video recordings, and the like.
  • the annotation component 104 B can apply optical character recognition or handwriting recognition technology to produce textual output from image or video data.
  • the annotation component 104 B can also apply text-to-speech or speech-to-text technology to produce audible output from textual data or to transcribe audible data to textual data.
  • the annotation system 104 may permit the automatic and dynamic conversion of these items (e.g., text can be dynamically converted to audio, and audio can be dynamically transcribed to text).
  • annotations can be visibly presented next to the corresponding element(s) that are annotated. This can also include presenting an annotation symbol comprising a selectable link to audio, textual, or graphical annotations.
  • the symbol or link can be presented as text or image data that is colored, sized, or presented in a distinguishing way and that, when selected, accesses or provides access to the annotation(s).
  • the graphical elements can comprise anatomical features (e.g., images, videos, anatomical models).
  • the annotations can be applied to highlight one or more injuries or conditions, to indicate the severity of injury, to illustrate treatment options, to indicate healing or degenerative progress over time, to illustrate medical procedures, to indicate medication options, and the like.
  • different shapes can illustrate different types of injuries or conditions, while different colors can represent different severity levels (or vice-versa).
  • the annotation can include modifying a display characteristic of a graphical element that was already included as part of the record, such as by enhancing coloring, contrast or other display characteristics.
  • a doctor may use the annotation system to track the treatment of a patient. With each visit, the doctor can capture a graphical representation of the patient's injury or condition, and make one or more annotations highlighting the healing or degenerative process.
  • the annotated graphical images represent a comprehensive temporal visual history of the treatment process, complete with targeted and appropriate comments in the form of annotations.
  • Graphical representations of the annotations can reflect different stages or diagnosis. This history may be useful to the doctor, other doctors, the patient, insurance carriers, educators, etc.
  • the patient can send graphical images to the doctor (e.g., between physical visits) so that the doctor can track progress and make detailed recommendations. These recommendations can be communicated back to the patient, at least in part, via annotations made to or with the graphical images.
  • the annotation component 104 B can also provide one or more user interfaces and/or controls for interactively manipulating the graphical elements prior to, during, or after annotation.
  • the annotation component 104 B can include tools for rotating, zooming, color adjustment, cropping, trimming, joining, and the like.
  • the annotation component 104 B can provide rich interactive features, including the ability to rotate the graphical elements in up to 360°, the ability to apply motion to the graphical elements, the ability to display annotations or animations, etc. This interactively is illustrated in greater detail in subsequent Figures.
  • the annotation component 104 B can display and annotate both two-dimensional (2D) graphical elements and three-dimensional (3D) graphical elements. Furthermore, the annotation component 104 B can convert 2D graphical elements to 3D graphical elements, and vice-versa. For example, the annotation component 104 B can convert 2D graphical element to three dimensions by using automatic algorithms, user input, model information, etc. Additionally, the annotation component 104 B can convert a 3D graphical element to two dimensions via flattening, cross-sectioning, etc.
  • annotations When applied to 3D graphical elements, annotations may be visible only when the annotation rotated into view. Alternatively, annotations may be made visible through a visual cue that becomes more prominent as the annotation is rotated into view. Of course, annotations may also be permanently visible regardless of the current view. Similarly, when applied video graphical elements, annotations may only be visible during temporal periods in which the annotation was applied, or may be permanently visible or accessible through bookmarks, time stamps, etc. Annotations can also be selectively displayed in response to receiving predetermined input, such as, but not limited to authorization information, query information and so forth.
  • FIG. 3 illustrates one embodiment of an annotation user interface 300 layout that can be utilized by the annotation system 104 .
  • the annotation user interface 300 can display one or more graphical elements, which in this circumstance comprise the image of the skeletal anatomy of a human shoulder.
  • the annotation user interface 300 also provides one or more annotation interface controls 302 for annotating the image, configuring the annotation user interface 300 , etc.
  • the user interface 300 can present one or more annotation dialogue 304 , 306 , 308 , along with any appropriate annotation options.
  • the annotation dialogue ( 304 , 306 , 308 ) may appear after selection of one or more areas of the graphical element.
  • an annotation dialogue ( 304 , 306 , 308 ) may appear after a user clicks, touches, or otherwise selects a point or region of the graphical element, with the annotation applying to the selected element.
  • the user can customize the annotation in any appropriate manner using the annotation options associated with the annotation dialogue ( 304 , 306 , 308 ).
  • the annotation user interface 300 can also include annotation options that are separate from an annotation dialogue.
  • the annotation options can include selection from among of dots, circles, arrows, text input, audio recording, audio playback, etc. that are to be applied to a point or region of the graphical element (i.e., the image of the shoulder).
  • the annotation options are not limited to those shown.
  • the annotation options can also include options to add (or remove) one or more colors, to add (or remove) a plurality of shapes or graphical objects (e.g., polygons, arrows, bullets, stars, numbers), to add (or remove) animations or images, etc.
  • Annotation dialogue 304 includes several annotation options, such as an annotation option 310 for recording audio or video (shown here as selected and currently recording).
  • annotation dialogue 304 shows that an annotation option 312 for no highlighting has been selected.
  • an annotation option 312 for no highlighting has been selected.
  • annotation dialogue 306 the annotation option 310 for recording audio has not been selected.
  • an annotation option 314 for selecting a circle has been selected.
  • a circular highlight 328 is displayed on the shoulder corresponding to the area of annotation.
  • selecting the annotation option 314 may also present additional menus for selecting alternate shapes, colors, sizes, transparency, etc.
  • Annotation dialogue 308 shows that an annotation option 316 has for selecting a pointer has been selected.
  • an arrow 320 is displayed on the shoulder which points to the selected area of annotation.
  • additional menus may be used to select a particular pointer type, color, size, shape, transparency, etc.
  • annotation dialogue 308 shows that media (e.g., audio or video) has been recorded, which is associated with element 320 , and that the media can be played back or deleted using the corresponding controls ( 318 and 320 ) and interface menus.
  • annotation dialogues 304 , 306 , 308 are exemplary only and are not limiting to the annotations dialogues available to the present invention.
  • the annotation user interface 300 can also display relevant identifying information 322 for the graphical element.
  • the identifying information 322 can include a patient's name, a date (e.g., the date when the graphical element was generated, or the date when the annotation system 104 received the graphical element), and an image identifier, etc.
  • the identifying information 322 can be part of the graphical element itself, and can used as the basis of gathering additional information about the graphical element (e.g., through the use of optical character recognition technology).
  • the annotation user interface 300 can also provide a plurality of user interface controls for manipulating the graphical element.
  • FIG. 3 illustrates that the annotation user interface 300 can include a brightness/contrast selector 324 that provides one or more user interface controls for adjusting the brightness and contrast of the image of the graphical element.
  • any number of manipulation controls can be provided for manipulating graphical elements (including size, shape, brightness, contrast, animations, etc.), whether they be image, video, or otherwise.
  • Such manipulation controls can include user interface controls for cropping, rotating, trimming, joining, etc.
  • annotation user interface can provide any number of interactivity controls that provide rich user interaction with the graphical element.
  • These can controls can include as one or more user interface controls for rotating the graphical element in up to 360°, zooming, rendering motion to the graphical element, converting the graphical element from three-dimensions to two-dimensions (and vice-versa), etc.
  • the communications component 104 C of the annotation system 104 can receive records and graphical elements from the data source(s) 102 (either directly, or via the clearinghouse 106 ) and can send records and graphical elements to the clearinghouse 106 and/or to the destination(s) 108 .
  • the communications component 104 C can communicate directly with the data source(s) 102 and the destination(s) 108 , or can communicate indirectly (e.g., via the clearinghouse 106 ).
  • the communications component 104 C may comprise a plurality of components, such as one for receiving data and one for sending data.
  • communications within the annotation environment 100 can occur over any appropriate connections, such as wireless connections 110 A (e.g., WiFi, Bluetooth, infra-red), or wired connections 110 B (e.g., network, USB, FireWire, internal bus). It will be appreciated that connections can be made directly (e.g., via a USB connection between the data source 102 and the annotation system 104 ), or indirectly (e.g., when communications occur over a local or wide area network, or when communications occur via the clearinghouse 106 ). Regardless of the connection type, communications protocols can also take a variety of forms, such as electronic messages 110 C or any other communications protocols 110 D. As indicated by the vertical ellipses 110 E, any other appropriate connection types and communications protocols are available within the annotation environment 100 . For example, one or more implementations may use facsimile, SMS, MMS, and so forth.
  • wireless connections 110 A e.g., WiFi, Bluetooth, infra-red
  • wired connections 110 B e.g., network, USB, FireWire,
  • the data source(s) 102 should be construed broadly to include any appropriate source(s) of records and/or graphical elements.
  • graphical elements can originate from one or more doctors 102 A (e.g., family practice doctors, radiologists, emergency care doctors, physical therapists, dentists, veterinarians), one or more patients 102 B, one or more hospitals 102 C, or one or more insurance providers 102 D.
  • the graphical elements can also originate from and be automatically attached to a record by the annotation system 104 (e.g., from a still camera or a video camera 102 E, a motion capture device, or any other acquisition device at the annotation system 104 , or from a direct upload to the annotation system 104 , in combination with system logic and tagging mechanisms).
  • the data source 102 is separate from other illustrated components, such as the annotation system 104 and the clearinghouse 106 , while in other instances the data source 102 is the same as or part of the other illustrated components.
  • Different graphical display characteristics e.g., size, color, shape, transparency, etc.
  • the data source 102 can include any number of additional sources of data for records that can be annotated or from which annotations can be derived.
  • the annotation system 104 can also import information (e.g., patient contact information) automatically from one or more external databases.
  • the annotation system 104 can also use internal sources.
  • the annotation system 104 obtains at least some information using optical character recognition of information included in the graphical elements (e.g., identifying information 306 ).
  • the annotation system 104 can also utilize speech recognition and/or handwriting recognition.
  • the graphical elements can be partially or entirely virtual (e.g., computer models). Additionally, graphical elements can incorporate, at least in part, motion capture data, metadata, audio data, etc.
  • the destination(s) 108 should also be construed broadly to include any appropriate destination of records and/or graphical elements.
  • records and/or graphical elements can be sent to one or more doctors 108 A, one or more patients 108 B, one or more hospitals 108 C, one or more insurance carriers 108 D, or even other annotation systems 108 E.
  • other destinations or combinations of destinations are also possible.
  • the data source 102 and the destination 108 may be the same.
  • a patient 102 B may send graphical element(s) to an annotation system 104 operated by his or her doctor. After receiving the graphical element(s), the doctor can annotate the graphical elements(s) and send the annotated graphical element(s) back to the patient 108 B and/or to another doctor 108 A.
  • patients can be provided access to records associated with their own medical history and/or their family members' medical histories.
  • a doctor 102 A e.g., a radiologist
  • the radiologist can subsequently send the annotated x-ray to any appropriate destination 108 (e.g., a family doctor 108 A, the patient 108 B, a hospital 108 C).
  • medical records are made available insurance entities corresponding to carriers of insurance policies. This way an insurance representative can access and view corresponding medical records and related data for the various carriers. In educational settings, students or professors can be provided access to historical medical records.
  • the various medical records can be annotated and stored with the medical annotations for review and later access according to the invention.
  • the medical records and annotations are stored separately, but remain linked by data maintained at the annotation system.
  • the record is sent with the appropriate annotations that are relevant and/or authorized for each corresponding recipient.
  • all annotations are included with the medical record and filtering is applied at the recipient system(s) to display only authorized annotations.
  • FIG. 4 illustrates a communications user interface 400 of the annotation system 104 , in accordance with one or more implementations, with which a user can send records and graphical elements (annotated or un-annotated) to destination(s) 108 and/or the clearinghouse 106 .
  • the user can customize a message or other data that accompanies the records and/or graphical elements as they are sent. For instance, a doctor may customize a message to another doctor or a patient, or may make any other comments. This data can be stored and transmitted/accessed with the other image data (like X-rays) or other medical records associated with a patient.
  • FIG. 4 illustrates the composition of an e-mail message
  • the communications user interface 400 can, in other embodiments, send graphical elements in other forms (e.g., MMS, direct upload).
  • the clearinghouse 106 can comprise one or more computing systems configured to receive and make records, graphical images, etc., accessible to data sources 102 , destinations 108 , and the annotation system 104 .
  • the clearinghouse 106 can, in some embodiments, comprise a “cloud” configuration which includes one or more servers that are separated from the annotation system 104 .
  • the clearinghouse 106 may also be a part of the annotation system 104 itself.
  • the clearinghouse 106 can employ any appropriate security and authentication mechanisms to protect any data stored therein from unauthorized access. Thus, communications between the clearinghouse 106 and any other component can be secured.
  • the annotation environment 100 can be used to communicate graphical elements between a variety of sources and destinations in a variety of contexts beyond the medical field. For example, scientists can use the annotation environment 100 to share and annotate their research data. Furthermore, engineers can use the annotation environment 100 to share and annotate their designs. Still further, artists can use the annotation environment 100 to share and annotate their artwork. Accordingly, as mentioned previously, while some of the Figures illustrate implementations related to the medical field, the disclosure herein should not be viewed limiting the annotation environment 100 to use by the medical field.
  • FIGS. 5A-5C a plurality of flow diagrams are illustrated to reflect some of the communication paths that can be utilized within the annotation environment 100 in accordance with one or more implementations.
  • the flow diagrams are for illustrative purposes only, and other communication paths are also possible.
  • Each of the flow diagrams may use all or only a part of the illustrated annotation environment 100 , as well as other components not previously discussed.
  • one or more implementations may include a web service 502 which may be provided in connection with, or separate from, the clearinghouse 106 .
  • FIG. 5A illustrates an embodiment in which the annotation system 104 communicates directly with the data source(s) 102 and the destination(s) 108 .
  • the annotation environment 100 may lack the clearinghouse 106 , or the annotation system 104 may refrain from using the clearinghouse 106 .
  • a data source 102 e.g., doctor 102 A
  • the annotation system 104 may acquire the graphical element(s) directly with an acquisition device (e.g., a camera 102 E), or graphical element(s) may be sent to the annotation system 104 via a wireless 110 A or a hard-wired 110 B connection.
  • the annotated graphical element(s) may be sent to a destination 108 (such as to a patient 108 B).
  • the doctor 102 A may use the communications user interface 400 at the annotation system 104 to send the annotated graphical element(s), along with any comments, to the patient 108 B via e-mail, MMS, direct upload, etc.
  • FIG. 5B illustrates an additional embodiment in which the annotation system 104 communicates with the data source(s) 102 and the destination(s) 108 via a web service 502 .
  • a doctor 102 A may upload a patient's records and/or graphical elements to the web service 502 .
  • the annotation system 104 which is in communication with the web service 502 , can receive the records and/or graphical elements. Then, the doctor 102 A can make any appropriate changes (e.g., annotations) at the annotation system 104 and send these changes back to the web service 502 . Subsequently, one or more destinations 108 can retrieve the records and/or graphical elements from the web service 502 .
  • the patient 108 B, a different doctor 108 A, a hospital 108 C, or any other destination 108 can retrieve the records and/or graphical elements from the web service 502 .
  • the data source(s) 102 and/or the destination(s) 108 may also be in direct communication with the annotation system 104 .
  • the web service 502 can be a standalone web service that is separated from the annotation system 104 , or may be a component or module of the annotation system 104 or the clearinghouse 106 .
  • FIG. 5C illustrates yet another embodiment in which the clearinghouse 106 is used to manage information from a variety of sources.
  • the clearinghouse 106 can be in communication with a plurality of hospitals 102 C.
  • the hospitals 102 C can store a plurality of records and graphical elements for a plurality of patients.
  • a doctor 102 A can then use the annotation system 104 to retrieve medical records for a particular patient and to perform any appropriate annotations.
  • the medical record, along with annotations, can then be synchronized back to the clearinghouse 106 .
  • a variety of destinations e.g., doctors 108 B, patients 108 A, hospitals 108 C, insurance companies 108 D
  • the foregoing annotation system may be used as part of a local or remote diagnosis and treatment system.
  • some embodiments include mechanisms that enable a user to select anatomical regions of the human body from a displayed anatomical subassembly. From this selection, the user can be presented with information about corresponding conditions, the selection can be used to track medical conditions over time, or the selection can be used for annotation using the annotation system 104 . Of course, the selection can be used for additional purposes beyond these examples.
  • FIG. 6 illustrates a layout of an anatomical selection user interface 600 .
  • a user e.g., a patient, a doctor or other entity
  • anatomical subassembly 602 representing an anatomical region of the human body (e.g., a human foot).
  • the user can select one more anatomical regions or elements (e.g. anatomical region 604 ), which can then be used for further processing or diagnosis, as described more fully below.
  • the user can be instructed to select an area in which pain is being experienced (i.e., “touch where it hurts”).
  • This information can then be used by a doctor or by a computing system to determine possible conditions that may cause pain in the selected anatomical region 604 . While the illustrated embodiment indicate that selection is based on experienced pain, selection can be based on any appropriate criteria, such as areas experiencing inflammation, areas of known injury, areas of discoloration or rash, areas of past treatment, etc.
  • the user may also be presented with one or more user interface elements (not illustrated) for identifying a relative measure of pain or perceived severity of a condition.
  • Relative measures/magnitudes can be provided in any number of ways, such as through the selection of a number from a predefined range (e.g., 1-10), selection of a color from a color scale, selection of a graphical image of from a set of graphical images (e.g., a face selected from a set of faces having varying degrees of smiles and frowns), and so forth.
  • Sets of objects having different sizes can also be selected from to reflect a relative magnitude.
  • This information may be recorded as an annotation in the user's medical history (as a medical record), or may be used to further evaluate and provide information related to corresponding specialists, or even a diagnosis. In this manner, for example, it may be possible to interact with a virtual doctor's office and corresponding attendee.
  • the user may also be presented with a selection of specialists who are knowledgeable about the anatomical subassembly 602 and/or the condition.
  • the selection of specialists can be local to the user's geographical area, or may include specialists from a broader geographical area. From the selection of specialists, the user can receive additional information about particular specialist(s) (e.g., cost, insurance affiliations, medical and educational credentials, contact information, photographs, reviews, hours of operation, and so forth). Further, the user can be presented with one or more options to contact the specialist directly.
  • any relevant information about the user that has been gathered may be sent to a doctor (e.g., a specialist) for remote diagnosis, or for helping a doctor to remotely guide the user through an investigation of possible conditions (e.g., condition 606 ).
  • a doctor e.g., a specialist
  • possible conditions e.g., condition 606
  • the gathered information (e.g., the selected region 604 and/or the severity) may be used for making annotations using the annotation system 104 , for forming a diagnosis, for educating the patient about conditions that may correspond to the selection, for engaging other doctors, for tracking the patient's medical history, etc.
  • FIG. 7 illustrates a layout of a condition information user interface 700 consistent with one or more implementations.
  • the condition information user interface 700 can present information about the selected condition 606 in the form of photographs, audio and/or video presentations, illustrations, text, annotations and the like. Additionally or alternatively (not illustrated), the user can be presented with information about one or more medical specialists who have expertise with the selected condition (as discussed previously).
  • further processing or diagnosis can also include tracking a condition over time and/or annotating a medical record with the annotation system 104 .
  • the selected region can be used as an aid in annotating a graphical element, such as a user or patient's X-Ray or MRI image.
  • selection of an anatomical region can be made directly from the graphical element (e.g. from the user or patient's X-Ray or MRI image), or of from a simulated anatomical subassembly.
  • selection of an anatomical region can be made on a simulated anatomical subassembly and then the selection can be transferred or overlaid on the displayed graphical element.
  • the selected anatomical region can be the basis of annotation, and this information can be saved (e.g. as a medical file at the clearinghouse 106 ) for future retrieval, or to maintain a record of the condition.
  • this information can be saved (e.g. as a medical file at the clearinghouse 106 ) for future retrieval, or to maintain a record of the condition.
  • a subsequent selection of the anatomical region 604 , along with any annotations, can also be saved (e.g. at the clearinghouse 106 ) and/or can be compared to previously saved medical records.
  • Any display element (e.g., a graphical element being annotated, or the anatomical subassembly 602 ) can be displayed statically or dynamically.
  • a user interface can enable a user to interactively rotate the display element in up to 360°, to selectively display descriptive annotations, to add or remove anatomical layers, to animate the displayed element through one or more types of motion, etc.
  • the user interface can also display the display element dynamically without the use of user input as well.
  • FIG. 8 illustrates a layout of an interactive user interface 800 which includes an interactive anatomical display, according to one or more implementations.
  • a user can interactively modify the display of an anatomical display element 802 using one or more interactive display options 804 , one or more motion selectors 806 , or input from a user input devices, etc.
  • the interactive display options 804 can be used to selectively add or remove anatomical layers (e.g., nerves, muscle, tendon, bone), or to selectively add or remove annotations (e.g., anatomical labels).
  • anatomical layers e.g., nerves, muscle, tendon, bone
  • annotations e.g., anatomical labels
  • the interactive display options 804 can be used to select a type of motion to apply to the display element 802 .
  • the display element 802 can be automatically rotated, or selectively rotated based on user input. Once selected, the user may be able to click and/or drag on the display element 802 directly, or perform any other appropriate user input (e.g., using a separate user interface control. Other motions are also possible, such as motions that the display element 802 may actually perform (e.g., walking motion, ankle movement, and toe movement, etc.).
  • the user may use the motion selector 806 to perform the motion.
  • the motion selector 806 can take any form appropriate to a motion type, such as buttons, sliders, switches, virtual or actual d-pads, virtual or actual joy-sticks, etc.
  • any of the text and any of the visual icons and objects displayed throughout the various user interfaces can be selectable links to initiate a function or to select the displayed item or a corresponding item, as generally described and inferred from the foregoing.
  • the user interfaces can be optimized for touch user interaction (e.g., via a tablet computer), while in others the application interfaces can be optimized for other types of user interaction, such is with pointer-based input devices.
  • the selection of a region or element on an anatomical assembly or graphical element can include having a user make several selections through one or more related interfaces that allow a user to drill down from a first region, to a sub-region, to a specific element, for example. Any number or combination of menu selections may be made.
  • the user can also make a selection from a pull-down menu that lists different elements/regions/features available for selection and/or make a selection of an element/region on an anatomical feature/assembly/subassembly, and so forth. Once a selection is made, it can be highlighted or otherwise visually modified to reflect the selection and to provide one or more interface elements for receiving and storing related annotations to the selected element(s) and/or for providing information related to the selected element(s).
  • FIGS. 1-8 provide a number of components, mechanisms, and user interfaces for annotating and sharing graphical elements, as well as components, mechanisms, and user interfaces for inputting information about medical conditions for diagnosis, education, tracking medical histories, etc.
  • One or more disclosed implementations can enable rich, easy, and intuitive annotations, as well as flexible management and sharing of annotated graphical elements.
  • One or more disclosed implementation can also enable remote diagnosis and/or guiding a user to information about a medical condition.
  • FIGS. 9-10 illustrate flowcharts of computerized methods of annotating graphical elements and for selecting anatomical regions for further investigation.
  • FIG. 9 illustrates a flowchart of a method for annotating a graphical element.
  • FIG. 10 illustrates a flowchart of a method for providing information about a medical condition based on user input. The acts of FIGS. 9 and 10 are described herein below with respect to the schematics, diagrams, devices and components shown in FIGS. 1-8 .
  • FIG. 9 shows that a method for annotating a graphical element can comprise an act 902 of displaying graphical element(s).
  • Act 902 can include displaying one or more graphical elements at a user interface that includes one or more user-selectable annotation options for selecting one or more areas of the graphical elements for annotation.
  • Act 902 can include the annotation user interface 300 displaying a graphical element, either statically or dynamically.
  • the annotation user interface 300 can include any number of user-selectable graphical element manipulation options, such as color control options, cropping options, rotation options, drawing options, and so forth.
  • the annotation user interface 300 can, in some embodiments, also employ a rich interactive display, such as the interactive user interface 800 of FIG. 8 , which can add or remove layers to or from the graphical element, apply motion to the graphical element, rotate the graphical element in 360°, etc.
  • Act 902 can also include the annotation user interface 300 displaying one or more annotation interface controls 302 .
  • the interface controls 302 can include one or more tools for selecting regions for annotation, tools for launching an annotation dialogue 304 , and so forth.
  • the displayed graphical element may be a medical or anatomical image, such as the illustrated human shoulder.
  • FIG. 9 also includes an act 904 of receiving user input selecting an annotation area.
  • Act 904 can include receiving user input selecting one or more areas of the graphical elements for annotation.
  • act 904 can include the user selecting one or more regions of the graphical element by direct interaction with the graphical element (e.g., by directly clicking, touching, dragging, pinching, zooming, etc. on the graphical element).
  • act 904 can also involve the use of selection tools, menus, buttons, etc. These tools may be provided as part of the interface controls 302 , or as part of any other user interface element.
  • FIG. 9 also includes an act 906 of displaying annotation dialogue(s).
  • Act 906 can include displaying an annotation dialogue which provides selection of one or more annotation options for annotating the selected one or more areas.
  • the annotation dialogue can include one or more highlighting options, including one or more of shapes or colors.
  • the annotation dialogue can include other highlighting options, such as images, animations, and so forth.
  • the annotation dialogue can include one or more comment input options, including one or more of text or audio input. Other comment options are also available, such as hand-drawing, video attachment or recording, etc.
  • the illustrated method includes an act 908 of receiving user input selecting highlighting and comment options.
  • Act 908 can include receiving user input selecting one or more of the highlighting options.
  • the user can select one or more shapes, colors, etc. from the annotation dialogue, or any other associated user interface controls.
  • the selected highlighting options include both a selection of shape (e.g., a circle) and a selection of color (e.g., red).
  • shape e.g., a circle
  • color e.g., red
  • selection of a shape can indicate a medical condition
  • selection of a color can indicate a severity level of the medical condition.
  • Act 908 can also include receiving user input selecting one or more of the comment input options. Selecting a comment input option can include selecting a comment type (e.g., text, audio or video recording, handwriting, drawing, etc.).
  • a comment type e.g., text, audio or video recording, handwriting, drawing, etc.
  • the annotation dialogue 304 can include pre-selected highlighting and/or comment input options, and that comment input options can be selected by their mere use (e.g., a text comment type can be selected by the user beginning to enter text).
  • FIG. 9 also shows an act 910 of receiving comment input.
  • Act 910 can include receiving user comment input and inputting at least one comment corresponding to the user comment input.
  • the user can type or otherwise enter text via the annotation dialogue, or an associated user interface control, using a physical or virtual keyboard.
  • the user can also record audio and/or video comments.
  • the act can include transcribing audio input into the at least one comment.
  • FIG. 9 also identifies an act 912 of displaying the annotated graphical element(s).
  • This act can include displaying the one or more graphical elements along with the selected annotation, including the selected highlighting options and the inputted at least one comment.
  • the annotation user interface 300 can display the graphical element along with any appropriate visual cues indicating that an annotation exists. When selected, the visual cue can be replaced or expanded to fully display any highlighting options and/or the entered comments.
  • the method can include any number of additional acts.
  • the method can also include an act of uploading the annotated graphical element(s) to a clearing house, or of sending the annotated graphical element(s) to a destination.
  • the method can include uploading the one or more graphical elements and the selected annotation as a medical file to a clearinghouse.
  • the medical file can be accessed from the clearinghouse and displayed with a selectable graphical indicator of proximate areas of the selected annotation which, when selected, render the selected annotation.
  • Rendering the selected annotation can comprise playing an audio comment, or playing a recorded video.
  • FIG. 10 illustrates that one or more additional implementations of providing information about a medical condition based on user input can comprise an act 1002 of presenting an anatomical subassembly.
  • Act 1002 can include presenting a user with an anatomical subassembly representing an anatomical region of the human body, including one or more user-selectable display elements which, when selected, indicate one or more an areas in the anatomical subassembly representing areas in which a medical condition is experienced in the human body.
  • a user e.g., a patient or a doctor
  • the medical condition can be pain experienced in the human body.
  • the anatomical subassembly 602 can presented in any appropriate static or dynamic manner, and interactive tools can be provided, such as tools for rotating the anatomical subassembly to provide a 360° view of the anatomical subassembly 602 .
  • FIG. 10 illustrates that embodiments of the invention also include an act 1004 of receiving input selecting a display element, such as, for example, input selecting one or more of the user-selectable elements of the anatomical assembly.
  • a display element such as, for example, input selecting one or more of the user-selectable elements of the anatomical assembly.
  • the user touches or clicks on anatomical region/element 604 to select that region/element of the anatomical subassembly 602 .
  • This selection can also include presenting the user with other options for providing additional information about the selection.
  • the act of selecting can include presenting the user with one or more user-selectable interface elements for further selecting a severity level (e.g., a pain level) associated with the experience medical condition.
  • a severity level e.g., a pain level
  • FIG. 10 includes an act 1006 of presenting a selection of medical condition(s) to a user, such as a selection of one or more medical conditions corresponding to the one or more selected elements of the anatomical subassembly, wherein the one or more medical conditions are medical conditions corresponding to the one or more selected elements of the anatomical assembly.
  • This can also include presenting the user with a list or menu of medical conditions that may apply to the selected anatomical subassembly 602 . From this list, the user can select one or more of the medical conditions (e.g., medical condition 606 ). Then, the user can be presented with information about the selected medical condition(s).
  • the user may be presented with a condition information user interface 700 which provides textual, visual, and/or audio information about the selected condition.
  • the condition information user interface 700 can also provide information about procedures for treating the one or more medical conditions.
  • FIG. 10 also includes an act 1008 of presenting information about specialists.
  • Act 1008 can include presenting the user with information about one or more medical specialists corresponding to one or more of the medical conditions, as well as the one or more selected elements of the anatomical subassembly. For example, when the user is a patient, the patient can send any gathered information and any user annotations to a specialist for remote diagnosis. Alternatively, when the user is a doctor, the user can send the information to another doctor for advice, additional opinions, etc.
  • the method can also comprise annotation by the annotation system 104 .
  • the user can be presented with one or more user-selectable annotation options for selecting one or more areas of the anatomical subassembly 602 for annotation.
  • these annotations can be uploaded as a medical file to a clearinghouse where they can be made accessible to doctors, patients, insurance companies, hospitals, etc.
  • FIGS. 1-10 provide a number of components and mechanisms for annotating graphical element and for providing medical information or assistance based on selected graphical elements.
  • One or more disclosed implementations also provide for a central clearinghouse for sharing graphical elements between annotation systems, data sources, and destinations.
  • the foregoing embodiments may be also practiced by a computer system including any number of one or more processors and computer readable media such as computer memory or other storage media which is detachable from the processor(s).
  • the computer memory may store computer executable instructions that when executed by one or more processors cause various functions to be performed.
  • Embodiments may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are physical storage media.
  • Computer-readable media that carry computer-executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer readable storage media and transmission computer readable media.
  • Physical computer readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc), magnetic disk storage or other magnetic storage devices, flash memory, thumb drives, portable memory drives, solid state disks, or any other physical medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • the storage devices do not consist of merely transitory carrier waves and/or merely transitory signals.
  • a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer readable media to physical computer readable storage media (or vice-versa).
  • computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface card or module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer readable physical storage media at a computer system.
  • NIC network interface card or module
  • computer readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device (e.g., CPU device(s)) to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, tablet computers (e.g., iPads, Android tablets), message processors, hand-held devices (e.g., iPods), multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, other smart devices, or other interactive display devices or personal computers, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Veterinary Medicine (AREA)
  • General Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Pain & Pain Management (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Embodiments herein provide information about a medical condition based on user input. A user is presented with an anatomical assembly representing an anatomical region of the human body. The presentation includes user-selectable display elements which, when selected, indicate an area in the anatomical assembly representing areas in which a medical symptom is experienced in the human body. Upon receiving user input selecting one of the user-selectable elements of the anatomical assembly, the user is presented with a selection of a medical condition corresponding to the selected element of the anatomical assembly. The user is also presented with information about a medical specialist corresponding to the medical condition.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a divisional of U.S. application Ser. No. 13/093,272 filed Apr. 25, 2011, entitled “MEDICAL INTERFACE, ANNOTATION AND COMMUNICATION SYSTEMS,” which application claims the benefit of, and priority to the following three provisional applications: U.S. Provisional Application No. 61/424,548 filed Dec. 17, 2010, entitled “INTERACTIVE ANATOMICAL MEDICAL APPLICATION USER INTERFACES;” U.S. Provisional Application No. 61/442,686 filed Feb. 14, 2011, entitled “INTERACTIVE ANATOMICAL MEDICAL APPLICATION USER INTERFACES;” and U.S. Provisional Application No. 61/442,666 filed Feb. 14, 2011, entitled “INTERACTIVE GRAPHICAL ELEMENT ANNOTATION.” All of these applications are incorporated by reference in their entireties.
  • BACKGROUND
  • 1. The Field of the Invention
  • This invention relates to systems, methods, and computer program products related to the interactive annotation of displayed graphical elements and to providing users with information about selected graphical elements.
  • 2. The Relevant Technology
  • Annotation typically involves appending descriptive information to objects. In a simple form, for example, annotation may involve the addition of handwritten notes on paper documents. Thus, annotation can involve appending handwritten notes, markings, etc. to content of the paper documents.
  • With the development of computing technology, annotation has been extended to electronic forms as well. For example, some office suites (e.g. Microsoft® Office®) enable electronic annotations to be added to documents through the use of notes, revision tools, etc. Furthermore, some ability exists to add annotations to graphical elements, such as through the use of image editing suites (e.g., Adobe® Photoshop®). Thus, some ability exits to perform electronic annotation using generic tools such as office suites and image editing suits.
  • Despite the foregoing, there is an ongoing need to improve and ease the ability to electronically annotate graphical elements (e.g., images, video, etc), and to share these annotations with others.
  • BRIEF SUMMARY
  • Implementations of the present invention include systems, methods and computer program products configured for annotating and sharing graphical elements, as well as for providing information surrounding selected graphical elements. In one or more embodiments, an annotation computing provides one or more interactive user interfaces for managing, viewing, displaying, manipulating, and/or annotating the graphical elements. One or more embodiments also provide a central clearinghouse for obtaining, storing, tracking, and sharing information, including annotated and/or un-annotated graphical elements. Further, in the context of medical annotation, some embodiments can extend to receiving information about experienced medical conditions and to providing medical information or advice about the inputted condition(s).
  • For example, a method for providing information about a medical condition based on user input can include presenting a user with a user interface that in includes an anatomical assembly representing an anatomical region of the human body. The user interface can also include user-selectable display elements which, when selected, indicate areas in the anatomical assembly corresponding to areas in which a medical symptom is experienced. User input selecting one or more of the user-selectable elements of the anatomical assembly can be received. In response, the user can be presented with a selection of medical conditions corresponding to the selected elements. In addition, the user can be presented with information about one or more medical specialists knowledgeable about the medical conditions and the selected elements.
  • This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • Additional features and advantages of exemplary implementations of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary implementations. The features and advantages of such implementations may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary implementations as set forth hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 illustrates a schematic diagram of an annotation environment for annotating and sharing records in accordance with one or more implementations of the invention;
  • FIG. 2 illustrates a layout of a management user interface of an annotation system, in accordance with one or more implementations of the invention;
  • FIG. 3 illustrates a layout of an annotation user interface of an annotation system, in accordance with one or more implementations of the invention;
  • FIG. 4 illustrates a communications user interface 400 of an annotation system, in accordance with one or more implementations of the invention;
  • FIG. 5A illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention;
  • FIG. 5B illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention;
  • FIG. 5C illustrates a flow diagram of communication paths within an annotation environment, in accordance with one or more implementations of the invention;
  • FIG. 6 illustrates a layout of an anatomical selection user interface, in accordance with one or more implementations of the invention;
  • FIG. 7 illustrates a layout of a condition information user interface, in accordance with one or more implementations of the invention;
  • FIG. 8 illustrates a layout of a user interface which includes an interactive anatomical display, in accordance with one or more implementations of the invention;
  • FIG. 9 illustrates a flowchart of a series of acts in a method, in accordance with an implementation of the present invention, for annotating a graphical element; and
  • FIG. 10 illustrates a flowchart of a series of acts in a method, in accordance with an implementation of the present invention, for providing information about a medical condition based on user input.
  • DETAILED DESCRIPTION
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below.
  • Embodiments described herein are generally directed to methods, systems, and computer storage media configured for annotating graphical elements, which can include images, videos, models, and the like. Embodiments of the invention can also include accessing and transmitting the annotations.
  • In one or more embodiments, an annotation computing system receives, generates, or obtains one or more graphical elements and provides one or more interactive user interfaces for managing, viewing/displaying, manipulating, and/or annotating the graphical elements. Graphical element manipulation includes any form of image or video manipulation, such as rotating, zooming, color modification and/or adjustment, cropping, trimming, joining, and the like. Graphical element annotation includes any form of descriptive commenting or enhancement to graphical elements, such as the addition of one or more highlights, selections, shapes, objects, textual comments, visual or audible enhancements, etc.
  • In some embodiments, audible annotations can also be applied to and associated with the graphical elements.
  • Embodiments also include a central clearinghouse system for obtaining, storing, tracking, and sharing information, including annotated and/or un-annotated graphical elements. A clearinghouse can connect the annotation computing system with third-party sources and destinations of information, other annotation computing systems, or other clearinghouses. At least one embodiment also includes sharing and obtaining graphical elements separate from any clearinghouse(s).
  • As a preliminary matter, one will appreciate that the embodiments described herein can be applied broadly to any field in which the annotation of graphical elements is desirable or useful (e.g. mechanical arts, medicine, business methods, chemical arts, entertainment, etc.). Nevertheless, for purposes of convenience in description, the following text and figures describe the inventive embodiments primarily with respect to a system and graphical user interface for annotating medical graphical elements, such as, but not limited to human anatomical features.
  • In the context of medical annotation, some embodiments can extend to methods, systems, and computer storage media for inputting information about experienced medical conditions (e.g., experience pain), and receiving medical information or advice about the inputted condition. These embodiments can extend to local or remote diagnosis, education, and investigation about medical conditions, tracking medical histories, etc. These embodiments can be employed separately from or in connection with the annotation computing system and the clearinghouse.
  • FIG. 1 illustrates a schematic diagram of an annotation environment 100 for annotating and sharing records, including graphical elements, in accordance with one or more implementations of the invention. The annotation environment 100 can include an annotation system 104 communicatively coupled with one or more data sources 102 and one or more data destinations 108. The annotation environment 100 can also include the annotation system 104 communicatively coupled with and a central record clearinghouse 106, which can, in turn, be coupled with the data source(s) 102 and destination(s) 108. If used, the clearinghouse 106 can serve as a central repository for storing and tracking records and for communicating records between the annotation system 104, the data source(s) 102, and the destination(s) 108.
  • The annotation system 104 can comprise any computing system capable of receiving, sending, and managing records, as well as annotating graphical elements. The annotation system 104 may take the form of a desktop or laptop computer, a personal desktop assistant (PDA), a tablet computer (e.g., iPad), a phone device or other smart device, etc. The annotation system 104 can include a management component 104A, an annotation component 104B, and a communications component 104C. The annotation system 104 can also include any additional components 104D. It will be appreciated that the annotations system 104 can consist of a standalone server system or can comprise a distributed system, such as can be distributed throughout a computing cloud.
  • The management component 104A can manage one or more textual, graphical and/or audible records. In some embodiments, a record, which can apply to any contextual topic, will include one or more graphical elements. As mentioned previously, the annotation environment 100 can, in one or more implementations, comprise a medical annotation environment. In this environment, each record may be a different medical record for different patients, and each medical record may include one or more graphical elements (e.g., medical images). Alternatively, each medical record may correspond with a different condition connected with the patient, a different doctor visit by the patient, different categories of disease, etc. Of course, a record may simply be the graphical element itself.
  • The graphical elements can be any appropriate graphical elements including images, videos, and models. For instance, when the annotation system 104 operates in a medical annotation environment, the graphical elements can include images, videos, or models of human or animal anatomical features. Thus, the graphical elements may be virtual anatomical models, Magnetic Resonance Imaging (MRI) scans, X-Ray images, Computerized Axial Tomography (CT or CAT) scans, photographs, endoscopic images or videos, etc. Of course, in other contexts, the graphical elements will correspond directly to the applicable industry. For instance, in the engineering industry, the graphical elements may be engineering schematics, design photographs, artwork, etc.
  • Whatever forms the records and the graphical elements take, the management component 104A can organize the records into one or more categories. In the medical context, for example, categories may include patients, doctors, medical conditions, hospitals or offices, severity levels, time periods, etc. The management component 104A can also provide functionality for adding or removing categories, adding or removing records or graphical elements, selecting one or more categories or records view viewing or editing, selecting one or more graphical elements for annotation, etc. As discussed more fully herein after, the management component 104A can also provide functionality for sending or receiving records or graphical elements via the communications component 104C.
  • For example, FIG. 2 illustrates a layout of a management user interface 200 of the annotation system 104, for managing medical records, consistent with to one or more implementations. The management user interface 200 can include one or more interface controls 202 for adding categories and/or records. As illustrated, each category may correspond to a different patient, and each record may correspond to one or more graphical elements. However, other combinations are also possible. For instance, medical record information can be stored and associated with different owners, different medical professionals, different conditions, etc. Various property information and schemas can be used to group medical records or other medical or graphical information according to any desired need or preference.
  • The management user interface 200 can also include one or more interface controls 204 for searching existing categories and/or records. Additionally, the management user interface 200 can include a listing 206 of existing categories or records. As illustrated, the listing 206 can be an alphabetical listing of patents. However, any format of the listing 206 is available, such as drop-down menus, tables, combo boxes, etc. In connection with the listing 206 of categories or records, the management user interface 200 can also include a detailed view 208 of graphical elements or a display frame for displaying any combination of graphical elements available in a selected category or record. The graphical elements can be collated, grouped, and displayed in any desired format. The graphical elements can also comprise selectable links to other types of records, such as sound recordings or multimedia files, for example.
  • The management user interface 200 can be used to access, display, and annotate the graphical elements, or the linked to files (e.g., sound files, multimedia files, etc.).
  • In one embodiment, the management user interface 200 provides a doctor with a listing of medical records for his or her patients, including records associated with Aaron Anderson, who has been selected. Graphical elements, or any other elements relevant to Aaron Anderson, are shown. The graphical elements can be grouped or categorized, such as by type, a date on which the graphical elements were generated or modified, a date on which the graphical elements were obtained by the annotation system 104, a body part, or according to any other desired characterization. The grouping can be performed automatically, when received, such as by parsing data on the records, or manually, as desired.
  • The management user interface 200 can also include one or more interface controls for obtaining additional records and/or graphical elements, and one or more interface controls for removing records and/or graphical elements. Additionally, the management user interface 200 can provide one or more interface controls for selecting a graphical element for making, editing, and/or accessing annotations. Further, the management user interface 200 can provide one or more interface controls for sending graphical elements to a destination 108 (e.g., patients, doctors, dentists, clearinghouses, or any other appropriate entity).
  • The annotation component 104B of the annotation system 104 can provide one or more user interfaces and/or controls for annotating graphical elements. Annotations can include any combination of colors, highlights, animations, images, audio, video, text, and the like, which is added to a record or an element of the record and which is thereafter associated with the record/record element. The annotative elements can be applied from a library of available elements, or can be applied in a free-form manner (e.g. dynamic color selectors, image import, free-hand drawing). In some embodiments, pre-built or pre-scripted annotations are presented for selection and annotation of a graphical element.
  • Each annotation can also include descriptive comments or other information added by the annotator. Descriptive comments can include text, drawings, images, audio recordings, video recordings, and the like. In some instances, the annotation component 104B can apply optical character recognition or handwriting recognition technology to produce textual output from image or video data. Furthermore, the annotation component 104B can also apply text-to-speech or speech-to-text technology to produce audible output from textual data or to transcribe audible data to textual data. Thus, when annotations include textual, audio, or audio/visual comments and/or descriptions, the annotation system 104 may permit the automatic and dynamic conversion of these items (e.g., text can be dynamically converted to audio, and audio can be dynamically transcribed to text).
  • The annotations can be visibly presented next to the corresponding element(s) that are annotated. This can also include presenting an annotation symbol comprising a selectable link to audio, textual, or graphical annotations. The symbol or link can be presented as text or image data that is colored, sized, or presented in a distinguishing way and that, when selected, accesses or provides access to the annotation(s).
  • In one or more implementations, when the annotation environment 100 is used in the medical context, the graphical elements can comprise anatomical features (e.g., images, videos, anatomical models). Thus, the annotations can be applied to highlight one or more injuries or conditions, to indicate the severity of injury, to illustrate treatment options, to indicate healing or degenerative progress over time, to illustrate medical procedures, to indicate medication options, and the like. For example, different shapes can illustrate different types of injuries or conditions, while different colors can represent different severity levels (or vice-versa). In such embodiments, the annotation can include modifying a display characteristic of a graphical element that was already included as part of the record, such as by enhancing coloring, contrast or other display characteristics.
  • Unique combinations of colors/shapes/objects can be used to indicate different types of annotations, different authors, different conditions, or any combination of differences in annotations. Of course, the annotations (as well as the display characteristics of the annotations) may also be applied merely for teaching or instructive purposes.
  • In one illustrative example, a doctor may use the annotation system to track the treatment of a patient. With each visit, the doctor can capture a graphical representation of the patient's injury or condition, and make one or more annotations highlighting the healing or degenerative process. Thus, when taken together, the annotated graphical images represent a comprehensive temporal visual history of the treatment process, complete with targeted and appropriate comments in the form of annotations. Graphical representations of the annotations (with or without corresponding symbols) can reflect different stages or diagnosis. This history may be useful to the doctor, other doctors, the patient, insurance carriers, educators, etc. Additionally or alternatively, the patient can send graphical images to the doctor (e.g., between physical visits) so that the doctor can track progress and make detailed recommendations. These recommendations can be communicated back to the patient, at least in part, via annotations made to or with the graphical images.
  • The annotation component 104B can also provide one or more user interfaces and/or controls for interactively manipulating the graphical elements prior to, during, or after annotation. For example, the annotation component 104B can include tools for rotating, zooming, color adjustment, cropping, trimming, joining, and the like. Furthermore, the annotation component 104B can provide rich interactive features, including the ability to rotate the graphical elements in up to 360°, the ability to apply motion to the graphical elements, the ability to display annotations or animations, etc. This interactively is illustrated in greater detail in subsequent Figures.
  • One will appreciate, in view of the disclosure herein, that the annotation component 104B can display and annotate both two-dimensional (2D) graphical elements and three-dimensional (3D) graphical elements. Furthermore, the annotation component 104B can convert 2D graphical elements to 3D graphical elements, and vice-versa. For example, the annotation component 104B can convert 2D graphical element to three dimensions by using automatic algorithms, user input, model information, etc. Additionally, the annotation component 104B can convert a 3D graphical element to two dimensions via flattening, cross-sectioning, etc.
  • When applied to 3D graphical elements, annotations may be visible only when the annotation rotated into view. Alternatively, annotations may be made visible through a visual cue that becomes more prominent as the annotation is rotated into view. Of course, annotations may also be permanently visible regardless of the current view. Similarly, when applied video graphical elements, annotations may only be visible during temporal periods in which the annotation was applied, or may be permanently visible or accessible through bookmarks, time stamps, etc. Annotations can also be selectively displayed in response to receiving predetermined input, such as, but not limited to authorization information, query information and so forth.
  • FIG. 3 illustrates one embodiment of an annotation user interface 300 layout that can be utilized by the annotation system 104. As illustrated, the annotation user interface 300 can display one or more graphical elements, which in this circumstance comprise the image of the skeletal anatomy of a human shoulder. The annotation user interface 300 also provides one or more annotation interface controls 302 for annotating the image, configuring the annotation user interface 300, etc.
  • Upon selection of an appropriate annotation interface control 302, the user interface 300 can present one or more annotation dialogue 304, 306, 308, along with any appropriate annotation options. In one or more implementations, the annotation dialogue (304, 306, 308) may appear after selection of one or more areas of the graphical element. For example, an annotation dialogue (304, 306, 308) may appear after a user clicks, touches, or otherwise selects a point or region of the graphical element, with the annotation applying to the selected element. The user can customize the annotation in any appropriate manner using the annotation options associated with the annotation dialogue (304, 306, 308). Of course the annotation user interface 300 can also include annotation options that are separate from an annotation dialogue.
  • In one or more implementations, the annotation options can include selection from among of dots, circles, arrows, text input, audio recording, audio playback, etc. that are to be applied to a point or region of the graphical element (i.e., the image of the shoulder). Of course, the annotation options are not limited to those shown. For instance, the annotation options can also include options to add (or remove) one or more colors, to add (or remove) a plurality of shapes or graphical objects (e.g., polygons, arrows, bullets, stars, numbers), to add (or remove) animations or images, etc.
  • Annotation dialogue 304 includes several annotation options, such as an annotation option 310 for recording audio or video (shown here as selected and currently recording). In addition, annotation dialogue 304 shows that an annotation option 312 for no highlighting has been selected. Thus, while an area 326 of the shoulder is being annotated, there is no particular highlighting associated with the selected area.
  • In annotation dialogue 306 the annotation option 310 for recording audio has not been selected. In addition, an annotation option 314 for selecting a circle has been selected. Correspondingly, a circular highlight 328 is displayed on the shoulder corresponding to the area of annotation. Of course, selecting the annotation option 314 may also present additional menus for selecting alternate shapes, colors, sizes, transparency, etc.
  • Annotation dialogue 308 shows that an annotation option 316 has for selecting a pointer has been selected. Correspondingly, an arrow 320 is displayed on the shoulder which points to the selected area of annotation. Like annotation option 314, additional menus may be used to select a particular pointer type, color, size, shape, transparency, etc. In addition, annotation dialogue 308 shows that media (e.g., audio or video) has been recorded, which is associated with element 320, and that the media can be played back or deleted using the corresponding controls (318 and 320) and interface menus.
  • Of course, the illustrated annotation dialogues 304, 306, 308 are exemplary only and are not limiting to the annotations dialogues available to the present invention.
  • The annotation user interface 300 can also display relevant identifying information 322 for the graphical element. For example, when the graphical element is a medical image, the identifying information 322 can include a patient's name, a date (e.g., the date when the graphical element was generated, or the date when the annotation system 104 received the graphical element), and an image identifier, etc. In some embodiments, the identifying information 322 can be part of the graphical element itself, and can used as the basis of gathering additional information about the graphical element (e.g., through the use of optical character recognition technology).
  • The annotation user interface 300 can also provide a plurality of user interface controls for manipulating the graphical element. For instance, FIG. 3 illustrates that the annotation user interface 300 can include a brightness/contrast selector 324 that provides one or more user interface controls for adjusting the brightness and contrast of the image of the graphical element.
  • As mentioned previously, any number of manipulation controls can be provided for manipulating graphical elements (including size, shape, brightness, contrast, animations, etc.), whether they be image, video, or otherwise. Such manipulation controls can include user interface controls for cropping, rotating, trimming, joining, etc. Additionally, as discussed previously, annotation user interface can provide any number of interactivity controls that provide rich user interaction with the graphical element. These can controls can include as one or more user interface controls for rotating the graphical element in up to 360°, zooming, rendering motion to the graphical element, converting the graphical element from three-dimensions to two-dimensions (and vice-versa), etc.
  • Returning again to FIG. 1, the communications component 104C of the annotation system 104 can receive records and graphical elements from the data source(s) 102 (either directly, or via the clearinghouse 106) and can send records and graphical elements to the clearinghouse 106 and/or to the destination(s) 108. The communications component 104C can communicate directly with the data source(s) 102 and the destination(s) 108, or can communicate indirectly (e.g., via the clearinghouse 106). Of course, the communications component 104C may comprise a plurality of components, such as one for receiving data and one for sending data.
  • As illustrated, communications within the annotation environment 100 can occur over any appropriate connections, such as wireless connections 110A (e.g., WiFi, Bluetooth, infra-red), or wired connections 110B (e.g., network, USB, FireWire, internal bus). It will be appreciated that connections can be made directly (e.g., via a USB connection between the data source 102 and the annotation system 104), or indirectly (e.g., when communications occur over a local or wide area network, or when communications occur via the clearinghouse 106). Regardless of the connection type, communications protocols can also take a variety of forms, such as electronic messages 110C or any other communications protocols 110D. As indicated by the vertical ellipses 110E, any other appropriate connection types and communications protocols are available within the annotation environment 100. For example, one or more implementations may use facsimile, SMS, MMS, and so forth.
  • The data source(s) 102 should be construed broadly to include any appropriate source(s) of records and/or graphical elements. In the case of a medical annotation environment, for example, graphical elements can originate from one or more doctors 102A (e.g., family practice doctors, radiologists, emergency care doctors, physical therapists, dentists, veterinarians), one or more patients 102B, one or more hospitals 102C, or one or more insurance providers 102D. The graphical elements can also originate from and be automatically attached to a record by the annotation system 104 (e.g., from a still camera or a video camera 102E, a motion capture device, or any other acquisition device at the annotation system 104, or from a direct upload to the annotation system 104, in combination with system logic and tagging mechanisms). In some instances, the data source 102 is separate from other illustrated components, such as the annotation system 104 and the clearinghouse 106, while in other instances the data source 102 is the same as or part of the other illustrated components. Different graphical display characteristics (e.g., size, color, shape, transparency, etc.) can be applied to the different annotations to visually represent the source of the annotations.
  • As indicated by the vertical ellipses 102F of FIG. 1, the data source 102 can include any number of additional sources of data for records that can be annotated or from which annotations can be derived. For example, the annotation system 104 can also import information (e.g., patient contact information) automatically from one or more external databases. The annotation system 104 can also use internal sources. For instance, in one or more implementations, the annotation system 104 obtains at least some information using optical character recognition of information included in the graphical elements (e.g., identifying information 306). The annotation system 104 can also utilize speech recognition and/or handwriting recognition. It will be appreciated that in addition to images originating from imaging devices (e.g., cameras, medical imaging devices, etc.), the graphical elements can be partially or entirely virtual (e.g., computer models). Additionally, graphical elements can incorporate, at least in part, motion capture data, metadata, audio data, etc.
  • The destination(s) 108 should also be construed broadly to include any appropriate destination of records and/or graphical elements. For example records and/or graphical elements can be sent to one or more doctors 108A, one or more patients 108B, one or more hospitals 108C, one or more insurance carriers 108D, or even other annotation systems 108E. Of course, as indicated by the vertical ellipses 108F, other destinations or combinations of destinations are also possible. In many instances, the data source 102 and the destination 108 may be the same.
  • A patient 102B may send graphical element(s) to an annotation system 104 operated by his or her doctor. After receiving the graphical element(s), the doctor can annotate the graphical elements(s) and send the annotated graphical element(s) back to the patient 108B and/or to another doctor 108A. Thus, patients can be provided access to records associated with their own medical history and/or their family members' medical histories. Of course, the foregoing example is only one possibility. For instance, a doctor 102A (e.g., a radiologist) can send a patient's x-ray the annotation system 104 where the radiologist can apply one or more annotations. The radiologist can subsequently send the annotated x-ray to any appropriate destination 108 (e.g., a family doctor 108A, the patient 108B, a hospital 108C). In another example, medical records are made available insurance entities corresponding to carriers of insurance policies. This way an insurance representative can access and view corresponding medical records and related data for the various carriers. In educational settings, students or professors can be provided access to historical medical records. The various medical records can be annotated and stored with the medical annotations for review and later access according to the invention.
  • In other embodiments, the medical records and annotations are stored separately, but remain linked by data maintained at the annotation system. When the record is subsequently accessed or transmitted, the record is sent with the appropriate annotations that are relevant and/or authorized for each corresponding recipient. Alternatively, all annotations are included with the medical record and filtering is applied at the recipient system(s) to display only authorized annotations.
  • FIG. 4 illustrates a communications user interface 400 of the annotation system 104, in accordance with one or more implementations, with which a user can send records and graphical elements (annotated or un-annotated) to destination(s) 108 and/or the clearinghouse 106. As illustrated, the user can customize a message or other data that accompanies the records and/or graphical elements as they are sent. For instance, a doctor may customize a message to another doctor or a patient, or may make any other comments. This data can be stored and transmitted/accessed with the other image data (like X-rays) or other medical records associated with a patient. While FIG. 4 illustrates the composition of an e-mail message, the communications user interface 400 can, in other embodiments, send graphical elements in other forms (e.g., MMS, direct upload).
  • Returning briefly to FIG. 1, the clearinghouse 106 can comprise one or more computing systems configured to receive and make records, graphical images, etc., accessible to data sources 102, destinations 108, and the annotation system 104. As illustrated, the clearinghouse 106 can, in some embodiments, comprise a “cloud” configuration which includes one or more servers that are separated from the annotation system 104. Of course, one will appreciate that, in one or more implementations, the clearinghouse 106 may also be a part of the annotation system 104 itself. The clearinghouse 106 can employ any appropriate security and authentication mechanisms to protect any data stored therein from unauthorized access. Thus, communications between the clearinghouse 106 and any other component can be secured.
  • As discussed previously, the annotation environment 100 can be used to communicate graphical elements between a variety of sources and destinations in a variety of contexts beyond the medical field. For example, scientists can use the annotation environment 100 to share and annotate their research data. Furthermore, engineers can use the annotation environment 100 to share and annotate their designs. Still further, artists can use the annotation environment 100 to share and annotate their artwork. Accordingly, as mentioned previously, while some of the Figures illustrate implementations related to the medical field, the disclosure herein should not be viewed limiting the annotation environment 100 to use by the medical field.
  • Turning now to FIGS. 5A-5C, a plurality of flow diagrams are illustrated to reflect some of the communication paths that can be utilized within the annotation environment 100 in accordance with one or more implementations. The flow diagrams are for illustrative purposes only, and other communication paths are also possible. Each of the flow diagrams may use all or only a part of the illustrated annotation environment 100, as well as other components not previously discussed. For example, one or more implementations may include a web service 502 which may be provided in connection with, or separate from, the clearinghouse 106.
  • FIG. 5A, for example, illustrates an embodiment in which the annotation system 104 communicates directly with the data source(s) 102 and the destination(s) 108. In this embodiment, the annotation environment 100 may lack the clearinghouse 106, or the annotation system 104 may refrain from using the clearinghouse 106. Thus, a data source 102 (e.g., doctor 102A) may send one or more records and/or graphical elements directly to the annotation system 104.
  • As discussed previously, the annotation system 104 may acquire the graphical element(s) directly with an acquisition device (e.g., a camera 102E), or graphical element(s) may be sent to the annotation system 104 via a wireless 110A or a hard-wired 110B connection. After any appropriate annotations are made at the annotation system 104 (e.g., by the doctor 102A), the annotated graphical element(s) may be sent to a destination 108 (such as to a patient 108B). For example, the doctor 102A may use the communications user interface 400 at the annotation system 104 to send the annotated graphical element(s), along with any comments, to the patient 108B via e-mail, MMS, direct upload, etc.
  • FIG. 5B illustrates an additional embodiment in which the annotation system 104 communicates with the data source(s) 102 and the destination(s) 108 via a web service 502. For example, a doctor 102A may upload a patient's records and/or graphical elements to the web service 502. The annotation system 104, which is in communication with the web service 502, can receive the records and/or graphical elements. Then, the doctor 102A can make any appropriate changes (e.g., annotations) at the annotation system 104 and send these changes back to the web service 502. Subsequently, one or more destinations 108 can retrieve the records and/or graphical elements from the web service 502. Illustratively, the patient 108B, a different doctor 108A, a hospital 108C, or any other destination 108 can retrieve the records and/or graphical elements from the web service 502. Of course, similar to FIG. 5A, the data source(s) 102 and/or the destination(s) 108 may also be in direct communication with the annotation system 104. The web service 502 can be a standalone web service that is separated from the annotation system 104, or may be a component or module of the annotation system 104 or the clearinghouse 106.
  • FIG. 5C illustrates yet another embodiment in which the clearinghouse 106 is used to manage information from a variety of sources. For example, FIG. 5C illustrates that the clearinghouse 106 can be in communication with a plurality of hospitals 102C. Thus, the hospitals 102C can store a plurality of records and graphical elements for a plurality of patients. A doctor 102A can then use the annotation system 104 to retrieve medical records for a particular patient and to perform any appropriate annotations. The medical record, along with annotations, can then be synchronized back to the clearinghouse 106. A variety of destinations (e.g., doctors 108B, patients 108A, hospitals 108C, insurance companies 108D) can then retrieve the annotated records from the clearinghouse 106. This may be done directly, or via a separate web service 502 provided by the clearinghouse 106, as illustrated.
  • As indicated previously, particularly in the medical context, the foregoing annotation system may be used as part of a local or remote diagnosis and treatment system. For instance, some embodiments include mechanisms that enable a user to select anatomical regions of the human body from a displayed anatomical subassembly. From this selection, the user can be presented with information about corresponding conditions, the selection can be used to track medical conditions over time, or the selection can be used for annotation using the annotation system 104. Of course, the selection can be used for additional purposes beyond these examples.
  • FIG. 6 illustrates a layout of an anatomical selection user interface 600. As illustrated, a user (e.g., a patient, a doctor or other entity) may be presented with anatomical subassembly 602 representing an anatomical region of the human body (e.g., a human foot). From the anatomical subassembly 602, the user can select one more anatomical regions or elements (e.g. anatomical region 604), which can then be used for further processing or diagnosis, as described more fully below. As illustrated, for example, the user can be instructed to select an area in which pain is being experienced (i.e., “touch where it hurts”). This information can then be used by a doctor or by a computing system to determine possible conditions that may cause pain in the selected anatomical region 604. While the illustrated embodiment indicate that selection is based on experienced pain, selection can be based on any appropriate criteria, such as areas experiencing inflammation, areas of known injury, areas of discoloration or rash, areas of past treatment, etc.
  • The user may also be presented with one or more user interface elements (not illustrated) for identifying a relative measure of pain or perceived severity of a condition. Relative measures/magnitudes can be provided in any number of ways, such as through the selection of a number from a predefined range (e.g., 1-10), selection of a color from a color scale, selection of a graphical image of from a set of graphical images (e.g., a face selected from a set of faces having varying degrees of smiles and frowns), and so forth. Sets of objects having different sizes can also be selected from to reflect a relative magnitude. This information may be recorded as an annotation in the user's medical history (as a medical record), or may be used to further evaluate and provide information related to corresponding specialists, or even a diagnosis. In this manner, for example, it may be possible to interact with a virtual doctor's office and corresponding attendee.
  • The user may also be presented with a selection of specialists who are knowledgeable about the anatomical subassembly 602 and/or the condition. The selection of specialists can be local to the user's geographical area, or may include specialists from a broader geographical area. From the selection of specialists, the user can receive additional information about particular specialist(s) (e.g., cost, insurance affiliations, medical and educational credentials, contact information, photographs, reviews, hours of operation, and so forth). Further, the user can be presented with one or more options to contact the specialist directly. Thus, any relevant information about the user that has been gathered (e.g., the patient's medical history, the selected anatomical subassembly 602, the selected anatomical region 604, the indicated severity, patient-gathered photographs or records, patient-generated comments or annotations, doctor-gathered images or annotations) may be sent to a doctor (e.g., a specialist) for remote diagnosis, or for helping a doctor to remotely guide the user through an investigation of possible conditions (e.g., condition 606).
  • Similarly, when the user is a doctor, the gathered information (e.g., the selected region 604 and/or the severity) may be used for making annotations using the annotation system 104, for forming a diagnosis, for educating the patient about conditions that may correspond to the selection, for engaging other doctors, for tracking the patient's medical history, etc.
  • Once a user has selected an anatomical region 602, and a possible condition 606, further processing or diagnosis can include presenting the user with one or more corresponding condition information user interfaces. For example, FIG. 7 illustrates a layout of a condition information user interface 700 consistent with one or more implementations. The condition information user interface 700 can present information about the selected condition 606 in the form of photographs, audio and/or video presentations, illustrations, text, annotations and the like. Additionally or alternatively (not illustrated), the user can be presented with information about one or more medical specialists who have expertise with the selected condition (as discussed previously).
  • As mentioned, further processing or diagnosis can also include tracking a condition over time and/or annotating a medical record with the annotation system 104. For instance, after selecting an anatomical region 604, the selected region can be used as an aid in annotating a graphical element, such as a user or patient's X-Ray or MRI image. One will also appreciate that selection of an anatomical region can be made directly from the graphical element (e.g. from the user or patient's X-Ray or MRI image), or of from a simulated anatomical subassembly. Alternatively, selection of an anatomical region can be made on a simulated anatomical subassembly and then the selection can be transferred or overlaid on the displayed graphical element. However the selection is made, the selected anatomical region can be the basis of annotation, and this information can be saved (e.g. as a medical file at the clearinghouse 106) for future retrieval, or to maintain a record of the condition. For instance, a subsequent selection of the anatomical region 604, along with any annotations, can also be saved (e.g. at the clearinghouse 106) and/or can be compared to previously saved medical records.
  • Any display element (e.g., a graphical element being annotated, or the anatomical subassembly 602) can be displayed statically or dynamically. For instance, in some implementations, a user interface can enable a user to interactively rotate the display element in up to 360°, to selectively display descriptive annotations, to add or remove anatomical layers, to animate the displayed element through one or more types of motion, etc. Of course, the user interface can also display the display element dynamically without the use of user input as well.
  • FIG. 8 illustrates a layout of an interactive user interface 800 which includes an interactive anatomical display, according to one or more implementations. As shown, a user can interactively modify the display of an anatomical display element 802 using one or more interactive display options 804, one or more motion selectors 806, or input from a user input devices, etc. The interactive display options 804 can be used to selectively add or remove anatomical layers (e.g., nerves, muscle, tendon, bone), or to selectively add or remove annotations (e.g., anatomical labels).
  • The interactive display options 804 can be used to select a type of motion to apply to the display element 802. When selecting a “360°” motion option, the display element 802 can be automatically rotated, or selectively rotated based on user input. Once selected, the user may be able to click and/or drag on the display element 802 directly, or perform any other appropriate user input (e.g., using a separate user interface control. Other motions are also possible, such as motions that the display element 802 may actually perform (e.g., walking motion, ankle movement, and toe movement, etc.). In one or more embodiments, the user may use the motion selector 806 to perform the motion. The motion selector 806 can take any form appropriate to a motion type, such as buttons, sliders, switches, virtual or actual d-pads, virtual or actual joy-sticks, etc.
  • It will be appreciated that any of the text and any of the visual icons and objects displayed throughout the various user interfaces can be selectable links to initiate a function or to select the displayed item or a corresponding item, as generally described and inferred from the foregoing. In some instances, the user interfaces can be optimized for touch user interaction (e.g., via a tablet computer), while in others the application interfaces can be optimized for other types of user interaction, such is with pointer-based input devices.
  • It will also be appreciated that the selection of a region or element on an anatomical assembly or graphical element can include having a user make several selections through one or more related interfaces that allow a user to drill down from a first region, to a sub-region, to a specific element, for example. Any number or combination of menu selections may be made. The user can also make a selection from a pull-down menu that lists different elements/regions/features available for selection and/or make a selection of an element/region on an anatomical feature/assembly/subassembly, and so forth. Once a selection is made, it can be highlighted or otherwise visually modified to reflect the selection and to provide one or more interface elements for receiving and storing related annotations to the selected element(s) and/or for providing information related to the selected element(s).
  • FIGS. 1-8 provide a number of components, mechanisms, and user interfaces for annotating and sharing graphical elements, as well as components, mechanisms, and user interfaces for inputting information about medical conditions for diagnosis, education, tracking medical histories, etc. One or more disclosed implementations can enable rich, easy, and intuitive annotations, as well as flexible management and sharing of annotated graphical elements. One or more disclosed implementation can also enable remote diagnosis and/or guiding a user to information about a medical condition.
  • Some implementations of the present invention can be described in terms of flowcharts comprising one or more acts in a method for accomplishing a particular result. Along these lines, FIGS. 9-10 illustrate flowcharts of computerized methods of annotating graphical elements and for selecting anatomical regions for further investigation. FIG. 9 illustrates a flowchart of a method for annotating a graphical element. FIG. 10 illustrates a flowchart of a method for providing information about a medical condition based on user input. The acts of FIGS. 9 and 10 are described herein below with respect to the schematics, diagrams, devices and components shown in FIGS. 1-8.
  • FIG. 9 shows that a method for annotating a graphical element can comprise an act 902 of displaying graphical element(s). Act 902 can include displaying one or more graphical elements at a user interface that includes one or more user-selectable annotation options for selecting one or more areas of the graphical elements for annotation. Act 902 can include the annotation user interface 300 displaying a graphical element, either statically or dynamically. The annotation user interface 300 can include any number of user-selectable graphical element manipulation options, such as color control options, cropping options, rotation options, drawing options, and so forth. The annotation user interface 300 can, in some embodiments, also employ a rich interactive display, such as the interactive user interface 800 of FIG. 8, which can add or remove layers to or from the graphical element, apply motion to the graphical element, rotate the graphical element in 360°, etc.
  • Act 902 can also include the annotation user interface 300 displaying one or more annotation interface controls 302. As discussed previously, the interface controls 302 can include one or more tools for selecting regions for annotation, tools for launching an annotation dialogue 304, and so forth. As shown, the displayed graphical element may be a medical or anatomical image, such as the illustrated human shoulder.
  • FIG. 9 also includes an act 904 of receiving user input selecting an annotation area. Act 904 can include receiving user input selecting one or more areas of the graphical elements for annotation. For example, act 904 can include the user selecting one or more regions of the graphical element by direct interaction with the graphical element (e.g., by directly clicking, touching, dragging, pinching, zooming, etc. on the graphical element). However, act 904 can also involve the use of selection tools, menus, buttons, etc. These tools may be provided as part of the interface controls 302, or as part of any other user interface element.
  • FIG. 9 also includes an act 906 of displaying annotation dialogue(s). Act 906 can include displaying an annotation dialogue which provides selection of one or more annotation options for annotating the selected one or more areas. The annotation dialogue can include one or more highlighting options, including one or more of shapes or colors. Of course, the annotation dialogue can include other highlighting options, such as images, animations, and so forth. Additionally, the annotation dialogue can include one or more comment input options, including one or more of text or audio input. Other comment options are also available, such as hand-drawing, video attachment or recording, etc.
  • After displaying the annotation dialogue(s), the illustrated method includes an act 908 of receiving user input selecting highlighting and comment options. Act 908 can include receiving user input selecting one or more of the highlighting options. For example, the user can select one or more shapes, colors, etc. from the annotation dialogue, or any other associated user interface controls. In some instances, the selected highlighting options include both a selection of shape (e.g., a circle) and a selection of color (e.g., red). When used in the medical context, selection of a shape can indicate a medical condition, and selection of a color can indicate a severity level of the medical condition.
  • Act 908 can also include receiving user input selecting one or more of the comment input options. Selecting a comment input option can include selecting a comment type (e.g., text, audio or video recording, handwriting, drawing, etc.). One will appreciate that the annotation dialogue 304 can include pre-selected highlighting and/or comment input options, and that comment input options can be selected by their mere use (e.g., a text comment type can be selected by the user beginning to enter text).
  • FIG. 9 also shows an act 910 of receiving comment input. Act 910 can include receiving user comment input and inputting at least one comment corresponding to the user comment input. For example, the user can type or otherwise enter text via the annotation dialogue, or an associated user interface control, using a physical or virtual keyboard. As illustrated, however, the user can also record audio and/or video comments. In such a circumstance, the act can include transcribing audio input into the at least one comment.
  • FIG. 9 also identifies an act 912 of displaying the annotated graphical element(s). This act can include displaying the one or more graphical elements along with the selected annotation, including the selected highlighting options and the inputted at least one comment. For example, once an annotation has been created, the annotation user interface 300 can display the graphical element along with any appropriate visual cues indicating that an annotation exists. When selected, the visual cue can be replaced or expanded to fully display any highlighting options and/or the entered comments.
  • Of course, the method can include any number of additional acts. For example, the method can also include an act of uploading the annotated graphical element(s) to a clearing house, or of sending the annotated graphical element(s) to a destination. In some embodiments, the method can include uploading the one or more graphical elements and the selected annotation as a medical file to a clearinghouse. The medical file can be accessed from the clearinghouse and displayed with a selectable graphical indicator of proximate areas of the selected annotation which, when selected, render the selected annotation. Rendering the selected annotation can comprise playing an audio comment, or playing a recorded video.
  • In addition to the foregoing, FIG. 10 illustrates that one or more additional implementations of providing information about a medical condition based on user input can comprise an act 1002 of presenting an anatomical subassembly. Act 1002 can include presenting a user with an anatomical subassembly representing an anatomical region of the human body, including one or more user-selectable display elements which, when selected, indicate one or more an areas in the anatomical subassembly representing areas in which a medical condition is experienced in the human body.
  • A user (e.g., a patient or a doctor) can be presented an anatomical subassembly 602 from which the user can select one or more anatomical regions or elements 604. In one or more embodiments, the medical condition can be pain experienced in the human body. Of course, the anatomical subassembly 602 can presented in any appropriate static or dynamic manner, and interactive tools can be provided, such as tools for rotating the anatomical subassembly to provide a 360° view of the anatomical subassembly 602.
  • FIG. 10 illustrates that embodiments of the invention also include an act 1004 of receiving input selecting a display element, such as, for example, input selecting one or more of the user-selectable elements of the anatomical assembly. In one embodiment, the user touches or clicks on anatomical region/element 604 to select that region/element of the anatomical subassembly 602. This selection can also include presenting the user with other options for providing additional information about the selection. For example, the act of selecting can include presenting the user with one or more user-selectable interface elements for further selecting a severity level (e.g., a pain level) associated with the experience medical condition.
  • FIG. 10 includes an act 1006 of presenting a selection of medical condition(s) to a user, such as a selection of one or more medical conditions corresponding to the one or more selected elements of the anatomical subassembly, wherein the one or more medical conditions are medical conditions corresponding to the one or more selected elements of the anatomical assembly. This can also include presenting the user with a list or menu of medical conditions that may apply to the selected anatomical subassembly 602. From this list, the user can select one or more of the medical conditions (e.g., medical condition 606). Then, the user can be presented with information about the selected medical condition(s). Illustratively, the user may be presented with a condition information user interface 700 which provides textual, visual, and/or audio information about the selected condition. The condition information user interface 700 can also provide information about procedures for treating the one or more medical conditions.
  • FIG. 10 also includes an act 1008 of presenting information about specialists. Act 1008 can include presenting the user with information about one or more medical specialists corresponding to one or more of the medical conditions, as well as the one or more selected elements of the anatomical subassembly. For example, when the user is a patient, the patient can send any gathered information and any user annotations to a specialist for remote diagnosis. Alternatively, when the user is a doctor, the user can send the information to another doctor for advice, additional opinions, etc.
  • It will be appreciated that the method can also comprise annotation by the annotation system 104. Thus, at any point, the user can be presented with one or more user-selectable annotation options for selecting one or more areas of the anatomical subassembly 602 for annotation. After annotations are made, these annotations can be uploaded as a medical file to a clearinghouse where they can be made accessible to doctors, patients, insurance companies, hospitals, etc.
  • Accordingly, FIGS. 1-10 provide a number of components and mechanisms for annotating graphical element and for providing medical information or assistance based on selected graphical elements. One or more disclosed implementations also provide for a central clearinghouse for sharing graphical elements between annotation systems, data sources, and destinations.
  • The foregoing embodiments may be also practiced by a computer system including any number of one or more processors and computer readable media such as computer memory or other storage media which is detachable from the processor(s). In particular, the computer memory may store computer executable instructions that when executed by one or more processors cause various functions to be performed.
  • Embodiments may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer readable storage media and transmission computer readable media.
  • Physical computer readable storage media (device(s)) includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc), magnetic disk storage or other magnetic storage devices, flash memory, thumb drives, portable memory drives, solid state disks, or any other physical medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. The storage devices do not consist of merely transitory carrier waves and/or merely transitory signals.
  • A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.
  • Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer readable media to physical computer readable storage media (or vice-versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface card or module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer readable physical storage media at a computer system. Thus, computer readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device (e.g., CPU device(s)) to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, tablet computers (e.g., iPads, Android tablets), message processors, hand-held devices (e.g., iPods), multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, other smart devices, or other interactive display devices or personal computers, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

We claim:
1. A method, implemented at a computer system that includes one or more processors and system memory, for providing information about a medical condition based on user input, the method comprising:
presenting a user with an anatomical assembly representing an anatomical region of the human body, including one or more user-selectable display elements which, when selected, indicate one or more an areas in the anatomical assembly representing areas in which a medical symptom is experienced in the human body;
receiving user input selecting one or more of the user-selectable elements of the anatomical assembly;
presenting the user with a selection of one or more medical conditions corresponding to the one or more selected elements of the anatomical assembly, wherein the one or more medical conditions are medical conditions corresponding to the one or more selected elements of the anatomical assembly; and
presenting the user with information about one or more medical specialists corresponding to one or more of the medical conditions as well as the one or more selected elements of the anatomical assembly.
2. The method of claim 1, wherein the medical symptom is selected from the group comprising pain experienced in the human body, inflammation, discoloration, and rash.
3. The method of claim 1, further comprising presenting the user with one or more user-selectable interface elements for selecting a severity level.
4. The method of claim 1, further comprising:
receiving user input selecting one or more of the medical conditions; and
presenting the user with information about the selected one or more medical conditions.
5. The method of claim 4, wherein the information about the selected one or more medical conditions includes:
one or more graphical images or videos illustrating the one or more medical conditions; and
a textual description of the one or more medical conditions.
6. The method of claim 5, wherein the information about the selected one or more medical conditions also includes information about procedures for treating the one or more medical conditions.
7. The method of claim 1, further comprising:
rotating the anatomical assembly in response to user input, to provide a 360 degree view of the anatomical assembly.
8. The method of claim 1, further comprising:
presenting one or more user-selectable annotation options for selecting one or more areas of the anatomical assembly for annotation.
9. One or more hardware storage devices having stored thereon computer executable instructions that, when executed by one or more processors of a computer system, cause the computer system to provide information about a medical condition based on user input, including at least the following:
presenting a user with an anatomical assembly representing an anatomical region of the human body, including one or more user-selectable display elements which, when selected, indicate one or more an areas in the anatomical assembly representing areas in which a medical symptom is experienced in the human body;
receiving user input selecting one or more of the user-selectable elements of the anatomical assembly;
presenting the user with a selection of one or more medical conditions corresponding to the one or more selected elements of the anatomical assembly, wherein the one or more medical conditions are medical conditions corresponding to the one or more selected elements of the anatomical assembly; and
presenting the user with information about one or more medical specialists corresponding to one or more of the medical conditions as well as the one or more selected elements of the anatomical assembly.
10. The one or more hardware storage devices of claim 9, wherein the medical symptom is selected from the group comprising pain experienced in the human body, inflammation, discoloration, and rash.
11. The one or more hardware storage devices of claim 9, wherein the computer system also presents the user with one or more user-selectable interface elements for selecting a severity level.
12. The one or more hardware storage devices of claim 9, wherein the computer system also receives user input selecting one or more of the medical conditions, and presents the user with information about the selected one or more medical conditions.
13. The one or more hardware storage devices of claim 12, wherein the information about the selected one or more medical conditions includes:
one or more graphical images or videos illustrating the one or more medical conditions; and
a textual description of the one or more medical conditions.
14. The one or more hardware storage devices of claim 13, wherein the information about the selected one or more medical conditions also includes information about procedures for treating the one or more medical conditions.
15. The one or more hardware storage devices of claim 9, wherein the computer system also rotates the anatomical assembly in response to user input, to provide a 360 degree view of the anatomical assembly.
16. The one or more hardware storage devices of claim 9, wherein the computer system also presents one or more user-selectable annotation options for selecting one or more areas of the anatomical assembly for annotation.
17. A computer system, comprising:
one or more hardware processors; and
one or more hardware storage devices having stored thereon computer executable instructions that, when executed by the one or more processors, cause the computer system to perform at least the following:
present a user with an anatomical assembly representing an anatomical region of the human body, including one or more user-selectable display elements which, when selected, indicate one or more an areas in the anatomical assembly representing areas in which a medical symptom is experienced in the human body;
receive user input selecting one or more of the user-selectable elements of the anatomical assembly;
present the user with a selection of one or more medical conditions corresponding to the one or more selected elements of the anatomical assembly, wherein the one or more medical conditions are medical conditions corresponding to the one or more selected elements of the anatomical assembly; and
present the user with information about one or more medical specialists corresponding to one or more of the medical conditions as well as the one or more selected elements of the anatomical assembly.
18. The computer system of claim 17, wherein the medical symptom is selected from the group comprising pain experienced in the human body, inflammation, discoloration, and rash.
19. The computer system of claim 17, wherein the computer system also receives user input selecting one or more of the medical conditions, and presents the user with information about the selected one or more medical conditions.
20. The computer system of claim 17, wherein the computer system also presents one or more user-selectable annotation options for selecting one or more areas of the anatomical assembly for annotation.
US14/477,540 2010-12-17 2014-09-04 Visual selection of an anatomical element for requesting information about a medical condition Abandoned US20140372955A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/477,540 US20140372955A1 (en) 2010-12-17 2014-09-04 Visual selection of an anatomical element for requesting information about a medical condition

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201061424548P 2010-12-17 2010-12-17
US201161442686P 2011-02-14 2011-02-14
US201161442666P 2011-02-14 2011-02-14
US13/093,272 US8843852B2 (en) 2010-12-17 2011-04-25 Medical interface, annotation and communication systems
US14/477,540 US20140372955A1 (en) 2010-12-17 2014-09-04 Visual selection of an anatomical element for requesting information about a medical condition

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/093,272 Division US8843852B2 (en) 2010-12-17 2011-04-25 Medical interface, annotation and communication systems

Publications (1)

Publication Number Publication Date
US20140372955A1 true US20140372955A1 (en) 2014-12-18

Family

ID=46236179

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/093,272 Expired - Fee Related US8843852B2 (en) 2010-12-17 2011-04-25 Medical interface, annotation and communication systems
US14/477,540 Abandoned US20140372955A1 (en) 2010-12-17 2014-09-04 Visual selection of an anatomical element for requesting information about a medical condition

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/093,272 Expired - Fee Related US8843852B2 (en) 2010-12-17 2011-04-25 Medical interface, annotation and communication systems

Country Status (1)

Country Link
US (2) US8843852B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130290826A1 (en) * 2011-12-27 2013-10-31 Toshiba Medical Systems Corporation Medical image display apparatus and medical image archiving system
US9715753B2 (en) 2013-01-23 2017-07-25 Orca Health, Inc. Personalizing medical conditions with augmented reality
WO2021067334A1 (en) * 2019-10-03 2021-04-08 Nsv, Inc. Automated process for controlling in vivo examination of the cervix and collecting image data related thereto

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8843852B2 (en) * 2010-12-17 2014-09-23 Orca Health, Inc. Medical interface, annotation and communication systems
US8992232B2 (en) 2011-09-20 2015-03-31 Orca Health, Inc. Interactive and educational vision interfaces
JP5899856B2 (en) * 2011-11-18 2016-04-06 ソニー株式会社 Information processing apparatus, information processing method, and program
JP6091137B2 (en) * 2011-12-26 2017-03-08 キヤノン株式会社 Image processing apparatus, image processing system, image processing method, and program
JP5820320B2 (en) * 2012-03-27 2015-11-24 株式会社東芝 Information processing terminal and method, and information management apparatus and method
US8908943B2 (en) 2012-05-22 2014-12-09 Orca Health, Inc. Personalized anatomical diagnostics and simulations
EP2880574A1 (en) * 2012-08-06 2015-06-10 Koninklijke Philips N.V. Method and apparatus for managing an annotated record of a medical treatment event
US20150178457A1 (en) * 2012-08-06 2015-06-25 Koninklijke Philips N.V. Graphical user interface for obtaining a record of a medical treatment event in real time
BR112015002446A2 (en) * 2012-08-06 2017-07-04 Koninklijke Philips Nv method for recording a medical treatment event in real time
USD743982S1 (en) * 2012-11-30 2015-11-24 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD742907S1 (en) * 2012-11-30 2015-11-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US8954546B2 (en) 2013-01-25 2015-02-10 Concurix Corporation Tracing with a workload distributor
US8972882B2 (en) * 2013-01-30 2015-03-03 Orca Health, Inc. User interfaces and systems for oral hygiene
US9256969B2 (en) * 2013-02-01 2016-02-09 Microsoft Technology Licensing, Llc Transformation function insertion for dynamically displayed tracer data
US8924941B2 (en) 2013-02-12 2014-12-30 Concurix Corporation Optimization analysis using similar frequencies
US20130283281A1 (en) 2013-02-12 2013-10-24 Concurix Corporation Deploying Trace Objectives using Cost Analyses
US8997063B2 (en) 2013-02-12 2015-03-31 Concurix Corporation Periodicity optimization in an automated tracing system
US9305347B2 (en) * 2013-02-13 2016-04-05 Dental Imaging Technologies Corporation Automatic volumetric image inspection
US20130227529A1 (en) 2013-03-15 2013-08-29 Concurix Corporation Runtime Memory Settings Derived from Trace Data
US9575874B2 (en) 2013-04-20 2017-02-21 Microsoft Technology Licensing, Llc Error list and bug report analysis for configuring an application tracer
US8990777B2 (en) 2013-05-21 2015-03-24 Concurix Corporation Interactive graph for navigating and monitoring execution of application code
US9734040B2 (en) 2013-05-21 2017-08-15 Microsoft Technology Licensing, Llc Animated highlights in a graph representing an application
US9280841B2 (en) 2013-07-24 2016-03-08 Microsoft Technology Licensing, Llc Event chain visualization of performance data
WO2015017687A2 (en) * 2013-07-31 2015-02-05 Cosmesys Inc. Systems and methods for producing predictive images
US9292415B2 (en) 2013-09-04 2016-03-22 Microsoft Technology Licensing, Llc Module specific tracing in a shared module environment
US20150134361A1 (en) * 2013-11-08 2015-05-14 The Cleveland Clinic Foundation Graphical generation and retrieval of medical records
US9460534B2 (en) * 2013-11-12 2016-10-04 Siemens Aktiengesellschaft Labeling a rib cage by placing a label based on a center point and position of a rib
CN105765528B (en) 2013-11-13 2019-09-24 微软技术许可有限责任公司 Method, system and medium with the application execution path trace that configurable origin defines
EP3069267A4 (en) 2013-11-13 2017-09-27 Microsoft Technology Licensing, LLC Software component recommendation based on multiple trace runs
USD765687S1 (en) * 2014-01-10 2016-09-06 Apple Inc. Display screen or portion thereof with graphical user interface
US11227427B2 (en) * 2014-08-11 2022-01-18 Covidien Lp Treatment procedure planning system and method
EP2996057A1 (en) * 2014-09-12 2016-03-16 Oulun Ammattikorkeakoulu Oy Healthcare related information management
JP5885133B1 (en) * 2014-11-17 2016-03-15 富士ゼロックス株式会社 Terminal device, defect report system and program
US9639512B1 (en) * 2014-11-20 2017-05-02 Nicholas M. Carter Apparatus and method for sharing regional annotations of an image
US10127331B2 (en) * 2014-12-15 2018-11-13 The Boeing Company 3D models utilizing 3D markers to indicate engineering requirements
US11231826B2 (en) * 2015-03-08 2022-01-25 Google Llc Annotations in software applications for invoking dialog system functions
US10055542B2 (en) * 2015-03-25 2018-08-21 Niramai Health Analytix Pvt Ltd Software interface tool for breast cancer screening
US9824232B1 (en) * 2015-09-21 2017-11-21 Amazon Technologies, Inc. System for providing messages through media content
US10417272B1 (en) 2015-09-21 2019-09-17 Amazon Technologies, Inc. System for suppressing output of content based on media access
US10558353B2 (en) 2015-11-18 2020-02-11 Samsung Electronics Co., Ltd. System and method for 360-degree video navigation
CN109478423B (en) * 2016-07-21 2023-09-15 皇家飞利浦有限公司 Annotating medical images
US20180189992A1 (en) * 2017-01-04 2018-07-05 Clarius Mobile Health Corp. Systems and methods for generating an ultrasound multimedia product
JP1607361S (en) 2017-06-23 2018-06-25
CN116301480A (en) * 2017-08-01 2023-06-23 直观外科手术操作公司 Touch screen user interface for interacting with virtual models
US11316865B2 (en) 2017-08-10 2022-04-26 Nuance Communications, Inc. Ambient cooperative intelligence system and method
US20190051395A1 (en) 2017-08-10 2019-02-14 Nuance Communications, Inc. Automated clinical documentation system and method
US10586017B2 (en) * 2017-08-31 2020-03-10 International Business Machines Corporation Automatic generation of UI from annotation templates
AU2018339524B2 (en) 2017-09-27 2023-05-25 Equifax Inc. Synchronizing data-entry fields with corresponding image regions
WO2019173333A1 (en) 2018-03-05 2019-09-12 Nuance Communications, Inc. Automated clinical documentation system and method
US20190272902A1 (en) 2018-03-05 2019-09-05 Nuance Communications, Inc. System and method for review of automated clinical documentation
US11250382B2 (en) 2018-03-05 2022-02-15 Nuance Communications, Inc. Automated clinical documentation system and method
US11468562B2 (en) * 2019-01-03 2022-10-11 Camerad Technologies Systems and methods for radiographic and photographic imaging of patients
US11227679B2 (en) 2019-06-14 2022-01-18 Nuance Communications, Inc. Ambient clinical intelligence system and method
US11216480B2 (en) 2019-06-14 2022-01-04 Nuance Communications, Inc. System and method for querying data points from graph data structures
US11531807B2 (en) 2019-06-28 2022-12-20 Nuance Communications, Inc. System and method for customized text macros
CN112420202A (en) 2019-08-23 2021-02-26 阿里巴巴集团控股有限公司 Data processing method, device and equipment
US11670408B2 (en) 2019-09-30 2023-06-06 Nuance Communications, Inc. System and method for review of automated clinical documentation
US11755998B2 (en) * 2020-05-18 2023-09-12 International Business Machines Corporation Smart data annotation in blockchain networks
US20220101518A1 (en) * 2020-09-25 2022-03-31 GE Precision Healthcare LLC System and method for stylizing a medical image
US11222103B1 (en) 2020-10-29 2022-01-11 Nuance Communications, Inc. Ambient cooperative intelligence system and method
US12051498B2 (en) * 2021-07-19 2024-07-30 GE Precision Healthcare LLC Imaging system and method employing virtual skin markers
US11727145B1 (en) 2022-06-10 2023-08-15 Playback Health Inc. Multi-party controlled transient user credentialing for interaction with patient health data

Citations (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4839822A (en) * 1987-08-13 1989-06-13 501 Synthes (U.S.A.) Computer system and method for suggesting treatments for physical trauma
US5335173A (en) * 1991-08-08 1994-08-02 Hitachi Medical Corp. Medical diagnosis image display method and apparatus
US6161080A (en) * 1997-11-17 2000-12-12 The Trustees Of Columbia University In The City Of New York Three dimensional multibody modeling of anatomical joints
US20020021828A1 (en) * 2000-08-01 2002-02-21 Arthur Papier System and method to aid diagnoses using cross-referenced knowledge and image databases
US6383135B1 (en) * 2000-02-16 2002-05-07 Oleg K. Chikovani System and method for providing self-screening of patient symptoms
US20020065854A1 (en) * 2000-11-29 2002-05-30 Jennings Pressly Automated medical diagnosis reporting system
US20020072785A1 (en) * 1999-12-14 2002-06-13 Medtronic, Inc. Apparatus and method for remote therapy and diagnosis in medical devices via interface systems
US20020082865A1 (en) * 2000-06-20 2002-06-27 Bianco Peter T. Electronic patient healthcare system and method
US20020183607A1 (en) * 2000-09-11 2002-12-05 Thomas Bauch Method and system for visualizing a body volume and computer program product
US20030016850A1 (en) * 2001-07-17 2003-01-23 Leon Kaufman Systems and graphical user interface for analyzing body images
US20030023461A1 (en) * 2001-03-14 2003-01-30 Dan Quintanilla Internet based therapy management system
US20030052787A1 (en) * 2001-08-03 2003-03-20 Zerhusen Robert Mark Patient point-of-care computer system
US6572560B1 (en) * 1999-09-29 2003-06-03 Zargis Medical Corp. Multi-modal cardiac diagnostic decision support system and method
US20030146942A1 (en) * 2002-02-07 2003-08-07 Decode Genetics Ehf. Medical advice expert
US20030156745A1 (en) * 2001-09-11 2003-08-21 Terarecon, Inc. Image based medical report system on a network
US6690397B1 (en) * 2000-06-05 2004-02-10 Advanced Neuromodulation Systems, Inc. System for regional data association and presentation and method for the same
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US20040064342A1 (en) * 2002-09-30 2004-04-01 Browne David W. Health care protocols
US20040078215A1 (en) * 2000-11-22 2004-04-22 Recare, Inc. Systems and methods for documenting medical findings of a physical examination
US20040138569A1 (en) * 1999-08-20 2004-07-15 Sorin Grunwald User interface for handheld imaging devices
US20040151358A1 (en) * 2003-01-31 2004-08-05 Akiko Yanagita Medical image processing system and method for processing medical image
US20040172296A1 (en) * 2002-12-03 2004-09-02 Recare, Inc. Data structures for context based rule application
US20050010444A1 (en) * 2003-06-06 2005-01-13 Iliff Edwin C. System and method for assisting medical diagnosis using an anatomic system and cause matrix
US20050070783A1 (en) * 2003-09-30 2005-03-31 Akiko Yanagita Medical image processing apparatus
US20050107689A1 (en) * 2003-11-14 2005-05-19 Konica Minolta Meical & Graphic, Inc. Medical image management system
US20050113960A1 (en) * 2003-11-26 2005-05-26 Karau Kelly L. Methods and systems for computer aided targeting
US20050240882A1 (en) * 2004-04-21 2005-10-27 Ge Medical Systems Information Technologies, Inc. Method and system for displaying regions of pathological interest
US20050261941A1 (en) * 2004-05-21 2005-11-24 Alexander Scarlat Method and system for providing medical decision support
US20050280651A1 (en) * 2004-06-18 2005-12-22 Geddes Berkeley L Apparatus and method for enhanced video images
US20060173858A1 (en) * 2004-12-16 2006-08-03 Scott Cantlin Graphical medical data acquisition system
US20060241977A1 (en) * 2005-04-22 2006-10-26 Fitzgerald Loretta A Patient medical data graphical presentation system
US20070016028A1 (en) * 2005-07-15 2007-01-18 General Electric Company Integrated physiology and imaging workstation
US20070061393A1 (en) * 2005-02-01 2007-03-15 Moore James F Management of health care data
US20070118399A1 (en) * 2005-11-22 2007-05-24 Avinash Gopal B System and method for integrated learning and understanding of healthcare informatics
US20070168461A1 (en) * 2005-02-01 2007-07-19 Moore James F Syndicating surgical data in a healthcare environment
US7255564B2 (en) * 2003-03-14 2007-08-14 Innovative Premiums, Inc. Anatomical pocket model
US20070260492A1 (en) * 2006-03-09 2007-11-08 Microsoft Corporation Master patient index
US20080021877A1 (en) * 2006-07-20 2008-01-24 Terarecon, Inc. Medical image processing system in a wide-area network environment
US20080077019A1 (en) * 2006-09-25 2008-03-27 Song Xiao System and method for health evaluation
US20080270183A1 (en) * 2007-04-26 2008-10-30 Michael Hawkins Systems and methods for presentation of clinical evidence for diagnostic interpretation
US20080273774A1 (en) * 2007-05-04 2008-11-06 Maged Mikhail System and methods for capturing a medical drawing or sketch for generating progress notes, diagnosis and billing codes
US7457656B2 (en) * 2000-12-20 2008-11-25 Heart Imaging Technologies, Llc Medical image management system
US20090024413A1 (en) * 2007-07-19 2009-01-22 Sultan Haider Method and system to manage cross institutional mamma carcinoma care plans
US7499048B2 (en) * 2002-10-31 2009-03-03 Medtronic, Inc. Body region indication
US20090070140A1 (en) * 2007-08-03 2009-03-12 A-Life Medical, Inc. Visualizing the Documentation and Coding of Surgical Procedures
US20090192823A1 (en) * 2007-04-26 2009-07-30 General Electric Company Electronic health record timeline and the human figure
US20090217194A1 (en) * 2008-02-24 2009-08-27 Neil Martin Intelligent Dashboards
US20090240137A1 (en) * 2008-03-23 2009-09-24 Scott Rosa Diagnostic Imaging Method
US7623915B2 (en) * 2003-07-16 2009-11-24 Medtronic Physio-Control Corp. Interactive first aid information system
US20090318775A1 (en) * 2008-03-26 2009-12-24 Seth Michelson Methods and systems for assessing clinical outcomes
US20100026955A1 (en) * 2006-10-26 2010-02-04 Carl Zeiss Vision Australia Holdings Limited Ophthalmic lens dispensing method and system
US20100034438A1 (en) * 2008-08-07 2010-02-11 Canon Kabushiki Kaisha Output device, and method, program, and storage medium therefor
US20100050110A1 (en) * 2008-08-19 2010-02-25 General Electric Company Integration viewer systems and methods of use
US20100049050A1 (en) * 2008-08-22 2010-02-25 Ultrasonix Medical Corporation Highly configurable medical ultrasound machine and related methods
US20100063848A1 (en) * 2008-07-31 2010-03-11 Consortium Of Rheumatology Researchers Of North America, Inc. System and method for collecting and managing patient data
US7717468B2 (en) * 2005-12-01 2010-05-18 Innovative Premiums Inc. Clipboard with an integral three dimensional display
US20100128953A1 (en) * 2008-11-25 2010-05-27 Algotec Systems Ltd. Method and system for registering a medical image
US20100191100A1 (en) * 2009-01-23 2010-07-29 Warsaw Orthopedic, Inc. Methods and systems for diagnosing, treating, or tracking spinal disorders
US20100198755A1 (en) * 1999-04-09 2010-08-05 Soll Andrew H Enhanced medical treatment
US20100217336A1 (en) * 2006-08-31 2010-08-26 Catholic Healthcare West (Chw) Computerized Planning Tool For Spine Surgery and Method and Device for Creating a Customized Guide for Implantations
US20100256991A1 (en) * 2007-09-27 2010-10-07 Canon Kabushiki Kaisha Medical diagnosis support apparatus
US20100311028A1 (en) * 2009-06-04 2010-12-09 Zimmer Dental, Inc. Dental Implant Surgical Training Simulation System
US7905840B2 (en) * 2003-10-17 2011-03-15 Nuvasive, Inc. Surgical access system and related methods
US20110145693A1 (en) * 2009-12-10 2011-06-16 Fulcrum Medical Inc. Transfer of digital medical images and data
US8031838B2 (en) * 2009-01-29 2011-10-04 The Invention Science Fund I, Llc Diagnostic delivery service
US20110291800A1 (en) * 2010-05-24 2011-12-01 General Electric Company Handheld x-ray system interface with tracking feature
US20120014559A1 (en) * 2010-01-12 2012-01-19 Siemens Aktiengesellschaft Method and System for Semantics Driven Image Registration
US20120159391A1 (en) * 2010-12-17 2012-06-21 Orca MD, LLC Medical interface, annotation and communication systems
US8458610B2 (en) * 2010-03-17 2013-06-04 Discus Investments, Llc Medical information generation and recordation methods and apparatus
US20130246097A1 (en) * 2010-03-17 2013-09-19 Howard M. Kenney Medical Information Systems and Medical Data Processing Methods
US8908943B2 (en) * 2012-05-22 2014-12-09 Orca Health, Inc. Personalized anatomical diagnostics and simulations
US8972882B2 (en) * 2013-01-30 2015-03-03 Orca Health, Inc. User interfaces and systems for oral hygiene
US8992232B2 (en) * 2011-09-20 2015-03-31 Orca Health, Inc. Interactive and educational vision interfaces
US9092727B1 (en) * 2011-08-11 2015-07-28 D.R. Systems, Inc. Exam type mapping
US20150269315A1 (en) * 2012-12-25 2015-09-24 Kabushiki Kaisha Toshiba Image observation apparatus, image observation method, and computer-readable recording medium
US9256962B2 (en) * 2013-01-23 2016-02-09 Orca Health Inc. Personalizing medical conditions with augmented reality

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7202838B2 (en) * 2003-11-19 2007-04-10 Eastman Kodak Company Viewing device
JP4820680B2 (en) * 2006-04-12 2011-11-24 株式会社東芝 Medical image display device
US20080027917A1 (en) * 2006-07-31 2008-01-31 Siemens Corporate Research, Inc. Scalable Semantic Image Search

Patent Citations (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4839822A (en) * 1987-08-13 1989-06-13 501 Synthes (U.S.A.) Computer system and method for suggesting treatments for physical trauma
US5335173A (en) * 1991-08-08 1994-08-02 Hitachi Medical Corp. Medical diagnosis image display method and apparatus
US6161080A (en) * 1997-11-17 2000-12-12 The Trustees Of Columbia University In The City Of New York Three dimensional multibody modeling of anatomical joints
US20100198755A1 (en) * 1999-04-09 2010-08-05 Soll Andrew H Enhanced medical treatment
US20040138569A1 (en) * 1999-08-20 2004-07-15 Sorin Grunwald User interface for handheld imaging devices
US6572560B1 (en) * 1999-09-29 2003-06-03 Zargis Medical Corp. Multi-modal cardiac diagnostic decision support system and method
US20020072785A1 (en) * 1999-12-14 2002-06-13 Medtronic, Inc. Apparatus and method for remote therapy and diagnosis in medical devices via interface systems
US6383135B1 (en) * 2000-02-16 2002-05-07 Oleg K. Chikovani System and method for providing self-screening of patient symptoms
US6690397B1 (en) * 2000-06-05 2004-02-10 Advanced Neuromodulation Systems, Inc. System for regional data association and presentation and method for the same
US20020082865A1 (en) * 2000-06-20 2002-06-27 Bianco Peter T. Electronic patient healthcare system and method
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US20020021828A1 (en) * 2000-08-01 2002-02-21 Arthur Papier System and method to aid diagnoses using cross-referenced knowledge and image databases
US20020183607A1 (en) * 2000-09-11 2002-12-05 Thomas Bauch Method and system for visualizing a body volume and computer program product
US20040078215A1 (en) * 2000-11-22 2004-04-22 Recare, Inc. Systems and methods for documenting medical findings of a physical examination
US20020065854A1 (en) * 2000-11-29 2002-05-30 Jennings Pressly Automated medical diagnosis reporting system
US7457656B2 (en) * 2000-12-20 2008-11-25 Heart Imaging Technologies, Llc Medical image management system
US20030023461A1 (en) * 2001-03-14 2003-01-30 Dan Quintanilla Internet based therapy management system
US20030016850A1 (en) * 2001-07-17 2003-01-23 Leon Kaufman Systems and graphical user interface for analyzing body images
US20030052787A1 (en) * 2001-08-03 2003-03-20 Zerhusen Robert Mark Patient point-of-care computer system
US20030156745A1 (en) * 2001-09-11 2003-08-21 Terarecon, Inc. Image based medical report system on a network
US20030146942A1 (en) * 2002-02-07 2003-08-07 Decode Genetics Ehf. Medical advice expert
US20040064342A1 (en) * 2002-09-30 2004-04-01 Browne David W. Health care protocols
US7499048B2 (en) * 2002-10-31 2009-03-03 Medtronic, Inc. Body region indication
US20040172296A1 (en) * 2002-12-03 2004-09-02 Recare, Inc. Data structures for context based rule application
US20040151358A1 (en) * 2003-01-31 2004-08-05 Akiko Yanagita Medical image processing system and method for processing medical image
US7255564B2 (en) * 2003-03-14 2007-08-14 Innovative Premiums, Inc. Anatomical pocket model
US20050010444A1 (en) * 2003-06-06 2005-01-13 Iliff Edwin C. System and method for assisting medical diagnosis using an anatomic system and cause matrix
US7623915B2 (en) * 2003-07-16 2009-11-24 Medtronic Physio-Control Corp. Interactive first aid information system
US20050070783A1 (en) * 2003-09-30 2005-03-31 Akiko Yanagita Medical image processing apparatus
US7905840B2 (en) * 2003-10-17 2011-03-15 Nuvasive, Inc. Surgical access system and related methods
US20050107689A1 (en) * 2003-11-14 2005-05-19 Konica Minolta Meical & Graphic, Inc. Medical image management system
US20050113960A1 (en) * 2003-11-26 2005-05-26 Karau Kelly L. Methods and systems for computer aided targeting
US20050240882A1 (en) * 2004-04-21 2005-10-27 Ge Medical Systems Information Technologies, Inc. Method and system for displaying regions of pathological interest
US20050261941A1 (en) * 2004-05-21 2005-11-24 Alexander Scarlat Method and system for providing medical decision support
US20050280651A1 (en) * 2004-06-18 2005-12-22 Geddes Berkeley L Apparatus and method for enhanced video images
US20060173858A1 (en) * 2004-12-16 2006-08-03 Scott Cantlin Graphical medical data acquisition system
US20070168461A1 (en) * 2005-02-01 2007-07-19 Moore James F Syndicating surgical data in a healthcare environment
US20070061393A1 (en) * 2005-02-01 2007-03-15 Moore James F Management of health care data
US20060241977A1 (en) * 2005-04-22 2006-10-26 Fitzgerald Loretta A Patient medical data graphical presentation system
US20070016028A1 (en) * 2005-07-15 2007-01-18 General Electric Company Integrated physiology and imaging workstation
US20070118399A1 (en) * 2005-11-22 2007-05-24 Avinash Gopal B System and method for integrated learning and understanding of healthcare informatics
US7717468B2 (en) * 2005-12-01 2010-05-18 Innovative Premiums Inc. Clipboard with an integral three dimensional display
US20070260492A1 (en) * 2006-03-09 2007-11-08 Microsoft Corporation Master patient index
US20080021877A1 (en) * 2006-07-20 2008-01-24 Terarecon, Inc. Medical image processing system in a wide-area network environment
US20100217336A1 (en) * 2006-08-31 2010-08-26 Catholic Healthcare West (Chw) Computerized Planning Tool For Spine Surgery and Method and Device for Creating a Customized Guide for Implantations
US20080077019A1 (en) * 2006-09-25 2008-03-27 Song Xiao System and method for health evaluation
US20100026955A1 (en) * 2006-10-26 2010-02-04 Carl Zeiss Vision Australia Holdings Limited Ophthalmic lens dispensing method and system
US20090192823A1 (en) * 2007-04-26 2009-07-30 General Electric Company Electronic health record timeline and the human figure
US20080270183A1 (en) * 2007-04-26 2008-10-30 Michael Hawkins Systems and methods for presentation of clinical evidence for diagnostic interpretation
US20080273774A1 (en) * 2007-05-04 2008-11-06 Maged Mikhail System and methods for capturing a medical drawing or sketch for generating progress notes, diagnosis and billing codes
US20090024413A1 (en) * 2007-07-19 2009-01-22 Sultan Haider Method and system to manage cross institutional mamma carcinoma care plans
US20090070140A1 (en) * 2007-08-03 2009-03-12 A-Life Medical, Inc. Visualizing the Documentation and Coding of Surgical Procedures
US20100256991A1 (en) * 2007-09-27 2010-10-07 Canon Kabushiki Kaisha Medical diagnosis support apparatus
US20090217194A1 (en) * 2008-02-24 2009-08-27 Neil Martin Intelligent Dashboards
US20090240137A1 (en) * 2008-03-23 2009-09-24 Scott Rosa Diagnostic Imaging Method
US20090318775A1 (en) * 2008-03-26 2009-12-24 Seth Michelson Methods and systems for assessing clinical outcomes
US20100063848A1 (en) * 2008-07-31 2010-03-11 Consortium Of Rheumatology Researchers Of North America, Inc. System and method for collecting and managing patient data
US20100034438A1 (en) * 2008-08-07 2010-02-11 Canon Kabushiki Kaisha Output device, and method, program, and storage medium therefor
US20100050110A1 (en) * 2008-08-19 2010-02-25 General Electric Company Integration viewer systems and methods of use
US20100049050A1 (en) * 2008-08-22 2010-02-25 Ultrasonix Medical Corporation Highly configurable medical ultrasound machine and related methods
US20100128953A1 (en) * 2008-11-25 2010-05-27 Algotec Systems Ltd. Method and system for registering a medical image
US20100191100A1 (en) * 2009-01-23 2010-07-29 Warsaw Orthopedic, Inc. Methods and systems for diagnosing, treating, or tracking spinal disorders
US8031838B2 (en) * 2009-01-29 2011-10-04 The Invention Science Fund I, Llc Diagnostic delivery service
US20100311028A1 (en) * 2009-06-04 2010-12-09 Zimmer Dental, Inc. Dental Implant Surgical Training Simulation System
US20110145693A1 (en) * 2009-12-10 2011-06-16 Fulcrum Medical Inc. Transfer of digital medical images and data
US20120014559A1 (en) * 2010-01-12 2012-01-19 Siemens Aktiengesellschaft Method and System for Semantics Driven Image Registration
US8458610B2 (en) * 2010-03-17 2013-06-04 Discus Investments, Llc Medical information generation and recordation methods and apparatus
US20130246097A1 (en) * 2010-03-17 2013-09-19 Howard M. Kenney Medical Information Systems and Medical Data Processing Methods
US20110291800A1 (en) * 2010-05-24 2011-12-01 General Electric Company Handheld x-ray system interface with tracking feature
US20120159391A1 (en) * 2010-12-17 2012-06-21 Orca MD, LLC Medical interface, annotation and communication systems
US8843852B2 (en) * 2010-12-17 2014-09-23 Orca Health, Inc. Medical interface, annotation and communication systems
US9092727B1 (en) * 2011-08-11 2015-07-28 D.R. Systems, Inc. Exam type mapping
US8992232B2 (en) * 2011-09-20 2015-03-31 Orca Health, Inc. Interactive and educational vision interfaces
US8908943B2 (en) * 2012-05-22 2014-12-09 Orca Health, Inc. Personalized anatomical diagnostics and simulations
US20150269315A1 (en) * 2012-12-25 2015-09-24 Kabushiki Kaisha Toshiba Image observation apparatus, image observation method, and computer-readable recording medium
US9256962B2 (en) * 2013-01-23 2016-02-09 Orca Health Inc. Personalizing medical conditions with augmented reality
US8972882B2 (en) * 2013-01-30 2015-03-03 Orca Health, Inc. User interfaces and systems for oral hygiene

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130290826A1 (en) * 2011-12-27 2013-10-31 Toshiba Medical Systems Corporation Medical image display apparatus and medical image archiving system
US9715753B2 (en) 2013-01-23 2017-07-25 Orca Health, Inc. Personalizing medical conditions with augmented reality
WO2021067334A1 (en) * 2019-10-03 2021-04-08 Nsv, Inc. Automated process for controlling in vivo examination of the cervix and collecting image data related thereto
GB2604290A (en) * 2019-10-03 2022-08-31 Nsv Inc Automated process for controlling in vivo examination of the cervix and collecting image data related thereto
GB2604290B (en) * 2019-10-03 2023-07-19 Nsv Inc Automated process for controlling in vivo examination of the cervix and collecting image data related thereto

Also Published As

Publication number Publication date
US8843852B2 (en) 2014-09-23
US20120159391A1 (en) 2012-06-21

Similar Documents

Publication Publication Date Title
US8843852B2 (en) Medical interface, annotation and communication systems
AU2020202337B2 (en) Characterizing states of subject
Kuhn et al. Clinical documentation in the 21st century: executive summary of a policy position paper from the American College of Physicians
US10037820B2 (en) System and method for managing past, present, and future states of health using personalized 3-D anatomical models
US11900266B2 (en) Database systems and interactive user interfaces for dynamic conversational interactions
US10372802B2 (en) Generating a report based on image data
US8520978B2 (en) Methods, computer program products, apparatuses, and systems for facilitating viewing and manipulation of an image on a client device
CN116344071A (en) Informatics platform for integrating clinical care
Weinberger et al. MyPACS. net: a Web-based teaching file authoring tool
JP2008506188A (en) Gesture-based reporting method and system
DE112009003492T5 (en) Systems and methods for extracting, holding, and transmitting clinical elements in a widget-based application
WO2020236678A1 (en) Apparatus for generating and transmitting annotated video sequences in response to manual and image input devices
McGrath et al. Optimizing radiologist productivity and efficiency: Work smarter, not harder
Palagin et al. Hospital Information Smart-System for Hybrid E-Rehabilitation.
Berkowitz et al. Interactive multimedia reporting technical considerations: HIMSS-SIIM Collaborative White Paper
Pereira et al. Improving access to clinical practice guidelines with an interactive graphical interface using an iconic language
Arunachalan et al. Designing portable solutions to support collaborative workflow in long-term care: A five point strategy
Conte et al. Development of a Mobile App for training health professionals in diagnostic imaging: a progress report
US20190279404A1 (en) Methods and program product for mapping of functional panels onto available physical displays
CN102609175B (en) Sketch map omniselector is used to apply sequence-level operation and the system and method for movement images
Miller Bone Metastasis on Temporal Subtraction Images from Serial CT Scans
Nah Providing Personal Health Records in Malaysia-A Portable Prototype
Takeshita Usability Evaluation for Handheld Devices: Presenting Clinical Evidence at the Point of Care

Legal Events

Date Code Title Description
AS Assignment

Owner name: ORCA MD, LLC, UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERRY, MATTHEW M.;BERRY, ROBERT M.;CHAPMAN, WESLEY D.;AND OTHERS;SIGNING DATES FROM 20110329 TO 20110403;REEL/FRAME:033672/0055

Owner name: ORCA HEALTH, INC., UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ORCA MD, LLC;REEL/FRAME:033672/0094

Effective date: 20120116

AS Assignment

Owner name: WORKMAN NYDEGGER PC, UTAH

Free format text: LIEN;ASSIGNOR:ORCA HEALTH, INC;REEL/FRAME:045086/0195

Effective date: 20180118

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: ORCA HEALTH, INC., UTAH

Free format text: RELEASE OF ATTORNEY'S LIEN;ASSIGNOR:WORKMAN NYDEGGER;REEL/FRAME:049998/0967

Effective date: 20190808

AS Assignment

Owner name: COCHLEAR LIMITED, AUSTRALIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ORCA HEALTH INC.;REEL/FRAME:050376/0124

Effective date: 20190811