Nothing Special   »   [go: up one dir, main page]

US20140323856A1 - Magnetic particle detection with incubation period - Google Patents

Magnetic particle detection with incubation period Download PDF

Info

Publication number
US20140323856A1
US20140323856A1 US14/352,693 US201214352693A US2014323856A1 US 20140323856 A1 US20140323856 A1 US 20140323856A1 US 201214352693 A US201214352693 A US 201214352693A US 2014323856 A1 US2014323856 A1 US 2014323856A1
Authority
US
United States
Prior art keywords
garment
feedback
subject
sensors
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/352,693
Inventor
Sytske Foppen
Bernardus Jozef Maria Beerling
Willemina Maria Huijnen-Keur
Hendrik Jan De Graaf
Danielle Walthera Maria Kemper-Van De Wiel
Roland Antonius Johannes Gerardus Smits
Albert Hendrik Jan Immink
Femke Karina de Theije
Wendela Meertens
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to US14/352,693 priority Critical patent/US20140323856A1/en
Priority claimed from PCT/IB2012/055729 external-priority patent/WO2013057703A1/en
Assigned to KONINKLIJKE PHILIPS N.V. reassignment KONINKLIJKE PHILIPS N.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MANZKE, ROBERT, CHAN, RAYMOND, KLINDER, TOBIAS
Publication of US20140323856A1 publication Critical patent/US20140323856A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/50Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
    • G01N33/53Immunoassay; Biospecific binding assay; Materials therefor
    • G01N33/543Immunoassay; Biospecific binding assay; Materials therefor with an insoluble carrier for immobilising immunochemicals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L3/00Containers or dishes for laboratory use, e.g. laboratory glassware; Droppers
    • B01L3/50Containers for the purpose of retaining a material to be analysed, e.g. test tubes
    • B01L3/502Containers for the purpose of retaining a material to be analysed, e.g. test tubes with fluid transport, e.g. in multi-compartment structures
    • B01L3/5027Containers for the purpose of retaining a material to be analysed, e.g. test tubes with fluid transport, e.g. in multi-compartment structures by integrated microfluidic structures, i.e. dimensions of channels and chambers are such that surface tension forces are important, e.g. lab-on-a-chip
    • B01L3/502761Containers for the purpose of retaining a material to be analysed, e.g. test tubes with fluid transport, e.g. in multi-compartment structures by integrated microfluidic structures, i.e. dimensions of channels and chambers are such that surface tension forces are important, e.g. lab-on-a-chip specially adapted for handling suspended solids or molecules independently from the bulk fluid flow, e.g. for trapping or sorting beads, for physically stretching molecules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6805Vests
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2200/00Solutions for specific problems relating to chemical or physical laboratory apparatus
    • B01L2200/06Fluid handling related problems
    • B01L2200/0647Handling flowable solids, e.g. microscopic beads, cells, particles
    • B01L2200/0668Trapping microscopic beads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2300/00Additional constructional details
    • B01L2300/08Geometry, shape and general structure
    • B01L2300/0861Configuration of multiple channels and/or chambers in a single devices
    • B01L2300/0877Flow chambers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2400/00Moving or stopping fluids
    • B01L2400/04Moving fluids with specific forces or mechanical means
    • B01L2400/0403Moving fluids with specific forces or mechanical means specific forces
    • B01L2400/043Moving fluids with specific forces or mechanical means specific forces magnetic forces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B01PHYSICAL OR CHEMICAL PROCESSES OR APPARATUS IN GENERAL
    • B01LCHEMICAL OR PHYSICAL LABORATORY APPARATUS FOR GENERAL USE
    • B01L2400/00Moving or stopping fluids
    • B01L2400/04Moving fluids with specific forces or mechanical means
    • B01L2400/0475Moving fluids with specific forces or mechanical means specific mechanical means and fluid pressure
    • B01L2400/0487Moving fluids with specific forces or mechanical means specific mechanical means and fluid pressure fluid pressure, pneumatics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N35/00Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
    • G01N35/0098Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor involving analyte bound to insoluble magnetic carrier, e.g. using magnetic separation

Definitions

  • This disclosure relates to optical shape sensing and more particularly to systems and methods which employ optical shape sensing and sensory feedback for medical applications.
  • Breathing motion can be a problem in any thoracic or abdominal imaging procedure because, e.g., standard image reconstruction algorithms usually implicitly assume a static scan object. Motion therefore can affect imaging of larger volumes when the time needed for the imaging procedure is comparable to or even longer than the respiratory period. Image-guided interventional procedures can suffer from the effect of motion during the intervention, since this can make automated assignment of reference points in the live images difficult.
  • Devices for monitoring respiratory motion such as spirometers or breathing belts can yield only crude information about the patient's breathing motion state at specific points in time during the acquisition.
  • precise knowledge of both the state of the respiratory cycle and shape of body contours and surfaces at a given time becomes increasingly important. Even if breathing is accounted for in images, it may not be sufficient feedback for a physician attempting an operative task, such as inserting a needle or the like.
  • pre-operative information e.g., patient specific anatomy
  • pre-operative information e.g., patient specific anatomy
  • having direct, real-time feedback of certain interactions e.g., needle insertion
  • a system for providing sensory feedback includes a garment configured to flexibly and snuggly fit over a portion of a subject.
  • the garment includes one or more sensors disposed therein to monitor activity of the subject or monitor points of interest of the subject.
  • An interpretation module is coupled with the sensors to receive sensor signals and interpret the sensor signals to determine if conditions are met to provide feedback signals to the garment.
  • a feedback modality is incorporated into the garment and is responsive to the feedback signals such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.
  • the feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • a garment providing sensory feedback includes a fabric configured to fit over at least a portion of a subject.
  • One or more sensors are incorporated in the fabric to perform measurements from the subject.
  • a feedback modality is incorporated into the fabric and is responsive to one or more feedback signals derived from measurements such that the feedback modality emits energy from the fabric to provide sensory information to assist a physician during a procedure.
  • the feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • a method for providing sensory feedback includes providing a garment configured to fit over at least a portion of a subject, one or more sensors incorporated in the garment to perform measurements from the subject, and a feedback modality incorporated into the garment and being responsive to the one or more feedback signals derived from measurements such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure; generating the one or more feedback signals derived from the measurements; and activating the feedback modality in accordance with the feedback signals to emit energy from the garment to provide sensory information to assist a physician.
  • FIG. 1 is a block/flow diagram showing an illustrative system/method which employs sensory feedback from a patient during a procedure in accordance with the present principles
  • FIG. 2 is a schematic diagram showing a garment for generating sensory feedback in accordance with one illustrative embodiment
  • FIG. 3 is a diagram showing a garment or manifold in accordance with another illustrative embodiment.
  • FIG. 4 is a flow diagram showing steps for measuring data and generating sensory feedback from a garment in accordance with an illustrative embodiment of the present invention.
  • a measurement system and method provide an adaptable and optimized setup configured to provide status feedback of a patient.
  • Lighting sensors or other feedback mechanisms are associated with a patient's body surface (e.g., by including the sensors in a medical vest or other garment) to provide visual/audio feedback to a physician of a physical state or states of the patient and also to account for augmented reality.
  • Augmented reality refers to the use of pre-operatively collected data employed during a present-time procedure.
  • the data collected preoperatively may be employed to indicate points of interest directly on a patient during a procedure.
  • shape sensing technology is employed to determine shape information of a patient's body.
  • the shape sensing data may be collected using a garment equipped with shape sensing technology.
  • Shape information can be derived from a variety of systems. These include: optical shape interrogation systems (e.g., fiber optic Bragg sensors, Rayleigh scattering, Brillouin scatter, optical intensity-based attenuation), multi-coil arrays for electromagnetic (EM) localization of points on the apparatus, laser scanning systems for three-dimensional surface estimation and optical/acoustic marker/emitter arrays for camera (time-of-flight or conventional optical measurement) or microphone-based interrogation of shape.
  • EM electromagnetic
  • laser scanning systems for three-dimensional surface estimation
  • optical/acoustic marker/emitter arrays for camera (time-of-flight or conventional optical measurement) or microphone-based interrogation of shape.
  • Real-time imaging such as ultrasound may also be used for shape information, but the clinical viability of that approach depends on additional cost and clinical value of tomographic information relative to the imaging performed.
  • a shape sensing garment may be equipped with sensors and/or feedback devices to indicate positions on the patient or a status or activity of the patient (e.g., breathing cycles, swallowing, muscle twitching, etc.).
  • one or more shape sensing optical fibers are employed in a garment with electrical, thermal or other measurement sensors. The fibers and other sensors work in conjunction to provide signals to feedback devices such as light emitting diodes or other feedback mechanisms to give guidance to the physician using preoperative data or currently measured data.
  • the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any instruments employed in tracking or analyzing complex biological or mechanical systems.
  • the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro-intestinal tract, excretory organs, blood vessels, etc.
  • the elements depicted in the FIGS. may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.
  • DSP digital signal processor
  • ROM read-only memory
  • RAM random access memory
  • non-volatile storage etc.
  • embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • System 100 may include a workstation or console 112 from which a procedure is supervised and managed. Procedures may include any procedure including but not limited to biopsies, ablations, injection of medications, etc. Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications. It should be understood that the function and components of system 100 may be integrated into one or more workstations or systems.
  • Memory 116 may store an interpretation module 115 configured to interpret electromagnetic, optical, acoustic, etc. feedback signals from a sensitized flexible garment 106 .
  • the garment 106 may include fiber sensors, optical, acoustic, electrical or electromagnetic markers or sensors, etc. embedded therein with known geometry or with a geometry that is initialized before use.
  • a shape interrogation console 122 measures the marker/sensor distribution over the surface of interest and supplies feedback about calibration/reference sections and measurement sections to the interpretation module 115 .
  • the shape interrogation module 122 sends and receives light to/from optical fibers or provide electrical power or signals to sensors 104 .
  • the optical fiber sensors 104 are weaved or otherwise integrated into garment 106 in a pattern that allows for stretching of the underlying textile substrate while accounting for the fact that the overall fiber sensor length in the textile can change only minimally (e.g., a 2D spiral pattern or 2D sinusoidal pattern embedded within the flexible membrane).
  • the fibers for sensors 104 are locally anchored at control points to provide a strain in the fiber during the flexure of the subject 148 .
  • control points can constrain the fiber in all degrees of freedom relative to the mesh, e.g., at the fiber tip, whereas others can allow for a sliding degree of freedom so that the fiber can slide freely relative to the mesh pattern to accommodate any overall path length changes in the patterned structure as the mesh deforms.
  • the interpretation module 115 may include the capability of receiving multiple inputs from multiple devices or systems to interpret an event or dynamic occurrence during a medical procedure, diagnostic test, etc.
  • a medical imaging device 110 and/or a tracking module 117 may also be included and may provide additional feedback to the interpretation module 115 .
  • the interpretation module 115 is configured to use the signal feedback (and any other feedback) to account for errors or aberrations related to dynamic changes of a patient's body.
  • a subject 148 or a region of interest 140 on the subject 148 is covered or constrained by the flexible garment 106 .
  • the flexible garment 106 may include a fabric or netting configured to stretch corresponding with movement or flexure of the subject 148 or the region of interest 140 .
  • garment 106 includes a feedback modality or feedback mechanisms 108 .
  • the feedback mechanisms 108 are configurable to react to stimuli collected by sensors 104 (or by other external data).
  • feedback mechanisms 108 include lights, such as light emitting diodes (LEDs) 109 integrated into the fabric of garment 106 .
  • the LEDs 109 are distributed within and throughout the garment 106 to provide feedback as to locations and//or events occurring relative to the patient.
  • a medical device 102 may include, e.g., a needle, a catheter, a guide wire, an endoscope, a probe, a robot, an electrode, a filter device, a balloon device or other medical component, etc.
  • the device 102 is to be inserted in the patient.
  • the device 102 has its coordinate system registered to pre-operative data 119 (e.g., image data).
  • pre-operative data 119 e.g., image data
  • an LED 109 nearest to a tip of the device 102 (which can be tracked by a tracking device 107 ) is illuminated on the garment 106 to provide visual feedback to the physician and any other person present in the environment.
  • One or more tracking devices or cameras 107 may be incorporated into the device 102 , so tracking information can be provided.
  • the tracking devices 107 may include electromagnetic (EM) trackers, fiber optic tracking, robotic positioning systems, cameras, etc.
  • breathing, heatbeat and swallowing data is collected by sensors 104 in the garment 106 .
  • the patient's breathing is visually indicated by a first LED (e.g., a white LED illuminated at each breathing cycle), the patient's heartbeat is indicated by a second LED (e.g., a red LED that is illuminated at each beat) and the patient's swallowing is indicated by a third LED (e.g., a blue LED that is illuminated at each swallow).
  • a first LED e.g., a white LED illuminated at each breathing cycle
  • the patient's heartbeat is indicated by a second LED (e.g., a red LED that is illuminated at each beat)
  • the patient's swallowing is indicated by a third LED (e.g., a blue LED that is illuminated at each swallow).
  • the garment 106 is spatially registered with pre-operative data 119 , such as pre-operative images.
  • pre-operative data 119 such as pre-operative images.
  • image landmarks may be indicated using LEDs 109 to assist in locating a proper insertion point, in addition a depth of the needle may be indicated on the garment 106 by a number or color of LEDs 109 that have been lit.
  • the system 100 can be configured to be sensitive to any event or movement of a patient.
  • the sensors 104 are configured to relay positional, temperature, electric field information, shape information, etc. back to the interpretation module 115 .
  • the garment 106 may be disposed over a mid-section of a patient ( 148 ) such that during a breathing cycle sensors 104 sense the dynamic shape changes of the abdomen or chest. This information may be interpreted using the interpretation module 115 which computes distances or changes in distances between nodes or positions in the garment 106 .
  • the mesh deflections may then be employed to account for breathing in images taken by an imaging device 110 or assist in the timing of an action during a medical procedure (e.g., inserting a device on an exhale, etc.), or any other event or action that needs compensation for dynamic changes.
  • the garment 106 may be applied to any portion of the subject's anatomy to collect dynamic data.
  • the garment 106 may be placed over the arms, legs, abdomen, chest, neck, head, or combinations thereof.
  • the garment 106 may be made adjustable to be configured for different sized subjects, different sized appendages, etc.
  • the garment 106 may be employed during a medical procedure to assist a clinician in performing the procedure.
  • Workstation 112 may include a display 118 for viewing internal images of the subject 148 using the imaging system 110 .
  • the imaging system 110 may include one or more imaging modalities, such as, e.g., ultrasound, photoacoustics, a magnetic resonance imaging (MRI) system, a fluoroscopy system, a computed tomography (CT) system, positron emission tomography (PET), single photon emission computed tomography (SPECT), or other system.
  • Imaging system 110 may be provided to collect real-time intra-operative imaging data.
  • the imaging data may be displayed on display 118 .
  • Display 118 may permit a user to interact with the workstation 112 and its components and functions. This is further facilitated by an interface 120 which may include a keyboard, mouse, a joystick or any other peripheral or control to permit user interaction with the workstation 112 .
  • a controller module 126 or other device is provided to condition signals and to control feedback mechanisms 108 .
  • the controller module 126 may generate control signals to control various controllers, sensors, radiation sources/beams, etc. in accordance with programmed conditions for which feedback is desired. The manner of the feedback response can also be programmed.
  • the controller 126 receives data from the interpretation module 115 and issues commands or signals to the feeddback mechanisms 108 .
  • the interpretation module 115 dynamically provides information collected and interpreted from the sensors 104 in garment 106 to the controller 126 to render the feedback to the physician in real-time, which may then be employed in administering medication, making decisions, etc.
  • garment 106 includes a vest or manifold 202 .
  • the vest 202 is formed from a mesh or fabric 206 .
  • the mesh 206 or vest 202 measures body surface deformation continuously in time and space with spatial high resolution (e.g., shape sensing vest).
  • the vest 202 is preferably flexible with a snug fit over the subject 148 .
  • sensing fibers 210 are integrated in the vest 202 and are employed to determine a shape of the chest of the subject 148 .
  • the sensing fibers 210 are also employed to determine dynamic geometry changes in the subject 148 and/or monitor a status of the subject 148 .
  • the sensing fiber(s) 210 may include a single optical fiber integrated into the vest 202 that spirals around the subject's body and hence delivers a sufficient picture of the geometry or may include multiple fibers integrated into the vest 202 .
  • the sensing fibers 210 may include one or more fiber optic Bragg gratings (FBG), which are a segment of an optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror.
  • FBG fiber optic Bragg gratings
  • a FBG can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
  • a fundamental principle behind the operation of a FBG is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and consequently, destructive interference for transmission.
  • the Bragg wavelength is sensitive to strain as well as to temperature. This means that FBGs gratings can be used as sensing elements in fiber optical sensors. In a FBG sensor, a shift in the Bragg wavelength, ⁇ B is caused, and the relative shift in the Bragg wavelength, ⁇ B / ⁇ B , due to an applied strain ( ⁇ ) and a change in temperature ( ⁇ T) is approximately given by:
  • ⁇ B ⁇ B C s ⁇ ⁇ + C T ⁇ ⁇ ⁇ ⁇ T .
  • the coefficient C s is called the coefficient of strain and its magnitude is usually around 0.8 ⁇ 10 ⁇ 6 / ⁇ s or in absolute quantities about 1 pm/ ⁇ s.
  • the coefficient C T describes the temperature sensitivity of the sensor and is made up of the thermal expansion coefficient and the thermo optic effect. Its value is around 7 ⁇ 10 ⁇ 6 /K (or as an absolute quantity 13 pm/K).
  • One of the main advantages of the technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure allows for evaluation of the curvature of the structure as a function of longitudinal position and hence for the three dimensional form of such a structure to be precisely determined.
  • the vest 202 may include an integrated lighting arrangement 220 (LEDs) and/or a visual display screen 222 within the vest 202 .
  • the lighting arrangement 220 may include a grid or grids of LEDs 224 disposed on a surface or integrated within the vest 202 .
  • the grid of LEDs 224 may include or be associated with tracking devices 226 or with the sensing fibers 210 such that each LED 224 can be located relative to the subject 148 .
  • the grid pattern shown in FIG. 2 is illustrative and non-limiting as other configurations are also contemplated.
  • An initialization process may be employed once the vest 202 is securely positioned on the subject 148 to register the location of the vest 202 and/or the LEDs 224 (or display screen 222 ) with positions or references on the subject 148 .
  • a registration scheme is needed between pre-operative data (e.g., images) and intra-operative patient positions.
  • pre-operative data e.g., images
  • Such registration may employ known technologies and methods. For example, fiducial markers on the patient and/or in the vest 202 may be registered with corresponding points in the pre-operative images.
  • the signals from the sensors in the vest 202 are interpreted by the shape interpretation module (module 115 ), which generates signals that are addressed to one or more locations in the grid of LEDs 224 by the controller 126 ( FIG. 1 ).
  • the LEDs 224 that satisfy the programmed conditions are illuminated by the controller 126 and provide feedback to the physician or others.
  • augmented reality In order to bridge the gap between the pre-operative information (in particular, the anatomical information provided by a computed tomography (CT) or a magnetic resonance (MR) scan together with associated plans, e.g., needle insertion points on the body surface) and the live view of the physician in the operating room, augmented reality is employed.
  • CT computed tomography
  • MR magnetic resonance
  • This may entail a camera system mounted on a head of the physician together with a special set of glasses that have been designed so that while looking through the glasses the physician can virtually overlay preoperative information over a patient or region of interest. This feature can be switched on or switched off.
  • relevant pre-operative information may be displayed directly on the body surface by making use of vest 202 with integrated lighting or display screen 222 , which is worn by the patient throughout the procedure.
  • This display screen 222 may include a flexible display integrated with the fabric of vest 202 .
  • preoperative images may be registered to the patient and displayed on the screen 222 .
  • Representations of instruments or overlays may be generated on the screen in accordance with information received from the sensors to integrate visual feedback via integrated display/lighting components.
  • the visual feedback feature(s) of the vest 202 may be used independently of the shape sensing.
  • the LEDs 224 provide visual feedback in the form of information encoded spatially and temporally via color, intensity, phase/timing, direction, etc. Depending on the application these LEDs 224 can be spread sparsely or densely. Information can be presented to reflect guidance information for navigation of an instrument through an access port based on real time measurements of surface anatomy deformation (that in turn reflects organ and target motion relative to an original planned path). Other sensory information may be provided to the physician and can take the form of acoustic or haptic/tactile feedback that is spatially modulated over the surface of the vest/manifold 202 .
  • knowing the deformation of the outer body surface is of wide interest for many applications.
  • One application includes respiratory motion compensation which is a problem for many image guided interventions.
  • Another application is the deformation of organs and tissue due to applied forces during the interventions (e.g., needle insertion).
  • the vest provides effective feedback, e.g., through visualization.
  • the visual feedback of deformation measurements makes use of lighting from LEDs 224 or from the display screen 222 included in the clothing.
  • the sensor/feedback enabled clothing itself may take the form of a surgical drape or other sterile manifold disposed over the patient. This component may be reusable or disposable in nature, depending on the application and costs involved.
  • the use for this application would be to convert a measurement from shape sensing fibers into an LED signal. By doing so, the physician would have direct visual feedback, e.g., how much deformation he caused when he inserted the needle or where high areas of respiratory motion exist.
  • a pre-procedural scan of the patient is acquired.
  • the patient wears the shape sensing vest 202 with the lighting system included.
  • a continuous mapping of the outer surface can be calculated between the pre-operative data and the current patient deformation. This mapping also gives an estimate of the internal organ position (plus locations of derived information, e.g., during planning as a region of interest or insertion point).
  • the lighting system LEDs 224 and/or display screen 222 are employed to display useful information during the intervention or other procedure, e.g., to activate certain LEDs 224 to display the insertion points or even to use the LEDs 224 to display the organ position mapped on a surface of vest 202 .
  • the shape sensing vest 202 permits tracking of patient/organ deformation in real-time.
  • the LED feedback is able to account for, e.g., respiratory motion, reactionary forces, etc.
  • Display screen 222 may include a higher-resolution flexible display integrated in the vest 202 for displaying medical (imaging) data.
  • the medical data can be filtered or transformed depending on input of the deformation sensors 210 in the vest 202 or other data acquired by other sensors, e.g., location of a needle tip.
  • the display screen 222 and/or LEDs 224 may indicate medical alarms and/or sites where complications may have occurred.
  • the vest 202 may be employed to display medical condition diagnoses depending on other sensors internal or external to the vest 202 .
  • a garment 302 can have sterility maintaining access ports 310 to permit perforations to be made through the sensing manifold by percutaneous interventional devices.
  • the garment 302 can have a modular design which permits re-usable and sterilizable sensing matrix 312 to be embedded or mated with a sterile, disposable portion 315 makes direct contact with the patient 148 .
  • Portion 315 includes a surface or interface 314 which couples or connects with matrix 312 .
  • garment 302 e.g., a pair of trunks, a bra, a skull cap, a sock, a glove, etc.
  • the clothing may include elastic, SpandexTM or other form of elastic clothing.
  • the garment 302 (or any other garment in accordance with the present principles) may include adjustment mechanisms 304 to be adaptively sized to snuggly fit a patient.
  • the adjustment mechanisms 304 may include hook and loop connectors, buckles, elastic bands, zippers, snaps, adhesive strips, inflatable cuffs, suction devices or any other devices for connecting to a body surface and/or adjusting to a size.
  • a method for providing sensory feedback is illustratively shown in accordance with illustrative embodiments.
  • a garment is provided, which is configured to fit over at least a portion of a subject, e.g., an arm, leg, chest, etc.
  • One or more sensors are incorporated in the garment to perform measurements of the subject.
  • a feedback modality is incorporated into the garment and is responsive to the one or more feedback signals derived from measurements of: the one or more sensors or other devices such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.
  • Sensory information may include light, sound, tactile information, etc.
  • Embodiments may include one or more types of sensory information employed together.
  • the garment may include one or more shape sensing optical fibers disposed therein.
  • the optical fibers may be woven into the fabric of the garment and are preferably attached to the fabric such that flexing the fabric imparts a strain in the fiber.
  • the optical fibers may be tied to the fabric, glued or otherwise coupled to the fabric to permit flexure but also to constraint the motion of the portion of the subject.
  • one or more feedback signals are generated.
  • the feedback signals are preferably derived from measurements of the one or more sensors.
  • Other sensors or devices may also be employed to trigger the generation of feedback signals. These sensors or devices may include monitoring devices or stored data not necessarily located on the garment.
  • the measurements of the one or more sensors are interpreted to determine one or more of anatomical movement, anatomical function, a position of an interventional device, etc.
  • the feedback modality is activated to emit energy from the garment to provide sensory information to assist a physician.
  • lights or other devices may be selectively illuminating in an array to indicate, e.g., a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc.
  • a display screen may be illuminated to render images to indicate a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc.
  • the images may include pre-operative images.
  • decisions are made, procedures conducted, etc. using the sensory feedback generated by the garment.
  • the sensory feedback may be employed by physicians to determine a patient status, determine triggering events, understand the boundaries of an organ, obtain anatomical responses, etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Immunology (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Hematology (AREA)
  • Analytical Chemistry (AREA)
  • Urology & Nephrology (AREA)
  • Dispersion Chemistry (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Fluid Mechanics (AREA)
  • Clinical Laboratory Science (AREA)
  • Dentistry (AREA)
  • Biochemistry (AREA)
  • Physiology (AREA)
  • Biotechnology (AREA)
  • Cell Biology (AREA)
  • Microbiology (AREA)
  • Human Computer Interaction (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Investigating Or Analyzing Materials By The Use Of Magnetic Means (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)

Abstract

A system for providing sensory feedback includes a garment (106) that is configured to flexibly and snuggly fit over a portion of a subject. The garment includes one or more sensors (104) disposed therein to monitor activity of the subject or monitor points of interest of the subject. An interpretation module (115) is coupled with the sensors to receive sensor signals and interpret the sensor signals to determine if conditions are met to provide feedback signals to the garment. A feedback modality (108) is incorporated into the garment and is responsive to the feedback signals such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • Related application is “Motion Compensation and Patient Feedback in Medical Imaging Systems,” PCT IB2011/051340, Mar. 29, 2011.
  • This disclosure relates to optical shape sensing and more particularly to systems and methods which employ optical shape sensing and sensory feedback for medical applications.
  • Breathing motion can be a problem in any thoracic or abdominal imaging procedure because, e.g., standard image reconstruction algorithms usually implicitly assume a static scan object. Motion therefore can affect imaging of larger volumes when the time needed for the imaging procedure is comparable to or even longer than the respiratory period. Image-guided interventional procedures can suffer from the effect of motion during the intervention, since this can make automated assignment of reference points in the live images difficult.
  • Devices for monitoring respiratory motion such as spirometers or breathing belts can yield only crude information about the patient's breathing motion state at specific points in time during the acquisition. With the increasing accuracy of medical imaging modalities and the emergence of new applications such as image-guided interventions, precise knowledge of both the state of the respiratory cycle and shape of body contours and surfaces at a given time becomes increasingly important. Even if breathing is accounted for in images, it may not be sufficient feedback for a physician attempting an operative task, such as inserting a needle or the like.
  • Making pre-operative information (e.g., patient specific anatomy) effectively available during a procedure is a challenging task in image-guided interventions. Furthermore, having direct, real-time feedback of certain interactions (e.g., needle insertion) during a procedure can assist a physician in decision making or timing of his actions.
  • In accordance with the present principles, a system for providing sensory feedback includes a garment configured to flexibly and snuggly fit over a portion of a subject. The garment includes one or more sensors disposed therein to monitor activity of the subject or monitor points of interest of the subject. An interpretation module is coupled with the sensors to receive sensor signals and interpret the sensor signals to determine if conditions are met to provide feedback signals to the garment. A feedback modality is incorporated into the garment and is responsive to the feedback signals such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure. The feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • A garment providing sensory feedback includes a fabric configured to fit over at least a portion of a subject. One or more sensors are incorporated in the fabric to perform measurements from the subject. A feedback modality is incorporated into the fabric and is responsive to one or more feedback signals derived from measurements such that the feedback modality emits energy from the fabric to provide sensory information to assist a physician during a procedure. The feedback modality may be responsive to feedback signals from sensors on the garment or sensors looked elsewhere, e.g., on other equipment, from an interventional devices, etc.
  • A method for providing sensory feedback includes providing a garment configured to fit over at least a portion of a subject, one or more sensors incorporated in the garment to perform measurements from the subject, and a feedback modality incorporated into the garment and being responsive to the one or more feedback signals derived from measurements such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure; generating the one or more feedback signals derived from the measurements; and activating the feedback modality in accordance with the feedback signals to emit energy from the garment to provide sensory information to assist a physician. These and other objects, features and advantages of the present disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.
  • This disclosure will present in detail the following description of preferred embodiments with reference to the following figures wherein:
  • FIG. 1 is a block/flow diagram showing an illustrative system/method which employs sensory feedback from a patient during a procedure in accordance with the present principles;
  • FIG. 2 is a schematic diagram showing a garment for generating sensory feedback in accordance with one illustrative embodiment;
  • FIG. 3 is a diagram showing a garment or manifold in accordance with another illustrative embodiment; and
  • FIG. 4 is a flow diagram showing steps for measuring data and generating sensory feedback from a garment in accordance with an illustrative embodiment of the present invention.
  • In accordance with the present principles, a measurement system and method provide an adaptable and optimized setup configured to provide status feedback of a patient. Lighting sensors or other feedback mechanisms are associated with a patient's body surface (e.g., by including the sensors in a medical vest or other garment) to provide visual/audio feedback to a physician of a physical state or states of the patient and also to account for augmented reality. Augmented reality refers to the use of pre-operatively collected data employed during a present-time procedure. In one embodiment, the data collected preoperatively may be employed to indicate points of interest directly on a patient during a procedure. In particularly useful embodiments, shape sensing technology is employed to determine shape information of a patient's body. The shape sensing data may be collected using a garment equipped with shape sensing technology.
  • Shape information can be derived from a variety of systems. These include: optical shape interrogation systems (e.g., fiber optic Bragg sensors, Rayleigh scattering, Brillouin scatter, optical intensity-based attenuation), multi-coil arrays for electromagnetic (EM) localization of points on the apparatus, laser scanning systems for three-dimensional surface estimation and optical/acoustic marker/emitter arrays for camera (time-of-flight or conventional optical measurement) or microphone-based interrogation of shape. Real-time imaging such as ultrasound may also be used for shape information, but the clinical viability of that approach depends on additional cost and clinical value of tomographic information relative to the imaging performed.
  • A shape sensing garment may be equipped with sensors and/or feedback devices to indicate positions on the patient or a status or activity of the patient (e.g., breathing cycles, swallowing, muscle twitching, etc.). In a particularly useful embodiment, one or more shape sensing optical fibers are employed in a garment with electrical, thermal or other measurement sensors. The fibers and other sensors work in conjunction to provide signals to feedback devices such as light emitting diodes or other feedback mechanisms to give guidance to the physician using preoperative data or currently measured data.
  • It should be understood that the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any instruments employed in tracking or analyzing complex biological or mechanical systems. In particular, the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro-intestinal tract, excretory organs, blood vessels, etc. The elements depicted in the FIGS. may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.
  • The functions of the various elements shown in the FIGS. can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.
  • Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams and the like represent various processes which may be substantially represented in computer readable storage media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
  • Furthermore, embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
  • Referring now to the drawings in which like numerals represent the same or similar elements and initially to FIG. 1, a system 100 for performing a medical procedure is illustratively depicted. System 100 may include a workstation or console 112 from which a procedure is supervised and managed. Procedures may include any procedure including but not limited to biopsies, ablations, injection of medications, etc. Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications. It should be understood that the function and components of system 100 may be integrated into one or more workstations or systems.
  • Memory 116 may store an interpretation module 115 configured to interpret electromagnetic, optical, acoustic, etc. feedback signals from a sensitized flexible garment 106. The garment 106 may include fiber sensors, optical, acoustic, electrical or electromagnetic markers or sensors, etc. embedded therein with known geometry or with a geometry that is initialized before use. A shape interrogation console 122 measures the marker/sensor distribution over the surface of interest and supplies feedback about calibration/reference sections and measurement sections to the interpretation module 115. In one embodiment, the shape interrogation module 122 sends and receives light to/from optical fibers or provide electrical power or signals to sensors 104.
  • When the sensors 104 include optical shape sensing fibers, the optical fiber sensors 104 are weaved or otherwise integrated into garment 106 in a pattern that allows for stretching of the underlying textile substrate while accounting for the fact that the overall fiber sensor length in the textile can change only minimally (e.g., a 2D spiral pattern or 2D sinusoidal pattern embedded within the flexible membrane). The fibers for sensors 104 are locally anchored at control points to provide a strain in the fiber during the flexure of the subject 148. Several control points can constrain the fiber in all degrees of freedom relative to the mesh, e.g., at the fiber tip, whereas others can allow for a sliding degree of freedom so that the fiber can slide freely relative to the mesh pattern to accommodate any overall path length changes in the patterned structure as the mesh deforms.
  • The interpretation module 115 may include the capability of receiving multiple inputs from multiple devices or systems to interpret an event or dynamic occurrence during a medical procedure, diagnostic test, etc. A medical imaging device 110 and/or a tracking module 117 may also be included and may provide additional feedback to the interpretation module 115. The interpretation module 115 is configured to use the signal feedback (and any other feedback) to account for errors or aberrations related to dynamic changes of a patient's body.
  • In one embodiment, a subject 148 or a region of interest 140 on the subject 148 is covered or constrained by the flexible garment 106. The flexible garment 106 may include a fabric or netting configured to stretch corresponding with movement or flexure of the subject 148 or the region of interest 140. In addition to sensors 104, garment 106 includes a feedback modality or feedback mechanisms 108. The feedback mechanisms 108 are configurable to react to stimuli collected by sensors 104 (or by other external data). In one example, feedback mechanisms 108 include lights, such as light emitting diodes (LEDs) 109 integrated into the fabric of garment 106. The LEDs 109 are distributed within and throughout the garment 106 to provide feedback as to locations and//or events occurring relative to the patient. Some non-limiting examples of operation of the LEDs 109 (or other feedback mechanisms 108) will now be illustratively described.
  • In one application, an interventional procedure is performed. For example, a medical device 102 may include, e.g., a needle, a catheter, a guide wire, an endoscope, a probe, a robot, an electrode, a filter device, a balloon device or other medical component, etc. The device 102 is to be inserted in the patient. The device 102 has its coordinate system registered to pre-operative data 119 (e.g., image data). As the device 102 is deployed inside the patient, an LED 109 nearest to a tip of the device 102 (which can be tracked by a tracking device 107) is illuminated on the garment 106 to provide visual feedback to the physician and any other person present in the environment. One or more tracking devices or cameras 107 may be incorporated into the device 102, so tracking information can be provided. The tracking devices 107 may include electromagnetic (EM) trackers, fiber optic tracking, robotic positioning systems, cameras, etc.
  • In another application, breathing, heatbeat and swallowing data is collected by sensors 104 in the garment 106. The patient's breathing is visually indicated by a first LED (e.g., a white LED illuminated at each breathing cycle), the patient's heartbeat is indicated by a second LED (e.g., a red LED that is illuminated at each beat) and the patient's swallowing is indicated by a third LED (e.g., a blue LED that is illuminated at each swallow).
  • In yet another application, the garment 106 is spatially registered with pre-operative data 119, such as pre-operative images. In a procedure like a biopsy, image landmarks may be indicated using LEDs 109 to assist in locating a proper insertion point, in addition a depth of the needle may be indicated on the garment 106 by a number or color of LEDs 109 that have been lit.
  • The system 100 can be configured to be sensitive to any event or movement of a patient. The sensors 104 are configured to relay positional, temperature, electric field information, shape information, etc. back to the interpretation module 115. For example, the garment 106 may be disposed over a mid-section of a patient (148) such that during a breathing cycle sensors 104 sense the dynamic shape changes of the abdomen or chest. This information may be interpreted using the interpretation module 115 which computes distances or changes in distances between nodes or positions in the garment 106. The mesh deflections may then be employed to account for breathing in images taken by an imaging device 110 or assist in the timing of an action during a medical procedure (e.g., inserting a device on an exhale, etc.), or any other event or action that needs compensation for dynamic changes.
  • It should be understood that the garment 106 may be applied to any portion of the subject's anatomy to collect dynamic data. For example, the garment 106 may be placed over the arms, legs, abdomen, chest, neck, head, or combinations thereof. In addition, the garment 106 may be made adjustable to be configured for different sized subjects, different sized appendages, etc.
  • The garment 106 may be employed during a medical procedure to assist a clinician in performing the procedure. Workstation 112 may include a display 118 for viewing internal images of the subject 148 using the imaging system 110. The imaging system 110 may include one or more imaging modalities, such as, e.g., ultrasound, photoacoustics, a magnetic resonance imaging (MRI) system, a fluoroscopy system, a computed tomography (CT) system, positron emission tomography (PET), single photon emission computed tomography (SPECT), or other system. Imaging system 110 may be provided to collect real-time intra-operative imaging data. The imaging data may be displayed on display 118. Display 118 may permit a user to interact with the workstation 112 and its components and functions. This is further facilitated by an interface 120 which may include a keyboard, mouse, a joystick or any other peripheral or control to permit user interaction with the workstation 112.
  • A controller module 126 or other device is provided to condition signals and to control feedback mechanisms 108. In one example, the controller module 126 may generate control signals to control various controllers, sensors, radiation sources/beams, etc. in accordance with programmed conditions for which feedback is desired. The manner of the feedback response can also be programmed. The controller 126 receives data from the interpretation module 115 and issues commands or signals to the feeddback mechanisms 108. The interpretation module 115 dynamically provides information collected and interpreted from the sensors 104 in garment 106 to the controller 126 to render the feedback to the physician in real-time, which may then be employed in administering medication, making decisions, etc.
  • Referring to FIG. 2, in one embodiment, garment 106 includes a vest or manifold 202. The vest 202 is formed from a mesh or fabric 206. In this embodiment, the mesh 206 or vest 202 measures body surface deformation continuously in time and space with spatial high resolution (e.g., shape sensing vest). The vest 202 is preferably flexible with a snug fit over the subject 148. In one embodiment, sensing fibers 210 are integrated in the vest 202 and are employed to determine a shape of the chest of the subject 148. The sensing fibers 210 are also employed to determine dynamic geometry changes in the subject 148 and/or monitor a status of the subject 148. The sensing fiber(s) 210 may include a single optical fiber integrated into the vest 202 that spirals around the subject's body and hence delivers a sufficient picture of the geometry or may include multiple fibers integrated into the vest 202.
  • The sensing fibers 210 may include one or more fiber optic Bragg gratings (FBG), which are a segment of an optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror. A FBG can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.
  • A fundamental principle behind the operation of a FBG is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and consequently, destructive interference for transmission. The Bragg wavelength is sensitive to strain as well as to temperature. This means that FBGs gratings can be used as sensing elements in fiber optical sensors. In a FBG sensor, a shift in the Bragg wavelength, ΔλB is caused, and the relative shift in the Bragg wavelength, ΔλBB, due to an applied strain (∈) and a change in temperature (ΔT) is approximately given by:
  • δλ B λ B = C s ɛ + C T Δ T .
  • The coefficient Cs is called the coefficient of strain and its magnitude is usually around 0.8×10−6/μs or in absolute quantities about 1 pm/μs. The coefficient CT describes the temperature sensitivity of the sensor and is made up of the thermal expansion coefficient and the thermo optic effect. Its value is around 7×10−6/K (or as an absolute quantity 13 pm/K). One of the main advantages of the technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure allows for evaluation of the curvature of the structure as a function of longitudinal position and hence for the three dimensional form of such a structure to be precisely determined.
  • As an alternative to FBGs, the inherent backscatter in conventional optical fibers can be exploited. One such approach is to use Rayleigh scatter in standard single-mode communications fiber. Rayleigh scatter occurs as a result of random fluctuations of the index of refraction in the fiber core. These random fluctuations can be modeled as a Bragg grating with a random variation of amplitude and phase along the grating length. By using this effect in 3 or more cores running within a single length of multicore fiber, the 3D shape and dynamics of the surface of interest would be trackable.
  • In one embodiment, the vest 202 may include an integrated lighting arrangement 220 (LEDs) and/or a visual display screen 222 within the vest 202. The lighting arrangement 220 may include a grid or grids of LEDs 224 disposed on a surface or integrated within the vest 202. The grid of LEDs 224 may include or be associated with tracking devices 226 or with the sensing fibers 210 such that each LED 224 can be located relative to the subject 148. The grid pattern shown in FIG. 2 is illustrative and non-limiting as other configurations are also contemplated.
  • An initialization process may be employed once the vest 202 is securely positioned on the subject 148 to register the location of the vest 202 and/or the LEDs 224 (or display screen 222) with positions or references on the subject 148. If augmented reality is being employed, a registration scheme is needed between pre-operative data (e.g., images) and intra-operative patient positions. Such registration may employ known technologies and methods. For example, fiducial markers on the patient and/or in the vest 202 may be registered with corresponding points in the pre-operative images. The signals from the sensors in the vest 202 are interpreted by the shape interpretation module (module 115), which generates signals that are addressed to one or more locations in the grid of LEDs 224 by the controller 126 (FIG. 1). The LEDs 224 that satisfy the programmed conditions are illuminated by the controller 126 and provide feedback to the physician or others.
  • In order to bridge the gap between the pre-operative information (in particular, the anatomical information provided by a computed tomography (CT) or a magnetic resonance (MR) scan together with associated plans, e.g., needle insertion points on the body surface) and the live view of the physician in the operating room, augmented reality is employed. This may entail a camera system mounted on a head of the physician together with a special set of glasses that have been designed so that while looking through the glasses the physician can virtually overlay preoperative information over a patient or region of interest. This feature can be switched on or switched off. However, the drawback is that the additional camera system and glasses have not been embraced by physicians. In accordance with one embodiment, relevant pre-operative information (images) may be displayed directly on the body surface by making use of vest 202 with integrated lighting or display screen 222, which is worn by the patient throughout the procedure.
  • This display screen 222 may include a flexible display integrated with the fabric of vest 202. In this embodiment, preoperative images may be registered to the patient and displayed on the screen 222. Representations of instruments or overlays may be generated on the screen in accordance with information received from the sensors to integrate visual feedback via integrated display/lighting components. Depending on the application, the visual feedback feature(s) of the vest 202 may be used independently of the shape sensing.
  • The LEDs 224 provide visual feedback in the form of information encoded spatially and temporally via color, intensity, phase/timing, direction, etc. Depending on the application these LEDs 224 can be spread sparsely or densely. Information can be presented to reflect guidance information for navigation of an instrument through an access port based on real time measurements of surface anatomy deformation (that in turn reflects organ and target motion relative to an original planned path). Other sensory information may be provided to the physician and can take the form of acoustic or haptic/tactile feedback that is spatially modulated over the surface of the vest/manifold 202.
  • In one embodiment, knowing the deformation of the outer body surface is of wide interest for many applications. One application includes respiratory motion compensation which is a problem for many image guided interventions. Another application is the deformation of organs and tissue due to applied forces during the interventions (e.g., needle insertion). The vest provides effective feedback, e.g., through visualization. The visual feedback of deformation measurements makes use of lighting from LEDs 224 or from the display screen 222 included in the clothing. The sensor/feedback enabled clothing itself may take the form of a surgical drape or other sterile manifold disposed over the patient. This component may be reusable or disposable in nature, depending on the application and costs involved.
  • The use for this application would be to convert a measurement from shape sensing fibers into an LED signal. By doing so, the physician would have direct visual feedback, e.g., how much deformation he caused when he inserted the needle or where high areas of respiratory motion exist.
  • For an augmented reality application, a pre-procedural scan of the patient is acquired. During the intervention, the patient wears the shape sensing vest 202 with the lighting system included. Based on the measurement from the shape sensing, a continuous mapping of the outer surface can be calculated between the pre-operative data and the current patient deformation. This mapping also gives an estimate of the internal organ position (plus locations of derived information, e.g., during planning as a region of interest or insertion point). The lighting system LEDs 224 and/or display screen 222 are employed to display useful information during the intervention or other procedure, e.g., to activate certain LEDs 224 to display the insertion points or even to use the LEDs 224 to display the organ position mapped on a surface of vest 202. Different colors of the LEDs 224 could also improve the visual feedback. The shape sensing vest 202 permits tracking of patient/organ deformation in real-time. The LED feedback is able to account for, e.g., respiratory motion, reactionary forces, etc. Display screen 222 may include a higher-resolution flexible display integrated in the vest 202 for displaying medical (imaging) data. The medical data can be filtered or transformed depending on input of the deformation sensors 210 in the vest 202 or other data acquired by other sensors, e.g., location of a needle tip. The display screen 222 and/or LEDs 224 may indicate medical alarms and/or sites where complications may have occurred. The vest 202 may be employed to display medical condition diagnoses depending on other sensors internal or external to the vest 202.
  • Referring to FIG. 3, a garment 302 can have sterility maintaining access ports 310 to permit perforations to be made through the sensing manifold by percutaneous interventional devices. For procedures requiring maintenance of sterile fields, the garment 302 can have a modular design which permits re-usable and sterilizable sensing matrix 312 to be embedded or mated with a sterile, disposable portion 315 makes direct contact with the patient 148. Portion 315 includes a surface or interface 314 which couples or connects with matrix 312.
  • Instead of or in addition to a vest, another form of clothing may be employed for garment 302, e.g., a pair of trunks, a bra, a skull cap, a sock, a glove, etc. The clothing may include elastic, Spandex™ or other form of elastic clothing. The garment 302 (or any other garment in accordance with the present principles) may include adjustment mechanisms 304 to be adaptively sized to snuggly fit a patient. The adjustment mechanisms 304 may include hook and loop connectors, buckles, elastic bands, zippers, snaps, adhesive strips, inflatable cuffs, suction devices or any other devices for connecting to a body surface and/or adjusting to a size.
  • Referring to FIG. 4, a method for providing sensory feedback is illustratively shown in accordance with illustrative embodiments. In block 402, a garment is provided, which is configured to fit over at least a portion of a subject, e.g., an arm, leg, chest, etc. One or more sensors are incorporated in the garment to perform measurements of the subject. A feedback modality is incorporated into the garment and is responsive to the one or more feedback signals derived from measurements of: the one or more sensors or other devices such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure. Sensory information may include light, sound, tactile information, etc. Embodiments may include one or more types of sensory information employed together. The garment may include one or more shape sensing optical fibers disposed therein. The optical fibers may be woven into the fabric of the garment and are preferably attached to the fabric such that flexing the fabric imparts a strain in the fiber. The optical fibers may be tied to the fabric, glued or otherwise coupled to the fabric to permit flexure but also to constraint the motion of the portion of the subject.
  • In block 404, one or more feedback signals are generated. The feedback signals are preferably derived from measurements of the one or more sensors. Other sensors or devices may also be employed to trigger the generation of feedback signals. These sensors or devices may include monitoring devices or stored data not necessarily located on the garment. In block 406, the measurements of the one or more sensors are interpreted to determine one or more of anatomical movement, anatomical function, a position of an interventional device, etc.
  • In block 408, the feedback modality is activated to emit energy from the garment to provide sensory information to assist a physician. In block 410, lights or other devices may be selectively illuminating in an array to indicate, e.g., a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc. In block 412, a display screen may be illuminated to render images to indicate a location of an anatomical feature, anatomical movement, pre-operative data, a position of an interventional device, etc. The images may include pre-operative images.
  • In block 414, decisions are made, procedures conducted, etc. using the sensory feedback generated by the garment. The sensory feedback may be employed by physicians to determine a patient status, determine triggering events, understand the boundaries of an organ, obtain anatomical responses, etc.
  • In interpreting the appended claims, it should be understood that:
      • a) the word “comprising” does not exclude the presence of other elements or acts than those listed in a given claim;
      • b) the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements;
      • c) any reference signs in the claims do not limit their scope;
      • d) several “means” may be represented by the same item or hardware or software implemented structure or function; and
      • e) no specific sequence of acts is intended to be required unless specifically indicated.
  • Having described preferred embodiments for systems and methods for body surface feedback for medical interventions (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments of the disclosure disclosed which are within the scope of the embodiments disclosed herein as outlined by the appended claims. Having thus described the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.

Claims (22)

1. A system, comprising:
a garment (106) configured to flexibly and snuggly fit over at least a portion of a subject, the garment including one or more sensors (104) disposed therein to monitor activity of the subject and/or monitor points of interest of the subject;
an interpretation module (115) coupled with the sensors to receive sensor signals and interpret the sensor signals to determine if conditions are met to provide one or more feedback signals to the garment; and
a feedback modality (108) included in a positional grid incorporated into the garment and being responsive to the one or more feedback signals such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure.
2. The system as recited in claim 1, wherein the sensors (104) include shape sensing optical fibers and the interpretation module (115) provides the one or more feedback signals based on movement of the subject.
3. The system as recited in claim 1, further comprising preoperative internal images (119) of the subject registered with the garment, the interpretation module (115) providing the one or more feedback signals to indicate one or more points of interest on the garment based on the preoperative images.
4. The system as recited in claim 3, wherein the feedback modality (108) includes a display screen (222) configured to display the preoperative image registered with the garment.
5. The system as recited in claim 1, wherein the feedback modality (108) includes lights (224) configured to illuminate in accordance with the one or more feedback signals.
6. The system as recited in claim 5, wherein the lights (224) are disposed in an array and are selectively illuminated to indicate progression or position of an interventional device.
7. The system as recited in claim 5, wherein the lights (224) are configured to illuminate in accordance with at least one of: movement of a body surface of the subject or in accordance with a body function.
8. A garment providing sensory feedback, comprising:
a fabric (206) configured to fit over at least a portion of a subject;
one or more sensors (104) incorporated in the fabric to perform measurements from the subject;
a feedback modality (108) included in a positional grid incorporated into the fabric and being responsive to one or more feedback signals derived from measurements such that the feedback modality emits energy from the fabric to provide sensory information to assist a physician during a procedure.
9. (canceled)
10. (canceled)
11. (canceled)
12. (canceled)
13. (canceled)
14. (canceled)
15. The garment as recited in claim 8, further comprising an adjustment mechanism (304) configured to permit adjustment of the garment to permit flexible and snug attachment of the fabric to the subject.
16. The garment as recited in claim 8, wherein the fabric includes a first portion (312), which is sterilizable and reusable and a second portion (315) which is disposable.
17. The garment as recited in claim 8, wherein the first portion (312) includes the one or more sensors and the feedback modality.
18. A method for providing sensory feedback, comprising:
providing (402) a garment configured to fit over at least a portion of a subject, one or more sensors incorporated in the garment to perform measurements from the subject, and a feedback modality incorporated into the garment and being responsive to the one or more feedback signals derived from measurements such that the feedback modality emits energy from the garment to provide sensory information to assist a physician during a procedure;
generating (404) the one or more feedback signals derived from the measurements; and
activating (408) the feedback modality in accordance with the feedback signals to emit energy from the garment to provide sensory information to assist a physician.
19. The method as recited in claim 18, wherein activating the feedback modality to emit energy includes selectively illuminating (410) lights in an array to indicate one or more of a location of an anatomical feature, anatomical movement, pre-operative data or a position of an interventional device.
20. The method as recited in claim 18, wherein activating the feedback modality to emit energy includes illuminating (412) a display screen to render images to indicate one or more of a location of an anatomical feature, anatomical movement, pre-operative data or a position of an interventional device.
21. (canceled)
22. The method as recited in claim 18, wherein generating the one or more feedback signals includes interpreting (406) measurements of the one or more sensors to determine one or more of anatomical movement, anatomical function or a position of an interventional device.
US14/352,693 2011-10-20 2012-10-19 Magnetic particle detection with incubation period Abandoned US20140323856A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/352,693 US20140323856A1 (en) 2011-10-20 2012-10-19 Magnetic particle detection with incubation period

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161549428P 2011-10-20 2011-10-20
US14/352,693 US20140323856A1 (en) 2011-10-20 2012-10-19 Magnetic particle detection with incubation period
PCT/IB2012/055729 WO2013057703A1 (en) 2011-10-21 2012-10-19 Body surface feedback for medical interventions

Publications (1)

Publication Number Publication Date
US20140323856A1 true US20140323856A1 (en) 2014-10-30

Family

ID=47227982

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/352,448 Active 2035-04-25 US10031132B2 (en) 2011-10-20 2012-10-05 Magnetic particle detection with incubation period
US14/352,693 Abandoned US20140323856A1 (en) 2011-10-20 2012-10-19 Magnetic particle detection with incubation period

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/352,448 Active 2035-04-25 US10031132B2 (en) 2011-10-20 2012-10-05 Magnetic particle detection with incubation period

Country Status (6)

Country Link
US (2) US10031132B2 (en)
EP (1) EP2745118B1 (en)
JP (1) JP6054405B2 (en)
CN (1) CN103890588B (en)
IN (1) IN2014CN02659A (en)
WO (1) WO2013057616A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160331460A1 (en) * 2015-05-11 2016-11-17 Elwha Llc Interactive surgical drape, system, and related methods
US20160331461A1 (en) * 2015-05-11 2016-11-17 Elwha LLC, a limited liability company of the State of Delaware Interactive surgical drape, system, and related methods
WO2024057310A1 (en) * 2022-09-13 2024-03-21 Marrow Wiz Ltd. Entry point identification system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105266765A (en) * 2014-11-10 2016-01-27 北京至感传感器技术研究院有限公司 Device for detecting physiological changes of breasts
CN105929149B (en) * 2016-04-26 2018-09-11 中国科学院电子学研究所 A kind of optical detector based on magnetic enrichment and total internal reflection
JP7455529B2 (en) * 2019-08-01 2024-03-26 キヤノンメディカルシステムズ株式会社 Sample measuring device and method for controlling the sample measuring device
US11136543B1 (en) 2020-02-11 2021-10-05 Edward R. Flynn Magnetic cell incubation device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095263A1 (en) * 2000-02-08 2003-05-22 Deepak Varshneya Fiber optic interferometric vital sign monitor for use in magnetic resonance imaging, confined care facilities and in-hospital
US20050034485A1 (en) * 2003-08-14 2005-02-17 Tam-Telesante Garment for the medical monitoring of a patient

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6319670B1 (en) * 1995-05-09 2001-11-20 Meso Scale Technology Llp Methods and apparatus for improved luminescence assays using microparticles
JP3647587B2 (en) * 1997-01-20 2005-05-11 シスメックス株式会社 Immunoassay
US6180418B1 (en) * 1998-01-20 2001-01-30 The United States Of America As Represented By The Secretary Of The Navy Force discrimination assay
US7425455B2 (en) 2002-01-29 2008-09-16 Asahi Kasei Kabushiki Kaisha Biosensor, magnetic molecule measurement device
US7736889B2 (en) * 2003-06-10 2010-06-15 The United States Of America As Represented By The Secretary Of The Navy Fluidic force discrimination
EP1926994A1 (en) * 2005-09-08 2008-06-04 Koninklijke Philips Electronics N.V. Microsensor device
US8637317B2 (en) 2006-04-18 2014-01-28 Advanced Liquid Logic, Inc. Method of washing beads
US20090251136A1 (en) 2006-07-17 2009-10-08 Koninklijke Philips Electronics N.V. Attraction and repulsion of magnetic of magnetizable objects to and from a sensor surface
WO2008017972A2 (en) 2006-08-09 2008-02-14 Koninklijke Philips Electronics N. V. A magnet system for biosensors
WO2008114025A1 (en) * 2007-03-21 2008-09-25 University Of The West Of England, Bristol Particle facilitated testing
WO2008142492A1 (en) 2007-05-22 2008-11-27 Koninklijke Philips Electronics N.V. Method for detecting label particles
US20100187450A1 (en) 2007-06-21 2010-07-29 Koninklijke Philips Electronics N.V. Microelectronic sensor device with light source and light detector
EP2017618A1 (en) * 2007-07-20 2009-01-21 Koninklijke Philips Electronics N.V. Methods and systems for detecting
RU2489704C2 (en) * 2007-12-20 2013-08-10 Конинклейке Филипс Электроникс Н.В. Microelectronic sensory unit of sensor for detecting target particles
WO2010042242A1 (en) 2008-10-10 2010-04-15 Rutgers, The State University Of New Jersey Methods and related devices for continuous sensing utilizing magnetic beads
JP5996868B2 (en) * 2008-10-17 2016-09-21 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Pulsed magnetic actuation for sensitive assays
CA2772020A1 (en) * 2009-08-31 2011-03-03 Mbio Diagnostics, Inc. Integrated sample preparation and analyte detection
CN102549445B (en) 2009-09-28 2015-04-08 皇家飞利浦电子股份有限公司 A biosensor system for single particle detection
AU2011257260A1 (en) * 2010-05-27 2013-01-10 Episentec Ab Improved method of sensor measurement
US20130088221A1 (en) 2010-06-22 2013-04-11 Koninklijke Philips Electronics N.V. Detection of magnetic particles and their clustering
US20120077184A1 (en) * 2010-09-28 2012-03-29 Starkdx Incorporated Electromagnetic multiplex assay biosensor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030095263A1 (en) * 2000-02-08 2003-05-22 Deepak Varshneya Fiber optic interferometric vital sign monitor for use in magnetic resonance imaging, confined care facilities and in-hospital
US20050034485A1 (en) * 2003-08-14 2005-02-17 Tam-Telesante Garment for the medical monitoring of a patient

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160331460A1 (en) * 2015-05-11 2016-11-17 Elwha Llc Interactive surgical drape, system, and related methods
US20160331461A1 (en) * 2015-05-11 2016-11-17 Elwha LLC, a limited liability company of the State of Delaware Interactive surgical drape, system, and related methods
US10226219B2 (en) * 2015-05-11 2019-03-12 Elwha Llc Interactive surgical drape, system, and related methods
US10235737B2 (en) * 2015-05-11 2019-03-19 Elwha Llc Interactive surgical drape, system, and related methods
WO2024057310A1 (en) * 2022-09-13 2024-03-21 Marrow Wiz Ltd. Entry point identification system

Also Published As

Publication number Publication date
US10031132B2 (en) 2018-07-24
IN2014CN02659A (en) 2015-06-26
CN103890588A (en) 2014-06-25
CN103890588B (en) 2017-02-15
US20140329335A1 (en) 2014-11-06
WO2013057616A1 (en) 2013-04-25
JP6054405B2 (en) 2016-12-27
EP2745118B1 (en) 2020-12-09
JP2014531029A (en) 2014-11-20
EP2745118A1 (en) 2014-06-25

Similar Documents

Publication Publication Date Title
EP2717774B1 (en) Dynamic constraining with optical shape sensing
US20140323856A1 (en) Magnetic particle detection with incubation period
EP2747590A1 (en) Body surface feedback for medical interventions
US11219487B2 (en) Shape sensing for orthopedic navigation
US10610085B2 (en) Optical sensing-enabled interventional instruments for rapid distributed measurements of biophysical parameters
EP2677937B1 (en) Non-rigid-body morphing of vessel image using intravascular device shape
EP2866642B1 (en) Fiber optic sensor guided navigation for vascular visualization and monitoring
EP2830502B1 (en) Artifact removal using shape sensing
US20180228553A1 (en) Sensored surgical tool and surgical intraoperative tracking and imaging system incorporating same
US20190117317A1 (en) Organ motion compensation
EP2632384A1 (en) Adaptive imaging and frame rate optimizing based on real-time shape sensing of medical instruments
JP6706576B2 (en) Shape-Sensitive Robotic Ultrasound for Minimally Invasive Interventions
KR20160069180A (en) CT-Robot Registration System for Interventional Robot
JP2017500935A5 (en)
CN109715054A (en) The visualization of image object relevant to the instrument in external image
US11406278B2 (en) Non-rigid-body morphing of vessel image using intravascular device shape
WO2018160955A1 (en) Systems and methods for surgical tracking and visualization of hidden anatomical features
EP3944254A1 (en) System for displaying an augmented reality and method for generating an augmented reality

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONINKLIJKE PHILIPS N.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KLINDER, TOBIAS;MANZKE, ROBERT;CHAN, RAYMOND;SIGNING DATES FROM 20130128 TO 20130201;REEL/FRAME:032705/0796

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE