Nothing Special   »   [go: up one dir, main page]

NL2019891B1 - Label-free microscopy - Google Patents

Label-free microscopy Download PDF

Info

Publication number
NL2019891B1
NL2019891B1 NL2019891A NL2019891A NL2019891B1 NL 2019891 B1 NL2019891 B1 NL 2019891B1 NL 2019891 A NL2019891 A NL 2019891A NL 2019891 A NL2019891 A NL 2019891A NL 2019891 B1 NL2019891 B1 NL 2019891B1
Authority
NL
Netherlands
Prior art keywords
sample
light
light beam
image data
scattered
Prior art date
Application number
NL2019891A
Other languages
Dutch (nl)
Inventor
Candelli Andrea
Original Assignee
Lumicks Tech B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lumicks Tech B V filed Critical Lumicks Tech B V
Priority to NL2019891A priority Critical patent/NL2019891B1/en
Priority to PCT/NL2018/050753 priority patent/WO2019093895A1/en
Priority to DE112018005412.8T priority patent/DE112018005412T5/en
Application granted granted Critical
Publication of NL2019891B1 publication Critical patent/NL2019891B1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0056Optical details of the image generation based on optical coherence, e.g. phase-contrast arrangements, interference arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor

Landscapes

  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

A method of imaging at least part of a sample is provided which comprises: focusing at least part of a light beam in a sample plane in the sample, and in particular focusing the part of the light beam at or near the scatterer therein, thus providing unscattered light and scattered light; causing a displacement of at least part of the sample and at least part of the focus with respect to each other; and for plural relative positions of the sample and the focus: collecting the unscattered light and the scattered light with a detection system focused in at least part of the sample, and comprising a position dependent sensor; and controlling the detection system to capture image data, the image data representing at least part of the intensity pattern related to the outgoing angular distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least part of an intensity pattern in the back focal plane of the detection system and/or in an optical conjugate plane of the back focal plane of the detection system. This step may also be described as: detecting with a position dependent sensor at least part of the intensity pattern in the 20 back focal plane and/or in an optical conjugate plane of the back focal plane of the detection system. A corresponding system is also provided.

Description

Label-free microscopy
TECHNICAL FIELD
The present disclosure relates to microscopy, in particular tomicroscopy of biological specimens.
BACKGROUND
In microscopy there is a continuous strive towards imagingever smaller details ever clearer and faster.
Current microscopy methods, in particular microscopy ofbiological and/or molecular interactions, heavily rely on (fluorescent)labelling strategies, which can be challenging and which potentially eveninterfere with the molecular interactions under scrutiny. Although (singlemolecule) fluorescence microscopy can be very powerful due to the highselectivity that can be achieved with labelling techniques there are anumber of intrinsic drawbacks associated with it, such as the need forlabelling steps, limited signal due to photobleaching, and limited photonflux which limits imaging speed. It is noted that several label freemicroscopy techniques have been developed previously. Standard brightfield microscopy is based on differences in the transmitted lightintensity induced by a sample. The contrast is mostly based on absorptionand or scattering of light induced by the sample. However, many specimens,especially biological specimens, do not absorb or scatter much lightleading to poor contrast in many cases. Dark field microscopy uses aspecial illumination system which illuminates a sample with a hollow coneof light in such a way that the unscattered light cannot reach thedetector. This leads to contrast based on the scattering properties of thesample. A different technique is Interferometric scattering microscopy"iSCAT" [Ortega-Arroyo et al. , Phys. Chem. Chem. Phys., 14, 15625(2012)]. This technique uses light reflected from a microscope coverglassas a reference field in order to detect scattered sample lightinterferometrically with great sensitivity. It has been shown to havesingle-molecule sensitivity. This method however suffers fromirregularities in the coverglass which causes large background signals.Because of the scattering produced by the glass irregularities, for iSCATimaging, it is required to first to obtain a "background" image of thesurface sample prior to adding the biological specimen. In addition, thisapproach is limited to biological samples in close proximity to the glass surface, severely limiting its application to three-dimensional structuresor suspended samples .
Hence, further improvements are desired.
It is noted that for characterisation of optical traps anddetection of forces on trapped objects back-focal-plane interferometry-based techniques have been developed: Gittes and Schmidt [Gittes, F. andSchmidt, C. F., Optics Letters, 23, 7 (1998)] developed back-focal-planeinterferometry to sensitively detect the displacement of microscopic beadstrapped in optical tweezer systems in a nonimaging manner. Jahnel et al.[Jahnel, M., et al, Optics Letters, 36, 1260-1262 (2011)] used back-focal-plane interferometry to map the force field of an optical trap. DeSantisand Cheng [DeSantis, M. 0., & Cheng, W., WIRES Nanomed Nanobiotechnol, 8,717-729.(2016)] describe how optical trapping combined with back-focal-plane interferometry can be used to detect individual virus particles anddifferentiate them from aggregates.
SUMMARY
Herewith, a method and a system in accordance with theappended claims is provided.
An aspect comprises a method of imaging at least part of asample, in particular a biological sample comprising a scatterer, e.g. abiological object. The method comprises: focusing at least part of a light beam in a sample plane inthe sample, and in particular focusing the part of the light beam at ornear the scatterer therein, thus providing unscattered light and scatteredlight; causing a displacement of at least part of the sample and atleast part of the focus with respect to each other; and for plural relative positions of the sample and the focus:collecting the unscattered light and the scattered lightwith a detection system focused in at least part of the sample, andcomprising a position dependent sensor; controlling the detection system to capture image data,the image data representing at least part of the intensity patternrelated to the outgoing angular distribution of the scattered andunscattered light in the sample plane, in particular the image datarepresenting at least part of an intensity pattern in the back focalplane of the detection system and/or in an optical conjugate planeof the back focal plane of the detection system. This step may also be described as: detecting with a position dependent sensor at leastpart of the intensity pattern in the back focal plane and/or in anoptical conjugate plane of the back focal plane of the detectionsystem.
The method further comprises: constructing an image of atleast part of the sample, in particular at least part of the scatterer, onthe basis of the image data associated with the plural relative positionsas a function of the relative positions of the sample and the focus.
Thus, imaging and/or detection of a scatterer is providedbased on back-focal-plane interferometry. The imaging is based oncapturing image data corresponding to detecting the scattered (&unscattered) light in the far field. The imaging may not affect thescatterer's position and/or other properties. Each of the plural relativepositions of the sample and the focus produces its own intensity pattern,resultant of interference between the non-scattered and the scatteredlight, in the back focal plane and/or in an optical conjugate plane of theback focal plane, representative of the interaction of the light beamfocus with the portion of sample in which the light beam is focused.
In particular, the focus of the light beam and the focus ofthe collecting optical system may (be caused to) coincide, providing aconfocal arrangement.
The light source may comprise a laser and the light beam maybe a laser beam. The light beam may have any suitable beam shape butpreferable is a Gaussian beam, which enables a particularly well-definedfocus. Displacing the sample (part) and the focus (part) relative to eachother in a controlled manner is also referred to as "scanning". Thescanning may be 1-, 2- and/or 3-dimensional, scanning directions (i.edirections of relative displacement) may be selected as desired andpreferably be orthogonal to each other, preferably at least one of thescanning directions is perpendicular to the direction of propagation ofthe light beam; imaging may be based on 1-, 2- or 3-dimensional scanning,such as by scanning along a linear path (e.g. along a strand), scanning aplane (which may have any suitable shape and/or orientation in the sample)or scanning a volume, wherein the construction of the image and itspossible interpretation may depend on properties of the detector, e.g.enabling position dependent detection in one or two spatial directions(see below). Information on dynamics of the sample may be derived fromconsecutive scans of one or more sample portions, which may be depicted asa series of images, an image stack, a kymograph, a graph indicating the time-dependent changes and/or as a time varying image (a movie). Inanother option, displacement of at least part of the sample with respectto the sample holder and to the focus may be caused by diffusion of atleast part of the sample, which may have natural causes allowed over time,and/or which may be caused by causing flow of a sample fluid within thesample holder. A scatterer, e.g. a scattering particle located in a focalplane of a light beam, in particular a Gaussian beam, of which theincoming field is defined as Ei, will produce a scattered field Es. It isnoted that direct detection of the scattered field Es is possible (e.g. indarkfield microscopy) but this is limited to relatively large structuresbecause of a size dependence of the scattering cross-section r scalingwith r6. However, present method relies on the notion that when thescattered field Es is allowed to interfere with the unscattered field Eu,the interference term scales with a r-’-size dependence instead. Thisfacilitates detection of smaller details.
The scatterer should have an index of refraction that differsfrom the surrounding medium in the sample for at least a portion ofwavelengths of the light of the light beam. The image in particular showsone or more of position, shape and size of the scatterer in the sample.The sample may comprise plural scatterers and/or scatterers comprisingdifferent structures and/or differently scattering structures.
Suitable biological objects for acting as a scatterer in abiological sample as discussed herein may comprise or be a cellular bodyor a substructure thereof, such as any one of cells, proteins, smallmolecules interacting with proteins, viruses, DNA and RNA molecules,chromosomes, organelles, filaments, a sub-cellular structures, but alsotissues, antibody stained tissues, protein-small molecule complexes etc.
The detection system, e.g. based on a condenser (or anobjective), is positioned after the sample as seen in the direction ofpropagation of the light beam, to collect at least a portion of thescattered light. The intensity pattern is created by interference in theback focal plane of the optical system, the pattern depending on therelative position between the scatterer, e.g. a refractive portion of aparticle, and the focus. Capturing image data, e.g. by detection of thisintensity pattern by the position sensitive detector in the back focalplane, or an optical conjugate plane thereof, allows sensitive detectionof the scatterer. The image may be constructed by converting the imagedata associated with each relative position of the sample and the focus into one or more pixel values of the constructed image. By varying therelative positions of at least part of the sample and the focus, e.g.scanning the positions of the sample and the focus relative to each other,an image can be obtained that is dependent on the scattering properties ofthe scatterer, in particular on diffraction and refraction (but alsoabsorption) properties of the scatterer and/or any structures thereofand/or therein. As a consequence, the scatterer need not be labelled orotherwise affected.
The light intensity pattern in the back focal plane or theoptical conjugate plane thereof, and consequently the intensitydistribution on the position sensitive detector, is proportional tointeraction of the light and the scatterer, in particular proportional tothe amount of deflection of the light beam induced by the sample. Notethat, unless specified otherwise, all references to directions and/orrelative position like "lateral" are to be understood as referencing tothe direction of propagation of the light beam; generally the direction.
The image data may comprise one or more of the intensitydistribution on the position sensitive detector or a fraction thereof, atotal intensity on the position sensitive detector, colour information,variation date, time stamps, etc. The image data used for construction ofthe image may also comprise spatial and/or time averages of detectionsignals and/or statistical information regarding detection signals, e.g.root-mean-square deflection noise (RMS deflection noise). The positionsensitive detector may comprise e.g. a camera, a diode array, a quadrantphotodiode (QPD), a position sensitive diode (PSD), or any combinationthereof. A detector that is position sensitive in two dimensions, e.g. acamera, a PSD or a QPD enables independent and/or simultaneous detectionof the interaction of the scatterer and the light beam projected ontothese two dimensions, i.e. it enables quantification of the interaction inX and Y directions independently and allows the total, or absolute, signalto be calculated as S = sqrt(Sx2 + Sy2). Quantification of the signalmight be used to obtain certain properties of the sample e.g. scatterer'ssize, permittivity of the solvent medium, relative refractive index of thescatterer compared to the medium and incident wavelength of the light. Arelatively simple detector such as a QPD or PSD may be faster than acamera, enabling detection of rapid changes in the sample. Such a detectormay also have a larger dynamic range enabling measurement of small signalson a large background. A scatterer in the sample plane, particularly a scattererslightly above or below the focal plane of the light beam, might cause asymmetric change to the interference pattern in the back-focal plane (i.e.the chief ray of the scattered beam might not be deflected in the lateraldirection but the marginal rays of the light beam might be deflected tocause a divergence or convergence of the scattered light beam). Detectionof this deflection may also be implemented for providing information forthe imaging step. As an example, the total intensity of the transmittedlight within a certain region in the back focal plane of the opticalsystem may be monitored, e.g. using an aperture to restrict the range ofacceptance angles of the detection system.
The image may be constructed as an array of pixels. Each pixelor groups of pixels may correspond to one relative position of the sampleand the focus. Also or alternatively, each pixel or groups of pixels maycorrespond to a plurality of relative positions of the sample and thefocus. This facilitates scaling the image.
The method may further comprise recording the image data, e.g.detected intensity patterns, as a function of the relative positions ofthe sample and the focus and constructing an image of the at least part ofthe sample on the basis of the recorded image data. The image data may bestored in a transient or permanent memory and/or be transmitted throughthe internet to a remote controller or computer. Detection and imageconstruction may therefore be done separately.
The resolution may depend on the relative sizes of the focusand scatterer or, respectively any structure(s) of the scatterer to bestudied. This is particularly interesting for biological samples which maycomprise scattering particles varying in one or more of sizes, shapes andinternal structures. Preferably, the light beam focus is smaller than thescatterer; this may facilitate resolving details of the scatterer smallerthan the scatterer itself (i.e. sub-scatterers). In each case, dependentalso on the detector used, the image data may be used for constructing theimage per direction independently and/or in combination, e.g. the imagebeing based on signals Sx, Sy, Sz (signals in X-, Y- and Z-directionsindependently) and/or as an absolute value Sabs (Sabs = sqrt{Sx,'2 + Sy^l +SzA2}), wherein "S" indicates a signal strength which may correspond to oris proportional to the amount of beam deflection. Note that a two-dimensional position dependent sensor may be formed by a combination oftwo one-dimensional position dependent sensors oriented (to detect) at anangle to each other, in particular perpendicular to each other. However, more complex sensors, including cameras, may also be used, whereinaveraging over sensor data (e.g. averaging over parts of the camera image)may be used to define part of the image data.
The image may be constructed from a combination of image dataassociated with one or more individual directions, e.g. corresponding toamounts of scattering in one or more directions (e.g.: Sx, Sy) and/or toan absolute value thereof (e.g.: Sabs).
At least part of the image may be rendered in a brightnessscale and/or an essentially single-colour-scale, e.g. a grey scale whereindegrees of brightness may correspond to amounts of beam deflection in oneor more directions and/or to an absolute value thereof (e.g. Sx, Sy, Sabsas discussed above). Instead of a grey scale any other sequential colormapmay be used, where sequential means that the perceived lightness valueincreases or decreases monotonically through the colormap, e.g. a colourtemperature scale (also known as thermal red scale) ranging from purplishred via bright red, orange and yellow to white. In such scale, images willexhibit the well-known shadow effect which gives standard DIC (Nomarskyphase microscopy) images their three-dimensional appearance. The positiondependent sensor may be a two-dimensional position dependent sensor, inparticular a sensor capable of detecting two perpendicular directionssimultaneously, conveniently called X- and Y- directions. Then, both X andY beam deflection data may be simultaneously available as (part of) theimage data. In such case different linear combinations of the X and Y beamdeflection data may be used for construction of the image, in particularfor generating pixel values, and rendering the image with a shadow effectoriented in any chosen direction. This may increase a (perceived)resolution or contrast of the image. This is an advantage over knowntechniques, e.g. in DIC microscopy one needs to physically rotate theWollaston prism which determines the shearing direction in order to choosethe orientation of the shadow effect, i.e. one needs to physicallymanipulate (the beam line of) the optical setup itself.
The output of a quadrant photo diode ("QPD") and/or a positionsensitive diode ("PSD") may not only be a difference signal inperpendicular directions, but also a sum signal proportional to the totalintensity detected on the diode. The former may relate to the lateralchange in propagation/deflection of the light. The latter may correspondto the total intensity of the transmitted beam or reflected beam(dependent on whether the method is performed in-line or reflectively, seebelow). In such case, simultaneously with the deflection based contrast also an intensity based contrast image may be constructed. The methodtherefore also gives access to absorption/extinction parameters of thesample. In addition, the simultaneous measurement of both the deflectionand the intensity of the transmitted beam allows to correct for artefactssuch as caused by sudden laser emission intensity variations, e.g. bynormalizing the deflection signal by the total intensity signal.
The method may be performed in an in-line arrangement. Forthat, the method may comprise arranging the light source on one side ofthe sample and the optical system on a second side of the sample, inparticular the first and second sides being opposite each other, such thatat least part of the light beam traverses the sample from the first sideto the second side before reaching the optical system and the detector.Such method may further comprise focusing at least part of a light beam inthe sample from the first side and collecting the unscattered light andthe scattered light on the second side of the sample and controlling thedetection system to capture image data representing at least part of theintensity pattern resulting from the collected light, as above. Thus, themethod is based on forward-scattered light, or rather on interference offorward-scattered light with unscattered transmitted light.
Also or alternatively, the method may be performed in areflection arrangement. For that, the method may comprise arranging thelight source and the optical system on one side of the sample, andarranging a reflector for at least part of the light beam, such that atleast part of the light beam traverses at least part of the sample from afirst side and returns to the first side before reaching the opticalsystem and the detector. Then, the method may further comprise focusing atleast part of a light beam in the sample, from the first side andcollecting the unscattered light and the scattered light on the first sideof the sample and controlling the detection system to capture image datarepresenting at least part of the intensity pattern resulting from thecollected light, as above. Thus, the method is based on backward-scatteredlight, or rather interference of backward-scattered light with unscatteredreflected light. The reflected light may be reflected from at least one ofa portion of the sample, a portion of a sample holder and/or a separatereflector .
Also or alternatively, the method may be performed in acircumventional arrangement, wherein the unscattered light is light nothaving traversed and/or otherwise interacted with the sample at all.
Spatial filtering of the scattered and/or the unscatteredlight between the optical system and the sensor, may be employed, suchthat at least part of the scattered and/or of the unscattered light passesthrough a spatial filter prior to reaching the sensor.
For accurate beam deflection and absorption measurements it isadvantageous to collect as much as possible of the scattered light conefor which one may use a condenser with a numerical aperture larger thanthe refractive index of the sample medium. This particularly applies toforward-scattered light in an in-line arrangement. In particular fordetection of symmetric changes to the back focal plane interferencepattern it may be advantageous to restrict the acceptance angle of thedetected light cone. Therefore an iris may be arranged in the back focalplane of the condenser, which may be adjustable. An iris may also beotherwise employed for spatial filtering. Also, or alternatively, a dual-detection system may be provided and used, wherein a portion of the lightcone, preferably substantially the full light cone, is collected by thedetection system, e.g. a condenser, after which one part of the collectedlight is detected by the position sensitive detector, e.g. for lateraldeflection determination and/or for absorption measurement and/or forcapturing image data representative thereof, to construct the image, whilea second detector may be provided and used for measuring another portionof the beam selected by a spatial filter for detecting changes to thecollimation of the scattered beam and/or for capturing image datarepresentative thereof.
The method may comprise trapping at least one object in thesample, in particular optically trapping, wherein the object comprises thescatterer or the scatterer interacts with at least one of the objects,e.g. being attached to an object. The interaction may comprise one or moreof being attached to the object, moving with respect to the object,reacting with the object in a chemically and/or a biological and/or aphysical sense, etc. Thus, at least one of the position and orientation ofthe scatterer may be controlled and/or adjusted in the sample. This mayfacilitate studying the scatterer. The object may be a microsphere and thescatterer may be a biological object, e.g. a cellular body, a filament, amacromolecule etc. Optical trapping may obviate (presence of) attachmentstructures for holding the scatterer, which might otherwise affect thescatterer, and/or it may support the object free from (i.e. not in contactwith) a solid substrate. E.g., this might avoid unwanted contributions ofthe sample holder to the signal and it might decouple the sample from unwanted motions (vibrations or drift) of the sample holder. Thus, imageresolution and stability may be improved.
The method may comprise trapping, in particular opticallytrapping, plural objects attached to each other by at least one connectingelement, wherein at least one of the objects and/or the connectorcomprises the scatterer, and/or wherein the scatterer interacts with atleast one of the objects and/or the connecting element(s), e.g. beingattached to an object or to the connecting element(s). In particular theobjects may be microspheres and the connecting element(s) may comprise afilament, a microtubule, a DNA-strand, etc.
In case of optical trapping of at least one object in thesample with one or more optical trapping beams, the light beam may differfrom at least one of the optical trapping beams in at least one intensity,wavelength and polarization. Thus, interaction between the trappingbeam(s) and the (detection) light beam may be prevented and/or thedifferent beams may be separately controlled by wavelength-specific and/orpolarization-specific optics.
The method is flexible and may comprise modifying one or moreof: the focus size of the light beam, the intensity of the light beamand/or the wavelength of the light beam, as well as - in the case oftrapping - modifying one or more of: the focus size of a trapping lightbeam, the intensity of a trapping light beam and/or the wavelength of atrapping light beam. The modification may be done within one image and/orbetween different images and it may be controlled by a controller. Themodification allows detection of different image details and/or image datacapturing with different scanning settings. In a particular embodiment,the light beam may serve as a trapping beam. The described modificationfacilitates switching between both functions, e.g. by dithering powerand/or wavelength.
The method may comprise that at least one of the light beam ispolarized, in particular linearly polarized. Also, (the part of) theintensity pattern of the scattered and unscattered light represented inthe image data may be detected through at least one polarization dependentoptical element, such as a polarizer, a Polarizing Beam Splitter Cube("PBSC"), a Wollaston prism, etc. comprised in the collecting opticalsystem. The light may be split in different fractions according tomultiple polarizations and each split fraction may be detected separatelyon a position dependent sensor and image data representing one or severalof the fractions may provide information on polarization dependent characteristics of the sample, e.g. polarization altering characteristicsof (at least part of) the sample. One or more of the polarization of thelight beam and the at least one polarization dependent optical element maybe adjustable with respect to polarisation directions, which may becontroller operable; the light beam may be sent through a polarisationchanging element such as a quarter wave and/or a half wave plate.
The method may comprise providing at least part of the samplewith an optically effective label, possibly comprising opticallyactivating or de-activating the label. Although they are not required withthe presently provided techniques, labeling and associated techniques maybe exploited: e.g. it might be advantageous to scan for example biologicalsamples such as cells, tissues, biomolecules which have been (partly)fluorescently labelled and to simultaneously detect fluorescence thereof.The fluorescence may be excited or de-excited (e.g. quenched, bleached,etc.) by the scanning beam. In particular in the latter case thefluorescence might label specific structures of interest in the sample(the principal stain) while the scattering contrast may act as thecounterstain for providig a composite image with more context than theprimary stain alone.
In accordance with the method a system is provided herewith.
The system comprises: a sample holder to hold a biological sample, a lightsource providing a light beam, and, operably arranged along an opticalpath of at least part of the light beam: a source optical system, whichmay comprise one or more optical elements, in particular a focusing lensand/or an objective, and which is arranged to focus at least part of thelight beam in a sample held in the sample holder; a detection systemcomprising a position dependent sensor, e.g. one or more of a splitphotodiode, a quadrant photodiode, a photodiode array, a camera, aposition-sensitive photodiode. The detection optical system, e.g.comprising one or more optical elements, in particular a condenser lens,provides a back focal plane and is arranged to collect at least part ofthe light beam comprising both light not scattered by the sample and lightscattered by at least one scatterer in the sample and to provide from theman intensity pattern in the back focal plane.
The detection system is arranged to capture image data, theimage data representing at least part of the intensity pattern related tothe outgoing angular distribution of the scattered and unscattered lightin the sample plane, in particular the image data representing at leastpart of an intensity pattern in the back focal plane of the detection system and/or in an optical conjugate plane of the back focal plane of thedetection system. E.g. The position dependent sensor is arranged to detectat least part of the intensity pattern in the back focal plane and/or inan optical conjugate plane of the back focal plane.
At least part of at least one of the sample holder, the lightsource and the source optical system is adjustable to controllablydisplace the focus of the light beam and at least part of the samplerelative to each other, e.g. being connected to a position controllerwhich the system may comprise. The system further comprises a controllerconnected with the position dependent sensor and programmed to constructan image of at least part of the sample, in particular at least part ofthe scatterer, on the basis of the image data associated with the pluralrelative positions as a function of the relative positions of the sampleand the focus. The relative positions may result from a 1-, 2- or 3~dimensional scan of at least part of the sample. The image may be 1-,2~ or 3- dimensional and it may be rendered in a sequential colour scale.
The source optical system and the detection system preferablyare configured in a confocal arrangement.
The system may comprise a spatial filtering system, e.g.comprising a pinhole and/or an iris, which may be adjustable with respectto position and/or aperture, for spatial filtering detection light betweenthe collecting optics (condenser, objective, ...) of the detection systemand the position dependent sensor. The spatial filtering system may beconnected with a controller. The spatial filtering system may furthercomprise relay optics.
The system may comprise a trapping arrangement to trap and/orhold one or more objects in the sample. In particular, an optical trappingarrangement may be provided. A multiple trapping arrangement to trapand/or hold one or more objects in the sample in multiple traps may bepreferred. An optical trapping arrangement may comprise one or more lightssources, e.g. lasers, focusing optics and detection optics arranged toprovide one or more optical trapping beams in the sample.
Further, in accordance with the principles disclosed herein,an optical detection module is provided to be placed in an optical train of a sample or beam scanning microscope comprising a sample holder to hold a biological sample, a light source providing a light beam, and, operablyarranged along an optical path of at least part of the light beam, a source optical system arranged to focus at least part of the light beam in a sample held in the sample holder, and wherein at least part of at least one of the sample holder, the light source and the source optical systemis adjustable to controllably displace the focus of the light beam and atleast part of the sample relative to each other, e.g. being connected to aposition controller. The detection module comprises: a detection optical system comprising a position dependentsensor, e.g. one or more of a split photodiode, a quadrant photodiode, aphotodiode array, a camera, a position-sensitive photodiode; wherein the detection system provides a back focal plane andis arranged to collect at least part of the light beam comprising bothlight not scattered by the sample and light scattered by at least onescatterer in the sample and to provide from them an intensity pattern inthe back focal plane; wherein the detection system is arranged to capture image data, the image data representing at least part of the intensity patternrelated to the outgoing angular distribution of the scattered andunscattered light in the sample plane, in particular the image datarepresenting at least part of an intensity pattern in the back focal planeof the detection system and/or in an optical conjugate plane of the backfocal plane of the detection system; the detection system further comprising a controller connectedwith the position dependent sensor and programmed to construct an image ofat least part of the sample, in particular at least part of the scatterer,on the basis of the image data associated with the plural relativepositions as a function of the relative positions of the sample and thefocus .
Another aspect is a method of imaging at least part of asample, in particular a biological sample comprising a scatterer, themethod comprising: controlling a source optical system to focus at least part of a light beam in the sample and in particular at or near the scatterertherein, thus providing unscattered light and scattered light, which forman intensity pattern in a back focal plane of the source optical system; controlling at least one of the source optical system and asample positioning system to position the focus and the sample at aplurality of different positions with respect to each other, for plural relative positions of the sample and the focus,controlling a detection system to capture image data, the image datarepresenting at least part of the intensity pattern related to theoutgoing angular distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least partof an intensity pattern in the back focal plane of a detection systemand/or in an optical conjugate plane of the back focal plane; andconstructing an image of at least part of the sample on thebasis of the image data associated with the plural relative positions as afunction of the relative positions of the sample and the focus.
Further, a computer-implemented method for imaging at leastpart of a sample, in particular a biological sample comprising ascatterer, is provided, the method comprising: controlling a source optical system to focus at least part of a light beam in the sample and in particular at or near the scatterertherein, thus providing unscattered light and scattered light; controlling at least one of the source optical system and asample holder to displace at least part of the sample and at least part ofthe focus with respect to each other for achieving plural relativepositions of the sample and the focus; at each relative position of said plural relative positions,controlling a detection system to detect at least part of an intensitypattern, e.g. an interference pattern, caused by the unscattered light andthe scattered light combining; constructing an image of the at least part of the sample onthe basis of the detected interference intensity patterns respectivelyassociated with the plural relative positions.
One distinct aspect of this disclosure relates to a controllercomprising a processor that is configured to execute one or more of thesteps of the computer-implemented methods as described herein.
One distinct aspect of this disclosure relates to a computerprogram comprising instructions to cause a controller as described hereinto carry out one or more of the steps of the computer-implemented methodsas described herein.
One distinct aspect of this disclosure relates to a computer-readable medium comprising a computer program as described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The above-described aspects will hereafter be more explainedwith further details and benefits with reference to the drawings showing anumber of embodiments by way of example.
Fig. 1A illustrates an optical system 2 according to anembodiment;
Fig. IB illustrates a method for imaging at least part of asample according to one embodiment;
Fig. 2 is an embodiment of a system for label free imagingusing back-focal-plane interferometry; Fig. 2A shows a typical deflectionsignal of the system of Fig. 2 as a light beam is scanned over a smallobject in the sample;
Fig. 3 is an embodiment of another in-line arrangement usingsample scanning;
Fig. 4 is a detail of an embodiment of imaging with a dualbeam optical tweezer system;
Fig. 5 is an embodiment of a reflection arrangement, includingoptional confocal fluorescence detection;
Fig. 6 shows an exemplary embodiment for de-scanned detectionin transmission geometry; Fig. 6A shows a typical deflection signal of thesystem of Fig. 6 as a light beam is scanned over a small object in thesample;
Fig. 7-10 are exemplary images formed in accordance with theprinciples presently disclosed;
Fig. 11 shows a typical intensity pattern as used inaccordance with the principles presently disclosed.
DETAILED DESCRIPTION OF EMBODIMENTS
It is noted that the drawings are schematic, not necessarilyto scale and that details that are not required for understanding thepresent invention may have been omitted. The terms "upward", "downward","below", "above", and the like relate to the embodiments as oriented inthe drawings, unless otherwise specified. Further, elements that are atleast substantially identical or that perform an at least substantiallyidentical function are denoted by the same numeral, where helpfulindividualised with alphabetic suffixes.
Further, unless clearly otherwise indicated, terms like"detachable" and "removably connected" are intended to mean thatrespective parts may be disconnected essentially without destruction ofeither part, e.g. excluding structures in which the parts are integral(e.g. welded or molded as one piece), but including structures in whichparts are attached by or as mated connectors, fasteners, releasable self-fastening features, etc.
Figure 1A illustrates an optical system 2 according to anembodiment. The system 2 comprises a sample holder 6 to hold a sample 70,e.g. a biological sample, comprising a scatterer (not shown). System 2further comprises a source optical system 4 that is configured to focus atleast part of a light beam 12 in the sample 70 and in particular at ornear the scatterer therein, thus providing unscattered light 16 andscattered light 14. Optionally, the source optical system 4 comprises alight source for providing the light beam 12, such as a laser. Asindicated by the arrows x, y, z, at least part of the sample 70 and thefocus of light beam 12 can be displaced with respect to each other.Herewith plural relative positions of the sample 70 and the focus can beachieved. In one example, the source optical system 4 is configured tomove the focus with respect to the sample 70. Additionally oralternatively, the sample holder 6 may be configured to move the focuswith respect to the sample. The unscattered light 16 and the scatteredlight 14 combine and cause an intensity pattern, e.g. an interferencepattern. The optical system 2 further comprises a detection system 8 todetect at least part of this intensity pattern. The detection system 8comprises a position dependent sensor, e.g. one or more of a splitphotodiode, a quadrant photodiode, a photodiode array, a camera, aposition-sensitive photodiode. The detection system 8 may provide a back-focal plane and may be arranged to collect at least part of the scatteredlight 14 and unscattered light 16 in this back focal plane and/or in anoptical conjugate plane of the back-focal plane. System 2 furthercomprises a controller 120 that is configured to control the sourceoptical system 4 and the detection system 8 and optionally the sampleholder 6 to perform their respective functions as described herein.
Figure IB illustrates a method for imaging at least part of asample according to one embodiment. This method may be implemented tocontrol at least one of the source optical system 4, the sample holder 6and the detection system 8.
In step S2, the embodiment comprises controlling a sourceoptical system 4 to focus at least part of a light beam 12 in the sample70 and in particular at or near the scatterer therein, thus providingunscattered light 16 and scattered light 14. Controlling the sourceoptical system 4 to focus at least part of light beam 12 in the sample mayconsist of controlling a light source, such as a laser, to generate alight beam 12, which light beam 12 passes through passive elements, suchas lenses prisms, mirrors, filters et cetera.
In step S4, the embodiment comprises controlling at least oneof the source optical system 4 and a sample holder 6 to cause displacementof at least part of the sample 70 and at least part of the focus withrespect to each other for achieving a relative position of the sample 70and the focus. In order to achieve this position, the source opticalsystem may be controlled, which may comprise controlling an orientation ofa mirror in the source optical system 4 for directing the light beam 12.Alternatively or additionally, the plural positions may be achieved bycontrolling the sample holder 70, which may comprise controlling anorientation and/or position of the sample holder 70.
Step S6 comprises controlling the detection system 8 to detectat least part of an intensity pattern, e.g. an interference pattern,caused by the unscattered light 16 and the scattered light 14 combining.In an example, the detection system 8 comprises a position-dependent lightsensor, such as an imaging system, that comprises a plurality of pixels.Each pixel may be configured to output a light intensity value that isindicative of the light intensity incident on the pixel, and/or indicativeof other image data such as RMS deflection noise. Thus, a plurality ofpixels may output light intensity values that are indicative of a lightintensity pattern. A pixel for example outputs a light intensity value inthe form of a voltage signal. The pixels may continuously outputrespective light intensity values, that may vary with time. The position-dependent sensor may further comprise an image data capture module, thatmay be embodied as a software module in a computer. The image data capturemodule may continuously receive the light intensity values from the pixelsof the position-dependent sensor. It should be appreciated thatcontrolling the detection system 8 to detect at least part of theintensity pattern may consist of transmitting an instruction to the imagedata capture module to store the light intensity values that it iscurrently receiving from the respective pixels. Herewith, the image datacapture module captures the light intensity values and may thus capturethe intensity pattern as image data.
Steps S4 and S6 are repeated at least once, so that at leasttwo intensity patterns are at least partially detected for two respectiverelative positions of the sample and the focus. However, steps S4 and S6may be repeated numerous times.
Step S8 comprises constructing an image of the at least partof the sample on the basis of the detected image data, e.g. the detectedintensity patterns, respectively associated with the plural relative positions. After steps S4 and S6 have been repeated a number of times fora plurality of relative positions, a plurality of intensity patterns havebeen detected by the detection system 8, wherein each intensity pattern isassociated with a relative position of the sample and focus. Step S8 maycomprise, for each detected intensity pattern, determining an image pixelvalue, for example a greyscale value, for an image pixel in the to beconstructed image. Step S8 may further comprise constructing the imagebased on the determined image pixel values and their associated relativepositions .
Figure 2 also shows an exemplary embodiment of a systemarranged for performing at least one embodiment of the method disclosedherein, the system having a source optical system 4. Fig. 2 shows a lightsource 10 projecting a light beam 20, which might be a laser beam, onto ascanning device 30, here being controlled by an optional controller in theform of a central processing unit (CPU) 120. The scanning device can forexample be a tip/tilt mirror or an acousto/electronic optical deflectionsystem.
The beam 20 is relayed using telescope lenses 40, 50 to theback-focal-plane of a microscope objective 60.
The microscope objective 60 focusses the light onto a sampleheld in a sample holder 70. The sample, which might be a biologicalsample, can be scanned by the focused beam by means of the scanningdevice. A condenser lens 80 or similar optical system is used togetherwith a relay optical system (e.g. a single lens 90) of a detection system8 to project the light beam that has passed through the sample onto aposition sensitive detector 100 which can for example be a quadrantphotodiode (QPD), a position sensitive diode (PSD) or a camera of thedetection system 8, positioned in a conjugate of the back-focal plane ofthe condenser. The signals from the position sensitive detector areoptionally amplified and combined in an electronic circuit 110 and aresent to the CPU 120 or other processing unit. An optional beam splitter130 can be used to send part of the light beam that has passed through thesample to a spatial filter 140, positioned in a conjugate of the back-focal plane of the condenser, that can be used to select only a part ofthe beam. A detector 150, e.g. a photodiode, can be used to detect e.g.changes in collimation of the scattered which correlate to axialdisplacements of the scatterer. Based on the detector signals and thecurrent position of the scanner an image can be constructed by the CPU.
The light source 10 can be a laser and the light beam 20 canbe a laser beam. However, other light sources and light beams may beprovided. The sample can be a biological sample, comprising scattererssuch as a cell or sub-cellular structure, a filament (e.g. actin,microtubule), a protein on the surface of a substrate (e.g. a microscopecover glass) or a structure suspended in an optical trap, e.g. a dualoptical tweezer setup. The sample can furthermore comprise any scattererwith topological features or variation in refractive index.
The image contrast of the image to be constructed ispredominantly based on deflection of the light beam caused by interactionwith (the scatterer in) the sample. Any object in the sample, inparticular a sample plane in which the light beam focus is located, havinga refractive index (polarizability) that differs from a refractive indexof a medium surrounding the object will cause a deflection of (part of)the beam, hence the name "scatterer". This deflection can be measured,e.g. by monitoring the difference signal (Vx and/or Vy) of a quadrantphotodiode, wherein the measurement results provide image data. It hasbeen shown that this deflection can be detected directly, e.g. indarkfield microscopy, but this has the disadvantage that the scatteredintensity scales with r6 (a radius of the scattering feature raised to the6th power) which makes it hard to detect small objects such as e.g. singleproteins or small filaments, or structures thereof and changes in size andor mass of the biological object under study due to proteins, smallmolecules or other entities, binding to the initial biological structureof interest. This binding can be either dynamic or static in nature andcan be detected as a variation the deflection signal.
Taking advantage of interference between the deflected /scattered and the unscattered part of the light beam it has now been foundthat one can directly detect the field amplitude instead of the poweramplitude of the scattered light. This reduces the scaling factor to rJ,making it much easier to image or detect small structures and/or objectssuch as single proteins or protein complexes.
The deflection provides a varying interference resulting in adifference signal of the QPD; as the beam is scanned over a scatterer thelight detected on the QPD will first deflect to one side and then to theother. This is illustrated in Fig. 2Ά for a single and smooth scatteringstructure, e.g. a microsphere or a protein smaller than the beam focus.
Scanning an object whose optical properties such as position,size and/or mass change over time will result in a varying amount ofdeflected light and hence of varying signal.
Scanning an object with different scattering structures willresult in a more complex signal shape. Scanning the beam in two dimensionsallows the build-up of a 2D image of the scatterer.
Due to aberrations and imperfections in the optical path ascanning beam approach as illustrated in this exemplary embodiment mightlead to a non-zero and/or structured background image, on top of which itis hard to detect small signals of a scattering object in the sample. Thismay be resolved by careful subtraction of a background image. Such abackground subtraction can for example be achieved by scanning the imagemultiple times where at least a portion of the sample, e.g. the scatterer,is moved by a known amount between the consecutive images e.g. using asample stage or by moving optical traps. Subtracting such consecutiveimages may lead to a background free image, possibly with two displacedcopies of the sample. If the sample is larger than the displacement, post-processing might be useful to recover a single-copy image. Another methodfor background subtraction is to take advantage of any dynamics that mightbe present in the sample: for samples which are changing over time it ispossible to achieve high quality subtraction of a (static) background bysubtracting an average over many images from one or more individualimages .
By repeatedly scanning 2-dimensional images of the samplewhile the sample and focus are displaced with respect to each other alongthe direction of beam propagation (the z-direction) so that the light beamis focused in different layers of the sample is it is possible toconstruct a 3-dimensional image.
It is advantageous to use a condenser with a numericalaperture ("NA") higher than the index of refraction of the sample mediumin order to allow optimal capturing of the scattered light.
Multi-beam scanning can be done to improve imaging speedand/or accuracy. For this, it is preferred to collect and detect thedeflection of multiple beams simultaneously. At least some of the multiplebeams may differ in one or more optical characteristic such aspolarization state (e.g. different linear polarization directions),wavelength, wavelength modulation, intensity modulation, etc. which isdetectable by the detection system by suitable (combinations of) techniques such as polarisation separation, wavelength selective filteringand/or absorption, demodulation techniques, etc.
According to an alternative exemplary embodiment it ispossible to scan the sample stage instead of (a focus of) the light beam.This might have the advantage that in this case there is less backgroundsignal caused by abberations in the optical system. Figure 3 shows ascheme of such an embodiment. However, scanning both at least part of thesample and at least part of the light beam is also possible. Fig. 3 showsa light source 10 of a source optical system 4 projecting a beam of light20 into the back-focal-plane of a microscope objective 60. The objective60 focusses the light into a sample 70 which is mounted onto a sampleholder. The sample holder can be scanned in one or more directions (here:three mutually perpendicular directions X, Y and Z), preferably twodirections that are parallel to a sample plane. The scanning can becontrolled by a controller, e.g. via signals provided by a centralprocessing unit (CPU) 120 shown here. From the sample, when the light beamis at least partly scattered, both the unscattered light beam 20 and thescattered light beam 25 are collected by the condenser 80. Via a relaylens 90 of the detection system 8 the back focal plane of the condenser 80is imaged onto a position sensitive detector 100. Signals from theposition sensitive detector 100 are amplified and combined, as indicatedat reference number 110, and sent to the CPU 120. The CPU 120 usesinformation on the position of the sample holder 70 and the signals fromthe sensor 100 as image data to construct an image of at least part of thesample on the basis of the image data and the relative positions of thesample holder and the focus.
Such system may be less susceptible to spurious backgroundsignals caused by aberrations and optical imperfections. Optionally onecould also add a pinhole or other spatial filter 85 in the focal point ofthe optical relay system to the position sensitive detector. Spatialfiltering may reject unwanted background light scattered from differentportions of the sample, e.g. different focal planes. Further,reconstruction of three dimensional datasets of scattering contrast may befacilitated and/or enhanced by scanning the sample stage in a directionalong the direction of light beam propagation (here: the Z-direction) inaddition to one or more lateral directions (here: X- and Y- directions).
Part of another embodiment is illustrated in Figure 4. Anobject, e.g. a DNA molecule 230 with bound proteins 240, is tetheredbetween two beads 210 held in a sample medium (not indicated) in trapping beams 200 of a dual optical trap, known per se. A light beam 250 isscanned along the DNA molecule 230. The molecule 230 and the proteins 240each scatter the light beam 250 to some extent, dependent on their opticalproperties relative to the surrounding sample medium. At least part of thescattered light and unscattered light are collected and detected asgenerally indicated before. Thus, one or more images of (part of) the DNAmolecule 230 and/or the proteins 240 may be constructed based on the imagedata representative of the back-focal-plane interferometry signal of thelight beam. In such case effective background subtraction can be done e.g.by (slightly) moving the optical traps between consecutive images andsubtracting the images.
According to yet another embodiment, indicated in Figure 5,instead of detection of scattered light and unscattered light in theforward direction (i.e. in transmission through the sample) back-scatteredlight may be used. In this case the light beam 20 generated by the lightsource 10 travels through a beam splitter 300 and travels to the sample 70via an optional beam scanner 30, an optional relay system, which here isindicated as two lenses 40, 50, and an objective 60 which focuses thelight beam 20 into the sample 70. The sample 70 is mounted on a sampleholder, here comprising a coverslip 340 that is at least partlytransparent to the wavelength of the light beam 20. The sample 70 and/orthe sample holder 340 holding the sample 70 may at least partly bemovable, preferably controllably movable as discussed above, e.g.controlled by a controller. At or near the sample 70 at least part of thelight beam 20 is reflected. E.g., a substrate-sample transition in thecoverslip 340 may reflect part of the light from the light beam prior to aremaining part of the light beam having interacted with the actual sample.The reflected light can be used as the reference field for interferometricdetection. Both the light back-scattered from the sample and the reflectedreference light are collected with the objective 60, then serving asdetection optical system and providing the back focal plane. Note that, inthis embodiment, the illumination light, the sample and the detectionlight are, automatically, in a confocal arrangement and the source opticalsystem 4 and the detection system 8 share a significant number of opticalelements (300, 320, 30, 40, 50, 60). The detection light travels backthrough the objective 60 and the relay system 50, 40, and via the scanningmirror 30 to the beam splitter 300 where at least a part of the light isreflected and sent to a position sensitive detector 100 via a furtheroptional relay 90 and an optional pinhole 85 or iris for spatial filtering. For light efficiency one might choose to use linearly polarizedlight form the light source, e.g. using a polarizing beam splitter 300which transmits p-polarized light and a quarter wave plate 320 as shown.If the quarter wave plate 320 is rotated such that the illumination lighttravelling to the sample 70 has a circular polarization, the back-reflected detection light, after passing for a second time through thequarter wave plate 320, will have a linear polarization rotated 90 degreeswith respect to the incoming light and therefore has s-polarization. Thiswill be efficiently reflected by the polarizing beam splitter 300 ensuringoptimal light efficiency directed towards the position dependent sensor100.
If one would like to simultaneously observe any fluorescencelight of the sample 70, e.g. being excited in the sample 70 by theillumination/scanning beam, this can be easily achieved by adding adichroic beam splitter 310 which e.g. transmits the scanning excitationbeam but reflects the fluorescence emission. This is shown in Fig. 4 butsuch optical fluorescence detection system may be added to any embodiment.In Fig. 4, the emission travels to a sensitive point detector 330 or anyother suitable sensor or camera, via another optional relay 90 andoptional spatial filter 85. Any detection signals from the point detector330 may be combined with data from the position sensitive detector 100 aspart of image data for constructing the image.
In another embodiment, not shown, polarization sensitivedetection can be implemented. For this, a polarizing beam splitter may belocated before the sensor 100, to split the detection beam(s) according topolarization. In this way the two orthogonally polarized components of thedetection light (scattered and unscattered light) each give rise to theirown detection signals which may be treated separately or in any suitablecombination as image data for constructing the image. If the scattering bythe sample is polarization dependent this leads to slight differences inthe detection signals from the individual beams which can be analysed forexample in terms of birefringence. The signals may be detected with aquadrant position sensitive detector or with two position sensitivedetectors each associated with one of the polarization directions, suchdetectors then possibly being unidirectionally sensitive. In case of twodetectors, for optimization of the signal on both detectors thepolarization of the illumination/scanning beam may be tuned, e.g. to 45degrees, with the aid of a half wave plate. Similarly, the polarization ofthe illumination/scanning beam could be modulated in conjunction with polarization insensitive detection in order to characterize polarization-dependent scattering (e.g. implementing time-multiplexed polarizationdependent detection) .
In another embodiment, shown in Fig. 6, the beam scanningimplementation and the forward scattered detection can be furtherimplemented using a de-scanning tip-tilt mirror 160 after the light hastravelled through the sample and has been collected by the detectionsystem and passed through a pair of optical relay lenses 90. The de-scanning tip-tilt mirror ensures that the scanning beam is transformedinto a stationary beam. It is now possible to use a spatial filteringassembly consisting of a pair of lenses 170 and a pinhole or spatialfilter 140 before the beam is detected by the position sensitive detector110 (see Fig. 6A for an indicative signal). This spatial filtering can beemployed to reject background light and improve e.g. the z-sect.ioning ofthe scattering signal.
Other optical techniques like wavelength variation and/orwavelength dependent detection and/or detection of angle-dependentscattering may be exploited as well.
Figs. 7-10 show exemplary images, which were obtained with astage-scanning implementation in a microscopy system otherwise generallyin accordance with Fig. 2. The illumination light beam 20 was set to avery low power level, parked at the center of the field of view in aconfocal setup and the microscope sample stage was raster scanned whileacquiring image data representative of deflection data. From the imagedata the images of Figs. 7-10 were constructed.
Figures 7 and 8 are images of a human cheek epithelial cell,constructed from a one-directional deflection signal detected by scanninga cheek epithelial cell through a static light beam by stage scanning.Fig. 7 is constructed from deflection in one direction (X), whereas Fig. 8is constructed from deflection in a perpendicular direction (Y). The scaleof both images is 80 x 60 micrometer.
Figure 9 is an absolute signal image of the cell of Figs. 7-8.The image is obtained by combining the X and Y-direction data of Figs. 7and 8 according to Sabs = sqrt(SxA2 + SyA2) and using a differentbrightness colour scale relative to Figs. 7-8.
Figure 10 is an image of a single chromosome.
Fig. 11 shows a typical intensity pattern of a back focalplane. The pattern is offset from the centre, as indicated by the cross hairs dividing the picture. Intensity differences in X- and Y-direction(Xdiff, Ydiff) and total intensity (Sum) may be calculated as shown.
The disclosure is not restricted to the above describedembodiments which can be varied in a number of ways within the scope ofthe claims. For instance, the interference pattern beam line may becombined with another imaging beam line with which a focal plane or anoptical conjugate thereof may be imaged. Such beam lines may partlyoverlap, e.g. sharing the condenser and being separated by a partial beamsplitter to two different optical detectors, e.g. a quadrant photodiodefor the interference pattern beam line and a camera for the imaging beamline and/or having different wave lengths and being separable using adichromatic mirror and/or a filter.
Elements and aspects discussed for or in relation with aparticular embodiment may be suitably combined with elements and aspectsof other embodiments, unless explicitly stated otherwise.

Claims (16)

1. Werkwijze voor het in beeld brengen van ten minsteeen deel van een monster, in het bijzonder een biologischmonster bevattend een verstrooier, bijvoorbeeld een biologischobject, de werkwijze omvattende: focusseren van ten minste een deel van eenlichtstraal in een monstervlak in het monster, en in hetbijzonder het focusseren van het deel van de lichtstraal op ofnabij de verstrooier daarin, zodoende on-verstrooid enverstrooid licht verschaffend; veroorzaken van een verplaatsing van ten minste eendeel van het monster en ten minste een deel van de focus, tenopzichte van elkaar; en voor meerdere relatieve posities van het monster ende focus: opvangen van het on-verstrooide licht en hetverstrooide licht met een detectiesysteem gefocust in tenminste een deel van het monster, en een positieafhankelijkesensor omvattend; beheersen van het detectiesysteem om beelddatavast te leggen, waarbij de beelddata ten minste een deel vanhet intensiteitspatroon gerelateerd aan de uitgaandehoekverdeling van het verstrooide en on-verstrooide licht inhet monstervlak representeren, in het bijzonder de beelddatawelke ten minste een deel van een intensiteitspatroon in eenachterste focale vlak (back focal plane) van hetdetectiesysteem en/of in een optisch geconjugeerd vlak van hetachterste focale vlak van het detectie systeem representeren,waarbij de werkwijze verder omvat construeren van een beeldvan ten minste een deel van het monster, in het bijzonder tenminste een deel van de verstrooier, op basis van de beelddatageassocieerd met de meerdere relatieve posities als een functie van de relatieve posities van het monster en hetfocus.Method for imaging at least a part of a sample, in particular a biological sample containing a scatterer, for example a biological object, the method comprising: focusing at least a part of a light beam in a sample plane in the sample, and in in particular focusing the portion of the light beam on or near the scatterer therein, thereby providing unscattered and scattered light; causing displacement of at least part of the sample and at least part of the focus relative to each other; and for multiple relative positions of the sample and the focus: receiving the unscattered light and the scattered light with a detection system focused in at least a portion of the sample, and including a position dependent sensor; controlling the detection system to capture image data, the image data representing at least part of the intensity pattern related to the outward angle distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least part of an intensity pattern in a posterior focal plane (back focal plane) of the detection system and / or represent in an optically conjugated plane of the posterior focal plane of the detection system, the method further comprising constructing an image of at least part of the sample, in particular at least part of the sample scatterer, based on the image data associated with the multiple relative positions as a function of the relative positions of the sample and the focus. 2. Werkwijze volgens conclusie 1, waarbij depositieafhankelijke sensor een tweedimensionalepositieafhankelijke sensor is; en/of waarbij de werkwijze verder omvat renderen van tenminste een deel van het beeld in een lichtintensiteitsschaalen/of een in wezen eenkleurige schaal; en/of waarbij de werkwijze verder omvat het beeldconstrueren uit een combinatie van beelddata geassocieerd meteen of meer individuele richtingen, e.g. corresponderend methoeveelheden verstrooiing in één of meer richtingen (Sx, Sy)en/of met een absolute waarde daarvan (Sabs); en/of waarbij de werkwijze verder omvat een contrastbeeldconstrueren gebaseerd op een intensiteit van de lichtstraal nainteractie daarvan met het monster.The method of claim 1, wherein the deposition dependent sensor is a two dimensional position dependent sensor; and / or wherein the method further comprises rendering at least a portion of the image in a light intensity scale and / or an essentially single color scale; and / or wherein the method further comprises image constructing from a combination of image data associated in one or more individual directions, e.g., corresponding amounts of scattering in one or more directions (Sx, Sy) and / or having an absolute value thereof (Sabs); and / or wherein the method further comprises constructing a contrast image based on an intensity of the light beam after-interaction thereof with the sample. 3. Werkwijze volgens een der vorige conclusies,omvattende rangschikken van de lichtbron aan één zijde van hetmonster en het optische systeem aan een tweede zijde van hetmonster, in het bijzonder de eerste en tweede zijden tegenoverelkaar zijnde, zodat ten minste een deel van de lichtstraalhet monster doorkruist van de eerste zijde naar de tweedezijde alvorens het optische systeem en de detector tebereiken.A method according to any one of the preceding claims, comprising arranging the light source on one side of the sample and the optical system on a second side of the sample, in particular the first and second sides being opposite each other, so that at least a part of the light beam forms the sample traversed from the first side to the second side before reaching the optical system and detector. 4. Werkwijze volgens een der vorige conclusies,omvattende rangschikken van de lichtbron en het optischesysteem aan één zijde van het monster, en rangschikken van eenreflector voor ten minste een deel van de lichtstraal zodatten minste een deel van de lichtstraal ten minste een deel vanhet monster doorkruist van een eerste zijde en terugkeert naar de eerste zijde alvorens het optische systeem en de detectorte bereiken.A method according to any one of the preceding claims, comprising arranging the light source and the optical system on one side of the sample, and arranging a reflector for at least a part of the light beam so that at least a part of the light beam crosses at least a part of the sample from a first side and returns to the first side before reaching the optical system and detector. 5. Werkwijze volgens een der vorige conclusies,omvattende ruimtelijk filteren van het verstrooide en/of heton-verstrooide licht tussen het optische systeem en de sensor,zodat ten minste een deel van het verstrooide en/of on-verstrooide licht door een ruimtelijk filter passeert alvorensde sensor te bereiken.A method according to any one of the preceding claims, comprising spatially filtering the scattered and / or unscattered light between the optical system and the sensor, so that at least part of the scattered and / or unscattered light passes through a spatial filter before reaching the sensor. 6. Werkwijze volgens een der vorige conclusies,omvattende vangen van ten minste één object in het monster, inhet bijzonder optisch vangen, waarbij het object deverstrooier omvat of de verstrooier interacteert met tenminste één van de objecten, bijvoorbeeld verbonden zijnde aaneen object.A method according to any one of the preceding claims, comprising trapping at least one object in the sample, in particular optical trapping, wherein the object comprises the scatterer or the scatterer interacts with at least one of the objects, for example connected to an object. 7. Werkwijze volgens conclusie 6, omvattende vangen,in het bijzonder optisch vangen, van meerdere objecten die aanelkaar zijn verbonden met ten minste één verbindingselement,in het bijzonder objecten zijnde microbollen (microspheres) ende verbindingselement(en) omvattende een microtubule, een DNA-streng, etc., waarbij ten minste één van de objecten en/of deconnector de verstrooier omvat, en/of waarbij de verstrooiermet ten minste één van de objecten en/of hetverbindingselement interacteert, bijvoorbeeld verbonden zijndeaan een object of aan het verbindingselement / deverbindingselementen.Method according to claim 6, comprising catching, in particular optical catching, of a plurality of objects which are connected to at least one connecting element, in particular objects being microspheres (microspheres) and the connecting element (s) comprising a microtubule, a DNA strand, etc., wherein at least one of the objects and / or the connector comprises the diffuser, and / or wherein the diffuser interacts with at least one of the objects and / or the connecting element, for example, being connected to an object or to the connecting element / the connecting elements. 8. Werkwijze volgens conclusie 6 of 7, omvattendeoptisch vangen van ten minste één object in het monster metéén of meerdere optische vangstralen, waarbij de lichtstraal verschilt ten opzichte van ten minste één van de optischevangstralen in ten minste één van intensiteit, golflengte enpolarisatie.A method according to claim 6 or 7, comprising optical capture of at least one object in the sample with one or more optical capture beams, the light beam differing from at least one of the optical capture beams in at least one of intensity, wavelength and polarization. 9. Werkwijze volgens een der vorige conclusies,waarbij de werkwijze omvat modificeren van één of meerderevan: de focusgrootte van de lichtstraal, de intensiteit van delichtstraal en/of de golflengte van de lichtstraal; en/ofwaarbij de werkwijze omvat voorzien van ten minste een deelvan het monster van een optisch werkzaam label, mogelijkomvattende het optisch activeren en/of deactiveren van hetlabelA method according to any one of the preceding claims, wherein the method comprises modifying one or more of: the focus size of the light beam, the intensity of the light beam and / or the wavelength of the light beam; and / or wherein the method comprises providing at least a portion of the sample with an optically active label, possibly comprising optically activating and / or deactivating the label 10. Systeem voor de werkwijze van een der vorigeconclusies, waarbij het systeem omvat: een monsterhouder om een biologisch monster tehouden, een lichtbron welke een lichtstraal verschaft, en werkzaam ingericht langs een optisch pad van tenminste een deel van de lichtstraal: een bron optisch systeem (source optical system)geconfigureerd om ten minste een deel van de lichtstraalte focusseren in een in de monsterhouder gehouden monster,een detectie optisch systeem omvattende eenpositieafhankelijke sensor, bijvoorbeeld één of meervan een gesplitste fotodiode (split photodiode), eenkwadrantfotodiode, een fotodiodearray, een camera,een positie-sensitieve fotodiode; waarbij het detectiesysteem een achterste focalevlak verschaft en is ingericht om ten minste een deel van delichtstraal omvattende zowel licht niet verstrooid door hetmonster en licht verstrooid door ten minste één verstrooier in het monster en van hen een intensiteitspatroon in hetachterste focale vlak te verschaffen; waarbij het detectiesysteem is ingericht om beelddata vastte leggen, de beelddata ten minste een deel van hetintensiteitspatroon gerelateerd aan een uitgaandehoekverdeling van het verstrooide en on-verstrooide licht inhet monstervlak weergevende, in het bijzonder de beelddata dieten minste een deel van een intensiteitspatroon in eenachterste focale vlak van het detectiesysteem en/of in eenoptisch geconjugeerd vlak van het achterste focale vlak vanhet detectiesysteem representeert; waarbij ten minste een deel van ten minste één van demonsterhouder, de lichtbron en het bron optische systeemverstelbaar is om beheersbaar het focus van de lichtstraal enten minste een deel van het monster ten opzichte van elkaar tekunnen verplaatsen, bijvoorbeeld zijnde verbonden aan eenpositiecontroller; het systeem verder omvattende een controller verbonden metde positieafhankelijke sensor en geprogrammeerd om een beeldte construeren van ten minste een deel van het monster, in hetbijzonder ten minste een deel van de verstrooier, op basis vande beelddata geassocieerd met de meerdere relatieve positiesals een functie van de relatieve posities van het monster enhet focus.System for the method of any one of the preceding claims, the system comprising: a sample holder for holding a biological sample, a light source providing a light beam, and operatively arranged along an optical path of at least part of the light beam: a source optical system (source optical system) configured to focus at least part of the light beam in a sample held in the sample holder, a detection optical system comprising a position-dependent sensor, for example one or more of a split photodiode, a quadrant photodiode, a photodiode array, a camera , a position sensitive photodiode; wherein the detection system provides a rear focal plane and is arranged to provide at least a portion of the light beam comprising both light not scattered through the sample and light scattered through at least one diffuser in the sample and to provide them with an intensity pattern in the rear focal plane; wherein the detection system is arranged to capture image data, the image data representing at least part of the intensity pattern related to an exit angle distribution of the scattered and unscattered light in the sample plane, in particular the image data covering at least part of an intensity pattern in a posterior focal plane of the detection system and / or in an optically conjugated plane of the posterior focal plane of the detection system; wherein at least a portion of at least one of the sample holder, the light source and the optical system source is adjustable to controllably shift the focus of the light beam and at least a portion of the sample relative to each other, for example, being connected to a position controller; the system further comprising a controller connected to the position dependent sensor and programmed to construct an image of at least a portion of the sample, in particular at least a portion of the scatterer, based on the image data associated with the plurality of relative positions as a function of the relative positions of the sample and focus. 11. Systeem volgens conclusie 10, omvattende eenruimtelijk filter systeem voor ruimtelijk filteren vandetectielicht tussen het detectiesysteem en depositieafhankelijke sensor.System according to claim 10, comprising a spatial filter system for spatial filtering of detection light between the detection system and deposition-dependent sensor. 12. Systeem volgens conclusie 10 of 11, omvattende eenvanginrichting voor het vangen en/of vasthouden van één ofmeer objecten in het monster, in het bijzonder een optische valinrichting, bij voorkeur een veelvoudvanginrichting voorhet vangen en/of vasthouden van één of meer objecten in hetmonster in meerdere vallen.System according to claim 10 or 11, comprising a capture device for catching and / or holding one or more objects in the sample, in particular an optical falling device, preferably a multiple capture device for catching and / or holding one or more objects in the sample in multiple traps. 13. Detectiemodule voor plaatsen in een optische trein vaneen monster of bundelscanning microscoop omvattende eenmonsterhouder voor het houden van een biologisch monster, eenlichtbron die een lichtstraal verschaft, en, werkzaamaangebracht langs een optisch pad van ten minste een deel vande lichtstraal, een bron optisch systeem ingericht om tenminste een deel van de lichtstraal te focusseren in een in eenmonsterhouder gehouden monster, en waarbij ten minste een deelvan ten minste één van de monsterhouder, de lichtbron en hetbron optische systeem verstelbaar is om beheerst het focus vande lichtstraal en ten minste een deel van het monster tenopzichte van elkaar te verplaatsen, bijvoorbeeld zijndeverbonden aan een positiecontroller; waarbij de detectie module omvat: een detectie optisch systeem omvattende eenpositieafhankelijke sensor, bijvoorbeeld één of meer van eengesplitste fotodiode (split photodiode), een kwadrantfotodiode, een fotodiode array, een camera, een positie-sensitieve fotodiode; waarbij het detectiesysteem een achterste focalevlak verschaft en is ingericht om ten minste een deel van delichtstraal omvattende zowel licht niet verstrooid door hetmonster en licht verstrooid door ten minste één verstrooier inhet monster en van hen een intensiteitspatroon in hetachterste focale vlak te verschaffen; waarbij het detectiesysteem is ingericht om beelddatavast te leggen, de beelddata ten minste deel van hetintensiteitspatroon gerelateerd aan een uitgaandehoekverdeling van het verstrooide en on-verstrooide licht in het monstervlak weergevend, in het bijzonder de beelddata dieten minste een deel van een intensiteitspatroon in eenachterste focale vlak en/of in een optisch geconjugeerd vlakvan het achterste focale vlak representeert; het detectiesysteem verder omvattende een controllerverbonden met de positieafhankelijke sensor en geprogrammeerdom een beeld te construeren van ten minste een deel van hetmonster op basis van de beelddata geassocieerd met de meerdererelatieve posities als een functie van de relatieve positiesvan het monster en de focus.An optical train detection module of a sample or beam scanning microscope comprising a sample holder for holding a biological sample, a light source providing a light beam, and operatively disposed along an optical path of at least part of the light beam, a source optical system to focus at least a portion of the light beam into a sample held in a sample holder, and wherein at least a portion of at least one of the sample holder, the light source and the source optical system is adjustable to control the focus of the light beam and at least a portion of the move sample relative to each other, for example connected to a position controller; the detection module comprising: a detection optical system comprising a position dependent sensor, for example one or more of a split photodiode (split photodiode), a quadrant photodiode, a photodiode array, a camera, a position sensitive photodiode; wherein the detection system provides a rear focal plane and is arranged to provide at least a portion of the light beam comprising both light not scattered through the sample and light scattered through at least one diffuser in the sample and their intensity pattern in the rear focal plane; wherein the detection system is arranged to record image data, the image data represents at least part of the intensity pattern related to an output angle distribution of the scattered and unscattered light in the sample plane, in particular the image data covers at least part of an intensity pattern in a posterior focal plane and / or in an optically conjugated plane of the posterior focal plane; the detection system further comprising a controller connected to the position-dependent sensor and programmed to construct an image of at least a portion of the sample based on the image data associated with the multiple-relative positions as a function of the relative positions of the sample and the focus. 14. Werkwijze van het in beeld brengen van ten minste eendeel van een monster, in het bijzonder een biologisch monsterbevattend een verstrooier, de werkwijze omvattende: besturen van een bron optisch systeem voor het focusserenvan ten minste een deel van een lichtstraal in het monster enin het bijzonder op of nabij de verstrooier daarin, zodoendeon-verstrooid licht en verstrooid licht verschaffend, welkeeen intensiteitspatroon vormen in een achterste focale vlakvan een detectie optisch systeem; besturen van ten minste één van de bron optische systeemen een monsterpositioneringssysteem om de focus en het monsterop een meervoud van verschillende positie ten opzichte vanelkaar te positioneren, voor meerdere relatieve posities van het monster en defocus, het detectie systeem besturen om beelddata vast teleggen, de beelddata ten minste een deel van hetintensiteitspatroon gerelateerd aan de uitgaande hoekverdelingvan het verstrooide en on-verstrooide licht in het monstervlakrepresenterend, in het bijzonder de beelddata die ten minsteeen deel van het intensiteitspatroon in het achterste focalevlak van detectiesysteem en/of in een optisch geconjugeerdvlak van het achterste focale vlak representeert; en construeren van een beeld van ten minste een deel van hetmonster op basis van de beelddata geassocieerd met de meerdererelatieve posities van het monster en de focus.A method of imaging at least a portion of a sample, in particular a biological sample containing a scatterer, the method comprising: controlling a source optical system for focusing at least a portion of a light beam into the sample and in the particularly on or near the diffuser therein, thereby providing unscattered light and scattered light, which form an intensity pattern in a posterior focal plane of a detection optical system; controlling at least one of the source optical systems, a sample positioning system to position the focus and the sample at a plurality of different positions relative to each other, for multiple relative positions of the sample and defocus, controlling the detection system to capture image data, the image data at least part of the intensity pattern related to the outgoing angular distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least part of the intensity pattern in the rear focal plane of detection system and / or in an optically conjugated plane of the posterior focal plane; and constructing an image of at least a portion of the sample based on the image data associated with the multiple-relative positions of the sample and the focus. 15. Een computerprogramma omvattende instructies om hetsysteem volgens een van de conclusies 10-12 de stappen van dewerkwijze volgens een van de conclusies 1-9 te doen uitvoerenA computer program comprising instructions for causing the system of any of claims 10-12 to perform the steps of the method of any of claims 1-9 16. Een computerleesbaar medium met daarop opgeslagen hetcomputerprogramma volgens conclusie 15.A computer-readable medium with the computer program stored thereon according to claim 15.
NL2019891A 2017-11-10 2017-11-10 Label-free microscopy NL2019891B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
NL2019891A NL2019891B1 (en) 2017-11-10 2017-11-10 Label-free microscopy
PCT/NL2018/050753 WO2019093895A1 (en) 2017-11-10 2018-11-12 Label-free microscopy
DE112018005412.8T DE112018005412T5 (en) 2017-11-10 2018-11-12 Label-free microscopy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL2019891A NL2019891B1 (en) 2017-11-10 2017-11-10 Label-free microscopy

Publications (1)

Publication Number Publication Date
NL2019891B1 true NL2019891B1 (en) 2019-05-17

Family

ID=61003320

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2019891A NL2019891B1 (en) 2017-11-10 2017-11-10 Label-free microscopy

Country Status (3)

Country Link
DE (1) DE112018005412T5 (en)
NL (1) NL2019891B1 (en)
WO (1) WO2019093895A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2029859B1 (en) 2021-11-22 2023-06-13 Lumicks Dsm Holding B V Method to produce DNA molecules with repeating units for use in single-molecule assays.

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008092107A1 (en) * 2007-01-26 2008-07-31 New York University Holographic microscope system and method for optical trapping and inspection of materials

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008092107A1 (en) * 2007-01-26 2008-07-31 New York University Holographic microscope system and method for optical trapping and inspection of materials

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ANDERS WALLIN: "OPTICAL TWEEZERS FOR S INGLE MOLECULE BIOLOGY", 24 May 2011 (2011-05-24), Helsinki, XP055495389, ISBN: 978-952-1068-80-5, Retrieved from the Internet <URL:https://helda.helsinki.fi/bitstream/handle/10138/26300/opticalt.pdf> [retrieved on 20180726] *
BASUDEV ROY ET AL: "Probing the dynamics of an optically trapped particle by phase sensitive back focal plane interferometry", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 11 January 2012 (2012-01-11), XP080557562, DOI: 10.1364/OE.20.008317 *
FEKE G D ET AL: "INTERFEROMETRIC BACK FOCAL PLANE MICROELLIPSOMETRY", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 37, no. 10, 1 April 1998 (1998-04-01), pages 1796 - 1802, XP000754331, ISSN: 0003-6935, DOI: 10.1364/AO.37.001796 *
HONGJUN LIU ET AL: "Back-Focal-Plane Interferometry for 3D Position Tracking in Optical Tweezers", 2012 SYMPOSIUM ON PHOTONICS AND OPTOELECTRONICS, 1 May 2012 (2012-05-01), pages 1 - 4, XP055495353, ISSN: 2156-8464, ISBN: 978-1-4577-0910-4, DOI: 10.1109/SOPO.2012.6270909 *
PRALLE A ET AL: "Three-Dimensional High-Resolution Particle Tracking for Optical Tweezers by Forward Scattered Light", MICROSCOPY RESEARCH AND TECHNI, WILEY-LISS, CHICHESTER, GB, vol. 44, no. 5, 1 January 1999 (1999-01-01), pages 378 - 386, XP009054304, ISSN: 1059-910X, DOI: 10.1002/(SICI)1097-0029(19990301)44:5<378::AID-JEMT10>3.0.CO;2-Z *
YEHOSHUA SAMUEL ET AL: "Axial Optical Traps: A New Direction for Optical Tweezers", BIOPHYSICAL JOURNAL, ELSEVIER, AMSTERDAM, NL, vol. 108, no. 12, 16 June 2015 (2015-06-16), pages 2759 - 2766, XP029216043, ISSN: 0006-3495, DOI: 10.1016/J.BPJ.2015.05.014 *

Also Published As

Publication number Publication date
DE112018005412T5 (en) 2020-07-02
WO2019093895A1 (en) 2019-05-16

Similar Documents

Publication Publication Date Title
JP4670031B2 (en) Apparatus for optical detection of a light beam that has undergone excitation and / or backscattering in a sample
KR100743591B1 (en) Confocal Self-Interference Microscopy Which Excluding Side Lobes
CN107941763B (en) Coaxial three-dimensional stimulated radiation loss super-resolution microscopic imaging method and device
CN105487214B (en) A kind of quick three-dimensional super-resolution microscopic method and device
US7701632B2 (en) Method and arrangement for changing the spectral composition and/or intensity of illumination light and/or specimen light in an adjustable manner
CN109477955A (en) Interference scattering microscope
US20050121596A1 (en) Auto-focusing method and device
CN103674926B (en) Optical devices
JP2004170977A (en) Method and arrangement for optically grasping sample with depth of resolution
US20150185460A1 (en) Image forming method and image forming apparatus
JP6131013B2 (en) Microscope apparatus and method for three-dimensional positioning of a pointed object
CN102540447B (en) Trapping and detecting multiplexed scanning optical-tweezers system
JP2010525349A5 (en)
US20130100461A1 (en) Methods and apparatuses for position and force detection
CN104614318A (en) Rapid super-resolution micro-imaging method and device
CN112485232B (en) Sub-ten-nanometer positioning direction-finding method and device based on one-dimensional dark spot time-sharing illumination
JP2010525349A (en) Vector polarimetry method and apparatus for analyzing a three-dimensional electromagnetic field resulting from an interaction between a focused illumination field and an observed sample
JP5592108B2 (en) Interference confocal microscope and light source imaging method
JP2005037388A (en) Arrangement and method for optical detection of light radiation excited and/or backscattered in specimen with double-objective arrangement
Hsiao et al. Spinning disk interferometric scattering confocal microscopy captures millisecond timescale dynamics of living cells
CA2614254C (en) Three dimensional position observation method and apparatus
NL2019891B1 (en) Label-free microscopy
Loerke et al. Quantifying axial secretory-granule motion with variable-angle evanescent-field excitation
CN116481983B (en) Coaxial interference scattering microscopic imaging device and method based on polarized illumination
Ebeling et al. Increased localization precision by interference fringe analysis

Legal Events

Date Code Title Description
PD Change of ownership

Owner name: LUMICKS DSM HOLDING B.V.; NL

Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), DEMERGER; FORMER OWNER NAME: LUMICKS TECHNOLOGIES B.V.

Effective date: 20210621