NL2019891B1 - Label-free microscopy - Google Patents
Label-free microscopy Download PDFInfo
- Publication number
- NL2019891B1 NL2019891B1 NL2019891A NL2019891A NL2019891B1 NL 2019891 B1 NL2019891 B1 NL 2019891B1 NL 2019891 A NL2019891 A NL 2019891A NL 2019891 A NL2019891 A NL 2019891A NL 2019891 B1 NL2019891 B1 NL 2019891B1
- Authority
- NL
- Netherlands
- Prior art keywords
- sample
- light
- light beam
- image data
- scattered
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/002—Scanning microscopes
- G02B21/0024—Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
- G02B21/0052—Optical details of the image generation
- G02B21/0056—Optical details of the image generation based on optical coherence, e.g. phase-contrast arrangements, interference arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/361—Optical details, e.g. image relay to the camera or image sensor
Landscapes
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
A method of imaging at least part of a sample is provided which comprises: focusing at least part of a light beam in a sample plane in the sample, and in particular focusing the part of the light beam at or near the scatterer therein, thus providing unscattered light and scattered light; causing a displacement of at least part of the sample and at least part of the focus with respect to each other; and for plural relative positions of the sample and the focus: collecting the unscattered light and the scattered light with a detection system focused in at least part of the sample, and comprising a position dependent sensor; and controlling the detection system to capture image data, the image data representing at least part of the intensity pattern related to the outgoing angular distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least part of an intensity pattern in the back focal plane of the detection system and/or in an optical conjugate plane of the back focal plane of the detection system. This step may also be described as: detecting with a position dependent sensor at least part of the intensity pattern in the 20 back focal plane and/or in an optical conjugate plane of the back focal plane of the detection system. A corresponding system is also provided.
Description
Label-free microscopy
TECHNICAL FIELD
The present disclosure relates to microscopy, in particular tomicroscopy of biological specimens.
BACKGROUND
In microscopy there is a continuous strive towards imagingever smaller details ever clearer and faster.
Current microscopy methods, in particular microscopy ofbiological and/or molecular interactions, heavily rely on (fluorescent)labelling strategies, which can be challenging and which potentially eveninterfere with the molecular interactions under scrutiny. Although (singlemolecule) fluorescence microscopy can be very powerful due to the highselectivity that can be achieved with labelling techniques there are anumber of intrinsic drawbacks associated with it, such as the need forlabelling steps, limited signal due to photobleaching, and limited photonflux which limits imaging speed. It is noted that several label freemicroscopy techniques have been developed previously. Standard brightfield microscopy is based on differences in the transmitted lightintensity induced by a sample. The contrast is mostly based on absorptionand or scattering of light induced by the sample. However, many specimens,especially biological specimens, do not absorb or scatter much lightleading to poor contrast in many cases. Dark field microscopy uses aspecial illumination system which illuminates a sample with a hollow coneof light in such a way that the unscattered light cannot reach thedetector. This leads to contrast based on the scattering properties of thesample. A different technique is Interferometric scattering microscopy"iSCAT" [Ortega-Arroyo et al. , Phys. Chem. Chem. Phys., 14, 15625(2012)]. This technique uses light reflected from a microscope coverglassas a reference field in order to detect scattered sample lightinterferometrically with great sensitivity. It has been shown to havesingle-molecule sensitivity. This method however suffers fromirregularities in the coverglass which causes large background signals.Because of the scattering produced by the glass irregularities, for iSCATimaging, it is required to first to obtain a "background" image of thesurface sample prior to adding the biological specimen. In addition, thisapproach is limited to biological samples in close proximity to the glass surface, severely limiting its application to three-dimensional structuresor suspended samples .
Hence, further improvements are desired.
It is noted that for characterisation of optical traps anddetection of forces on trapped objects back-focal-plane interferometry-based techniques have been developed: Gittes and Schmidt [Gittes, F. andSchmidt, C. F., Optics Letters, 23, 7 (1998)] developed back-focal-planeinterferometry to sensitively detect the displacement of microscopic beadstrapped in optical tweezer systems in a nonimaging manner. Jahnel et al.[Jahnel, M., et al, Optics Letters, 36, 1260-1262 (2011)] used back-focal-plane interferometry to map the force field of an optical trap. DeSantisand Cheng [DeSantis, M. 0., & Cheng, W., WIRES Nanomed Nanobiotechnol, 8,717-729.(2016)] describe how optical trapping combined with back-focal-plane interferometry can be used to detect individual virus particles anddifferentiate them from aggregates.
SUMMARY
Herewith, a method and a system in accordance with theappended claims is provided.
An aspect comprises a method of imaging at least part of asample, in particular a biological sample comprising a scatterer, e.g. abiological object. The method comprises: focusing at least part of a light beam in a sample plane inthe sample, and in particular focusing the part of the light beam at ornear the scatterer therein, thus providing unscattered light and scatteredlight; causing a displacement of at least part of the sample and atleast part of the focus with respect to each other; and for plural relative positions of the sample and the focus:collecting the unscattered light and the scattered lightwith a detection system focused in at least part of the sample, andcomprising a position dependent sensor; controlling the detection system to capture image data,the image data representing at least part of the intensity patternrelated to the outgoing angular distribution of the scattered andunscattered light in the sample plane, in particular the image datarepresenting at least part of an intensity pattern in the back focalplane of the detection system and/or in an optical conjugate planeof the back focal plane of the detection system. This step may also be described as: detecting with a position dependent sensor at leastpart of the intensity pattern in the back focal plane and/or in anoptical conjugate plane of the back focal plane of the detectionsystem.
The method further comprises: constructing an image of atleast part of the sample, in particular at least part of the scatterer, onthe basis of the image data associated with the plural relative positionsas a function of the relative positions of the sample and the focus.
Thus, imaging and/or detection of a scatterer is providedbased on back-focal-plane interferometry. The imaging is based oncapturing image data corresponding to detecting the scattered (&unscattered) light in the far field. The imaging may not affect thescatterer's position and/or other properties. Each of the plural relativepositions of the sample and the focus produces its own intensity pattern,resultant of interference between the non-scattered and the scatteredlight, in the back focal plane and/or in an optical conjugate plane of theback focal plane, representative of the interaction of the light beamfocus with the portion of sample in which the light beam is focused.
In particular, the focus of the light beam and the focus ofthe collecting optical system may (be caused to) coincide, providing aconfocal arrangement.
The light source may comprise a laser and the light beam maybe a laser beam. The light beam may have any suitable beam shape butpreferable is a Gaussian beam, which enables a particularly well-definedfocus. Displacing the sample (part) and the focus (part) relative to eachother in a controlled manner is also referred to as "scanning". Thescanning may be 1-, 2- and/or 3-dimensional, scanning directions (i.edirections of relative displacement) may be selected as desired andpreferably be orthogonal to each other, preferably at least one of thescanning directions is perpendicular to the direction of propagation ofthe light beam; imaging may be based on 1-, 2- or 3-dimensional scanning,such as by scanning along a linear path (e.g. along a strand), scanning aplane (which may have any suitable shape and/or orientation in the sample)or scanning a volume, wherein the construction of the image and itspossible interpretation may depend on properties of the detector, e.g.enabling position dependent detection in one or two spatial directions(see below). Information on dynamics of the sample may be derived fromconsecutive scans of one or more sample portions, which may be depicted asa series of images, an image stack, a kymograph, a graph indicating the time-dependent changes and/or as a time varying image (a movie). Inanother option, displacement of at least part of the sample with respectto the sample holder and to the focus may be caused by diffusion of atleast part of the sample, which may have natural causes allowed over time,and/or which may be caused by causing flow of a sample fluid within thesample holder. A scatterer, e.g. a scattering particle located in a focalplane of a light beam, in particular a Gaussian beam, of which theincoming field is defined as Ei, will produce a scattered field Es. It isnoted that direct detection of the scattered field Es is possible (e.g. indarkfield microscopy) but this is limited to relatively large structuresbecause of a size dependence of the scattering cross-section r scalingwith r6. However, present method relies on the notion that when thescattered field Es is allowed to interfere with the unscattered field Eu,the interference term scales with a r-’-size dependence instead. Thisfacilitates detection of smaller details.
The scatterer should have an index of refraction that differsfrom the surrounding medium in the sample for at least a portion ofwavelengths of the light of the light beam. The image in particular showsone or more of position, shape and size of the scatterer in the sample.The sample may comprise plural scatterers and/or scatterers comprisingdifferent structures and/or differently scattering structures.
Suitable biological objects for acting as a scatterer in abiological sample as discussed herein may comprise or be a cellular bodyor a substructure thereof, such as any one of cells, proteins, smallmolecules interacting with proteins, viruses, DNA and RNA molecules,chromosomes, organelles, filaments, a sub-cellular structures, but alsotissues, antibody stained tissues, protein-small molecule complexes etc.
The detection system, e.g. based on a condenser (or anobjective), is positioned after the sample as seen in the direction ofpropagation of the light beam, to collect at least a portion of thescattered light. The intensity pattern is created by interference in theback focal plane of the optical system, the pattern depending on therelative position between the scatterer, e.g. a refractive portion of aparticle, and the focus. Capturing image data, e.g. by detection of thisintensity pattern by the position sensitive detector in the back focalplane, or an optical conjugate plane thereof, allows sensitive detectionof the scatterer. The image may be constructed by converting the imagedata associated with each relative position of the sample and the focus into one or more pixel values of the constructed image. By varying therelative positions of at least part of the sample and the focus, e.g.scanning the positions of the sample and the focus relative to each other,an image can be obtained that is dependent on the scattering properties ofthe scatterer, in particular on diffraction and refraction (but alsoabsorption) properties of the scatterer and/or any structures thereofand/or therein. As a consequence, the scatterer need not be labelled orotherwise affected.
The light intensity pattern in the back focal plane or theoptical conjugate plane thereof, and consequently the intensitydistribution on the position sensitive detector, is proportional tointeraction of the light and the scatterer, in particular proportional tothe amount of deflection of the light beam induced by the sample. Notethat, unless specified otherwise, all references to directions and/orrelative position like "lateral" are to be understood as referencing tothe direction of propagation of the light beam; generally the direction.
The image data may comprise one or more of the intensitydistribution on the position sensitive detector or a fraction thereof, atotal intensity on the position sensitive detector, colour information,variation date, time stamps, etc. The image data used for construction ofthe image may also comprise spatial and/or time averages of detectionsignals and/or statistical information regarding detection signals, e.g.root-mean-square deflection noise (RMS deflection noise). The positionsensitive detector may comprise e.g. a camera, a diode array, a quadrantphotodiode (QPD), a position sensitive diode (PSD), or any combinationthereof. A detector that is position sensitive in two dimensions, e.g. acamera, a PSD or a QPD enables independent and/or simultaneous detectionof the interaction of the scatterer and the light beam projected ontothese two dimensions, i.e. it enables quantification of the interaction inX and Y directions independently and allows the total, or absolute, signalto be calculated as S = sqrt(Sx2 + Sy2). Quantification of the signalmight be used to obtain certain properties of the sample e.g. scatterer'ssize, permittivity of the solvent medium, relative refractive index of thescatterer compared to the medium and incident wavelength of the light. Arelatively simple detector such as a QPD or PSD may be faster than acamera, enabling detection of rapid changes in the sample. Such a detectormay also have a larger dynamic range enabling measurement of small signalson a large background. A scatterer in the sample plane, particularly a scattererslightly above or below the focal plane of the light beam, might cause asymmetric change to the interference pattern in the back-focal plane (i.e.the chief ray of the scattered beam might not be deflected in the lateraldirection but the marginal rays of the light beam might be deflected tocause a divergence or convergence of the scattered light beam). Detectionof this deflection may also be implemented for providing information forthe imaging step. As an example, the total intensity of the transmittedlight within a certain region in the back focal plane of the opticalsystem may be monitored, e.g. using an aperture to restrict the range ofacceptance angles of the detection system.
The image may be constructed as an array of pixels. Each pixelor groups of pixels may correspond to one relative position of the sampleand the focus. Also or alternatively, each pixel or groups of pixels maycorrespond to a plurality of relative positions of the sample and thefocus. This facilitates scaling the image.
The method may further comprise recording the image data, e.g.detected intensity patterns, as a function of the relative positions ofthe sample and the focus and constructing an image of the at least part ofthe sample on the basis of the recorded image data. The image data may bestored in a transient or permanent memory and/or be transmitted throughthe internet to a remote controller or computer. Detection and imageconstruction may therefore be done separately.
The resolution may depend on the relative sizes of the focusand scatterer or, respectively any structure(s) of the scatterer to bestudied. This is particularly interesting for biological samples which maycomprise scattering particles varying in one or more of sizes, shapes andinternal structures. Preferably, the light beam focus is smaller than thescatterer; this may facilitate resolving details of the scatterer smallerthan the scatterer itself (i.e. sub-scatterers). In each case, dependentalso on the detector used, the image data may be used for constructing theimage per direction independently and/or in combination, e.g. the imagebeing based on signals Sx, Sy, Sz (signals in X-, Y- and Z-directionsindependently) and/or as an absolute value Sabs (Sabs = sqrt{Sx,'2 + Sy^l +SzA2}), wherein "S" indicates a signal strength which may correspond to oris proportional to the amount of beam deflection. Note that a two-dimensional position dependent sensor may be formed by a combination oftwo one-dimensional position dependent sensors oriented (to detect) at anangle to each other, in particular perpendicular to each other. However, more complex sensors, including cameras, may also be used, whereinaveraging over sensor data (e.g. averaging over parts of the camera image)may be used to define part of the image data.
The image may be constructed from a combination of image dataassociated with one or more individual directions, e.g. corresponding toamounts of scattering in one or more directions (e.g.: Sx, Sy) and/or toan absolute value thereof (e.g.: Sabs).
At least part of the image may be rendered in a brightnessscale and/or an essentially single-colour-scale, e.g. a grey scale whereindegrees of brightness may correspond to amounts of beam deflection in oneor more directions and/or to an absolute value thereof (e.g. Sx, Sy, Sabsas discussed above). Instead of a grey scale any other sequential colormapmay be used, where sequential means that the perceived lightness valueincreases or decreases monotonically through the colormap, e.g. a colourtemperature scale (also known as thermal red scale) ranging from purplishred via bright red, orange and yellow to white. In such scale, images willexhibit the well-known shadow effect which gives standard DIC (Nomarskyphase microscopy) images their three-dimensional appearance. The positiondependent sensor may be a two-dimensional position dependent sensor, inparticular a sensor capable of detecting two perpendicular directionssimultaneously, conveniently called X- and Y- directions. Then, both X andY beam deflection data may be simultaneously available as (part of) theimage data. In such case different linear combinations of the X and Y beamdeflection data may be used for construction of the image, in particularfor generating pixel values, and rendering the image with a shadow effectoriented in any chosen direction. This may increase a (perceived)resolution or contrast of the image. This is an advantage over knowntechniques, e.g. in DIC microscopy one needs to physically rotate theWollaston prism which determines the shearing direction in order to choosethe orientation of the shadow effect, i.e. one needs to physicallymanipulate (the beam line of) the optical setup itself.
The output of a quadrant photo diode ("QPD") and/or a positionsensitive diode ("PSD") may not only be a difference signal inperpendicular directions, but also a sum signal proportional to the totalintensity detected on the diode. The former may relate to the lateralchange in propagation/deflection of the light. The latter may correspondto the total intensity of the transmitted beam or reflected beam(dependent on whether the method is performed in-line or reflectively, seebelow). In such case, simultaneously with the deflection based contrast also an intensity based contrast image may be constructed. The methodtherefore also gives access to absorption/extinction parameters of thesample. In addition, the simultaneous measurement of both the deflectionand the intensity of the transmitted beam allows to correct for artefactssuch as caused by sudden laser emission intensity variations, e.g. bynormalizing the deflection signal by the total intensity signal.
The method may be performed in an in-line arrangement. Forthat, the method may comprise arranging the light source on one side ofthe sample and the optical system on a second side of the sample, inparticular the first and second sides being opposite each other, such thatat least part of the light beam traverses the sample from the first sideto the second side before reaching the optical system and the detector.Such method may further comprise focusing at least part of a light beam inthe sample from the first side and collecting the unscattered light andthe scattered light on the second side of the sample and controlling thedetection system to capture image data representing at least part of theintensity pattern resulting from the collected light, as above. Thus, themethod is based on forward-scattered light, or rather on interference offorward-scattered light with unscattered transmitted light.
Also or alternatively, the method may be performed in areflection arrangement. For that, the method may comprise arranging thelight source and the optical system on one side of the sample, andarranging a reflector for at least part of the light beam, such that atleast part of the light beam traverses at least part of the sample from afirst side and returns to the first side before reaching the opticalsystem and the detector. Then, the method may further comprise focusing atleast part of a light beam in the sample, from the first side andcollecting the unscattered light and the scattered light on the first sideof the sample and controlling the detection system to capture image datarepresenting at least part of the intensity pattern resulting from thecollected light, as above. Thus, the method is based on backward-scatteredlight, or rather interference of backward-scattered light with unscatteredreflected light. The reflected light may be reflected from at least one ofa portion of the sample, a portion of a sample holder and/or a separatereflector .
Also or alternatively, the method may be performed in acircumventional arrangement, wherein the unscattered light is light nothaving traversed and/or otherwise interacted with the sample at all.
Spatial filtering of the scattered and/or the unscatteredlight between the optical system and the sensor, may be employed, suchthat at least part of the scattered and/or of the unscattered light passesthrough a spatial filter prior to reaching the sensor.
For accurate beam deflection and absorption measurements it isadvantageous to collect as much as possible of the scattered light conefor which one may use a condenser with a numerical aperture larger thanthe refractive index of the sample medium. This particularly applies toforward-scattered light in an in-line arrangement. In particular fordetection of symmetric changes to the back focal plane interferencepattern it may be advantageous to restrict the acceptance angle of thedetected light cone. Therefore an iris may be arranged in the back focalplane of the condenser, which may be adjustable. An iris may also beotherwise employed for spatial filtering. Also, or alternatively, a dual-detection system may be provided and used, wherein a portion of the lightcone, preferably substantially the full light cone, is collected by thedetection system, e.g. a condenser, after which one part of the collectedlight is detected by the position sensitive detector, e.g. for lateraldeflection determination and/or for absorption measurement and/or forcapturing image data representative thereof, to construct the image, whilea second detector may be provided and used for measuring another portionof the beam selected by a spatial filter for detecting changes to thecollimation of the scattered beam and/or for capturing image datarepresentative thereof.
The method may comprise trapping at least one object in thesample, in particular optically trapping, wherein the object comprises thescatterer or the scatterer interacts with at least one of the objects,e.g. being attached to an object. The interaction may comprise one or moreof being attached to the object, moving with respect to the object,reacting with the object in a chemically and/or a biological and/or aphysical sense, etc. Thus, at least one of the position and orientation ofthe scatterer may be controlled and/or adjusted in the sample. This mayfacilitate studying the scatterer. The object may be a microsphere and thescatterer may be a biological object, e.g. a cellular body, a filament, amacromolecule etc. Optical trapping may obviate (presence of) attachmentstructures for holding the scatterer, which might otherwise affect thescatterer, and/or it may support the object free from (i.e. not in contactwith) a solid substrate. E.g., this might avoid unwanted contributions ofthe sample holder to the signal and it might decouple the sample from unwanted motions (vibrations or drift) of the sample holder. Thus, imageresolution and stability may be improved.
The method may comprise trapping, in particular opticallytrapping, plural objects attached to each other by at least one connectingelement, wherein at least one of the objects and/or the connectorcomprises the scatterer, and/or wherein the scatterer interacts with atleast one of the objects and/or the connecting element(s), e.g. beingattached to an object or to the connecting element(s). In particular theobjects may be microspheres and the connecting element(s) may comprise afilament, a microtubule, a DNA-strand, etc.
In case of optical trapping of at least one object in thesample with one or more optical trapping beams, the light beam may differfrom at least one of the optical trapping beams in at least one intensity,wavelength and polarization. Thus, interaction between the trappingbeam(s) and the (detection) light beam may be prevented and/or thedifferent beams may be separately controlled by wavelength-specific and/orpolarization-specific optics.
The method is flexible and may comprise modifying one or moreof: the focus size of the light beam, the intensity of the light beamand/or the wavelength of the light beam, as well as - in the case oftrapping - modifying one or more of: the focus size of a trapping lightbeam, the intensity of a trapping light beam and/or the wavelength of atrapping light beam. The modification may be done within one image and/orbetween different images and it may be controlled by a controller. Themodification allows detection of different image details and/or image datacapturing with different scanning settings. In a particular embodiment,the light beam may serve as a trapping beam. The described modificationfacilitates switching between both functions, e.g. by dithering powerand/or wavelength.
The method may comprise that at least one of the light beam ispolarized, in particular linearly polarized. Also, (the part of) theintensity pattern of the scattered and unscattered light represented inthe image data may be detected through at least one polarization dependentoptical element, such as a polarizer, a Polarizing Beam Splitter Cube("PBSC"), a Wollaston prism, etc. comprised in the collecting opticalsystem. The light may be split in different fractions according tomultiple polarizations and each split fraction may be detected separatelyon a position dependent sensor and image data representing one or severalof the fractions may provide information on polarization dependent characteristics of the sample, e.g. polarization altering characteristicsof (at least part of) the sample. One or more of the polarization of thelight beam and the at least one polarization dependent optical element maybe adjustable with respect to polarisation directions, which may becontroller operable; the light beam may be sent through a polarisationchanging element such as a quarter wave and/or a half wave plate.
The method may comprise providing at least part of the samplewith an optically effective label, possibly comprising opticallyactivating or de-activating the label. Although they are not required withthe presently provided techniques, labeling and associated techniques maybe exploited: e.g. it might be advantageous to scan for example biologicalsamples such as cells, tissues, biomolecules which have been (partly)fluorescently labelled and to simultaneously detect fluorescence thereof.The fluorescence may be excited or de-excited (e.g. quenched, bleached,etc.) by the scanning beam. In particular in the latter case thefluorescence might label specific structures of interest in the sample(the principal stain) while the scattering contrast may act as thecounterstain for providig a composite image with more context than theprimary stain alone.
In accordance with the method a system is provided herewith.
The system comprises: a sample holder to hold a biological sample, a lightsource providing a light beam, and, operably arranged along an opticalpath of at least part of the light beam: a source optical system, whichmay comprise one or more optical elements, in particular a focusing lensand/or an objective, and which is arranged to focus at least part of thelight beam in a sample held in the sample holder; a detection systemcomprising a position dependent sensor, e.g. one or more of a splitphotodiode, a quadrant photodiode, a photodiode array, a camera, aposition-sensitive photodiode. The detection optical system, e.g.comprising one or more optical elements, in particular a condenser lens,provides a back focal plane and is arranged to collect at least part ofthe light beam comprising both light not scattered by the sample and lightscattered by at least one scatterer in the sample and to provide from theman intensity pattern in the back focal plane.
The detection system is arranged to capture image data, theimage data representing at least part of the intensity pattern related tothe outgoing angular distribution of the scattered and unscattered lightin the sample plane, in particular the image data representing at leastpart of an intensity pattern in the back focal plane of the detection system and/or in an optical conjugate plane of the back focal plane of thedetection system. E.g. The position dependent sensor is arranged to detectat least part of the intensity pattern in the back focal plane and/or inan optical conjugate plane of the back focal plane.
At least part of at least one of the sample holder, the lightsource and the source optical system is adjustable to controllablydisplace the focus of the light beam and at least part of the samplerelative to each other, e.g. being connected to a position controllerwhich the system may comprise. The system further comprises a controllerconnected with the position dependent sensor and programmed to constructan image of at least part of the sample, in particular at least part ofthe scatterer, on the basis of the image data associated with the pluralrelative positions as a function of the relative positions of the sampleand the focus. The relative positions may result from a 1-, 2- or 3~dimensional scan of at least part of the sample. The image may be 1-,2~ or 3- dimensional and it may be rendered in a sequential colour scale.
The source optical system and the detection system preferablyare configured in a confocal arrangement.
The system may comprise a spatial filtering system, e.g.comprising a pinhole and/or an iris, which may be adjustable with respectto position and/or aperture, for spatial filtering detection light betweenthe collecting optics (condenser, objective, ...) of the detection systemand the position dependent sensor. The spatial filtering system may beconnected with a controller. The spatial filtering system may furthercomprise relay optics.
The system may comprise a trapping arrangement to trap and/orhold one or more objects in the sample. In particular, an optical trappingarrangement may be provided. A multiple trapping arrangement to trapand/or hold one or more objects in the sample in multiple traps may bepreferred. An optical trapping arrangement may comprise one or more lightssources, e.g. lasers, focusing optics and detection optics arranged toprovide one or more optical trapping beams in the sample.
Further, in accordance with the principles disclosed herein,an optical detection module is provided to be placed in an optical train of a sample or beam scanning microscope comprising a sample holder to hold a biological sample, a light source providing a light beam, and, operablyarranged along an optical path of at least part of the light beam, a source optical system arranged to focus at least part of the light beam in a sample held in the sample holder, and wherein at least part of at least one of the sample holder, the light source and the source optical systemis adjustable to controllably displace the focus of the light beam and atleast part of the sample relative to each other, e.g. being connected to aposition controller. The detection module comprises: a detection optical system comprising a position dependentsensor, e.g. one or more of a split photodiode, a quadrant photodiode, aphotodiode array, a camera, a position-sensitive photodiode; wherein the detection system provides a back focal plane andis arranged to collect at least part of the light beam comprising bothlight not scattered by the sample and light scattered by at least onescatterer in the sample and to provide from them an intensity pattern inthe back focal plane; wherein the detection system is arranged to capture image data, the image data representing at least part of the intensity patternrelated to the outgoing angular distribution of the scattered andunscattered light in the sample plane, in particular the image datarepresenting at least part of an intensity pattern in the back focal planeof the detection system and/or in an optical conjugate plane of the backfocal plane of the detection system; the detection system further comprising a controller connectedwith the position dependent sensor and programmed to construct an image ofat least part of the sample, in particular at least part of the scatterer,on the basis of the image data associated with the plural relativepositions as a function of the relative positions of the sample and thefocus .
Another aspect is a method of imaging at least part of asample, in particular a biological sample comprising a scatterer, themethod comprising: controlling a source optical system to focus at least part of a light beam in the sample and in particular at or near the scatterertherein, thus providing unscattered light and scattered light, which forman intensity pattern in a back focal plane of the source optical system; controlling at least one of the source optical system and asample positioning system to position the focus and the sample at aplurality of different positions with respect to each other, for plural relative positions of the sample and the focus,controlling a detection system to capture image data, the image datarepresenting at least part of the intensity pattern related to theoutgoing angular distribution of the scattered and unscattered light in the sample plane, in particular the image data representing at least partof an intensity pattern in the back focal plane of a detection systemand/or in an optical conjugate plane of the back focal plane; andconstructing an image of at least part of the sample on thebasis of the image data associated with the plural relative positions as afunction of the relative positions of the sample and the focus.
Further, a computer-implemented method for imaging at leastpart of a sample, in particular a biological sample comprising ascatterer, is provided, the method comprising: controlling a source optical system to focus at least part of a light beam in the sample and in particular at or near the scatterertherein, thus providing unscattered light and scattered light; controlling at least one of the source optical system and asample holder to displace at least part of the sample and at least part ofthe focus with respect to each other for achieving plural relativepositions of the sample and the focus; at each relative position of said plural relative positions,controlling a detection system to detect at least part of an intensitypattern, e.g. an interference pattern, caused by the unscattered light andthe scattered light combining; constructing an image of the at least part of the sample onthe basis of the detected interference intensity patterns respectivelyassociated with the plural relative positions.
One distinct aspect of this disclosure relates to a controllercomprising a processor that is configured to execute one or more of thesteps of the computer-implemented methods as described herein.
One distinct aspect of this disclosure relates to a computerprogram comprising instructions to cause a controller as described hereinto carry out one or more of the steps of the computer-implemented methodsas described herein.
One distinct aspect of this disclosure relates to a computer-readable medium comprising a computer program as described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The above-described aspects will hereafter be more explainedwith further details and benefits with reference to the drawings showing anumber of embodiments by way of example.
Fig. 1A illustrates an optical system 2 according to anembodiment;
Fig. IB illustrates a method for imaging at least part of asample according to one embodiment;
Fig. 2 is an embodiment of a system for label free imagingusing back-focal-plane interferometry; Fig. 2A shows a typical deflectionsignal of the system of Fig. 2 as a light beam is scanned over a smallobject in the sample;
Fig. 3 is an embodiment of another in-line arrangement usingsample scanning;
Fig. 4 is a detail of an embodiment of imaging with a dualbeam optical tweezer system;
Fig. 5 is an embodiment of a reflection arrangement, includingoptional confocal fluorescence detection;
Fig. 6 shows an exemplary embodiment for de-scanned detectionin transmission geometry; Fig. 6A shows a typical deflection signal of thesystem of Fig. 6 as a light beam is scanned over a small object in thesample;
Fig. 7-10 are exemplary images formed in accordance with theprinciples presently disclosed;
Fig. 11 shows a typical intensity pattern as used inaccordance with the principles presently disclosed.
DETAILED DESCRIPTION OF EMBODIMENTS
It is noted that the drawings are schematic, not necessarilyto scale and that details that are not required for understanding thepresent invention may have been omitted. The terms "upward", "downward","below", "above", and the like relate to the embodiments as oriented inthe drawings, unless otherwise specified. Further, elements that are atleast substantially identical or that perform an at least substantiallyidentical function are denoted by the same numeral, where helpfulindividualised with alphabetic suffixes.
Further, unless clearly otherwise indicated, terms like"detachable" and "removably connected" are intended to mean thatrespective parts may be disconnected essentially without destruction ofeither part, e.g. excluding structures in which the parts are integral(e.g. welded or molded as one piece), but including structures in whichparts are attached by or as mated connectors, fasteners, releasable self-fastening features, etc.
Figure 1A illustrates an optical system 2 according to anembodiment. The system 2 comprises a sample holder 6 to hold a sample 70,e.g. a biological sample, comprising a scatterer (not shown). System 2further comprises a source optical system 4 that is configured to focus atleast part of a light beam 12 in the sample 70 and in particular at ornear the scatterer therein, thus providing unscattered light 16 andscattered light 14. Optionally, the source optical system 4 comprises alight source for providing the light beam 12, such as a laser. Asindicated by the arrows x, y, z, at least part of the sample 70 and thefocus of light beam 12 can be displaced with respect to each other.Herewith plural relative positions of the sample 70 and the focus can beachieved. In one example, the source optical system 4 is configured tomove the focus with respect to the sample 70. Additionally oralternatively, the sample holder 6 may be configured to move the focuswith respect to the sample. The unscattered light 16 and the scatteredlight 14 combine and cause an intensity pattern, e.g. an interferencepattern. The optical system 2 further comprises a detection system 8 todetect at least part of this intensity pattern. The detection system 8comprises a position dependent sensor, e.g. one or more of a splitphotodiode, a quadrant photodiode, a photodiode array, a camera, aposition-sensitive photodiode. The detection system 8 may provide a back-focal plane and may be arranged to collect at least part of the scatteredlight 14 and unscattered light 16 in this back focal plane and/or in anoptical conjugate plane of the back-focal plane. System 2 furthercomprises a controller 120 that is configured to control the sourceoptical system 4 and the detection system 8 and optionally the sampleholder 6 to perform their respective functions as described herein.
Figure IB illustrates a method for imaging at least part of asample according to one embodiment. This method may be implemented tocontrol at least one of the source optical system 4, the sample holder 6and the detection system 8.
In step S2, the embodiment comprises controlling a sourceoptical system 4 to focus at least part of a light beam 12 in the sample70 and in particular at or near the scatterer therein, thus providingunscattered light 16 and scattered light 14. Controlling the sourceoptical system 4 to focus at least part of light beam 12 in the sample mayconsist of controlling a light source, such as a laser, to generate alight beam 12, which light beam 12 passes through passive elements, suchas lenses prisms, mirrors, filters et cetera.
In step S4, the embodiment comprises controlling at least oneof the source optical system 4 and a sample holder 6 to cause displacementof at least part of the sample 70 and at least part of the focus withrespect to each other for achieving a relative position of the sample 70and the focus. In order to achieve this position, the source opticalsystem may be controlled, which may comprise controlling an orientation ofa mirror in the source optical system 4 for directing the light beam 12.Alternatively or additionally, the plural positions may be achieved bycontrolling the sample holder 70, which may comprise controlling anorientation and/or position of the sample holder 70.
Step S6 comprises controlling the detection system 8 to detectat least part of an intensity pattern, e.g. an interference pattern,caused by the unscattered light 16 and the scattered light 14 combining.In an example, the detection system 8 comprises a position-dependent lightsensor, such as an imaging system, that comprises a plurality of pixels.Each pixel may be configured to output a light intensity value that isindicative of the light intensity incident on the pixel, and/or indicativeof other image data such as RMS deflection noise. Thus, a plurality ofpixels may output light intensity values that are indicative of a lightintensity pattern. A pixel for example outputs a light intensity value inthe form of a voltage signal. The pixels may continuously outputrespective light intensity values, that may vary with time. The position-dependent sensor may further comprise an image data capture module, thatmay be embodied as a software module in a computer. The image data capturemodule may continuously receive the light intensity values from the pixelsof the position-dependent sensor. It should be appreciated thatcontrolling the detection system 8 to detect at least part of theintensity pattern may consist of transmitting an instruction to the imagedata capture module to store the light intensity values that it iscurrently receiving from the respective pixels. Herewith, the image datacapture module captures the light intensity values and may thus capturethe intensity pattern as image data.
Steps S4 and S6 are repeated at least once, so that at leasttwo intensity patterns are at least partially detected for two respectiverelative positions of the sample and the focus. However, steps S4 and S6may be repeated numerous times.
Step S8 comprises constructing an image of the at least partof the sample on the basis of the detected image data, e.g. the detectedintensity patterns, respectively associated with the plural relative positions. After steps S4 and S6 have been repeated a number of times fora plurality of relative positions, a plurality of intensity patterns havebeen detected by the detection system 8, wherein each intensity pattern isassociated with a relative position of the sample and focus. Step S8 maycomprise, for each detected intensity pattern, determining an image pixelvalue, for example a greyscale value, for an image pixel in the to beconstructed image. Step S8 may further comprise constructing the imagebased on the determined image pixel values and their associated relativepositions .
Figure 2 also shows an exemplary embodiment of a systemarranged for performing at least one embodiment of the method disclosedherein, the system having a source optical system 4. Fig. 2 shows a lightsource 10 projecting a light beam 20, which might be a laser beam, onto ascanning device 30, here being controlled by an optional controller in theform of a central processing unit (CPU) 120. The scanning device can forexample be a tip/tilt mirror or an acousto/electronic optical deflectionsystem.
The beam 20 is relayed using telescope lenses 40, 50 to theback-focal-plane of a microscope objective 60.
The microscope objective 60 focusses the light onto a sampleheld in a sample holder 70. The sample, which might be a biologicalsample, can be scanned by the focused beam by means of the scanningdevice. A condenser lens 80 or similar optical system is used togetherwith a relay optical system (e.g. a single lens 90) of a detection system8 to project the light beam that has passed through the sample onto aposition sensitive detector 100 which can for example be a quadrantphotodiode (QPD), a position sensitive diode (PSD) or a camera of thedetection system 8, positioned in a conjugate of the back-focal plane ofthe condenser. The signals from the position sensitive detector areoptionally amplified and combined in an electronic circuit 110 and aresent to the CPU 120 or other processing unit. An optional beam splitter130 can be used to send part of the light beam that has passed through thesample to a spatial filter 140, positioned in a conjugate of the back-focal plane of the condenser, that can be used to select only a part ofthe beam. A detector 150, e.g. a photodiode, can be used to detect e.g.changes in collimation of the scattered which correlate to axialdisplacements of the scatterer. Based on the detector signals and thecurrent position of the scanner an image can be constructed by the CPU.
The light source 10 can be a laser and the light beam 20 canbe a laser beam. However, other light sources and light beams may beprovided. The sample can be a biological sample, comprising scattererssuch as a cell or sub-cellular structure, a filament (e.g. actin,microtubule), a protein on the surface of a substrate (e.g. a microscopecover glass) or a structure suspended in an optical trap, e.g. a dualoptical tweezer setup. The sample can furthermore comprise any scattererwith topological features or variation in refractive index.
The image contrast of the image to be constructed ispredominantly based on deflection of the light beam caused by interactionwith (the scatterer in) the sample. Any object in the sample, inparticular a sample plane in which the light beam focus is located, havinga refractive index (polarizability) that differs from a refractive indexof a medium surrounding the object will cause a deflection of (part of)the beam, hence the name "scatterer". This deflection can be measured,e.g. by monitoring the difference signal (Vx and/or Vy) of a quadrantphotodiode, wherein the measurement results provide image data. It hasbeen shown that this deflection can be detected directly, e.g. indarkfield microscopy, but this has the disadvantage that the scatteredintensity scales with r6 (a radius of the scattering feature raised to the6th power) which makes it hard to detect small objects such as e.g. singleproteins or small filaments, or structures thereof and changes in size andor mass of the biological object under study due to proteins, smallmolecules or other entities, binding to the initial biological structureof interest. This binding can be either dynamic or static in nature andcan be detected as a variation the deflection signal.
Taking advantage of interference between the deflected /scattered and the unscattered part of the light beam it has now been foundthat one can directly detect the field amplitude instead of the poweramplitude of the scattered light. This reduces the scaling factor to rJ,making it much easier to image or detect small structures and/or objectssuch as single proteins or protein complexes.
The deflection provides a varying interference resulting in adifference signal of the QPD; as the beam is scanned over a scatterer thelight detected on the QPD will first deflect to one side and then to theother. This is illustrated in Fig. 2Ά for a single and smooth scatteringstructure, e.g. a microsphere or a protein smaller than the beam focus.
Scanning an object whose optical properties such as position,size and/or mass change over time will result in a varying amount ofdeflected light and hence of varying signal.
Scanning an object with different scattering structures willresult in a more complex signal shape. Scanning the beam in two dimensionsallows the build-up of a 2D image of the scatterer.
Due to aberrations and imperfections in the optical path ascanning beam approach as illustrated in this exemplary embodiment mightlead to a non-zero and/or structured background image, on top of which itis hard to detect small signals of a scattering object in the sample. Thismay be resolved by careful subtraction of a background image. Such abackground subtraction can for example be achieved by scanning the imagemultiple times where at least a portion of the sample, e.g. the scatterer,is moved by a known amount between the consecutive images e.g. using asample stage or by moving optical traps. Subtracting such consecutiveimages may lead to a background free image, possibly with two displacedcopies of the sample. If the sample is larger than the displacement, post-processing might be useful to recover a single-copy image. Another methodfor background subtraction is to take advantage of any dynamics that mightbe present in the sample: for samples which are changing over time it ispossible to achieve high quality subtraction of a (static) background bysubtracting an average over many images from one or more individualimages .
By repeatedly scanning 2-dimensional images of the samplewhile the sample and focus are displaced with respect to each other alongthe direction of beam propagation (the z-direction) so that the light beamis focused in different layers of the sample is it is possible toconstruct a 3-dimensional image.
It is advantageous to use a condenser with a numericalaperture ("NA") higher than the index of refraction of the sample mediumin order to allow optimal capturing of the scattered light.
Multi-beam scanning can be done to improve imaging speedand/or accuracy. For this, it is preferred to collect and detect thedeflection of multiple beams simultaneously. At least some of the multiplebeams may differ in one or more optical characteristic such aspolarization state (e.g. different linear polarization directions),wavelength, wavelength modulation, intensity modulation, etc. which isdetectable by the detection system by suitable (combinations of) techniques such as polarisation separation, wavelength selective filteringand/or absorption, demodulation techniques, etc.
According to an alternative exemplary embodiment it ispossible to scan the sample stage instead of (a focus of) the light beam.This might have the advantage that in this case there is less backgroundsignal caused by abberations in the optical system. Figure 3 shows ascheme of such an embodiment. However, scanning both at least part of thesample and at least part of the light beam is also possible. Fig. 3 showsa light source 10 of a source optical system 4 projecting a beam of light20 into the back-focal-plane of a microscope objective 60. The objective60 focusses the light into a sample 70 which is mounted onto a sampleholder. The sample holder can be scanned in one or more directions (here:three mutually perpendicular directions X, Y and Z), preferably twodirections that are parallel to a sample plane. The scanning can becontrolled by a controller, e.g. via signals provided by a centralprocessing unit (CPU) 120 shown here. From the sample, when the light beamis at least partly scattered, both the unscattered light beam 20 and thescattered light beam 25 are collected by the condenser 80. Via a relaylens 90 of the detection system 8 the back focal plane of the condenser 80is imaged onto a position sensitive detector 100. Signals from theposition sensitive detector 100 are amplified and combined, as indicatedat reference number 110, and sent to the CPU 120. The CPU 120 usesinformation on the position of the sample holder 70 and the signals fromthe sensor 100 as image data to construct an image of at least part of thesample on the basis of the image data and the relative positions of thesample holder and the focus.
Such system may be less susceptible to spurious backgroundsignals caused by aberrations and optical imperfections. Optionally onecould also add a pinhole or other spatial filter 85 in the focal point ofthe optical relay system to the position sensitive detector. Spatialfiltering may reject unwanted background light scattered from differentportions of the sample, e.g. different focal planes. Further,reconstruction of three dimensional datasets of scattering contrast may befacilitated and/or enhanced by scanning the sample stage in a directionalong the direction of light beam propagation (here: the Z-direction) inaddition to one or more lateral directions (here: X- and Y- directions).
Part of another embodiment is illustrated in Figure 4. Anobject, e.g. a DNA molecule 230 with bound proteins 240, is tetheredbetween two beads 210 held in a sample medium (not indicated) in trapping beams 200 of a dual optical trap, known per se. A light beam 250 isscanned along the DNA molecule 230. The molecule 230 and the proteins 240each scatter the light beam 250 to some extent, dependent on their opticalproperties relative to the surrounding sample medium. At least part of thescattered light and unscattered light are collected and detected asgenerally indicated before. Thus, one or more images of (part of) the DNAmolecule 230 and/or the proteins 240 may be constructed based on the imagedata representative of the back-focal-plane interferometry signal of thelight beam. In such case effective background subtraction can be done e.g.by (slightly) moving the optical traps between consecutive images andsubtracting the images.
According to yet another embodiment, indicated in Figure 5,instead of detection of scattered light and unscattered light in theforward direction (i.e. in transmission through the sample) back-scatteredlight may be used. In this case the light beam 20 generated by the lightsource 10 travels through a beam splitter 300 and travels to the sample 70via an optional beam scanner 30, an optional relay system, which here isindicated as two lenses 40, 50, and an objective 60 which focuses thelight beam 20 into the sample 70. The sample 70 is mounted on a sampleholder, here comprising a coverslip 340 that is at least partlytransparent to the wavelength of the light beam 20. The sample 70 and/orthe sample holder 340 holding the sample 70 may at least partly bemovable, preferably controllably movable as discussed above, e.g.controlled by a controller. At or near the sample 70 at least part of thelight beam 20 is reflected. E.g., a substrate-sample transition in thecoverslip 340 may reflect part of the light from the light beam prior to aremaining part of the light beam having interacted with the actual sample.The reflected light can be used as the reference field for interferometricdetection. Both the light back-scattered from the sample and the reflectedreference light are collected with the objective 60, then serving asdetection optical system and providing the back focal plane. Note that, inthis embodiment, the illumination light, the sample and the detectionlight are, automatically, in a confocal arrangement and the source opticalsystem 4 and the detection system 8 share a significant number of opticalelements (300, 320, 30, 40, 50, 60). The detection light travels backthrough the objective 60 and the relay system 50, 40, and via the scanningmirror 30 to the beam splitter 300 where at least a part of the light isreflected and sent to a position sensitive detector 100 via a furtheroptional relay 90 and an optional pinhole 85 or iris for spatial filtering. For light efficiency one might choose to use linearly polarizedlight form the light source, e.g. using a polarizing beam splitter 300which transmits p-polarized light and a quarter wave plate 320 as shown.If the quarter wave plate 320 is rotated such that the illumination lighttravelling to the sample 70 has a circular polarization, the back-reflected detection light, after passing for a second time through thequarter wave plate 320, will have a linear polarization rotated 90 degreeswith respect to the incoming light and therefore has s-polarization. Thiswill be efficiently reflected by the polarizing beam splitter 300 ensuringoptimal light efficiency directed towards the position dependent sensor100.
If one would like to simultaneously observe any fluorescencelight of the sample 70, e.g. being excited in the sample 70 by theillumination/scanning beam, this can be easily achieved by adding adichroic beam splitter 310 which e.g. transmits the scanning excitationbeam but reflects the fluorescence emission. This is shown in Fig. 4 butsuch optical fluorescence detection system may be added to any embodiment.In Fig. 4, the emission travels to a sensitive point detector 330 or anyother suitable sensor or camera, via another optional relay 90 andoptional spatial filter 85. Any detection signals from the point detector330 may be combined with data from the position sensitive detector 100 aspart of image data for constructing the image.
In another embodiment, not shown, polarization sensitivedetection can be implemented. For this, a polarizing beam splitter may belocated before the sensor 100, to split the detection beam(s) according topolarization. In this way the two orthogonally polarized components of thedetection light (scattered and unscattered light) each give rise to theirown detection signals which may be treated separately or in any suitablecombination as image data for constructing the image. If the scattering bythe sample is polarization dependent this leads to slight differences inthe detection signals from the individual beams which can be analysed forexample in terms of birefringence. The signals may be detected with aquadrant position sensitive detector or with two position sensitivedetectors each associated with one of the polarization directions, suchdetectors then possibly being unidirectionally sensitive. In case of twodetectors, for optimization of the signal on both detectors thepolarization of the illumination/scanning beam may be tuned, e.g. to 45degrees, with the aid of a half wave plate. Similarly, the polarization ofthe illumination/scanning beam could be modulated in conjunction with polarization insensitive detection in order to characterize polarization-dependent scattering (e.g. implementing time-multiplexed polarizationdependent detection) .
In another embodiment, shown in Fig. 6, the beam scanningimplementation and the forward scattered detection can be furtherimplemented using a de-scanning tip-tilt mirror 160 after the light hastravelled through the sample and has been collected by the detectionsystem and passed through a pair of optical relay lenses 90. The de-scanning tip-tilt mirror ensures that the scanning beam is transformedinto a stationary beam. It is now possible to use a spatial filteringassembly consisting of a pair of lenses 170 and a pinhole or spatialfilter 140 before the beam is detected by the position sensitive detector110 (see Fig. 6A for an indicative signal). This spatial filtering can beemployed to reject background light and improve e.g. the z-sect.ioning ofthe scattering signal.
Other optical techniques like wavelength variation and/orwavelength dependent detection and/or detection of angle-dependentscattering may be exploited as well.
Figs. 7-10 show exemplary images, which were obtained with astage-scanning implementation in a microscopy system otherwise generallyin accordance with Fig. 2. The illumination light beam 20 was set to avery low power level, parked at the center of the field of view in aconfocal setup and the microscope sample stage was raster scanned whileacquiring image data representative of deflection data. From the imagedata the images of Figs. 7-10 were constructed.
Figures 7 and 8 are images of a human cheek epithelial cell,constructed from a one-directional deflection signal detected by scanninga cheek epithelial cell through a static light beam by stage scanning.Fig. 7 is constructed from deflection in one direction (X), whereas Fig. 8is constructed from deflection in a perpendicular direction (Y). The scaleof both images is 80 x 60 micrometer.
Figure 9 is an absolute signal image of the cell of Figs. 7-8.The image is obtained by combining the X and Y-direction data of Figs. 7and 8 according to Sabs = sqrt(SxA2 + SyA2) and using a differentbrightness colour scale relative to Figs. 7-8.
Figure 10 is an image of a single chromosome.
Fig. 11 shows a typical intensity pattern of a back focalplane. The pattern is offset from the centre, as indicated by the cross hairs dividing the picture. Intensity differences in X- and Y-direction(Xdiff, Ydiff) and total intensity (Sum) may be calculated as shown.
The disclosure is not restricted to the above describedembodiments which can be varied in a number of ways within the scope ofthe claims. For instance, the interference pattern beam line may becombined with another imaging beam line with which a focal plane or anoptical conjugate thereof may be imaged. Such beam lines may partlyoverlap, e.g. sharing the condenser and being separated by a partial beamsplitter to two different optical detectors, e.g. a quadrant photodiodefor the interference pattern beam line and a camera for the imaging beamline and/or having different wave lengths and being separable using adichromatic mirror and/or a filter.
Elements and aspects discussed for or in relation with aparticular embodiment may be suitably combined with elements and aspectsof other embodiments, unless explicitly stated otherwise.
Claims (16)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2019891A NL2019891B1 (en) | 2017-11-10 | 2017-11-10 | Label-free microscopy |
PCT/NL2018/050753 WO2019093895A1 (en) | 2017-11-10 | 2018-11-12 | Label-free microscopy |
DE112018005412.8T DE112018005412T5 (en) | 2017-11-10 | 2018-11-12 | Label-free microscopy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2019891A NL2019891B1 (en) | 2017-11-10 | 2017-11-10 | Label-free microscopy |
Publications (1)
Publication Number | Publication Date |
---|---|
NL2019891B1 true NL2019891B1 (en) | 2019-05-17 |
Family
ID=61003320
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2019891A NL2019891B1 (en) | 2017-11-10 | 2017-11-10 | Label-free microscopy |
Country Status (3)
Country | Link |
---|---|
DE (1) | DE112018005412T5 (en) |
NL (1) | NL2019891B1 (en) |
WO (1) | WO2019093895A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NL2029859B1 (en) | 2021-11-22 | 2023-06-13 | Lumicks Dsm Holding B V | Method to produce DNA molecules with repeating units for use in single-molecule assays. |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008092107A1 (en) * | 2007-01-26 | 2008-07-31 | New York University | Holographic microscope system and method for optical trapping and inspection of materials |
-
2017
- 2017-11-10 NL NL2019891A patent/NL2019891B1/en active
-
2018
- 2018-11-12 DE DE112018005412.8T patent/DE112018005412T5/en active Pending
- 2018-11-12 WO PCT/NL2018/050753 patent/WO2019093895A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008092107A1 (en) * | 2007-01-26 | 2008-07-31 | New York University | Holographic microscope system and method for optical trapping and inspection of materials |
Non-Patent Citations (6)
Title |
---|
ANDERS WALLIN: "OPTICAL TWEEZERS FOR S INGLE MOLECULE BIOLOGY", 24 May 2011 (2011-05-24), Helsinki, XP055495389, ISBN: 978-952-1068-80-5, Retrieved from the Internet <URL:https://helda.helsinki.fi/bitstream/handle/10138/26300/opticalt.pdf> [retrieved on 20180726] * |
BASUDEV ROY ET AL: "Probing the dynamics of an optically trapped particle by phase sensitive back focal plane interferometry", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 11 January 2012 (2012-01-11), XP080557562, DOI: 10.1364/OE.20.008317 * |
FEKE G D ET AL: "INTERFEROMETRIC BACK FOCAL PLANE MICROELLIPSOMETRY", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 37, no. 10, 1 April 1998 (1998-04-01), pages 1796 - 1802, XP000754331, ISSN: 0003-6935, DOI: 10.1364/AO.37.001796 * |
HONGJUN LIU ET AL: "Back-Focal-Plane Interferometry for 3D Position Tracking in Optical Tweezers", 2012 SYMPOSIUM ON PHOTONICS AND OPTOELECTRONICS, 1 May 2012 (2012-05-01), pages 1 - 4, XP055495353, ISSN: 2156-8464, ISBN: 978-1-4577-0910-4, DOI: 10.1109/SOPO.2012.6270909 * |
PRALLE A ET AL: "Three-Dimensional High-Resolution Particle Tracking for Optical Tweezers by Forward Scattered Light", MICROSCOPY RESEARCH AND TECHNI, WILEY-LISS, CHICHESTER, GB, vol. 44, no. 5, 1 January 1999 (1999-01-01), pages 378 - 386, XP009054304, ISSN: 1059-910X, DOI: 10.1002/(SICI)1097-0029(19990301)44:5<378::AID-JEMT10>3.0.CO;2-Z * |
YEHOSHUA SAMUEL ET AL: "Axial Optical Traps: A New Direction for Optical Tweezers", BIOPHYSICAL JOURNAL, ELSEVIER, AMSTERDAM, NL, vol. 108, no. 12, 16 June 2015 (2015-06-16), pages 2759 - 2766, XP029216043, ISSN: 0006-3495, DOI: 10.1016/J.BPJ.2015.05.014 * |
Also Published As
Publication number | Publication date |
---|---|
DE112018005412T5 (en) | 2020-07-02 |
WO2019093895A1 (en) | 2019-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4670031B2 (en) | Apparatus for optical detection of a light beam that has undergone excitation and / or backscattering in a sample | |
KR100743591B1 (en) | Confocal Self-Interference Microscopy Which Excluding Side Lobes | |
CN107941763B (en) | Coaxial three-dimensional stimulated radiation loss super-resolution microscopic imaging method and device | |
CN105487214B (en) | A kind of quick three-dimensional super-resolution microscopic method and device | |
US7701632B2 (en) | Method and arrangement for changing the spectral composition and/or intensity of illumination light and/or specimen light in an adjustable manner | |
CN109477955A (en) | Interference scattering microscope | |
US20050121596A1 (en) | Auto-focusing method and device | |
CN103674926B (en) | Optical devices | |
JP2004170977A (en) | Method and arrangement for optically grasping sample with depth of resolution | |
US20150185460A1 (en) | Image forming method and image forming apparatus | |
JP6131013B2 (en) | Microscope apparatus and method for three-dimensional positioning of a pointed object | |
CN102540447B (en) | Trapping and detecting multiplexed scanning optical-tweezers system | |
JP2010525349A5 (en) | ||
US20130100461A1 (en) | Methods and apparatuses for position and force detection | |
CN104614318A (en) | Rapid super-resolution micro-imaging method and device | |
CN112485232B (en) | Sub-ten-nanometer positioning direction-finding method and device based on one-dimensional dark spot time-sharing illumination | |
JP2010525349A (en) | Vector polarimetry method and apparatus for analyzing a three-dimensional electromagnetic field resulting from an interaction between a focused illumination field and an observed sample | |
JP5592108B2 (en) | Interference confocal microscope and light source imaging method | |
JP2005037388A (en) | Arrangement and method for optical detection of light radiation excited and/or backscattered in specimen with double-objective arrangement | |
Hsiao et al. | Spinning disk interferometric scattering confocal microscopy captures millisecond timescale dynamics of living cells | |
CA2614254C (en) | Three dimensional position observation method and apparatus | |
NL2019891B1 (en) | Label-free microscopy | |
Loerke et al. | Quantifying axial secretory-granule motion with variable-angle evanescent-field excitation | |
CN116481983B (en) | Coaxial interference scattering microscopic imaging device and method based on polarized illumination | |
Ebeling et al. | Increased localization precision by interference fringe analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PD | Change of ownership |
Owner name: LUMICKS DSM HOLDING B.V.; NL Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), DEMERGER; FORMER OWNER NAME: LUMICKS TECHNOLOGIES B.V. Effective date: 20210621 |