Help with Optical Interferometry?

Hey all,

I’ve been reading about OCT scanning (e.g. SS-OCT, SD-OCT) and 3D tissue imaging (mainly eyes and vasculature) with light and I see that one of the main ingredients is an interferometer with a laser. Does anyone here know how to build an interferometer with lasers/optics? Are there any homebrew projects for doing so or anyone here capable of instructing me? I don’t have a budget for fancy lab gear but even building something that at least demonstrated the principles correctly would get me on my way. I’ve been studying infrared imaging for a while and this area of it takes the cake as the coolest application for me.

I’d love to start experimenting with interferometry and tomography but there are a few key physical ingredients like how to set up the reference vs sample image with some lenses and a beam splitter. I have no idea how I’m supposed to decide what lenses/beam splitter setup to use/build (like on a homemade laser bench) or how to know know if an interferogram is being generated correctly. The papers all explain what’s going on but none of it’s for a homebrew quality build, rather usually tens of thousands of dollars worth of equipment involved. It can’t be THAT special from the looks of it other than the lasers they are using but I’m not trying to do medical grade imaging on my table. I would use a basic camera sensor for the receiver, which can be used for either a full image or oriented behind a diffraction grating to do spectroscopy.

Can’t claim to know much about it but this project comes to mind:

Building an OCT requires some very high spec interferometry. It is largely based on a Michelson interferometer, so I would first get a Michelson working well, before trying to continue

Basic interferometer

A basic Michelson is just a light source (laser, discharge lamp, etc), 2 mirrors, a beam splitter, and a screen. If you make sure your source is collimated (beam doesn’t change size as it travels) and fairly broad area (a few mm spot size) then you can just use a piece of paper as a screen to view interference.

What makes the above hard is you need both laser spots (that return from each mirror) very well aligned over each other, so your mirrors need to be very finely adjustable in angle. You also need your mirrors to not distort the “wavefront”, this basically means that if the mirror is bumpy rather than flat then different parts of the beam needed to travel different distances (which destroys the signal). As we are talking about light interference you really need the flatness to be a fraction of an optical wavelength. To do this you normally buy special “front silvered” mirrors where the reflective surface is exposed on top of the glass, these are easier to scratch, can only be cleaned in a special way, but work well. Also if you get a thick-ish beam splitter plate then you will need and extra compensating plate to correct for the extra time the light spent travelling through the beamsplitter. Once this is all correct and nicley aligned you should be able to get a an interference pattern, if you do not have a pattern when the spots overlap it is normally because the length one beam travelled is too different from the other (this is key for OCT). Lasers that have a very thin spectrum (range of colours in the beam) produce interference with more allowance for path difference.

Next before you can start thinking about OCT you will need a way to translate components very accurately. In a Michelson you normally translate one mirror. As it moves you should see the interference pattern move. To get it to move well you need the motion to be small (generally it is good if you can reproducibly move distances of less than 100nm). Also as it moves you need to be as free as possible from wobble and tilt of the mirror. If it wobbles by 100nm or so this will make your signal super noisy, by much more you can’t get useful data. We had some MPhys students build a 3D printed base for a Michelson (unfortunately not well documented as of yet) which you can find here. They got about 20nm steps with a stepper motor, but there was some slight tilt of the mirror that had to be live corrected by tilting the fixed mirror:

Of you get all this working really nicely with good fringes moving reproducible along the screen, you can start using a photo diode to just detect bright vs dark at one place on the pattern, probably done by reducing the width of the input beam. You need to be pretty confident in the reproducibility of you setup here because it is far harder to debug now.

Towards OCT

If this all starts working really well then you can start thinking about OCT. We said before that the lower range of wavelengths the larger the length you get interference over (coherence length). For OCT you broaden the optical spectrum, so you use something like a supercontinum (a laser beam that has had its spectrum widened to contain lots of frequencies). This will now only give a signal between beams that have travelled the same distance (this is how you set the depth you are imaging). To image you need to focus the light from one beam with a microscope objective and scan the focused through your sample in all 3 directions. The diagrams I have seen seems to tilt one mirror, and move another, but I don’t see why you cannot just move the sample in 3D (there may well be a reason I am missing). It is worth bearing in mind that the only light that is coming back is light that gets focused through the objective, gets scattered, and then returns through the whole system. To detect this you will probably need some really clever detection scheme and an incredibly stable laser. This would require a lot of thought even for a simple proof of principle. To do the scanning you could use this (better documented) stage we designed for aligning optical fibres:
project: https://gitlab.com/openflexure/openflexure-block-stage
paper: https://www.osapublishing.org/DirectPDFAccess/E2618CF2-D64F-3145-E0A7E6DF5C2FF6AD_426579/oe-28-4-4763.pdf?da=1&id=426579&seq=0&mobile=no

Further thoughts

Sounds like a fun project but I would not underestimate how accurately these things need to be aligned and moved (especially when you get to more broad sources, and detecting only scattered light). You can definitely build a simple Michelson for a reasonable price (I hope we find time to document and write up ours as some point), that is a good starting point to iterate to something more fancy.

1 Like

@kaspar Great! This might actually be useful! I gotta learn about self mixing interferometers. I have no idea what I’m doing either.

@julianstirling Awesome, yes Michelson interferometers are what got me to look deeper into this whole area. Radio and geophysics have some great uses of it too but I’m most interested in imaging humans. Definitely not trivial as I’m reading all this but I’m up for it as I can accrue bits and pieces for one of these devices. Everything you said is familiar to me, would not have thought about the lens roughness being the main factor for selection. What I was kind of thinking was I could probably do some sort of homebrew interferometer microscope by sticking a microscope lens on the imaging end and a camera sensor on the other end if I size the other lenses right.

Man, this is cool. Wonder if we’ll ever get safe Visible/IR full body scanner star trek holograms from this stuff, haha. I saw one great paper that was imaging a whole rat brain https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5506292/ (or maybe it was this one https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6119107/), and another about using gaussian windows and a focusing lens to see into deeper tissues https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4674840/, and others about 3D imaging, and I gotta say it has my imagination running wild. Could you enlighten me as to the limitations of this field with optics? Could you get bigger “fuzzy” images that cover broader regions?

With BCI specficially, you don’t need nanoscopic resolution to see general activation patterns. There is another method called Retinotopic mapping that uses an FNIRS technique to do that, as well as EROS which gets high quality live images with infrared, but interferometry seems like the next step from the looks of it.