Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations GregLocock on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Looking backwards through a holographic diffuser

Status
Not open for further replies.

BillCC

Electrical
Sep 6, 2010
3
Our application is illuminating some light sensors while we test their electrical characteristics. Like CMOS image sensors for cell phones. Long ago, we used some plastic holographic diffuser sheeting (unknown specs - but guessed to be 15-30 degree output.) However, the input wasn't collimated, but a simple disk full of various colored LEDs a couple of inches away. Think of it like a multicolored MagLite.

Another designer changed that to some opal glass to improve the uniformity, but I think all opal glass does is improve the wideness of the diffusion (not so important) while increasing the absorption HUGELY (suddenly an important thing to minimize).

I'm recommending going back to the much less-absorptive holographic diffuser, perhaps 45-60 degree angle units, but he has an objection. He says when you look backwards through a single layer of the holographic material, you can make out the images of the point source LEDs. Since you can make out individual LEDs in the translucency of the diffuser material, it can't be a very good diffuser. He says the proof of diffusion is in the uniform milkiness of the opal glass. I claim it can be a great diffuser if the intensity of the image you see is uniform within the diffusion angle, even if you can make out the LEDs.

Can anyone help me with an explanation of how holographic diffusers can be effective, if you can also discern items on the other side of them?

For an example, see this Edmunds picture of a holographic diffuser, and how it's not totally mikly like opal glass:


Thanks from a first-timer!
 
Replies continue below

Recommended for you

What exactly are you doing? What specification are you testing?

If your pixels are not getting light from all the LEDs, then of what value are your measurements?

TTFN

FAQ731-376
 
It would seem to me to be more important to understand what you are trying measure. What does the sensor "see": The diffused light falling on its surface after it passes through the optic or is it imaging the LEDs several inches away? Can you capture the light as it is collected by the sensor (and perhaps even upload a jpeg)?

Harold
SW2010 SP3.0 OPW2010 SP1.0 Win XP Pro 2002 SP3
Dell 690, Xeon 5160 @3.00GHz, 3.25GB RAM
nVidia Quadro FX4600
 
Thanks for the suggestions. I keep seeing the term "homogenize", but in my limited research, it doesn't seem to be quantifiable, and seems like it was just made up to go along with "milky glass". But it certainly describes what opal glass does even more of than frosted glass, with the holographic diffuser doing the least amount of it.

But I'm thinking along the terms of "a black hole has no hair". An eye can't reconstruct what's on the far side of a piece of opal glass, and it seems intuitive that it acts as a good homogenizer and diffuser.

Holographic diffusers that allow the eye to reconstruct the position of point source LEDs on the far side, can be argued (by my colleague) to be less effective diffusers. I, on the other hand, believe their specs are true, and they are effective diffusers. They certainly absorb less, and right now I need intensity improvements to get more uW/cm^2 at the device under test (DUT) for the same LED current.

I think the critical difference is that I'm looking at the holographic diffuser with my eye. It focuses and can discern an image. The device we're testing is essentially a photodiode with some UV and IR filtering, and has a relatively wide acceptance angle and can't see an image. Sorry to mention camera sensors and confuse the issue.

Photons hit it from any angle, and it doesn't matter much if the source is diffused as long as it's relatively uniform and can be calibrated. Since we don't collimate the source (being comprised of many LEDs on a 2" disk) I'm not entirely sure what the intensity distribution is after the diffuser. I'm pretty sure the device under test doesn't care. Many-pixeled arrays care, especially if they are fabbed with microlenses that vary the acceptance angle across the array. This isn't that type of device.

If I had been the original designer, I probably would have made the illuminator out of a 3" integrating sphere, with LEDs poking through many holes in the interior. Just an amateur optical person, but I can design a mean LED driving system. :)

So the sensor just sees uW/cm^2 with no ability to focus an image. Can probably be considered to be one pixel although there are a few photodiodes in close proximity.

My question's purpose was to see if I could understand why a holographic diffuser appears to the eye to be less "milky" than frosted or opal glass, yet still meet all the specs for being a good diffuser. I believe the device we're testing doesn't care, but I would like to convince my colleague and ultimately the customer, that looking in the illuminator's output port may not always look to the eye to be a diffuse (uniform in all axes) source.
 
Because it is. The whole point of a holographic diffuser is to trade the level of diffusion for higher transmission. The fact that the holographic diffuser allows you to discern the different LEDs tells you that it DOES NOT homogenize over the full angle of the physical locations of the LEDS. The ONLY WAY that holographic diffusers could work correctly in your (seriously lacking in description) application is to have the LEDs closer together physically. If you cannot tell that you have separate sources, then the LEDs are close enough together, and the diffuser is "close enough."

If what you are trying to mix is not mixed, then your system has failed, regardless of how great the transmission is. Presumably, your opticker understands all the requirements, which you've not elucidated upon, and I would agree with him that if the intent is to make appear as if there is a single source, but you can clearly discern multiple sources, then your approach is not working and is not correct.

TTFN

FAQ731-376
 
Our published spec for uniformity is 2% over a 15mm diameter circle, placed about 10mm from the device under test. We're pretty sure the device doesn't notice any non-uniformity since it just accepts photons from about a 60 degree cone. And measuring uniformity is relatively difficult without an instrument with a small aperture placed close to the diffuser. Our calibration unit is a Coherent LaserCheck with 5% accuracy, and an 8mm circular silicon detector with no optics. Absolute accuracy is not important for the application.

But the application, and ultimately, the specifications aren't important to my political/social problem. This other engineer just has an intuitive feel that holographic diffusers don't diffuse very well, because of this "seeing the led through them" issue. I could take measurements for days with expensive test equipment we would need to purchase, and he could still object.

I'm confident we can meet the requirements of testing the device by using one or maybe two layers of holographic diffusers (separated by a small distance), while absorbing a lot less light than the opal glass. I just don't want to have a debate about the diffuser without some idea why they have this optical appearance.

Is "homogenization" measurable? Or is it captured by the concept of opal glass having a Lambertian transmission distribution, and holographic diffusers, even 70 or 80 degree ones, have a faster fall-off of their scattering angle profile? Hmmmm.

If I wasn't space constrained I'd love to have a longer tube with LEDs closer together
 
If the specification doesn't play into the discussion then I sure don't envy your position. Intuitive feeling is not engineering (my opinion).

The way I would approach the problem would be to capture an image of the light at the surface of the device under test. Perhaps by replacing the device with a mylar screen and taking a picture of the screen from behind it. We use a software package called MaxIm DL which is actually an astronomy package for analysing images for uniformity. The software is really cheap. Compare both methods for generating the lighting and resolution of the device to see if it matters.

Then argue your point with data vs. the specification. Avoid the politics if you can (again, my opinion).

Harold
SW2010 SP3.0 OPW2010 SP1.0 Win XP Pro 2002 SP3
Dell 690, Xeon 5160 @3.00GHz, 3.25GB RAM
nVidia Quadro FX4600
 
Still not enough information.

> What is the diffusion angle of the holographic diffuser?

> What are the spacings between sources, between sources and diffuser, betwen diffuser and sensor?

> What is the sensor supposed to be seeing?

> Based on 15 mm diameter and 10 mm spacing, the FOV is 73.7°. If your holographic diffuser is only 30°, then you ought not be able to claim your stated level of uniformity unless your sources are all well within a 30° cone from the diffuser

> You should be able to get something like a cellphone camera into the sensor's location in the system to see what it sees.

TTFN

FAQ731-376
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor