1000 Tech Drive
Welcome to 1000 Tech Drive, your go-to podcast for all things optics and surveillance technology! Each episode, we’ll take you on a journey through industry trends and dive into the innovative products from CBC AMERICA’s Computar and Ganz brands. Our goal? To arm you with valuable insights and practical advice that you can apply directly to your industry applications.
What to Expect:
- Product Advice: Discover expert tips and recommendations on selecting and optimizing products for your specific needs.
- Technical Data Insights: Simplify complex specifications and performance metrics to help you make informed decisions.
- Case Studies: Learn from real-world applications that showcase how businesses across various sectors effectively leverage Computar and Ganz products to enhance efficiency, security, and automation.
Tune in to 100O Tech Drive and stay ahead in the rapidly evolving world of optics and surveillance technology!
1000 Tech Drive
Breakthroughs in Unifying Visible and SWIR Imaging
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
For decades, capturing both visible and shortwave infrared required two separate cameras. This episode explores how breakthrough sensor technology finally unified both spectrums in a single chip—and why it's transforming industries.
- The Problem — Traditional silicon sensors can't detect infrared, forcing expensive dual-camera systems
- The Solution — Copper-to-copper bonding enables Sony's IMX992 & IMX993 sensors to capture full visible-to-SWIR spectrum (400-1700nm) on one chip
- Real-World Impact — Revolutionizing semiconductor inspection, plastic recycling, food quality control, and medical imaging
- Cost Revolution — Simplified design dramatically reduces system complexity and price
- Future Applications — Opens doors to medical diagnostics and long-range imaging previously impossible with traditional tech
Speaker 1 Welcome to one thousand Tech Drive, your go to podcast for all things optics and surveillance technology. Hello! Today we are strapping in for a deep dive that looks past the visible spectrum. You know, the light we see with our own eyes. And we're going to explore the incredibly powerful world of shortwave infrared technology, or Swire. Our focus is really on the optical systems and, uh, the sensor breakthroughs that are making this kind of multi-spectrum imaging much more accessible.
Speaker 2 And our mission today is really to give you a knowledge shortcut, a way into advanced industrial defense and medical imaging. Okay. We're going to talk about what happens when you decide to image in that one thousand to two thousand five hundred nanometer spectrum, and specifically how a technical problem that has plagued engineers for decades has finally been solved.
Speaker 1 And that solution enables single sensor coverage of both Visible and SWIR Light.
Speaker 2 Exactly. It's a huge deal.
Speaker 1 That sounds like a true engineering feat. Before we get into all the physics, let's quickly set the stage for where these insights are coming from. The core material for today's deep dive is based on the pioneering work of Computar Optics.
Speaker 2 Right. They've been specializing in optics for over forty years. It's all rooted in that amazing Japanese engineering excellence.
Speaker 1 And that expertise. It's rooted in a much larger history. Right?
Speaker 2 Oh, absolutely. Computar is a registered trademark of the CBC Group, which is this, uh, global conglomerate founded way, way back in nineteen twenty five.
Speaker 1 Wow.
Speaker 2 Yeah. Over forty worldwide locations. And they offer solutions across these incredibly diverse fields, everything from pharmaceuticals to, well, the precision optics we're talking about today.
Speaker 1 So this is a company whose entire history is built around this idea of achieving, quote, perfect vision.
Speaker 2 Perfect.
Speaker 1 Vision, whether that's in machine vision, traffic systems, robotics or as we'll get into life sciences, they're really driving the standard.
Speaker 2 They really.
Speaker 1 Are. Okay. Let's start with the basics. Then when we talk about imaging beyond what we can see, we use a lot of specific terms. So for someone just, you know, catching up what are the crucial distinctions here.
Speaker 2 It is so easy to get lost in the terminology. Let's define the three types we see most often. First up you have hyperspectral imaging.
Speaker 1 Hyperspectral.
Speaker 2 Yeah. And you should think of this as like the most forensic level of detail you can get. It acquires images across over one hundred contiguous spectral bands.
Speaker 1 One hundred bands. I mean, that sounds like information overload. What's the practical use case for that?
Speaker 2 It's used primarily for material identification and characterization. Because you have so many continuous bands, you can generate a complete spectral signature.
Speaker 1 Like a fingerprint.
Speaker 2 Almost exactly a fingerprint for pretty much any material on Earth. And then you just compare that fingerprint to a library.
Speaker 1 Got it. Okay. Moving down a level, we have multispectral imaging.
Speaker 2 Multispectral is a bit more targeted. It takes just a few narrow spectral bands and then it compares them. So it's really useful for discrimination maybe for comparing different types of land or, uh, monitoring crop health.
Speaker 1 So you don't need the full fingerprint, you just need a few key comparisons. Exactly. And finally, our main topic. Shortwave. Infrared swear.
Speaker 2 SWIR It captures images generally in the one thousand to two thousand five hundred nanometer spectrum. And for context, visible light is roughly four hundred to seven hundred nanometers. Right. So SWIR is right next to near infrared or IR. And it's much, much shorter than longwave infrared. That's the thermal imaging range.
Speaker 1 Okay. So a very specific slice of the spectrum that just behaves differently. So when an industrial user wants to build a SWIR system, what do they need besides the camera.
Speaker 2 You need specialized components across the board the cameras, filters, software and critically, the specialized lenses which we'll get into. Okay. But the single most important component, and the one that always surprises people is the lighting.
Speaker 1 Why is lighting so essential? I would have thought if you're seeing past the visible, you might not even need illumination.
Speaker 2 You'd think so, right? But if you just use ambient light in the square band, the image often looks, well, almost identical to a regular black and white photo. Oh, really? Yeah. You lose all the unique contrast that sweater gives you to get those unique characteristics. Like the way water or certain plastics absorb saw light. You have to illuminate the subject. Precisely.
Speaker 1 And what kind of light are we talking about? Like a special light bulb?
Speaker 2 Pretty much. You need either narrowband LED lighting so it's tuned to a specific wavelength, or you use wideband quartz halogen lighting to just flood the subject with those non-visible wavelengths.
Speaker 1 So proper lighting is what makes it valuable industrial data instead of just a novelty image.
Speaker 3 That's the key.
Speaker 1 Okay, let's unpack the biggest technical hurdle this technology has faced historically. If you wanted both visible light and SWIR data, you needed two separate systems. Why was that dual camera setup necessary? What made it so complex?
Speaker 2 It all comes down to the fundamental physics of the sensors themselves, specifically something called quantum efficiency or QE. Essentially, you run into the silicon wall.
Speaker 1 The silicon wall. Tell us about that.
Speaker 2 So standard CMOs sensors, the ones in your phone, They rely on silicon and they are fantastic for visible light. That three hundred to seven hundred nanometer range. Right. The basilica has this technical limit. It becomes transparent to wavelengths longer than about one hundred nanometers transparent. Yeah. The photons just pass straight through the sensor, so its sensitivity just drops off a cliff past nine hundred nanometers. You get no image.
Speaker 1 So it wasn't just inconvenient to use one chip for everything. It was physically impossible. If you wanted that SWIR range, you needed a totally different material.
Speaker 2 Precisely for that critical nine hundred to seventeen hundred nanometer range. You had to use sensors made of indium gallium arsenide or InGaAs. It's sensitive in that range, but it is vastly more expensive and complex than silicon. So you'd end up with one cheap visible camera and one really expensive SWIR camera.
Speaker 1 And you'd have to align them perfectly.
Speaker 3 Which is a huge.
Speaker 2 Integration headache.
Speaker 1 That context is vital because now we can really appreciate the breakthrough. This brings us to the aha moment of this deep dive. The new sensors from Sony, the IMX nine hundred ninety two and the IMX nine hundred ninety three. Right. What specific engineering problem did they solve?
Speaker 2 They leveraged Sony's new technology, and the magic is really in the manufacturing process. They figured out how to combine the specialized Ingo of photodiodes with the standard silicon readout circuits.
Speaker 1 All on one stacked chip.
Speaker 2 All on one chip.
Speaker 1 How did they manage the stacking? That sounds like the part where you'd hit a roadblock.
Speaker 2 It's this breakthrough process. It's called copper to copper bonding.
Speaker 1 Okay.
Speaker 2 This replaces the traditional bump bonding method, which uses these tiny little metal solder spheres. Copper to copper bonding allows for this incredibly fine, robust, clean connection between the two layers.
Speaker 1 And why does that bonding method matter for the image itself?
Speaker 2 Because it allows the engineers to make the top layer. That's the iridium phosphorus layer, significantly thinner than was ever possible before.
Speaker 1 And a thinner layer is key because.
Speaker 2 It means more photons across a much wider spectrum can actually reach that underlying layer.
Speaker 1 That is genuinely revolutionary. So you eliminate the silicon wall, you unify the materials. What is the net result for someone using this?
Speaker 2 The implication is massive. It's why it's a true game changer. These new sensors enable broad imaging from zero point four microns. That's four hundred nanometers deep in the visible spectrum, all the way up to one point seven microns, or seventeen hundred nanometers, covering that essential square range.
Speaker 1 So now a single compact camera can cover visible near and SWIR.
Speaker 2 All with high quantum efficiency. It just drastically reduces the cost and complexity of these advanced imaging systems.
Speaker 1 The sensor technology has just leaped ahead, but a sensor that can capture that many wavelengths must put an enormous burden on the optics. Absolutely. You can't just slap a normal lens on a sensor that sees from purple all the way to infrared.
Speaker 2 That's absolutely correct. If the sensor is the eye, the lens is the eyelid and the glasses. And this required Computar to develop a specialized complementary lens series, which they call ViSWIR.
Speaker 1 Visa. Okay. What are the engineering challenges of designing a lens to work perfectly across that whole four hundred to seven hundred nanometer range?
Speaker 2 The main challenges are twofold. First, materials you have to use specific glass and these specialized material coatings and a reflective coatings to transmit the square wavelengths.
Speaker 1 While not losing anything else.
Speaker 2 Exactly. And second, you have to aggressively, aggressively reduce what's called chromatic aberration.
Speaker 1 Chromatic aberration. That's the effect where different colors of light focus at different points. Right?
Speaker 2 It is. So if you're dealing with the entire visible and square spectrum, that must be incredibly difficult to manage. If you perfectly focus, say, green light, the square light might be focused millimeters behind it.
Speaker 1 Which would make the image useless.
Speaker 2 Totally useless. And that's why the top tier is where hyper APO series is so special.
Speaker 1 Let's focus on that hyper APO. What's the design element that lets it manage that huge spectrum?
Speaker 2 These are ultra high performance lenses five megapixel 2/3 inch format covering eight hundred to seventeen hundred millimetres, and their key feature is the APO floating design. It stands for Apochromatic. It means it corrects for three colors, not just two. And the floating design dynamically compensates for those tiny focus differences between the visible and SWIR bands.
Speaker 1 So the practical benefit is huge efficiency. If you're using one of those new Sony sensors, you can image and visible light switch to swear and the image stays perfectly in focus.
Speaker 2 Yes. No manual adjustments needed. It eliminates downtime. It ensures image integrity, which in an automated inspection line is, well, it's priceless.
Speaker 1 And that feature alone probably justifies the premium price..
Speaker 2 It does, but not every application needs that extreme performance.
Speaker 1 Right. So what about more accessible options?
Speaker 2 That's where the ViSWIR Lite series comes in. It's a lot more cost effective, it's designed for one point five megapixel resolution.
Speaker 1 It's still pretty good.
Speaker 2 It's excellent. It still delivers that wideband transmission from four hundred to seventeen hundred millimetres using a hyper wideband anti-reflective coat. So it's high quality, high transmittance. But for systems where cost is a major factor.
Speaker 1 We also saw an incredible example of a super telephoto solution. I mean, sometimes you just need extreme reach, especially in surveillance.
Speaker 2 That would be the ViSWIR reflex zoom. This is a very specific piece of engineering. It's compatible with the new one point three megapixel sensors, but the focal length is just staggering.
Speaker 1 It really is.
Speaker 2 five hundred and twenty millimetres all the way up to thirteen hundred millimetres. It's designed specifically for long distance surveillance.
Speaker 1 I remember seeing the demonstration shots. It's one thing to hear the number one thousand three hundred millimetres, but seeing it as something else, they were shooting twenty kilometers to a wind farm.
Speaker 2 Or eleven kilometers to an airplane.
Speaker 1 And the visible image was just a hazy mess. But the ViSWIR image was crystal clear.
Speaker 2 It's amazing that ability to penetrate haze, fog, and just atmospheric moisture over these massive distances is the whole point of using the square band.
Speaker 1 Because visible light scatters.
Speaker 2 It scatters right off water droplets and pollution, but square light just passes right through, which makes this lens immediately valuable for things like coastal monitoring.
Speaker 1 Okay, so let's tie these technologies, the Sony sensor and these lenses back to the real world. What industries are actually driving this adoption? What does SWIR see that we just can't.
Speaker 2 The number one application I mean, the one dominating the industry right now is semiconductor inspection.
Speaker 1 Why is SWIR so indispensable for making chips?
Speaker 2 Well, it goes right back to that silicon wall we talked about earlier. Because silicon is transparent to wavelengths longer than eleven hundred nanometers. This transparency lets you image through the silicon wafer itself.
Speaker 1 So if a normal camera can only see the top and bottom surfaces, what problem are these specialized swearer cameras solving?
Speaker 2 They can detect critical defects that are trapped between bonded wafers, things like microparticles or alignment errors deep inside the integrated circuit. This ability to see internal defects is crucial for failure analysis in a multi-billion dollar industry.
Speaker 1 Beyond the chip factory, where else is this seeing through capability? A huge asset.
Speaker 2 And plastic sorting and quality control. You know, most common plastics, they look identical in visible light, but they absorb SWIR light differently based on their chemical composition. This lets automated recycling machines figure out plastic types like PE versus PVC, with reported efficiencies up to ninety eight percent.
Speaker 1 That's a massive efficiency boost for the recycling stream. What about something like food inspection?
Speaker 2 SWIR plays a key role in food inspection and grading. It can be used for foreign object detection, for sure, but more uniquely, it's used for grading meat quality based on fat and water content.
Speaker 1 Interesting.
Speaker 2 And verifying fill levels in opaque packaging without ever having to open the product.
Speaker 1 And finally, we touched on defense and surveillance.
Speaker 2 Exactly. For security, for coastal monitoring. SWIR is just superior for long range detection through all that environmental clutter like fog and smoke, and the band is also used for secure communications because the wavelengths are less susceptible to distortion.
Speaker 1 This deep dive has moved us really quickly from the basic physics of light through this complex engineering of copper to copper bonding, and into these incredible real world applications. The core value for you, the learner, is really gaining these technical insights quickly. You know, the problem of silicon transparency, the brilliance of the SWIR solution and how the APO lens design is applied in the real world.
Speaker 2 And if we connect this to the bigger picture, we really have to talk about medical research. SWIR and NIR are profoundly important for diagnostic imaging.
Speaker 1 In animals and potentially in humans.
Speaker 2 And eventually human use. Yes. Why? Because these specific wavelengths can penetrate biological tissue much deeper than visible light. They minimize tissue absorption and scattering.
Speaker 1 But previously, the lack of good sensitive detectors past eleven hundred nanometers was a huge roadblock for that.
Speaker 2 That's right. The new single sensor systems covering four hundred millimeter to seventeen hundred millimeters. They overcome that hurdle cleanly and importantly, cost effectively. So this isn't just an incremental improvement for checking semiconductors. No. It's a technology that could unlock previously inaccessible scientific research by giving us a much clearer window into physiological processes.
Speaker 1 So the provocative thought we want to leave you with is this, given that these new sensor and lens combinations now provide a clear, full spectrum, deep penetrating view into biological tissue, what long sought biological breakthroughs will become possible next week that just weren't possible last week?
Speaker 2 That's an exciting question.
Speaker 1 That's all the time we have for this deep dive. We hope this has clarified the complex but rapidly evolving landscape of multi-spectrum optics, and we invite you to join Computar's upcoming webinar, From Data to Insights How Hyperspectral and Multispectral Imaging Transforms Industries. On January fifteen, twenty twenty six at 2:00 pm EDT.