1000 Tech Drive

Machine Vision: The LensConnect Series

Computar Optics Season 1 Episode 3

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 10:55

Welcome to 1000 Tech Drive, your go-to podcast for all things optics and surveillance technology! In this episode, we explore Computar's LensConnect Series, a pioneering approach to machine-vision lenses that transforms how cameras operate in industrial automation.

Imagine remotely adjusting focus, iris, and zoom with precision—no more climbing ladders or awkward manual tweaks. LensConnect integrates motorized lenses with USB control, enabling seamless adjustments that enhance efficiency and adaptability in applications such as robotics, logistics, and traffic monitoring.

Join us as we explore real-world applications, gain insights from industry experts, and discover the remarkable impact of LensConnect on modern automation. Tune in to discover how this technology is transforming the landscape of machine vision.

Computar LensConnect Series July 17, 2025


Speaker All right, let's jump right into it. When you think about modern automation industrial systems, it's so often about seeing. Right. Right. Analyzing reacting. Absolutely. Vision is fundamental. And the eyes of those systems, that's the cameras. And maybe even more critically the lenses on those cameras. For years, you know, getting that perfect focus dialing in the zoom, it meant actually getting to the camera, physically adjusting it. Yeah, maybe it's perched way up high or tucked inside some machine. You're there manually fiddling with rings. It takes time. It can be awkward, and it really limits how dynamic those systems can be. Exactly. But what if you could control those eyes, those lenses from far away with, like, perfect precision? Well, that's where things get interesting. And that's what we're diving into today. We're looking at the Lens Connect series from computer. It seems like a really fascinating development in machine vision lenses. It is. It tackles that exact problem. So we've gathered quite a bit of material for this deep dive. We've got technical specs, product pages, a YouTube video explaining some of the benefits and what I think is really useful. Some real world case studies companies showing how they're actually using this stuff. Right. So our mission here for you listening is to unpack all of this. We want to get past just the feature list and really grasp the core idea behind LensConnect. How does it work, you know, under the hood and crucially, what kinds of actual problems is it solving out there, especially in machine vision, which is a pretty demanding field. Yeah, it feels like the core appeal is taking lens adjustment from that static manual thing into something dynamic, remote controlled, a pretty big shift. It really is. So let's start right there. The basics. When we talk about the LensConnect series, what is it fundamentally what makes it different from just a standard camera lens? Okay. So fundamentally lensConnect is a series of machine vision lenses. But the key difference is they have built in motors and a standard USB interface motors inside the lens. Exactly. So instead of you turning rings for focus or iris or zoom, those functions are motorized, meaning you set it up, install it, and then you don't have to physically touch the lens again for adjustments. That's the idea. You connect it with just a single USB cable to a PC. It could be windows, could be Linux, and you remotely control focus, Iris, and zoom to if it's one of the varifocal ones, right? For the Varifocal models, you control the zoom position remotely as well, all through that USB connection. Okay, but how precise can that remote control really be? Machine vision often needs really fine adjustments. What's the tech making that happen? Yeah. Good question. The sources all point to the use of stepping motors. Stepping motors. And that's a deliberate choice. Stepping motors move in discrete steps. They respond very precisely to digital commands. So they allow for incredibly delicate and precise positional control. Ah. So not just smooth movement, but very specific positions. Exactly. And why that's critical for machine vision is repeatability. The system needs to tell the lens, go to this exact focus position and know that it will every single time. Right. And it needs to get back to that same position reliably. So closely, quickly, reliably, durably. That's essential for automated tasks that happen over and over. Repeatability. Yeah. Yeah, that makes total sense for automation. The sources also talk about it being plug and play. What does that mean in practice. Well they've tried to make the integration side easier for you. The motor driver board, the electronics that control the motors that's built right into the lens itself. So you don't need a separate controller box. Yep. And computer provides the software. You need sample programs and an API that's an application programming interface. So developers can integrate control into their own software. Correct. The goal is you don't have to be, you know, an electrical engineer or an optics guru to get this lens connected and controlled by your system. Okay. That lowers the barrier to entry. What about the image quality? When you add motors and control systems inside a lens, does that compromise the optics? That's always a concern, but according to the sources, they have designed it specifically to maintain high optical performance. The focusing and zooming happen internally. How so? They mention a floating focus design. This is a technique where different groups of lens elements inside move relative to each other when you adjust focus. Yeah, it's more complex, mechanically more complex than just moving the lens barrel back and forth. Right. But the benefit is it helps maintain really high resolution, ultra high resolution, they call it. And it limits optical flaws, aberrations across different focusing distances. So the lens can actually make full use of those high risk sensors we see now like 12, even 20 megapixels. That's the claim. The optical design is meant to support the performance of those modern high resolution sensors. It's not just a motorized lens, it's optically designed for that motorization. Okay, interesting. And power. Is that another cable you have to run another power supply? No. And that's another neat part of the design. It's powered directly by the USB connection. USB bus power. Seriously, just the one cable for control and power? Yep. No need for an external power supply. Brick or batteries? Just the USB cable. Simplifies things quite a bit. That does simplify things, and I assume there's a range of these lenses, different focal lengths, that sort of thing. Oh yeah. Absolutely. The sources detail quite a range. You've got monofocal so fixed focal length lenses from eight millimeters up to 75 millimeters. Yeah. Supporting different sensor sizes up to 1.1in format. And those high resolutions 12 MP 20 MP. And then you have the Varifocal or zoom models. They list examples like four ten millimeter, nine 50 millimeter, 1230 six millimeter, even a 1690 six millimeter. So plenty of options depending on the field of view and working distance you need for your application. Got it. Okay, so that paints a pretty clear picture of what Lens Connect is. Motorized USB controlled, precise, designed for performance range of options. But let's get into the why. Why does this actually matter? Out in the real world, in factories or labs? What problems does it really solve for you? Right. This is where you see the practical value proposition. Just imagine a camera that's mounted way up high on a ceiling. Or maybe it's inside some complex piece of machinery or in a clean room. Maybe somewhere access is restricted. Exactly. Getting to that camera to manually tweak the lens focus or iris. It's a major headache. Yeah, time consuming, maybe even unsafe, depending on the location, for sure. So remote control immediately solves that. You can make delicate adjustments. You can fine tune the settings, maybe change the focus or iris in real time, all without physically touching the camera. That sounds incredibly useful just for initial setup, but also for ongoing optimization. Definitely. Or adapting to changing conditions. And think about systems with multiple cameras, you could potentially manage and adjust all of them efficiently. From one central point, it really boils down to saving time, saving effort, and gaining a lot more flexibility. Okay, that makes sense. And the sources mentioned specific application areas where this flexibility is paying off. Yes. Several key areas pop up again and again. Robot vision is a big one. Cameras on robot arms. Yeah, industrial robots are using vision more and more for tasks like, uh, picking things up, placing them, assembly, inspection, you name it. Right. Being able to remotely adjust the lens on that camera, maybe mounted right on the robot's wrist, that can significantly boost the performance and adaptability of the robot vision system. That makes immediate sense because the robot arm is moving, right? So the distance to the object is constantly changing. Remote focus would be almost essential there. Precisely. Logistics is another huge one. They highlight warehousing, shipping. Exactly. Automated warehouses, sorting facilities, systems identifying different types of cargo reading labels on packages flying by. These systems need to handle objects of all different sizes, maybe at different distances, maybe moving really fast. Yeah, logistics automation is exploding right now. And remote lens control helps streamline all that automated sorting collection, warehouse management, that Ravi TR system we'll talk about. It fits right into this category okay. Raptor and logistics. What else? Uh, intelligent transport systems. It's traffic monitoring, things like that. Yeah. Things like advanced traffic monitoring, automated toll collection, maybe safety systems on highways. These systems are getting more sophisticated, needing high speed, high precision imaging lens. Kinect allows them to adapt remotely, maybe to different lighting conditions, different traffic flows. Specific monitoring deeds promote safety and efficiency. Wow. Okay. So factory floors, robot arms, warehouses, highways, and they even mention sports analytics. Briefly. Yes. Optimizing image analysis in modern sports. You can imagine tracking players or you know, the ball with more precision. Ah, interesting. And then, of course, the more general applications you'd expect. Warehouses and factories using vision for automated inventory checks. Quality control inspection anywhere. Adaptability to different products or situations is needed, so it's clear why remote control is useful in theory. Let's make it concrete with that first case study. Artemis Vision. Who are they and what were they building? Okay, so a machine vision integrator built a robust, effective industrial vision systems for their clients. And tracking technology is one of their specialties. Gotcha. And the product we're looking at is called rapid TR The rapid pallet tracker. That's the one rapid TR. It's designed as this compact unit. You'd typically install it like say over a warehouse dock door okay. And its job is to track pallets moving at high speed underneath it and at the same time use an ultra high definition camera to scan and read barcodes or other labels on the cargo loaded onto that pallet. 

Wow. High speed scanning and tracking. Yeah. And a key feature is that it timestamps all that data accurately. This helps to prevent missed shipments, ensures everything is tracked correctly. They really wanted to make it easy to install and integrate. Aiming to set a new standard in warehousing. Accuracy and efficiency. Sounds like a sophisticated system for a pretty tough environment. So how did LensConnect fit into their plan for a rapid TR? Well, they were already using computer lenses in some other systems, so they were familiar with the brand. They found out about the LensConnect series through their account reps and they saw potential. They did. They saw it could potentially solve a problem for rapid TR, and what specifically made them choose Lenz Connect for TR. What was the problem? It solved that. Maybe other lenses couldn't. Well, initially a couple of things got their attention. LensConnect supported the larger sensor sizes they needed for that ultra high def camera in Raptor. And they found the pricing competitive. Okay. Practical considerations. But the real driver, the thing that really made it click for the Raptor application was that Lens Connect allowed them to create an auto configuration option for the lens setup, right inside their Raptor system software. Hold on. Auto configuration. You mean the Raptor system could basically set up its own lens focus and stuff? That's exactly it. They tested various options and found Lens Connect was, in their words, the most straightforward to integrate that ease of integration. Let them build this auto setup feature. Okay, that's cool, but here's the crucial insight from the case study. The biggest problem LensConnect actually solved for a customer. It wasn't really a technical challenge. They faced integrating the lens. It was a usability challenge for their customers, the people buying and installing Raptor R, so it wasn't about making their job easier, but making the end user's job easier. Precisely. It dramatically simplified the product setup for the warehousing companies, putting Raptor units in place. Their staff didn't need to become experts in manually focusing lenses or making super fine adjustments in potentially awkward locations. The Raptor system using Lens Connect could handle a lot of that initial setup itself. Yeah, guided by the software, it made their product more accessible, easier to deploy successfully. Probably opened it up to customers who might have been hesitant about the setup complexity. Otherwise it turned a potential pain point into a selling point. That's a really powerful benefit, making the end product simpler for the customer. Did They have any feedback for computer things they'd like to see improved or added? They did, yeah. Based on their experience deploying Raptor, they asked for a more rugged USB cable connection, something better suited for tough industrial environments like a locking connector. Exactly. They suggested locking USB, or maybe a standard industrial connector like a four pin M8 instead of just a standard USB plug that could potentially get knocked loose. Makes sense for a warehouse. They also asked for a more fully featured SDK, the software Development Kit, specifically mentioning things like better ways to monitor the connection status to the lens and better support for handling concurrent communications if multiple processes needed to talk to the lens. Okay, software improvements and looking ahead as their customers start finding new ways to use Raptor, they mentioned interest in a Lens Connect model for those 1.1in sensors, but with a wider focal length range, something like 4 to 40mm offering more flexibility. Got it. That's a really valuable practical feedback. Okay, let's switch gears to the second case study. Micro Technica and their product Sempron. What's their area of expertise? Micro Technica focuses on inspection systems, especially using image processing technology. They provide hardware things like industrial cameras, smart cameras, and also software solutions for inspection. Right. And Sempron is their smart camera designed to go on robots? Exactly. It's a compact smart camera, meaning it has processing power built right in. It's specifically designed to be attached to industrial robots. So the camera itself does the thinking. Yeah, it performs image analysis, can directly control machines or the robot itself. And it comes with built in apps for common tasks like reading barcodes, QR codes, characters on objects, uses a Sony high speed sensor, and importantly, users can write their own custom software for it too. Okay, a smart camera on a moving robot arm that immediately brings back that challenge you mentioned earlier. The robot can move all over. But what about the lens? What was the specific problem? Micro Technica hit with traditional lenses on Simran? It was exactly that issue. Industrial robots have incredible mobility, right? They can move through large areas, but a conventional lens, especially a fixed focused one mounted on Semprun, has a limited depth of field, meaning things are only sharp in a specific distance range. Right? A fairly narrow band. So even if they chose different fixed focal lengths, it was really difficult for one lens setup to handle all the different situations a robot might encounter. Sometimes it needs a close up view, sometimes it needs to see something further away, and just moving the robot arm to get the object perfectly in that focused sweet spot. That's tricky. It can be complex, slow, and sometimes just not practical, depending on the workspace or the task. relying only on robot movement to achieve focus was limiting so the robot could be anywhere, but the camera could only see clearly in one specific depth zone, I get it. Exactly. And that's precisely why Micro Technica saw the drive type lens concept, the motorized lens connect as well. The best option. They felt it enabled much better cooperation between their Simprints smart camera and the robot system it was attached to. Okay, so how did Lens Connect specifically fix that problem for Sempron users? It allowed them to dynamically adjust the lens parameters, mainly the focus, remotely while the robot was working. So as the robot moved or as it encountered different objects like maybe boxes of different sizes at slightly different distances, the lens could adjust. Yes. Instead of the robot having to perform complex maneuvers to place every single object at the exact right distance for a fixed lens, the LensConnect lens could adjust its focus to the object much more flexible. That makes sense. It decouples the focusing from the robot's precise position somewhat right. And they also mentioned it gave them options for customers who, maybe for safety reasons or cycle time, wanted to minimize how much the robot moved. They could achieve the necessary focus by adjusting the lens instead of moving the whole arm. Ah, interesting. So it added this layer of flexibility right into the vision system itself. Now, Micro Tecnica apparently had three key reasons for picking Lens Connect specifically. Yes, the sources highlighted three crucial factors in their decision. First was that high reproducibility of the focus drive. There's that word again, reproducibility. It's critical knowing the lens will reliably go back to the exact same saved focus position time after time. Absolutely essential for automated sequences in a factory makes total sense for repeatability. Okay. What was reason number two? The second was a really interesting architectural benefit for them. It was the ability to control the lens connect lens directly from The Simpsons smart camera itself. Oh, so the smart camera. The brain on the robot arm could talk directly to the lens. It didn't have to go through like a separate PC for every command. Exactly. They didn't necessarily need to route every single adjustment command back through an external host computer. The smart camera could manage the lens adjustments locally. Wow, that makes the whole robot camera unit much more autonomous, doesn't it? It really does. It simplifies the overall system. Wiring and control architecture reduces communication latency. It's a significant advantage. Okay, that's a big one. And the third reason. The third was simply the wide range of lens Kinect models available, having options from wide angle to telephoto for different sensor sizes. It gave micro Tecnica confidence confidence that as their customers came up with new applications, new requirements for Sempron, there would likely be a suitable lens connect lens available. Right. You gave them security that they could build solutions for diverse needs. They felt this flexibility, combined with that reliable positional control, being able to tell the lens go to position X made it much easier for the Semprun camera to cooperate effectively with what they called upper systems. Upper systems? Yeah. Can you clarify what that generally means in this kind of industrial context? Sure. In this context, up. for systems usually refers to the higher level controllers that are orchestrating the whole operation. That could be the main robot controller itself, maybe a PLC programmable logic controller running the production line, or even a factory wide system like an Mez, a manufacturing execution system. Okay. The things telling the robot and camera what job to do. Exactly. Semprun needs to talk to these systems, report results, get instructions, and having reliable addressable remote control of the lens makes that interaction much smoother, especially when the task changes and the lens needs to be reconfigured. That makes a lot of sense, and seeing how the design of a component like the lens enables new system architectures, like controlling the lens directly from the smart camera. That's. Well, that's where things get really interesting from an engineering perspective. It really is fascinating. You see how overcoming what seems like a simple limitation, a static lens, totally transforms the capability of something like a vision system on a mobile robot arm. So if we bring it all together, whether we're talking about Artemis vision tracking pallets with Raptor or Micro Tecnica, guiding robots with Semprun, or maybe monitoring traffic, the fundamental value of Lens Kinect seems to be this precise, repeatable remote lens adjustment capability. Yeah, it fundamentally changes the game. You're moving from a lens that's basically set it and forget it, or set it and hope you don't have to touch it again, right? To a lens that's a dynamic software control part of the whole automated system, which makes the vision systems way more adaptable to different products. Changing conditions exactly. More adaptable, often easier to set up and configure for the end user, and ultimately more efficient and capable of handling a wider range of tasks that might have been difficult or impossible before. And that innovation got noticed, didn't it? The sources mentioned. It received an award. Yeah. The series was recognized as a silver honoree in the Vision Systems Design Innovators Awards program. Which seems like well deserved recognition for tackling a pretty fundamental challenge in a very practical way. Definitely. So maybe here's a final thought for you, the listener, to consider after this deep dive. We've seen how Lens Connect isn't just a slightly better lens. It's the kind of component innovation that actually enables new types of systems like Raptor, or allows existing systems like simple run on a robot to tackle problems they couldn't effectively solve with older tech. It changed what was possible by rethinking the eyes. It unlocked new capabilities. So think about systems or processes you're familiar with. What other limitations may be? Limitations we currently just accept could be overcome if a key component may be something seemingly simple was fundamentally rethought or upgraded, what new possibilities might that unlock in automation or other fields?