MVPro Media – The Vision Podcast #20
Guest – Georgy Das, Director of Systems and Training at Midwest Optical Systems (MidOpt)
In this episode, we dive into the physics of imaging with Georgy Das, Director of Systems and Training at MidOpt, to explore why optical filters are essential in 99.9% of machine vision applications. Georgy explains how the right filtration can control glare, manage ambient light, and stabilize imaging performance as systems move from controlled lab environments to real-world factory floors. He also shares practical tools and approaches engineers can use to design complete, reliable vision systems from the start.
“There’s nothing worse than coming home and finding out they forgot your barbecue sauce. That’s the mindset. A complete solution for machine vision involves using a filter 99.9% of the time.”
On this page:
- Podcast player
- Guest information
- Useful links
- Episode chapters
- Episode transcript
Listen to the Episode:
About our Guest:

Georgy Das is Director of Systems and Training at MidOpt, where he works with engineers and integrators to improve machine vision system performance through optical filtering and system design best practices. With more than 15 years of experience across technical training, solutions support, and industry education, he focuses on helping users bridge the gap between optical physics and real-world deployment. Georgy also serves on the Association for Advancing Automation Vision Technology Board, contributing to initiatives that support the advancement of machine vision technologies.

Useful Links:
- Georgy Das LinkedIn profile
- Midwest Optical Systems: midopt.com | LinkedIn page
- MidOpt Curve Compare tool
- Automate 2026 – Visit MidOpt at Booth #333 | automateshow.com
Episode Chapters
Click onto the chapters to access the relevant sections of the transcript below.
1. Solving Glare and Contrast Problems in Vision Systems
Georgy explains how filters are used in practical machine vision applications, such as eliminating glare when reading barcodes through plastic or improving contrast in challenging environments.
2. Why Filters Are a Necessity, Not an Accessory
Georgy discusses why filters are often overlooked in system design and why engineers should consider them from the beginning to ensure reliable performance across different deployment environments.
3. The Fundamentals of Optical Filtering
Georgy explains how camera sensors perceive more of the electromagnetic spectrum than human eyes and how filters help isolate the wavelengths needed to detect parts, defects, and barcodes.
4. Choosing the Right Filter for an Application
The discussion turns to practical design considerations such as wavelength range, lighting conditions, reflectivity of the target, and filter placement within a vision system.
5. Thermal Imaging and New Materials
Georgy introduces emerging applications in thermal imaging and discusses new materials developed to replace traditional germanium windows in long-wave infrared systems.
Episode Transcript
Georgy Das
There’s nothing worse than coming home and finding out they forgot your barbecue sauce. That’s the mindset. A complete solution for machine vision involves using a filter 99. 9. 9% of the time.
Josh Eastburn – Host
Welcome to the MV Pro podcast. We spend a lot of time on this show discussing the brains of machine vision, AI models, high-speed processors. But as any engineer in the trenches will tell you, if the eyes of your system can’t see through the glare or handle inconsistent lighting, all that computing power is essentially wasted. Today’s guest sits at the intersection of deep technical know-how and practical education on this subject. Georgy Das is the Director of Systems and Training at Midwest Optical Systems, better known as MidOpt. With over 15 years of experience across IT, marketing, and training. Georgy also serves on the A3 Vision Technology Board, helping to guide the industry’s priorities at a national level. MidOpt is a global leader in machine vision filters and optical solutions. Based in Palatine, Illinois, MidOpt brings over 30 years of experience serving customers in more than 30 countries with a portfolio of over 3,000 products. Their filters are engineered for durability, precision, and superior image quality, helping industries like factory automation, medical imaging, security, and aerial imaging get dependable real-world results. In this episode, Georgy and I talk about why filters are core components, not just accessories, how to standardize optics for scaling, and why even the best AI model is only as good as the photons that reach it. Please enjoy my conversation with Georgy
Josh Eastburn – Host
I read that you started at MidOpt as a solutions engineer, is that right?
Georgy Das
That’s correct, yeah.
Josh Eastburn – Host
Okay, so what does that involve working at a company like MidOpt?
Georgy Das
MidOpt manufactures filters and optical components for machine vision systems. As a solutions engineer, my primary goal was to help customers figure out how to use those filters in their machine vision applications.
Josh Eastburn – Host
Can you give me an example of what problem somebody might approach you with?
Georgy Das
Sure. A lot of the times, folks would be setting up an application to look at something that’s wrapped in plastic because they’re trying to read a barcode. But the light that’s in the factory floor or the light that they’re using to illuminate the barcode was causing a lot of glare. And so they need to get rid of that glare so the camera can see just that barcode. And so we would use a polarizing filter to help extinguish that glare and help them to see that barcode clearly so it could be read quickly, efficiently. And so that’s just one example. But ambient light. Being able to create contrast was the primary focus and really where I learned a lot about how essential optical filters and optics are in machine vision.
Josh Eastburn – Host
Do you have a favorite application from that time? Something that maybe you felt like was a big win or just a really cool application?
Georgy Das
I mean, there’s all sorts of applications. A theme park company that I’m not going to mention the name of, but they needed a particular wavelength or color of blue light to create the atmosphere for this particular attraction. And one of our blue filters helped to do that. And it was cool to be able to visit that deep park with my kids and show them, Hey, this is kind of what dad works on. And for a second, they’re like, you built this ride? And I’m like, no, no, I helped to create the blue light of the atmosphere. Yeah, so that’s one cool thing. Something else that was a really cool location, I think it was in South America, they were using drones on beaches. It had drones that had a flotation device so that when somebody was drowning or they could see somebody is having an issue with the water, the lifeguards could send a drone out much quicker than they could actually run to the victim. So the drone would go out, drop the flotation device right above the victim or the person that was having difficulty, so that they would have that while the lifeguard was swimming up to them.
Georgy Das
Again, we use some filters so that the drone camera system had an easier picking up the person from the other folks that were in the water. Again, helping to create that contrast and protect the elements in the camera system. That was another really cool application. I’m not sure if it ever got used widely, but it was a cool concept and something I could see happening in the future.
Josh Eastburn – Host
Yeah, that’s an application that seems obvious that I hadn’t thought of. Pretty neat. You’ve mentioned a couple of aspects of filtration that are helpful. What I’m wondering is maybe what’s the most common topic that you feel like you need to educate people on? I know you spend a lot of time training now, right? You’re not working as a solutions engineer anymore. It sounds like you focus a lot on education, though. So yeah, maybe what’s something that you feel like people frequently need to be educated on as it relates to optical filtration? Or maybe what’s something that you wish that people would just unlearn about that area of machine vision?
Georgy Das
I feel like somewhat of optical filter or filter evangelist at times. People, honestly, they don’t think about filters. They treat them as an accessory. Our tagline or slogan or mantra at mid-opt is filters are a necessity, not just an accessory. Usually, when people are designing engineers or whoever is designing an application, they’re thinking about their camera, their lighting, their processing, and filters aren’t even a thought. It’s usually only something they consider if there’s a problem. What we know to be true about machine application is that there’s so many variables. Often you’re designing an application that in a lab setting where lighting conditions are and everything is working perfectly. But what happens is that application or that system gets rolled out to multiple locations. Each one of those locations is going to have variable lighting because of skylights, because of the overhead lighting, because people walking around, all sorts of things that they couldn’t account for. But what a filter helps to do is to control that ambient light because a filter is filtering out everything except for the wavelengths that you’re intending to pass. If you’re using a particular wavelength, like a red light or a blue light, to illuminate a barcode or some other part or a defect, then you can use a filter to block out everything else and just pass that wavelength of light.
Georgy Das
I feel like for folks who are designing an application or installing an application, if you think about that filtering aspect before you get to the customer, it’s going to be a great experience for that customer and for you because it’s going to work as best as it can out of the box because you’ve eliminated a lot of those variable lighting conditions. Then for the end user, for the customer receiving that application, it’s also a great experience because it saves time, money, don’t have to go back to the drawing board, you don’t have to figure out what was wrong. That’s the biggest thing. Hey, filters are a necessity, not just an accessory. That’s our biggest push. Then once you get in to filtering, there’s all sorts of different applications. There’s, again, creating a contrast, creating that repeatability, helping extinguish the layer in those applications where you’re looking at something shiny, protecting. But again, I think I would say the main thing that my role is involved with is trying to get people to understand that you need a filter most of the time and to consider it when they’re in that design phase of not just when there’s a problem.
Josh Eastburn – Host
How do you reframe the importance of filtration when you’re talking to people, like you said, more focused on cameras and lighting, the other what they feel like are the major components of the system? Is there a mindset people should have? Maybe it’s something along the lines of like, look, you can save money now and spend it later, or you can spend it now and have a predictable process going forward. Yeah, what’s the shift that you feel like people need to make?
Georgy Das
I think the biggest thing is trying to get people to understand that giving a complete solution, finding a complete solution, something that’s going to work as close to perfect is a really great experience for both the person doing the designing and delivering the product as well as the person receiving. I mean, you can think about when you order takeout, when they include everything in the bag and you’re sitting out at dinner and everything is there, all your saucing, your tensiles, everything is perfect. It’s like it lifts your experience to such a high level. There’s nothing worse than coming home and finding out they forgot your barbecue sauce. That’s the mindset, a complete solution for machine vision involves using a filter, again, 99. 9% of the time. That’s the mindset that we try to encourage.
Josh Eastburn – Host
That’s such a great example. Yes, I think we could all relate to that like, what? They didn’t put in a fork? Like, go find a fork, please. Yeah. And you’re talking about the analogy here is to, let’s say, the design lab where you have built out a theoretical system, right? Even if you’ve built it from some physical components, you’ve tried it out, now you’re going to take this to five or 10 different sites that might have varying conditions and you want to be sure. Or what you’re saying is MidOpt is offering you the option of being sure from the get-go that system is going to perform as designed across all of those different sites where you’re deploying it.
Georgy Das
From a vision perspective, from an optics perspective, from enabling your system to see what you’re intending it to show, again, whether it’s a barcode, a defect, trying to get rid of those variables, yes, I would say that’s one of our main goals at MidOpt.
Josh Eastburn – Host
Makes sense. You mentioned contrast and glare, protection. Let’s go maybe a little bit deeper. Can you give us the five-minute primer on this aspect of machine vision design? What are some of the different aspects that we’re able to control for or the different reasons why we might apply optical filtration? And what’s the end goal in each of those cases? And what’s the high-level primer?
Georgy Das
Yeah, sure. I guess the main thing to understand is when we talk about machine vision, you’re typically using some camera to look at something or some sensor to see something. And so the main thing to understand is that the camera sensor sees a lot more and pick up a lot more than what human eyes see. If you look at a spectral response or a transmission curve, our eyes see heavier in the blue-green, and that’s what natural color rendition looks for us. But cameras are way more sensitive. They see a lot more in the visible, where we all can see, but also they’re picking up things in the infrared, so things that are outside of what our human eyes can see. And so that’s the electromagnetic spectrum. It’s pretty wide. It starts from what UV or ultraviolet rays all the way to the infrared. It keeps going. And again, because cameras are more sensitive than our eyes, we might be able to look at the application and easily pick out that contrast, pick out that barcode, pick out that defect, pick out that part. But because a camera sensor can see much more, you need to limit what it can see because, again, you’re trying to create some kind of difference so that the system has an easier time picking that barcode part, defect, what have you.
Georgy Das
That’s what an optical filter does. It helps to block certain wavelengths and pass other wavelengths. Or maybe a certain type of filter called a neutral density filter, where there’s too much of a particular wavelength or too much light, like in the outdoor applications where there’s too much light. It’s like sunglasses. When you go outside and the sun just hits you in the face, it’s so much and oversaturated, a neutral density filter can help bring that down. And glare, all these things were controlling the wavelengths, filtering out certain wavelengths and passing other ones. That’s basically what a filter is. It’s a simple concept, but it’s something that people often forget, something that people often don’t even think about, again, until there’s an issue.
Josh Eastburn – Host
It sounds like contrast is the primary thing that you’re helping people control for. Is that right?
Georgy Das
Contrast is probably one of the biggest things. Contrast, again, just simply means creating separation. That’s all you want. You want your system to be able to separate something, whether it’s something from the background, whether it’s something from a machine, and you want to see the slightest defect, a crack in a window, a defect in the pink job of a car, scratch on a pair of glasses. So contrast simply means just creating that difference. And oftentimes people rely on image processing or lighting to create that contrast, and oftentimes it works. But the problem is when you try to scale that and there’s so many variables because of the factory that you’re in or the shop that you’re in or even on consumer products like an iPhone camera or even the front face ID. There’s so many variables because you’re going to be in so many different environments. And so what you’re trying to do is control that light.
Josh Eastburn – Host
Garbage in, garbage out.
Georgy Das
Garbage in, garbage out.
Josh Eastburn – Host
Yeah, okay. So another way to think about this would be, what are some of the different things that it’s possible to control for? Maybe give us an idea of what’s the variety. I think you carry something like, is it 3,000 products that MidOpt carries?
Georgy Das
Yeah, so those are just different variations of the filter. Sometimes you want to see something in the ultraviolet and block everything in the visible. If you’ve seen a fluorescence application, maybe you’ve seen a bottle of motor oil that has a really light, faint date code or barcode printed on it, and you can use an ultraviolet light to help illuminate that. And so that’s something that has to be quality controlled. But maybe you want to see something visible and block everything else out, like a color camera. A color camera just needs to pass visible wavelengths. But again, cameras are picking up UV, they’re picking up IR. So you use a filter to block everything outside of that visible, kind of mimic what our human eyes are seeing. Maybe you want to see something in the infrared IR, like the IR on TV remotes. There is a little red filter on the top of your remote that’s passing that infrared wavelength out, but blocking everything, all the other wavelengths, so it’s not interfering. So we carry so many different variations and so many segments of trying to block out certain segments, pass other ones. We carry all sorts of optical windows.
Georgy Das
Sometimes you need a clear window to protect something. Again, like on your iPhone, you have a clear window. On the dash of your car, you have a clear window that protects those components and screens. On your thermostat at home, like your Nest or your ECHOBI, there’s sometimes a clear plastic cover that’s protecting, but you still need to be able to see things, have the sensors behind them. They need to pass certain wavelengths back and forth, make different readings. And so Yes, 3,000 products a lot. Then we also carry like, lenses and different adapters and all sorts of things, again, to help make it as easy to get your machine vision application off the floor into a customer’s hands and have it running smoothly.
Josh Eastburn – Host
With that many different options to pick from, what are some of the maybe fundamental properties that an engineer needs to consider when they’re speccing out this type of component?
Georgy Das
Yeah, some of the biggest things is what wavelength range are you working in? Are you looking at something that’s visible, something in the UV, something in the IR? You need to figure that out because that’s going to play a big part in what filter that you’re using because you want something that’s complementary. Are you using lighting or are you relying on ambient lighting? What part are you looking at? Is it super shiny? Is it reflective? Then the angles of how you have it set up, the physics of it all. Those are probably the essential things that you want to consider when you’re trying to incorporate a filter into your setup or if you need to figure out if you need a filter for your setup.
Josh Eastburn – Host
How about the filters themselves? What are some of the different specifications, I guess, that might be involved in selecting the filter itself? Obviously, you mentioned wavelength, that it’s actually designed for that to fit that scenario. I’m sure there are things like size, curvature, anything like that.
Georgy Das
Yeah, not curvature. These aren’t lenses. They’re not providing any magnification like your eyeglasses might. There’s no curvature. But what you said is important, I guess the placement or the size. Is your filter going to go in front of the lens? Is it going to go behind the lens? Is it going to go in some kind of enclosure? One of the biggest of Midwest Optical is our ability to quickly customize the shape and the size. We offer a lot of in-stock products. We have a lot of mounted filters that are ready to go that you can thread on to the end of a lens. But we also have ones that you can thread in front of the sensor, behind the lens. But ultimately, if you need an unmounted circle or a square or rectangle, we can help create that really quickly. So placement where that filter is going to be probably another essential thing that I forgot to mention.
Josh Eastburn – Host
Sure. Okay. So placement, you mentioned different shapes as well, depending on what type of, I suppose, enclosure you’re trying to fit into, right? Or mount you’re trying to fit it into.
Georgy Das
Correct. Yeah.
Josh Eastburn – Host
Or the different angles, I suppose, is what you’re thinking of as well. If it’s ambient light that you’re trying to control for, which direction is it going to be coming in?
Georgy Das
Yeah, a way a filter works, or I guess a bandpass filter, which is probably some of the most popular types of filters, bandpass filters, have a bend or certain curves. So it’s allowing certain wavelengths through that bend and blocking everything outside of it. What can happen in certain situations is if the light is coming at an angle, because of the ways the codings work, that passband can start to shift. Let’s say you’re a red band pass filter. You would typically use a red band pass filter if you’re using a red light to illuminate something. Let’s say a barcode. You’ll see a barcode readers have red lights that flash out. And so what can happen is if that red light is entering the filter and lens at an angle, a lot of filters will happen is that pass band will start to shift to the left. We call it short shifting. So instead of passing red wavelengths like it’s supposed to, what will happen is it’ll start to pass green wavelengths and block red ones because of the shift. And so what ends up happening, this filter that was supposed to help pass red is now blocking red and allowing potentially interfering wavelengths through.
Georgy Das
So that’s something to consider. And MidOpt has a technology on a lot of their bandpass filter called Stable Edge, where our filters are designed in such a way that that shift doesn’t happen or it’s very limited. You can read more about that on our website. It’s a kind of glossed-over feature, but it’s very essential because people will buy filters all the time from lots of different sources. And all of a sudden this red filter that’s supposed to be passing red is passing green. It doesn’t make any sense. But it’s because of this phenomenon called short shifting. It shifts to the shorter wavelengths. That passband starts to shift to the shorter wavelengths. Shorter wavelengths are like the UV visible and the longer wavelengths are the infrared. So if you’re looking at that spectrum as like a line, shorter is to the left. You know, one way to help avoid this situation, aside from the Stable Edge technology, is if possible, you can mount the filter behind the lens in front of the sensor. This also limits that. But oftentimes, because of the design of the camera or the lens, for a multitude of reasons, that’s not always possible.
Georgy Das
That’s why it’s essential to use a filter with this Stable Edge technology. Again, that’s a proprietary thing in the design of our filters that will be available at MidOpt.
Josh Eastburn – Host
Very cool. We’ve been talking a lot about contrast, controlling for contrast, controlling for a glare, but you’ve also mentioned the need for protective windows. I’m wondering if there’s some nuance in that application, or is it as simple as, I need to put something in front of the expensive equipment so that it doesn’t get damaged by the process that I’m monitoring?
Georgy Das
I’m not sure if there’s nuance to it, but I guess let’s think about a couple of different applications. Let’s think about, say you’re on the factory floor, there’s a bunch of cameras set up. There are these cans of soda pop or I don’t know, something that are flying down the conveyor belt, and these cameras are taking multiple images. If in that environment, you’re bottling, or there’s food, or there’s liquid, or there’s grease, something like that, it’s very possible that there’s going to be constant splashing or spills on these systems. So oftentimes what happens is they have to pause the line, somebody has to go and clean the camera. And when you do that, of course, there’s going to be fingerprints and smudging. And so a protective window is just the easy way to prevent that. I mentioned protective windows often being just clear glass or acrylic, depending on what you need. But the benefit is we can apply certain coatings on those protective windows, something like an oleophobic coating or a hydrophobic coating. These are essential to help. Again, you’ll probably most at a consumer level, people are going to be real familiar with that because there’s a coating like that on your iPhone to help prevent smudges and fingerprints.
Georgy Das
These kinds of coatings almost work like rain on the windshield of your car, where the liquid beads off instead of sitting on there and spreading, and it makes it easier to clean. That’s one application. We supply a lot of protective windows for aerial or drone applications as well because you’re flying and there’s condensation, there’s things that build up, there could be things flying, and hitting your lens on your camera. Protective windows are a great way to just prevent that damage. Also, those optional coatings help to make it easier to clean, make it easier so that the system can continue to see even if it’s raining or if there’s some condensation or moisture in the air.
Josh Eastburn – Host
Makes sense. Okay, so again, this theme of trying to control the conditions that the vision system was designed to operate in, right?
Georgy Das
Yeah.
Josh Eastburn – Host
That makes sense. When we met up, I think for the first time at Automate this last year, you were talking a lot about the new SiLWIR™ protective window line. How may I understand what’s special about that or what types of applications it’s designed to support?
Georgy Das
Sure. We’ve seen an increase in the number of thermal applications. And thermal applications are applications where you’re looking at something in the far right IR or long wave IR. And this is, again, something that not our eyes can pick up. But these cameras and systems, what they’re doing is they’re converting this nonvisible energy to something that’s visible so that we can see. They’re trying to capture changes in temperature and heat to create maps, whether it be people, whether it be animals, whether it be anything that you’re trying to look at. So these applications are used a lot in surveillance applications. Maybe if you’ve ever gotten a home inspection, the home inspector has a special thermo camular to help look behind the walls to see if there’s water or water damage or if there’s a racoon living behind the wall. We’ve seen an explosion, and a lot of it, or some of it comes from the defense side, too. There’s more surveillance, more camera at every corner or all sorts of applications being used globally. Typically, in these kinds of applications, you need a window that’s made out of material called germanium. Germanium has been really difficult to get because of lots of things.
Georgy Das
This primarily comes from China. This element comes from China, and it’s usually, it’s mined there. It’s been increasingly difficult to get or it’s been really expensive. What MidOpt has been able to do is create a similar product, the same products or the same properties of germanium, but using silicon glass and applying special coatings on it to become more durable. We’re really excited to be able to offer an alternative to germanium. The reception has been really interesting. It’s cool to see that we’ve been able to come to the market with that offering.
Josh Eastburn – Host
Yeah, interesting. What is it about germanium that makes it best adapted to this, or what is it that you’re trying to replicate with a silicon-based product?
Georgy Das
Yeah, it’s because you need to see in that 8 to 12 micron range of the spectrum, and the germanium is just a really great… It’s perfect for that application. It’s also durable, and that’s what we’re trying to replicate, being able to see in that 8 to 12 micron because that’s where these thermal applications work. We’ve been able to, again, put anti-reflective coatings on it, so to increase that transmission so we can capture as much information as possible. We put a DLC or Diamond-Like Carbon coating to help protect. Because, again, usually these applications are oftentimes being deployed in the environments that might have liquid or different elements, pebbles, rocks, things that are flying around. We want to make it as durable as possible. As of right now, it’s significantly lower cost and more readily available to wider markets.
Josh Eastburn – Host
Interesting. You mentioned a couple of applications or industries that could probably best benefit from SiLWIR™. I think one of them was defense. What were some of the other ones that you mentioned? Who is this best designed for you?
Georgy Das
Yeah, defense, healthcare. It’s used a lot in manufacturing, building, construction, because, again, you’re able to take a peek behind things. Again, our human eyes aren’t able to see. You’re able to see moisture, temperature changes. I know test companies are able to use it, these thermal cameras, to see if there’s squirrels running around in your attic because it seems like a raccoon or squirrels hear people. They stop, they don’t move, and without breaking a wall down or ripping up your attic. I mean, construction companies, architects, they’ll probably use it for looking at the stress in different materials, all sorts of potential applications that I’m sure we’ll continue to see. It’s a pretty expensive endeavor. I mean, those cameras are very expensive, but as they’re used for more and more things, I’m sure they’ll find its way down to the consumer level at more than it is even now. I mean, you can go to Home Depot and buy a $200 terminal gun to look behind your walls for all sorts of applications.
Josh Eastburn – Host
That’s fascinating. Okay, so I feel like we’ve covered a lot of ground here, all the way to the fundamentals of optical filtration, all the way to specialty items that you carry. What’s coming next for MidOpt? We’re still kind of you know, it’s Q1, right? So we can still talk about what’s big for ’26. Anything we should look forward to this year?
Georgy Das
We’re continuing our evangelism of needing optical filters, getting people to understand that it’s super important. We’re looking forward to the upcoming trade shows. We’re Like, Automate is going to be in Chicago this year, so we have a booth there. I hope people can come visit us and learn more about why you might need a filter or if you have an application where I’ve described something that struck a chord, like you’re not able to create that contrast, or there’s too too much glow or too much ambient light. That’s something that we would love to help with and discuss and show you how that works. I probably can’t say anything about new products until the release, but we’re trying to understand where the market is going and trying to help create products that help solve problems that can arise from those trends.
Josh Eastburn – Host
Right on. What booth number are you going to have at Automate so people know where to find you?
Georgy Das
Booth number 333. Yeah. So come find us at 333 in Chicago. Buy you a cup of coffee if you come.
Josh Eastburn – Host
Awesome. Hope to see you there. Okay, well, is there anything that I have left out that I’ve neglected to ask about that you’d really like to talk about before we wrap?
Georgy Das
I mean, the other thing is that I think MidOpt is really well known in the industry to a certain extent, and I think they got that waye, we got that way by being a resource for people, for engineers, for customers, for hobbyists, to understand optics, to understand filters. And we want to continue providing and being a resource for people. And there’s a couple of tools. Our website is a great, great resource to learn more about our products and just about different applications and help understand why you would want to use a filter. But there’s a couple of tools that I want to mention. One is our Curve Compare. It’s a tool that you can select multiple multiple filters and see how they work together, see the curves and the transmissions. That’s been really useful for engineers trying to determine which filter might go best depending on the lighting that they’re using or the application. It’s a really great tool. You can print out the results or you can send them to us to help you understand what’s going on there. That’s one tool. The other is we talked about the placement of the filter or how to get the filter onto a lens.
Georgy Das
Oftentimes, when you buy a lens from different manufacturers, they all have different thread size is, and you don’t know what thread size because they often don’t include… Even though there are filter threads, they don’t include which size filter thread or it’s not easily available. We also use a lens finder or a lens mount finder. You can just type in the serial number or the part number of that lens from multiple manufacturers, and we’ll tell you, Hey, this is the mount size that goes on. Again, it just goes to… Like, MidOpt wants to be a resource for machine vision users and provision users, and so that’s what we’re trying to do. Those tools Just help us do that.
Josh Eastburn – Host
Where can people find you online?
Georgy Das
At midopt.com. It’s M-I-D-O-P-T .Com.
Josh Eastburn – Host
Perfect. Do you guys do anything on social? Is that a good place to go for education?
Georgy Das
Yeah, we’re posting a lot on LinkedIn. I think we have some Twitter and Facebook and all that, too. But LinkedIn and our website, probably some of the best places. If you go to our website, you can sign up for our newsletter. We include a lot of tips and tricks and application examples to help understand more about filtering.
Josh Eastburn – Host
Okay. Well, thanks so much for your time. I learned a lot today. This is not a subject that we had covered before, so I’m really glad we got to go into some depth on it.
Georgy Das
Cool. Thank you so much for your time. Appreciate it.
Josh Eastburn – Host
Again, that was Georgy Das, Director of Systems and Training at Midwest Optical Systems. It’s a great reminder that while we’re often chasing big breakthroughs, the fundamental laws of physics still have a lot to do with whether a project succeeds or struggles in the real world. If you’re looking to sharpen your own mental model around optical filtering, I highly recommend checking out the tools Georgy mentioned over at midopt.com. You can also find Georgy on LinkedIn. As he mentioned, they’ll be running the show floor at Automate, booth number 333 this June. If you are a vision professional or an integrator with a unique application story that moves beyond the hype and into the reality of the shop floor, please reach out to me, josh.eastburn@mvpromedia.com. For MvPro Media, I’m Josh Eastburn. Be well.
















