MVPro Media – The Vision Podcast #19
Guest – Brian Mock, President at Event Capture Systems
How do you modernize 50-year-old paper mills to make operations safer? Brian Mock, President and Co-Founder of Event Capture Systems (ECS), explains how machine vision, edge computing, and AI are transforming the pulp and paper industry. He digs into why aging, siloed vision systems are a hidden drag on uptime, how edge devices change the economics of quality control from the woodyard to finished rolls, and why even tiny efficiency gains can mean millions in savings.
“Ultimately, paper is a low margin, high volume business. If you don’t make it right, you can’t sell it. There’s a lot of energy you’ve already spent to top the tree, grind the tree, put the energy into it. And then if you can’t get top dollar — you’ve invested all of that time into it.” Brian Mock, President of ECS
On this page:
- Podcast player
- Guest information
- Useful links
- Episode chapters
- Episode transcript
Listen to the Episode:
About our Guest:

Brian Mock is President and Co-Founder of Event Capture Systems (ECS), a North Carolina–based industrial automation company specializing in machine vision for pulp and paper. With more than 25 years of experience in mill environments, Brian began his career in the mid-1990s working on some of the first digital industrial video systems in the United States. Since co-founding ECS in 2008, he has led the company’s growth across quality control, edge computing, and AI-driven safety solutions, helping major paper producers modernize legacy assets, reduce downtime, and improve worker safety across North America and Europe.
Useful Links:
Brian Mock:
- Brian Mock on LinkedIn
- Event Capture Systems Inc Website: ecsuptime.com
- Event Capture Systems Inc on LinkedIn
- TAPPICon 2026 Program
Episode Chapters
Click onto the chapters to access the relevant sections of the transcript below.
Chapter 1 — From VCRs to Vision Systems
Brian looks back to the mid-1990s, when industrial troubleshooting meant rewinding VHS tapes, and shares how early digital frame grabbers and pre-event recording laid the groundwork for modern machine vision in pulp and paper.
Chapter 2 — Why Paper Is a Perfect (and Painful) Vision Use Case
A deep dive into the realities of a 200-year-old industry running on 50-year-old machines, razor-thin margins, and massive raw material inputs — and why even a 0.01% efficiency gain can mean millions.
Chapter 3 — Edge Devices vs. “Server Room” Vision
Brian explains the architectural shift from siloed, server-heavy camera systems to edge-based smart devices that process data at the source, reduce failure points, and make AI practical on the mill floor.
Chapter 4 — AI for Industrial Safety: The Red Zone Revolution
Inside ECS’s large-scale safety deployments with International Paper, where edge-based human detection integrates directly with PLCs to prevent near misses and fatalities in real time.
The Evolution of Machine Vision in Pulp & Paper
The conversation closes with how ECS helps mills upgrade incrementally — from chip quality inspection to roll handling — using better data, smarter lighting, and pragmatic AI to increase uptime, protect margins, and save lives.
Episode Transcript
Brian Mock – ECS
Ultimately, paper is a low margin, high volume business. If you don’t make it right, you can’t sell it. There’s a lot of energy you’ve already spent to top the tree, ground the tree, put the energy into. Then if you can’t get top dollar or you have to lose on revenue, but you’ve yet invested all of that time into it.
Josh Eastburn – Host
Welcome to the MV Pro podcast. Today, we’re taking a deep dive into a part of the vision market we haven’t talked about before. To help us understand how machine vision, edge computing, and AI are transforming the world of pulp and paper production I’m joined by Brian Mock, President and co founder of Event Capture Systems. Brian has spent more than 25 years in pulp, paper, and industrial automation, starting in the mid-1990s on one of the first industrial video technology teams in the US. Since then, he’s become a national leader in machine vision for mills, designing, implementing, and scaling systems that reduce downtime, protect margins, and now deploy AI-based safety solutions. In 2007, Brian and his friend John Larkin co-founded ECS with a long-term mission to build technology that actually works in the real world and to stand behind it with service that never stops. Today, he leads multidisciplinary teams across R&D, sales, engineering, manufacturing, and customer service, helping North America’s largest paper and nonwoven producers modernize without ripping and replacing entire plants. If you work in pulp and paper, you’ve probably felt the pain of aging vision systems, siloed hardware, and flip phone technology trying to keep up with modern demands.
Josh Eastburn – Host
In this episode, Brian and I talk about how edge devices are changing the architecture of machine vision in mills, why AI is finally moving from trade show theory to real deployments, and how projects like ECS’s work with international paper are raising the bar on industrial safety. We also get into practical war stories from the woodyard to roll handling, and talk about how better data on near misses can literally save lives. With that said, let’s get rolling.
Josh Eastburn – Host
I know we will talk about ECS more. I do like to always start with a little bit of background. People know where you’ve come from and and how you came about your expertise, right? Like, as technical people, we want to know how you earned your stripes, right? You mentioned the mid ’90s. Let’s go back there for a second. You and your partner, you’re working together. And how does this idea for Event Capture Systems come out of that.
Brian Mock – ECS
We both graduated college around the mid ’90s. We got our first job at a small company in North Carolina called Carotek. They actually were the first coming to come out with a digital solution to replace VCRs in the packaging world. If you wanted to go back and see how the paper envelope was shredded by the US Postal Service. You’d actually have a little lipstick camera connected to a VCR. You’d hit Stop and Rewind, and that’s how you’d do this pre-event troubleshooting. It was used by VCR in a lot of places. Carotek digitized that with a frame grabber. I mean, this is like a lot of people don’t even know what the VCR is. So John and I were one of the first service and sales engineers to go out. We did a lot of work with Procter & Gamble at the time, and so we worked together for several years. And then paper found out about this device that’s able to help them understand the cause of paper breaks, and paper breaks in the paper industry is a very expensive event. So whatever data that they can gather, in this case, visually to understand, I know it broke here, but what did it look like 15 seconds before that at some other part of the same process.
Brian Mock – ECS
It’s called Upstream. That involves this thing called pre-event. And if you’re doing it by VCR, you’ve got lag time, you’ve got degradation, you’ve got blind spots while you’re reviewing the video, you can’t synchronize. I mean, all these concepts back then were really novel and unique. Now it can be done on your cell phone. So we did that for many years, and then I left to go work for a competitor. He stayed with that company. The company had many divisions, and one happened to be video technologies. They mostly were into engineering services. So this wasn’t really a big deal for them. But when paper exploded, they said, This is great. I left for about eight years, and then I called up John in 2008. So Event Capture Systems existed as a name, and it was owned by Carotek back in the day. So I left, and then I called him back 10 years later and said, Hey, man, come on in. What I’m working for is hiring. Why don’t you come with me? And the cat becomes the mouse, the mouse becomes the cat. John says, Hey, maybe there’s a chance that you and I could get together and buy the division from Carotek.
Brian Mock – ECS
I was like, Well, well, don’t know much about running a business, but it’s an opportunity you can’t pass up. So I said yes. And John and I mortgaged everything we could, purchased the company with some ownership from the parent company. Now there’s not that. And that was in 2008. Josh, it’s been one heck of an adventure deploying these quality control solutions in this paper industry and having done it for so long, both as an employee and now as an owner. And working with the same guy for almost 30 years.
Josh Eastburn – Host
Yeah, man. It’s the American dream story right there.
Brian Mock – ECS
Fun. And now we’ve grown. So we can talk about the days way back then, and now it’s the same, but yet very, very different.
Josh Eastburn – Host
Yeah. And all of that time, from the beginnings to the work that you’re doing now, is there one project in particular that stands out as a favorite? Maybe it’s a war story. Maybe it’s the revolution that we’re in now. But yeah, what comes to mind?
Brian Mock – ECS
Well, just real quick, the backdrop of the playground that we get to be in the pulp and paper industry, it’s a technology that’s been doing the same way for almost 200 years. They’re still making it the same way. The data suggests that over 50% of the paper machines today are over 50 years old, but there’s more pressure now than ever to increase efficiencies, reduce waste, and industrial safety. So machine vision has always been a part of the time at the table in order to help increase the OEE. But now you can’t run a machine without really getting pretty deep into machine vision. So we’ve seen a transformation. It’d be nice to have to, I’m not doing anything unless the camera system is working. And it’s not just on the paper machine now. It’s from the chip all the way to the finished product. So that’s been really fun. But your question was, what’s the single coolest thing that’s been fun? Of course, having all of that exposure across these individual quality controls. The best one, though, is the deployment of machine vision and AI now to increase worker place safety. I mean, it is a game changer because industrial safety is based on self-reported incidences.
Brian Mock – ECS
Go back and do some reading on it. It starts in 1930 by the Heinrich triangle that says that you have a certain amount of correlation between bad behavior and maybe a near miss, and then at the top, unfortunately, it could be a fatality. Problem is that all this near miss is until camera systems are self-reported. So it’s like a safety iceberg. You don’t really know all the data, and you can’t make the right decisions all the time on some of the data. So these safety systems that we’re now deploying in the paper industry is providing the data man, and it is making differences to just how the operators interact, how the management understands how it happens. You can just see the puzzle pieces fitting together.
Josh Eastburn – Host
So the goal with that is to move beyond self-reporting and actually have data where you could say, this is what happened, this is how many actual near-misses we have, that stuff. Yeah?
Brian Mock – ECS
There’s a lot of video. I mean, the systems know when to record, but it knows when humans enter the red zone. It knows when the PLC is the truth table. Our point is that it’s not bad behavior that goes on. What we find, it’s just miscommunication between the new version of the machine or the SOP that wasn’t done or the button never was moved over there. Management just doesn’t understand or doesn’t realize. And operators will do what they have to do to do the best that they can do. So a lot of it’s a communication gap that the safety systems via camera, and bear in mind, they still have gates and pressure pads and lasers. This is a layer of safety. It’s It’s not replacing that. It’s an additional data label and additional stream. But it fills in those gaps to have very real-world conversations on, Hey, man, I’m curious, why this happened that way?
Josh Eastburn – Host
Yeah, what did we miss? But they have a clear view into what they reality is on the floor, right?
Brian Mock – ECS
Right. Better data. I mean, we know that data is the new oil, makes better decisions. And in this case, it saves lives.
Josh Eastburn – Host
Yeah. Fantastic. Yeah. So I’m interested in talking, too. You mentioned about the bigger picture of pulp and paper. We haven’t talked about that on the show before, and I’d like to unpack that a little bit for listeners so they can understand what are some of the unique applications that you see. You mentioned a little bit about why you think vision has gained traction in the industry in general, but maybe walk us through what are some of the typical steps in the process and how machine vision comes into play there.
Brian Mock – ECS
So paper is from a natural product, from a tree. So the process that goes from growing the tree, get that, to then making the chip, to then making the slurry, to then putting on the machine, to then making what’s called a parent roll. And then eventually, you do some things to make it a saleable product. I mean, that process from when it gets to the paper mill as a chip, and then when it ends, can be several days. So it’s really hard to correlate between different processes. They’re all silos. But as the information capture gets better, you can start to understand that the tail is not wagging the dog. And machine vision is one of many tools that they can now use in a better way to really try to get some better understanding of variants. That’s what kills the process. Too much variant They’re outside of quality limits. And ultimately, paper is a low margin, high volume business. If you don’t make it right, you can’t sell it. And there’s a lot of energy you’ve already spent to top the tree, grind the tree, put the energy into it. And then if you can’t get top dollar or you have reduce, you lose on revenue, but you’ve yet invested all of that time into it.
Brian Mock – ECS
So paper is unique in that way, and also mentioned that it’s a technology that does the same thing for 200 years. So how do you modernize it with machines that are 50 years old It has been a lot of fun. So it’s adding another element, too, and I’m kind of everywhere. But before in the days, okay, Josh, long time ago in the ’90s, they would only put these machine vision systems on the paper machine, the most expensive asset, right? Because that’s where the downside time was the most expensive. Because also machine vision was pretty expensive back then even as well. It was a custom made hardware oftentimes. It was custom made computers. So let’s go from custom made applications, frame grabbers, servers that are very custom. We call them silos of technology, where after five years, man, you basically throw it out and you start over. I mean, I’ve literally been to Mills now where there’s interface cabinets from 20 years ago. There’s another interface cabinet from another vendor 15 years ago. There’s another interface cabinet 10 years ago. There’s another interface, I’m not kidding, five years ago. They ripped it out every time. And it’s like, Can we please stop the madness and find a way so that the technology doesn’t get obsolete?
Brian Mock – ECS
And you can build it such that as technology improves, you can take the legacy hardware, if possible, and bootstrap it to the new software because you’ve got operating systems, you’ve got computer hardware, you’ve got the tip of the spear of the camera. They all have to work together. Any system is better than none, Josh. But the point of it is the better the system is, the better data that you get. So today’s systems, computers are virtualized. It’s awesome because if you look at the failure modes, the data suggests that after a couple of years, like 80 % of the failures these quality control systems have are computer related. And that’s just really troublesome when you- A little ironic. Quality control product, Josh, to be the checker, but yet it’s the one that’s having the problem. So as a pulp and paper maker, you’re really upset that you’ve invested this money to have quality control, but yet the quality control or the quality control thing is also not on its best day.
Josh Eastburn – Host
Okay, so sorry to interrupt there, but it sounds like there’s a couple of different trends that are coming together here. So we’ve gone from a state where we’ve got this semi-continuous process of material moving through from raw to a finished good. We’ve got real narrow margins, so quality, especially upstream, is really important. And what I’m understanding from what you’re saying is the industry has responded by investing more and more heavily in machine vision over time, right? Gone from just critical asset monitoring to more and more, can we do the upstream quality control? Am I following?
Brian Mock – ECS
Yeah, 100 %. Okay, cool. What you’re adding to that, too, is that the cost of implementation, total cost of ownership has gone down because the technology is less proprietary, less specialized. Virtualization, like I say, has helped. We’ll talk about Edge devices, which is completely blown it up in certain ways, too. So you start with a wood yard, it’s a mile that way, man. So even running a piece of Ethernet out there is a nonstarter, okay? So the edge device has to make the decision. If you’re going to do closed loop control, it has to communicate out there to its control system. Traditional systems just don’t work well because of all the data cable that you have to run. And then you go into the palping and the paper machine, you go into the converting processes, you go into the finished product handling. We can all deploy better strategies that have less maintenance, they’re less complicated in order to do really a couple of things. And it’s interesting. It reduces human-based quality control. And we know about this in machine vision. I mean, articles have been around forever. And there’s these cost brackets that talk about doing that.
Brian Mock – ECS
And we’re seeing a transformation that you can get better decision making, more consistent 24/7 at a price point that’s very acceptable and it’s accurate and it’s precise. And you can do it oftentimes, maybe with an edge device. I mean, that is just, you asked earlier today what I was on before, it was a really unique call with a mill that is having a problem with… It’s like they make the chip and they spend all the money and they have the massive roll, and then there’s a little roll, and that’s what they wrap up and ultimately put on a pallet to sell it at Walmart, just as a generality. But at the very end, the core isn’t quite right sometimes because it’s a human process at times. It’s very rare, but they can’t roll if the core isn’t right. But they’ve already done all the work. Every time that we go to these mills, it’s a human that’s widely watching. It looks okay, or there’s no human at all because it’s never supposed to be wrong. So now because it’s an edge device and it’s easy to deploy, it’s about 100 % accurate. I mean, we’ve done the stats on it.
Brian Mock – ECS
We can now go to mills and be like, you have 100 % traceable quality control before it goes to the truck. And obviously, you can use that data to keep the problem from happening to begin with. So customer retention is preserved. OEE goes up, waste goes down. I mean, all of those things are happening because you can use quality control, try to not have the human involved at the very beginning to the very end of the process, not just where you’ve got the bottlenecks, which traditionally, again, has been at the paper machine.
Josh Eastburn – Host
Yeah, because I was hearing the other trend that you were saying is When I say we, I mean the industry, I guess, has accepted that it needs vision systems in order to improve its margins. But at the same time, you’re seeing all these, I don’t know what the right word is for it, but you’re seeing that there’s a failure in the quality of these systems over time. They’re not aging well. You have these multiple generations. And that’s what I’m wondering is, are you seeing pushback from people? Are you seeing a demand that needs to be filled that led you to find this niche for an application of edge devices?
Brian Mock – ECS
So you’re bringing up a great point, and it’s good in the context that the efficiency of the process is still quite high. But since it’s a low margin, high volume, you can increase efficiency by 0. 01 % and your returns are extremely high. For instance, you have a tree, you grind it up to chips, then you put that chip into a digester. Roughly 50 % of the weight of that chip turns into a lignin, which is used as fuel. It doesn’t go in a paper. But that ratio of what you can do as a usable fiber to what is not usable, if you just make that 51 and 49, It could be millions of dollars that you saved. Just right there. There’s no quality control, by the way, that’s on time, real time or online real time for chips. Something we have at ECS using what AI with edge devices to give you something as simple as What is the shape of that chip and thickness when you’re moving at 300 feet a minute on a conveyor line that’s 38 inches wide? Here’s another fun fact for you. Just as paper, the most expensive part of the paper making process is the raw material they buy.
Brian Mock – ECS
It’s more than the people, it’s more than the energy and all that. It’s the wood chip. Your average large paper mill fills up almost 20 size Olympic swimming pools per day. That’s the how much just mass of chips that goes into that. Okay, wow. How much of those chips are actually run through some quality control device? Couple of gallons maybe, because you can do an offline tester.
Josh Eastburn – Host
You can just sample it. Okay. Yeah.
Brian Mock – ECS
You’re still going to make paper, but what you’re going to do is you’re going to dose the chemicals at its highest. You’re going to use as much energy as much steam. You’re going to get usable fiber, but you’re going to waste some, too. So a lot of the paper industry, the need to get as efficient as possible with the tools they’ve had available has reached some level of point. And now that when you get some of these new, there are new control systems, but when you get some of the new machine vision, they’re like, wow, I didn’t know I could do that. And so The point of it is that you don’t have to tip the needle very far in order to have dramatic effects on the profitability of the paper mill.
Josh Eastburn – Host
Okay. I would love to dig into that now because the way you’re doing that, you’ve mentioned a few times now, is through the application of edge devices and AI. I’d really love to get into the nitty-gritty of what that means. Can you just, first of all, define what you mean by an edge device for somebody who isn’t familiar with that term?
Brian Mock – ECS
Josh, I’ll define it on what it is and then compare it to what we would do before we had access to the technology. What we use the edge device for is that the thing component out, it’s as close to the process as possible. It’s right there at the paper machine. It’s the sensor. It’s going to get the data in, it’s going to process the data, and it’s going to make a decision based on what it’s trained to do with Then that thing. And then that thing has an I/O that can report directly to typically a PLC, a control system, a DCS, something like that, that’s going to integrate that data into other data to make a decision to make that system run better. It be closed loop. It can also involve a human where it says, Hey, there’s a press point or something that is an alarm that goes off. And then the human may make the final decision on what ultimately is done. Either way, there’s a piece of data that is generated at the site and it’s communicated directly to a larger body of information to make the decision. Okay. Now, then let me compare that to what I would have done 10 years ago or five years ago.
Brian Mock – ECS
I would have had a camera out there. It would have been a raw camera, meaning that there’s nothing on the camera other than it just converts it to a gigabit Ethernet signal, something that transmits data, but it’s transmitting all of the frames, all of the resolution to a 42U stack cabinet that is probably getting some exposure to H2S and the controller that’s not supposed to have that, but it is. And you’ve got 12 computers stacked up because each computer may talk to one camera. We got thousands of watts going in there. So you’ve got all these failure points. You got the transmission failure. You’ve got the legacy computers that are going to die in probably five years or sooner. They’re running some OS, maybe, maybe not. In And then that thing ties into the control system. So you’ve got all these points of failure, and it is a silo of technology. It is a total cost of ownership nightmare. And the mills, here’s really what they do. They just grin and bear it because they have to have it, but they hate it because when it breaks, it’s a proprietary frame grabber. I mean, it’s all these things there. Just like, no one wants to deal with the camera system because no one knows the magic sauce that went into it. I mean, a couple of years ago, if you were to ask a pulp and paper manager, what is some of the most frustrating technologies that you I have to deal with. A camera may come up in that conversation.
Josh Eastburn – Host
Oh, man. Okay. So I really like the smoking gun.
Brian Mock – ECS
Well, the other thing, too, is that these cameras are in very nasty environments. It’s super steam. There’s slot flying everywhere. You’re having to see through a whole bunch of debris in order to make these image processing decisions. So the environment is absolutely awful. So to wrap up that part of it, if I can bring an Edge device in there, and also if I can… Notice I said you got this raw camera and it’s sending all this data over to the server. What you’re sending all It’s like running a tractor trailer on a dirt road at 100 miles an hour. It has potholes. It’s hard to do, but you got to do it. What would be better is if you could get all the data and more resolution, by the way, because if you do a gigabit Ethernet, guess what? You’re limited to a gigabit per second. That’s all you can do. So you can only have with so many pixels, only have with so much data. Your data is already limited. But wait a minute, maybe I install a 10 gigabit backpend per camera. Well, let’s not do that because the expense to maintain that in a paper mill is insanely high.
Brian Mock – ECS
I mean, there’s technology available, but it doesn’t make sense. Also, the distances between camera and computer, maybe a thousand feet. So then an Edge device can process more data because it’s doing at the source. It doesn’t have to transfer all the data to the server. It doesn’t have to deal with a server. And then finally, You can then bring in some inferencing models into that AI edge device, depending on the quality of that edge device. And so you talk about failure. Well, if the camera fails, what do you do? You go grab another camera out of stores, you load up program again. There may be a little bit of configuration, and boom, you’re off solving your quality control problem. Or if you want to run the quality control to another point of the process, you take the camera down there. You don’t have to drag the Ethernet cable or all the other infrastructure that’s required for traditional gigabit Ethernet, raw camera to massive server farm thing. And ultimately, it’s the decision that’s made. So maybe there’s a very small communication that comes out of that camera because it’s making the decision, obviously on the edge.
Josh Eastburn – Host
Yeah. Okay. So that’s the distinction. And we’re consolidating or we’re co-locating all of that processing and decision making into a single device. Is that right?
Brian Mock – ECS
Yeah. And if done correctly, it will be able to process more bit rate, pixels and frame rate than you would if you had a raw signal because the data transmission needs. I mean, it could be 50 cameras at 100 to 200 frames per second doing HD resolution all at the same time. They all need to be making decisions within the frame they captured, by the way. Okay, so there is post-processing, of course, but a lot of these decisions are in the frame that they see it. That’s a lot of bandwidth, man. And if you want to truck all that to a bunch of servers, which we had to do, that is one option. We’ve done it for 30 years. It’s a different way to do it. And it has all the advantages and very little cons. I mean, the cost of the camera typically is quite low because of the technology. What I’m saying is that when people ask me about the technology, I’m like, Hey, how does that cell phone you’re using? You’re still using a flip phone? In some instances, the paper industry is still using a flip phone.
Josh Eastburn – Host
I 100 % believe that. Yes, I get the analogy. Oh, yeah. I think most consumers would be surprised to understand how many industries, for example, are still running on paper. They still have people walking around and tracking things manually. Yeah, you’re right.
Brian Mock – ECS
The analog style. I know you’ve probably got a lot of podcasts coming up about the AI transformation and the Internet Of all Things. And that just is pretty cool, I think, to see it in action. A couple of years ago, we go to the Trade Shows on paper, and 50 % of the talks are all about AI, but it’s all theoretical. And now we can go back, and one of the things we’ll talk about is the case study that I get to talk about of actual how is AI being not just thought about, how is it being used in creating differences, not only quality control, but also in human safety that is concrete.
Josh Eastburn – Host
Yeah, I’d love to jump into that. Just one clarifying question before we do. A term that some of our listeners will probably be familiar with is smart camera, right? Is there a parallel here to what you’re talking about, or is there a difference? Are they the same thing?
Brian Mock – ECS
I would say they’re the same in these ways to think about it. You can say a smart camera, in my opinion, is anything that does more than just acquire the data. It would do something more. It could compress the data. Some people call that a smart camera because that’s pretty important, too. For your data highway can be still pretty restricted, but yet you can uncompress at the point. But to take it one step further, that smart camera, I think, is more like an Internet protocol camera or IP-based camera where it’s sending data on an Ethernet of some sort, and it would incorporate both a capture, storage, potentially acquisition and compression and also some analytics. So that’s like the total package that we would define as a smart camera.
Josh Eastburn – Host
Yeah. So it’s really a change in the architecture. Maybe here is the focus, like you said, is we’re doing as much of the processing, data capture and processing and decision making at the edge, close to the process, rather than trucking all that data back to the server room and back and forth, right?
Brian Mock – ECS
And there may be instances we got to truck the data back. We’ll talk about that in a minute. But the idea is that as the cameras get better, and they are like your cell phone, it’s incredible the tools that we have at our disposal that we didn’t have three, four a month ago. I mean, there is data, there’s bleeding edge, there’s cutting edge. We get to do both. But some of the technologies we know are out there really translate well into just moving that stuff closer to the process.
Josh Eastburn – Host
Great. With that as a foundation, yes, I’d love to hear about the project that you’re doing with International Paper, and especially, what do you think? I mean, they represent such huge influence in the market, right? They’re like a benchmark is maybe the way to say that. Absolutely. I’d also love to hear what you think about what that indicates about where the industry is going.
Brian Mock – ECS
It’s been an honor to work with IP from the proof of concept study. We were chosen as the vendor to then deploy safety-based systems across all of their assets 100%. Andy Silvernell, who’s the CEO, just made a statement that said that we will not tolerate any more fatalities or just near misses as well. I mean, unfortunately, in the paper industry, if you look at the data, people die every single year, and these are all avoidable. And so they did some analysis on where those potential high-risk areas are. And so about two years ago, we started on a proof of concept study. And what’s unique is, as far as I know, it’s the first time in not only paper, I believe, but we’ll just say for paper, that an Edge device has been used to do red zone human detection, and then in that same device, communicate that condition to what’s called the PLC, which is you can think of it as the brain of the paper machine. In order for it then to take I think that machine or that statement that there’s someone in the red zone, that’s one sensor, it then matches to as the machine on, as the gate open, as the gate closed, it’s called a truth table, to actually stop the machine.
Brian Mock – ECS
First time, I think that any paper, and what’s unique is it’s Edge and The speed at which it’s done is under a second. I mean, in some cases, it’s a 10th of a second. There’s all these elements of the solution you got to consider, false positives, false negatives, which is really accuracy, the ability to communicate to the PLC, single points of failure, like if it doesn’t work, what happens? Does the machine run? All of these considerations, they pioneered the layers of safety concept to make it something that is now standard on every one of their machines. I mean, that’s a lot of teeth cutting right there.
Josh Eastburn – Host
It’s amazing that they’re using that technology to raise the standard for the whole industry, right? I mean, this will set a pattern that others are going to need to follow.
Brian Mock – ECS
100 %, Josh. Yes. And not only is it going to be in these high risk areas, because it can be deployed pretty easily on edge devices, it can be, again, from the wood yard all the way to the shipping department. It’s not just maybe in this case on a part of the machine that historically showed it would be the highest risk, but on every location. Because ultimately, as we know, we don’t know all the behaviors. But this brings those to the surface so then we can understand what was maybe even a near miss to a near miss and all those other translations of, Well, how can we do this better?
Josh Eastburn – Host
Are you able to talk about the scope of this project? How many sites are we talking that thing?
Brian Mock – ECS
Yeah, I think, generally speaking, you can see online that IP has roughly 25 paper machines that are still running in the United States. So we’re going to call that the asset-centric 25. The paper machine has a couple of different processes that are around it. So you’re averaging anywhere from 15 to 30 cameras per potential asset, maybe less, because you’re going to look at maybe the paper machine, you’re going to look at the winder, you’re going to look at roll handling, you’re going to look at the highest potentials for there to be an interaction with a machine that’s not going to well. But they’re going to expand that, I’m sure, to the weight into the paper machine, the roll handling woodyard, which is where you have big trucks driving around. Not a lot of people out there, but still, it only takes things that you don’t know what you don’t know until it That’s a problem. So these are several hundred cameras, I think, is the way to answer, that are deployed across their North American operations and some in Europe. So there’s not a paper machine, a winder, or a roll handling system that’s not going to be covered by by another layer of safety.
Brian Mock – ECS
And in this case, it may be the only layer of safety for a roll handling where it’s just a roll going down a two miles an hour down a conveyor line. There may not be a gate around that. But now with these camera systems that can also provide a three-dimensional understanding of the human presence in that area. It’s a rich data set, absolutely rich. There’s not going to be a single one that doesn’t have this protection.
Josh Eastburn – Host
Fantastic. Well, thanks for sharing that with us, and thanks for your time today. I would like to let people know where they can learn more about Event Capture Systems. So where can we find you online, first of all?
Brian Mock – ECS
So it’ll be events – ecsuptime.com is our website. We have a new one coming out. It’s fabulous. It’s got a good description of the layers of machine vision we do from the top, which is industrial safety for everything. And then the particular points of machine vision quality control from the beginning to the end. And then within there, you also see that we’ve got our biggest trade show coming up in April. It’s in Columbus, Ohio. It’s April 26th, the 27th. It’s called TAPPICon, I’ll be giving a paper on case studies and results of industrial safety using Edge devices. I mean, it’s going to be so much fun. We also do, Josh, a lot of work on LinkedIn. I like doing what I call chalk talks, where I’ll come up with one thing like Real quick, steam in a paper industry is a real big problem, right? So we can build the best machine vision solution in the world. But if it can’t see through steam to see what it’s looking at, all the money you invested in the sensor and the camera and the data backbone and the edge or whatever it is, is absolutely worthless. It’s zero because you can’t see anything .but what if you’re in a steamy environment? We know there’s technologies that can see through steam pretty well, like NIR. It’s a longer wavelength of light, so it minimizes the effect of steam. Light light just balloons it out. But the problem is, is if operators still have to walk in that area, the human can’t see NIR light. So if you need process lighting and you need what I’ll call machine vision lighting, it doesn’t work together because so many times we used to say, all right, we’re on the wet end. If the machine was two feet from you, you couldn’t see it. It’s that steamy. And so we put the really sophisticated NIR light that’s synchronized with the camera. It’s doing great, man. And then some maintenance guy comes over behind you and sticks a halogen light because it’s a safety problem if you can’t see because you’ve got a machine there that’s going to kill you if you get too close to. You can’t stumble and fall down a ledge or something. So we developed a technology that’s patented to us that actually the camera is only on and within a second, you’ve got a cycle.
Brian Mock – ECS
So there’s the camera time that absorbs the NIR energy. And then in that same, you call cycle, the the human, another light that is white light or broad spectrum also fire. So the human perfectly happy. And the camera also gets its data that’s not corrupted by white light. So end of day, what I’m getting at is that’s another chalk talk that we give. And it’s like everyone hates steam because it renders every machine vision system to its worst. It’s zero value. And so that’s the thing- That’s incredible, though.
Josh Eastburn – Host
I’m sorry to interrupt. You’re saying the two types of light are synchronized and alternating.
Brian Mock – ECS
Is that right? Correct. It is. And you’ve got these multimillion dollar, maybe not that much systems that don’t have any value. And all you do is you change the light, right? And then that solves the problem. So there’s those things we talk about that the ECS can bring to the table because we’ve been in the industry for so long, a I love walking around a paper machine. It’s a people process, then technology-based company. So you got to be in the paper mills. You got to be able to talk to the people to get the frustration and the pain points. And then you come up with stuff like that that literally changes the way that systems work in all times in order to continue to deliver what they expect, something as simple as a light.
Josh Eastburn – Host
Very cool. That’s the stuff you’re talking about on LinkedIn, or did you say it was YouTube?
Brian Mock – ECS
Well, it would be on YouTube as well. But LinkedIn still is our number one spot for having those quick two-minute videos. Like, this is a problem. This is how it’s solved. I’m on a chalkboard writing it out in hopes to… So much of what we do is transference of knowledge.
Josh Eastburn – Host
And TAPPICon, you mentioned earlier, is that a conference that’s specific to Pulp & Paper?
Brian Mock – ECS
Yeah, it’s about the only large conference left, so it’s well attended. It’s again in Columbus. There are other Pulp and Paper shows that would be overseas, but in terms of the US market, that remains to be the largest one. So it’s going to encompass tissue- We’ll include some links to that then. And cardboard and some nonwovens. It has a lot of different segments within that show. And it’s a technical conference as well. So there’s papers that are given.
Josh Eastburn – Host
Okay. Including one that you’ll be delivering, right? That’s right. Cool. Okay.
Brian Mock – ECS
He means in red zone detection and how it increases worker safety in PPE compliance or some very long title like that.
Josh Eastburn – Host
Right on, man. Okay. Yeah. So we’ll drop some links to that then so people can follow up with you and hopefully shake some hands and whatnot at the conference. Thank you so much for your time. This has been super interesting. Like I said at the beginning, unique, we haven’t taken this a deep dive into Pulp and Paper before, so I’m really excited to be able to share this with people.
Brian Mock – ECS
Hopefully, there’s more podcasts to come, Josh. All right? Yeah. Hey, all right.
Josh Eastburn – Host
Thank you so much.
Josh Eastburn – Host
Thanks to Brian Mock and the team at Event Capture Systems for taking us inside the reality of the pulp and paper industry. If you’d like to learn more about ECS, visit ecsuptime.com and check the show notes for links to their other channels as well. As Brian mentioned, he will be speaking at TAPPICon, that’s T-A-P-P-I-Con, in Columbus, Ohio, April 26th to 27th, where he’ll be sharing case studies and results from deploying edge-based safety systems in live mill environments. You can find links to that as well in the show notes. To keep up with how machine vision is evolving across many different industries, from pulp and paper to robotics and beyond, follow MVProMedia on LinkedIn and visit MVPromedia.Com for all of our coverage. For MVProMedia, I’m Josh Eastburn. Until next time. Be well.

















