Painting robots can save time and waste, and the technology is not as far off as you may think.
At the International Manufacturing Technology Show (IMTS) show in Chicago earlier this month, I had the opportunity to share with clients and potential clients information about a new automation application that involves some good, solid here-and-now technology as well as some not-quite-ready-for primetime, emerging technology that is rather exciting. The application involves painting robots, coupled with laser scanning and non-contact encoding.
The painting applications I have typically encountered come in one of two flavors: 1) either fixturing a part up and loading it into a booth (or on a carriage in moving line applications), or 2) hanging it on a rack to be conveyed through a paint booth. The former is generally used to support robotic painting applications, while the latter is used in manual operations. Both are accompanied with significant costs that can be eliminated with the application of technology.
To understand this new application, first imagine a system that eliminates the requirement for extensive fixturing required for robotic painting, where both the cost of the fixture and the time to fixture and position the part is eliminated. Then, imagine a system that eliminates the inconsistency and waste associated with manual painting.
Imagine a system that eliminates the requirement for extensive fixturing required for robotic painting, where both the cost of the fixture and the time to fixture and position the part is eliminated.
If you’re struggling to imagine how those costs can be eliminated and still have a system that results in a quality painted part, it’ll help to have some insight into the technologies that enable this.
Laser scanning is a powerful tool that can capture a 3D model of parts either as they sit stationary or as they are conveyed down a line. With this technology and properly developed algorithms, a system can be designed to identify parts (differentiating between parts, so now we can mix parts on a common line) and determine their position and orientation in space, thereby eliminating the need for fixturing.
With this technology alone, many lines could be equipped with automated robotic painting by scanning a part once it is moved into the paint booth or as the part is conveyed into the paint booth. For example, using the 3D model to define the path of the robot, and then dispatching the robot to paint. When parts are being conveyed, all the major brands of robot include an auxiliary axis that can tie in the conveyor axis for seamless integration with the robot, making painting on a moving line relatively straightforward once the path is generated.
The process becomes more challenging when we start considering parts that move relative to the conveyor as they are conveyed. Think of parts that are hanging from a chain as they are moved through a paint booth. Those parts may sway or rotate as they are conveyed. To solve this challenge we need to look at non-contact encoding, which provides the ability to track these movements in 3D space. The challenge here is doing it fast and accurately enough to close the loop with the motion of the robot as it is painting.
As difficult as this may seem, I am here to say that robotic painting as I’ve described here is coming soon and is not as far fetched as you may think. We have already had good success in testing similar applications. The key is to incorporate a known target that does not move relative to the part or to pick a target point on the part. The scanning system then locks onto this target and monitors its position and orientation, which translates directly into the part’s position and orientation. This motion is then passed onto the robot, which uses it to adjust its path accordingly, effectively canceling out the swaying or rotating motion. There are some constraints with speed, but the tools are already here and with each new processor release from Intel, it gets that much better.
As an automation solution partner, it is important that we keep our eye on these emerging technologies and how they can be applied to benefit our clients. This may be a bit premature, but I would like to go ahead and invite you to come check out our booth in Chicago at IMTS 2016. As sure as I am to be there on booth duty, this technology is sure to be on display.