Machine vision and imaging webcast schedule for 2018
In addition to understanding the numerous disciplines required to design today’s high-performance machine-vision and image-processing systems, engineers must stay up to date with the latest technology, standards, and product developments as they become available. One of the ways we try to help with this is through our monthly webcasts. Check out the calendar for the remainder of the year here.
In addition to understanding the numerous disciplines required to design today’s high-performance machine-vision and image-processing systems, engineers must stay up to date with the latest technology, standards, and product developments as they become available.
One of the ways we try to help with this is through our monthly webcasts, which offer tutorial-based presentations that we hope help to replenish your knowledge or to bring you up to speed. Here is a look at the rest of our webcast schedule throughout the year.
3D imaging: Discover the current and future methods and applications
3D imaging can be accomplished in various ways, including single and multi-camera methods, structured light systems, and those based on the Time of Flight principle. With the ever-increasing market for augmented and virtual reality systems, sensors capable of real-time 3D video are hitting the market at a feverish pace at prices never dreamed of only a few short years ago.
In a November 28 webcast from Daniel Lau, Professor of Electrical and Computer Engineering at the University of Kentucky, attendees will learn all about these different methods of 3D imaging, in terms of both hardware and software, and which is most appropriate for a given machine vision or image processing application. He will also discuss the current state of imaging research underway that will define the market for imaging hardware in the future. This webcast will conclude with a Q&A period.
What You'll Learn:
- Various methods of 3D imaging
- Different applications of 3D imaging
- Hardware and software options
- Examples of 3D imaging products
Who Should Attend:
- Anyone wanting to learn about 3D imaging
- Scientists, engineers, designers, and managers
- Current developers who require a more in-depth understanding of the underlying technology
- Those considering 3D imaging hardware/software in future projects
- End users/OEMs
Specifying vision system components to fit your application: Cameras, lenses, lighting, optics and filters
When it comes to designing and deploying a vision system, success is contingent upon choosing the correct components for your application. If you have the right camera but the wrong lens, your lighting is insufficient to illuminate a certain region of interest, and so on; your vision system will not function correctly, and you’ll be left wondering what went wrong.
In a free webcast on December 12, Perry West, founder and president of Automated Vision Systems, Inc., will provide in-depth information on identifying the right cameras, lenses, lighting, optics, and filters to suit your particular imaging need. He will discuss how all of the components work synergistically to accomplish an automated imaging task, and how to avoid making the mistake of choosing the wrong components.
What You'll Learn:
- How to choose the correct cameras, lenses, lighting, optics, and filters
- How all of these components work together in a vision system
- How choosing the wrong components will impact your vision system
Who Should Attend:
- End users and systems integrators using machine vision
- Those interested in learning more about how these imaging components work together
- Those new in machine vision wishing to learn more about designing and deploying vision systems
*Calendar is subject to change.