Machine vision systems integrator spotlight: Phelps2020

Oct. 30, 2015
In this article, Besma Abidi, President of Phelps2020, Inc, discusses the development of imaging software and hardware solutions, the use of machine vision in outdoor environments, and the state of the machine vision market. 

In order to provide our readers with information on how to obtain the right technology orsystems integrator for their application, we are profiling major machine vision and image processing distributors and integrators from across the globe.

In this article, Besma Abidi, President of Phelps2020, Inc, discusses the development of imaging software and hardware solutions, the use of machine vision inoutdoor environments, and the state of the machine vision market.

Company name:Phelps2020, Inc.
Headquarters:
Knoxville, TN, USA
Year founded:
2009
Regions served:
United States and international market
Services offered:Systems integration, imaging software and hardware development

How have market changes and customer demands changed the way that you’ve approached business?

With the widespread increase in outdoor vision needs, from unmanned aerial vehicles to self-driving cars, a large portion of the video data gathered today remains unexploited because of harsh and uncontrollable outdoor viewing conditions. Illumination issues, such as high dynamic range, dark shadows, and low-light, and atmospheric scattering such as haze, fog, mist, and pollution hinder visibility and cost money and sometimes lives.

Our goal is to restore visibility and facilitate the visual and automatic interpretation of visible, infrared, and multispectral data under harsh illumination, in the presence of obscurants, or with bad heat contrast. Phelps2020’s integrates hardware and software solutions to solve machine vision problems under uncontrollable conditions. We offer real-time autonomous software packages that can help our customers uncover originally unseen information in their data. We also design camera-lens-software integrated solutions to accommodate range and resolution requirements of our customers. Our solutions can also be used indoors to relax lighting and other requirements on machine vision applications.

In what areas do you see the most growth?

We believe that machine vision in the outdoors will see the most growth. Automating the interpretation and exploitation of data from all the UAS out there is a big and unavoidable challenge. Applications include state and government surveillance, law enforcement, precision agriculture, oil and gas industry, wind mills and solar panels, power lines, dam and bridge inspection, search and rescue operations, natural disasters, the real-estate market, and the retail home delivery markets. Self-driving cars are another area where robust machine vision will continue to see growth.

Can you provide one example of a relatively new technology that you are utilizing?

Although FPGA and GPU-based Systems on a Chip are not new, we’ve seen in recent years a sizable increase in software tools that facilitate the programming of specialized hardware. This translated into a shorter time-to-market for embedded vision. Phelps2020 is using specialized hardware systems to miniaturize and speed-up its proven video exploitation algorithms.

What is one type of technological advancement or invention that you would like to see and that would benefit you, in terms of vision systems design/integration?

Although the industry made big strides toward creating translators from high level languages (HLL) to hardware synthesized code, these translators still lack the optimization levels of direct hardware languages. I would like to see better performing and more robust translators from HLL to synthesized hardware code catered to embedded vision.

Can you explain how you developed yourVideo Dynamic Range Correction technology and how you’ve seen it being used by customers?

Phelps2020 provides real-time fully autonomous video enhancement and exploitation technology with and without hardware in the loop. The technology mitigates obscurants, heat contrast, and illumination problems on-line and off-line in forensic mode.

The technology was developed on shared funds from Phelps2020 and a number of government contracts, originally for unmanned aerial systems. Phelps2020 was originally approached to solve the very annoying and limiting problem of washouts and very dark shadows in aerial video data. The technology then evolved to the entire visible and infrared spectrum and to solve for other atmospheric obscurants as well. Phelps2020’s technology has applications in all outdoor (or indoor) situations where a camera is used and better visibility and detail are needed on the fly in an autonomous fashion.

Have there been any recent examples of vision systems you’ve installed that are particularly unique or interesting?

Phelps2020’s system has been used in the biometrics industry for recognition under limiting uncontrollable illumination conditions. It has also been used in law enforcement and surveillance applications from unmanned aerial systems.

What is your take on the current state of the machine vision market?

The state of machine vision market, when indoors under controlled conditions, is very strong. On the other hand, machine vision in the outdoors, under uncontrollable lighting and atmospheric conditions, has fallen behind. Outdoor situations are variable, unpredictable and harder to handle, but it is a growing market and the need for robust machine vision solutions will be going up.

Is there a particular trend or product in the next few years that you see as “the next big thing?”

High resolution real-time video processing, exploitation, and dissemination will continue to grow. Any machine vision solution that can handle high volumes of data in real time will be in big demand. We see the area of embedded vision as being the next big thing.

What camera type do you think will be most popular in two years and why?

As we move towards aerial outdoor imaging, ranges become important for detail viewing and characterization, requiring higher resolution and higher bandwidth. High bandwidth of up to 12G is already manageable by the SDI standard; but the SDI standard lacks the ease of connectivity to common computers. For a practical common computer pluggable camera, the most recent USB3 vision standard looks promising.

Share your vision-related news by contactingJames Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox,click here.

Join ourLinkedIn group | Like us on Facebook | Follow us on Twitter| Check us out on Google +

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!