Strategic partnership speeds development of automotive active safety systems
FEBRUARY 5--Philips Semiconductors (San Jose, CA; www.semiconductors.philips.com), a division of Royal Philips Electronics, and MobilEye BV, a privately held company, are forming a strategic partnership to manufacture a highly integrated system-on-chip solution for automotive driver assistance applications.
FEBRUARY 5--Philips Semiconductors (San Jose, CA; www.semiconductors.philips.com), a division of Royal Philips Electronics, and MobilEye BV, a privately held company, are forming a strategic partnership to manufacture a highly integrated system-on-chip (SoC) solution for automotive driver assistance applications, taking the first step toward the development of autonomous driving systems. Philips Semiconductors and MobilEye will leverage their respective expertise in IC creation and driver-assistance systems to develop an ASIC design for applications such as adaptive cruise control to maintain safe headway distance in cruise-control mode, lane-departure warning, forward collision warning, and sensory fusion applications for collision mitigation and active safety.
The SoC solutions will deliver computationally intense (that is, intense real-time calculation) applications for real-time visual recognition and scene interpretation, customized for use in intelligent vehicle systems. The chip architecture is designed to maximize cost performance by having a full-fledged application, such as a low-cost version of adaptive cruise control from a single video source, on a single chip. The system, using sensors, can enable intelligent interpretations of the visual field such as detecting vehicles, pedestrians, and road signs to provide an intelligent driver-assistance system. Even though the chip architecture is designed to have a full-fledged application on a single chip, it is sufficiently flexible and programmable to accommodate a wide range of visual processing applications outside of the automobile.
The SoC functional capabilities include proprietary pattern-identification techniques for segmenting out vehicles from the background scene under static and dynamic conditions; visual motion-analysis techniques for isolating dynamically moving patterns such as passing and crossing vehicles and for estimating the host vehicle's yaw and pitch rates; and image-processing techniques for lane following and road-path prediction. Unlike conventional approaches, the technological architecture is designed to deliver the full range of capabilities from a monocular (single camera) video stream (in visible or IR spectrum), yet the chip architecture is designed to accept multiple sensory inputs, such as millimeter-wave or laser radar vehicle tracks for sensory fusion applications.
The architecture includes multiple ARM946 programmable central microprocessors for driving general-purpose computations and application-level programming and four application-specific modules for image preprocessing, motion analysis, pattern recognition, and lane following. The architecture includes 2.2 Mbit SRAM of on-chip memory for efficient image memory management.
To maximize cost performance, peripheral circuits are integrated, including dual CAN, PROM, and SDRAM controllers, parallel I/O, and image data input units. The SoC will be manufactured using the leading CMOS 0.18-μm technology, as installed in several Philips-owned wafer fabs. The product will receive full cabin-grade automotive qualification.
First silicon samples are to be released for testing by end of 2002, with the target to be deployed on 2005 car models.