Partnership speeds development of car safety systems

Philips Semiconductors has recently signed a strategic partnership with MobilEye that could lead to the development of better autonomous driving systems.

Philips Semiconductors has recently formed a strategic partnership to manufacture a highly integrated System-on-Chip solution for automotive driver assistance applications, taking the first step towards the development of autonomous driving systems.

Philips Semiconductors and MobilEye will combine their respective expertise in IC creation and driver assistance systems to develop an ASIC design for applications such as Adaptive Cruise Control, Lane Departure Warning, Forward Collision Warning, and sensory fusion applications for collision mitigation and active safety.

The System-on-Chip solutions will reportedly deliver computationally intense (intense real-time calculation) applications for real-time visual recognition and scene interpretation, customised for use in intelligent vehicle systems.

The chip architecture is designed to maximise cost performance by having a fully-fledged application, such as a low-cost version of Adaptive Cruise Control from a single video source, on a single chip. The system, using sensors, can enable intelligent interpretations of the visual field such as detecting vehicles, pedestrians and road signs to provide an intelligent driver assistance system.

Even though the chip architecture is designed to have a fully-fledged application on a single chip, it is said to be sufficiently flexible and programmable to accommodate a wide range of visual processing applications outside of the automobile.

The pattern classification module is application-specific yet at the same time based on general principles, which can accommodate other classes of objects such as human faces and pedestrians.

The System-on-Chip functional capabilities include proprietary pattern identification techniques for segmenting out vehicles from the background scene under static and dynamic conditions; visual motion analysis techniques for isolating dynamically moving patterns and for estimating the host vehicle’s yaw and pitch rates; and image processing techniques for lane following and road path prediction.

Unlike conventional approaches, the technological architecture is designed to deliver the full range of capabilities from a monocular (single camera) video stream (in visible or IR spectrum), yet the chip architecture is designed to accept multiple sensory inputs, such as millimetre-wave or laser radar vehicle tracks for sensory fusion applications.

The first silicon samples are to be released for testing by end of 2002 with the target to be deployed on 2005 car models.