Sensor System Companies Take Center Stage in a Sel
Post# of 218

NetworkNewsWire Editorial Coverage: Self-driving cars are already appearing on our roads. One of the main technological barrier holding them back from full use is the creation of effective sensor systems, and several companies are conducting specialist research in this area. Foresight Autonomous Holdings Ltd. (NASDAQ: FRSX) (TASE: FRSX) (FRSX Profile) has created a unique system that combines infrared and visible light cameras in stereo technology that can detect obstacles under all weather and lighting conditions. Google’s parent company, Alphabet, Inc. (NASDAQ: GOOG), is developing driverless cars through its Waymo subsidiary, using a wide range of different sensors. The work of Tesla, Inc. (NASDAQ: TSLA) in this area is well-known and heavily reliant on a range of visible light cameras. Automotive safety specialist Autoliv, Inc. (NYSE: ALV) has created a range of separate detection systems using different technologies. Apple, Inc. (NASDAQ: AAPL), on the other hand, is focusing on the potential of a single complex lidar system. It’s a diversity of approaches that shows a technology approaching maturity.
The Future of Driving
Technology commentators are predicting big things for self-driving cars. These autonomous automobiles are not just expected to save car users from the effort of driving. By making the most of efficient computing and by removing human error, these cars have the potential to improve the flow of traffic, reduce fuel usage and increase mobility for those who can’t drive themselves, such as the elderly and disabled. Despite alarmed responses to the idea of not having a human behind the wheel, self-driving cars are also expected to increase road safety and reduce accidents.
All of this — especially the reduction of accidents — is reliant upon the development of effective systems for the vehicles to sense what is going on around them and respond appropriately. Both the sensors and the processors dealing with this input are vital to making autonomous cars safe and effective. Radar and lidar have drawn the most attention, thanks to advances in these areas. Camera-based vision sensors have also seen significant advances.
New Detection Technology
Such a crucial area of technology needs specialist research and design to ensure that the best solutions are found. Foresight Autonomous Holdings (NASDAQ: FRSX) (TASE: FRSX) is a company focused on this specialism. Working through wholly owned subsidiary Foresight Automotive Ltd., Foresight is designing, developing and commercializing a range of technologies around detection systems for automated cars. These include stereo/quad-camera vision systems based on 3D video analysis, advanced algorithms for image processing and sensor fusion.
The company’s leading product is its QuadSight detection system. This stereoscopic automotive vision system uses two sets of stereo cameras — one infrared and the other working with visible light — to detect any obstacles on the road. It can detect obstacles regardless of adverse weather or extreme lighting conditions, making it a highly reliable option for self-driving cars regardless of the circumstances. It detects all obstacle, regardless of shape, form or material and color with near zero false alerts, which are the downside of highly sensitive detection equipment.
“At Foresight, we believe that a car’s vision system should be nothing less than perfect,” said Haim Siboni, the company’s CEO. “Vision is the foundation of passenger safety, and vision perfection under all weather and lighting conditions is clearly the breakthrough that vehicle makers need to build consumer confidence in order to accelerate autonomous vehicle adoption.”
Founded in 2015, Foresight has already completed a feasibility study for the QuadSight system, carried out extensive testing, and developed and produced a demo version. The company is creating a prototype for pilot projects so that the system can be tested out on the roads. It expects to see that system completed and commercialized during the second half of next year.
The first quad-camera multi-spectral vision solution of its kind, QuadSight uses advanced and proven image-processing algorithms and is derived from its major shareholder Magna B.S.P’s field-proven Homeland Security vision technology that has been deployed worldwide for almost two decades and is IP-protected by patents. With a fully developed system ready for demonstrations, 2018 is the year that QuadSight goes out into the world. So far, the company has done so with style.
A Strong Showing at CES
The QuadSight system drew a lot of positive press for Foresight during the International Consumer Electronics Show (CES) 2018. Given the focus on self-driving cars in recent years, a lot of public and press attention was on what detection systems could bring to the autonomous vehicle game, and QuadSight’s unique features caught people’s eyes.
Electronic Design presented an article that went into detail on the Foresight system (http://nnw.fm/ym4Us). The article discussed the range of the detection system and the fact that it can detect details better than the human eye, with the detection of small objects allowing it to operate at high speeds. The site also covered the key technical difference between QuadSight and many of its potential competitors — the fact that it uses a passive system that processes all the visual information already available in the world around it rather than having to send out signals as lidar and radar do.
Automotive World highlighted the cost benefits of Foresight’s system (http://nnw.fm/wT5F4). Using multiple sensory technologies increases the cost of a self-driving vehicle, both through the sensors themselves and through the processors needed to deal with the information they provide. QuadSight provides a complete detection system based on purely visual inputs that could eliminate the need for complementary sensors and their processing support.
For EE Times, the focus was on the unique combination of infrared and visible spectrum cameras (http://nnw.fm/cf6RI). The fusion of these two technologies allows QuadSight to detect obstacles both day and night and at any weather condition. They also combine to achieve both ranging and imaging, allowing the car detect how far away the object is without any need for additional sensors.
The Pattern Recognition Problem
The way that QuadSight uses its sensory data may give it another advantage compared with leading competitors. Some self-driving initiatives rely on pattern recognition as a means of detection and to help the car judge whether or not there is a hazard. This is believed to be the technology used in Tesla’s efforts to create autonomous vehicles. It relies on the system recognizing the form of an object as a mean of detection and then using this information to judge how to react. If this is true, then the pattern-recognition technology may be behind the crashes (http://nnw.fm/w2gqD) that have brought unwelcome attention to Tesla’s on-road testing.
QuadSight does not use pattern recognition as a mean of detection but uses unique algorithms to detect any obstacle regardless of shape, form, material or color. It’s a technology that gives the system an advantage in responding to unexpected events — one that might have detected the fire truck involved in the most recent Tesla crash this month.
Finding Solutions for Self-Driving Sensors
A number of companies are working on sensor technology for automated cars, whether in isolation or as part of developing whole vehicles.
Alphabet, Inc. (NASDAQ: GOOG), the parent company of Google, is one of the leading players in the creation of driverless cars through its Waymo subsidiary. Its vehicles detect objects through a wide range of technologies, including sonar, stereo cameras, lasers, lidar and radar. These systems serve different purposes, from generating a map of the vehicle’s surroundings to identifying the presence of other vehicles and judging the speed at which they are moving. It’s by bringing these data points together that the system can judge what is going on.
One of the great modern tech innovators, Tesla, Inc. (NASDAQ: TSLA), is famous for its work in developing autonomous cars. Cameras play a big part in Tesla’s detection technology. These are always mono-visible light cameras, so the system doesn’t have the ability to see in conditions where only infrared sensors can detect objects. Recent experiments with trifocal mono cameras are expanding the system’s detection capacity by considering views at varying distances.
Autoliv, Inc. (NYSE: ALV), the world’s largest automotive safety supplier, has developed a wide range of detection systems designed as additions to the information available to a driver, as well as options for increasingly automated cars. Its technology includes radar, lidar and a variety of camera technologies such as mono vision, stereo vision and infrared. This range of sensors provides car manufacturers with a variety of options to detect hazards on the road. Its various styles of camera currently exist as separate solutions, not an integrated system bringing their data together.
Starting in 2014, Apple, Inc. (NASDAQ: AAPL) began work on producing an electric car. This project has since been scaled back to the creation of autonomous driving systems that could be applied to other manufacturers’ cars. The company has been tight-lipped about its efforts, but revealed last year that it is working with a lidar-only system (http://nnw.fm/lzyL4). Some have argued that lidar alone can’t provide sufficient information, but Apple aims to use complex computing and artificial intelligence to make a complete lidar-based solution.
As various companies race to develop self-driving cars, their sensor systems will be vital. A company whose system can operate safely in all conditions, without the extra costs of multiple sensor types of massive processing, will have an edge in dominating this important market.
For more information about Foresight Autonomous Holdings, visit Foresight Autonomous Holdings (NASDAQ: FRSX) (TASE: FRSX).
Please see full disclaimers on the NetworkNewsWire website applicable to all content provided by NNW, wherever published or re-published: http://NNW.fm/Disclaimer

