ADAS: Market trends and emerging technologies

From the electric windows of the early 1950s to the latest car driving systems of today, the high-end features of luxury cars have finally been applied to mid-range and economical cars over time, becoming a must-have electronic and electrical system. . The recently emerging Advanced Driver Assistance System (ADAS) technology is no exception. As an example, Europe's Ford Focus vehicles now feature adaptive cruise control (ACC), automatic braking and active lane keeping – all of which were previously exclusive to luxury cars. Even the economical Kia Motors has a rearview camera installed. However, with the application of ADAS technology to relatively inexpensive vehicles, this presents a dilemma: the need to implement large amounts of computing resources at very low prices.

This article refers to the address: http://

ADAS technology can be used not only in luxury cars, but partly because of competition considerations, but not the only factor. Government regulations are also an important factor. For example, in the United States, the National Rapid Transit Authority is in the process of enacting a policy to force the installation of rearview cameras, and it is expected that the number of on-board cameras will reach 60 million by 2018. The EC will enforce the Advanced Emergency Brake System and Lane Departure Warning (LDW) system in 2015.

The discount on insurance for vehicles equipped with ADAS is another factor driving the widespread use of ADAS. Such offers have a certain statistical significance. When the driver starts to deviate from the lane, the system will issue an alarm. At night, the system can also enhance the driver's visibility, thus avoiding the accident and saving lives.

The decisive factor in the widespread adoption of ADAS is cost. Although ADAS technology is becoming more and more complex, advances in sensor and processor technology—incorporating multiple functions into fewer components—can now support engineers designing ADAS at affordable prices for mid-range and even economical cars. application. Reduced costs and reduced complexity through functional integration are key factors driving the widespread adoption of ADAS technology in a wide range of vehicles.

Functional diversity

In all of its expectations, ADAS technology has also brought many challenges to the automotive industry. Just as many technologies are in their early stages, ADAS applications have involved many development directions, and it is not clear which direction will ultimately drive the market. At the time of this writing, Hitachi focused on using two forward camera multiplex sensors to detect objects 100 meters away. This technology is called "eye vision" and has been applied to the 2013 Legacy and Outback models. Denso recently demonstrated a sleepiness detection system that uses an infrared (IR) camera to look at the driver's face to determine if the driver's eyes are open, and if the eyes are closed, the driver may have gone to sleep. Finally, Aisin is launching a lane detection system that uses a rearview camera—a cost-effective way to monitor the position of the vehicle relative to the lane markings. It combines GPS and road map data to determine the current position of the vehicle relative to the road ahead.

This article will give a more detailed introduction to the rapid application of ADAS technology in a broader market.

System: Lane departure warning

Sensor: camera

When the vehicle leaves its lane, or approaches the edge of the road, the LDW system emits an audible alarm or an action alarm (either by a slight vibration of the steering wheel or the seat). These systems begin to function when the vehicle speed exceeds a certain threshold (eg, greater than 55 miles) and the vehicle does not turn on the turn signal. When the vehicle is traveling and its position relative to the lane markings indicates that the vehicle is likely to deviate from the lane, the lane markings need to be observed through the camera system. While these application requirements are similar for all vehicle manufacturers, each vendor uses a different approach, using a front view camera, a rear view camera, or a dual/stereo front view camera. For this reason, it is difficult to adopt a hardware architecture to meet a variety of different types of camera requirements. A flexible hardware architecture is required to provide different implementation choices.

System: Adaptive Cruise Control

Sensor: radar

In the past decade, luxury cars have adopted ACC technology, which is now being used in a wider market. Unlike traditional cruise control technology designed to keep the vehicle moving at a constant speed, ACC technology adapts the speed to traffic conditions. If it is too close to the front, it will slow down and accelerate when the road conditions permit. To the upper limit. These systems are implemented using a radar mounted on the front of the vehicle. However, since the radar system cannot recognize the size and shape of a certain target, and its field of view is relatively narrow, it is necessary to combine the camera with the application. The difficulty is that the cameras and radar sensors currently in use are not equipped as standard. Therefore, there is still a need for a flexible hardware platform.

System: Traffic Sign Recognition

Sensor: camera

As its name suggests, the Traffic Sign Recognition (TSR) feature uses a forward camera combined with pattern recognition software to identify common traffic signs (speed limit, parking, U-turn, etc.). This feature alerts the driver to the traffic signs in front of them so that the driver can follow them. The TSR function reduces the possibility of drivers not complying with traffic regulations such as stop signs, and avoids illegal left-turn or unintentional other traffic violations, thereby improving safety. These systems require a flexible software platform to enhance the detection algorithm and adjust it based on traffic signs in different regions.

System: Night Vision

Sensor: IR or thermal imaging camera

The Night Vision (NV) system helps the driver identify objects in very dark conditions. These objects generally exceed the field of view of the headlights of the vehicle. Therefore, the NV system gives an early warning to the vehicles driving on the road ahead to help the driver avoid collisions.

NV systems use a variety of camera sensors and displays, specifically related to the manufacturer, but generally fall into two basic types: active and passive.

· The active system, also known as the near-IR system, combines a charged-coupled device (CCD) camera with an IR light source to present a black-and-white image on the display. The resolution of these systems is high and the image quality is very good. Its typical viewing range is 150 meters. These systems are able to see all objects in the field of view of the camera (including those without heat radiation), but in the rain and snow environment, the efficiency is greatly reduced.

· Passive systems do not use external light sources, but instead rely on thermal imaging cameras to capture images using natural thermal radiation from objects. These systems are not affected by the opposite headlights and are not affected by severe weather conditions, with detection ranges from 300 meters to 1000 meters. The disadvantage of these systems is that the image is grainy and its function is limited by warmer weather conditions. Moreover, passive systems can only detect objects with thermal radiation. Passive systems combined with video analytics technology clearly show objects on the road ahead of the vehicle, such as pedestrians.

In NV systems, there are a variety of architectural choices, each with its own advantages and disadvantages. To increase competitiveness, automakers should support a variety of camera sensors to implement these sensors on a versatile, flexible hardware platform.

System: Adaptive High Beam Control

Sensor: camera

Adaptive High Beam Control (AHBC) is a smart headlight control system that uses cameras to detect traffic conditions (opposite and co-directional traffic conditions) and, depending on these conditions, brighten or dim the high beam. The AHBC system allows the driver to use the high beam at the maximum possible illumination distance without having to manually dim the headlights when other vehicles are present, without distracting the driver's attention and thus improving the safety of the vehicle. In some systems, it is even possible to separately control the headlights to dim one headlight while the other headlights are normally lit. AHBC complements forward camera systems such as LDW and TSR. These systems do not require a high-resolution camera, and if a vehicle has a front-view camera in an ADAS application, the price/performance ratio of this feature can be very high.

System: Pedestrian / Obstacle / Vehicle Detection (PD)

Sensor: camera, radar, IR

Pedestrian (and obstacle and vehicle) detection (PD) systems rely entirely on camera sensors to gain insight into the surrounding environment, for example, using a single camera or using a stereo camera in a more complex system. The “category variables” (clothing, lighting, size, and distance) can vary widely, the background is complex and constantly changing, and the sensors are placed on mobile platforms (vehicles), making it difficult to determine the visual characteristics of pedestrians on the move, so The use of IR sensors can enhance the PD system. The radar can also enhance the vehicle detection system, which provides excellent distance measurement and excellent performance in harsh weather conditions, which can measure the speed of the vehicle. This complex system requires the use of data from multiple sensors simultaneously. (This process, called sensor fusion, will be discussed in detail later.)

System: driver sleepiness alarm

Sensor: In-car IR camera

The sleepiness alarm system monitors the driver's face, measuring its head position, eyes (open/closed), and other similar alarm indications. The system will issue an alarm if it is determined that the driver is showing signs of going to sleep, or if it appears to be unconscious. Some systems also monitor heart rate and breathing. Ideas that have not been realized, including the ability to drive the vehicle closer to the roadside, end up stopping.

Demand: Flexible, high-performance technology platform

Although it is difficult to predict in detail the future development of these functions, to what extent will be applied in the future, but from a technical point of view, there are several points that are clear:

· There is no single architecture that can meet the emerging needs of various applications.

· A flexible platform is needed to adapt to market trends and achieve the latest capabilities while meeting cost, planning and performance goals.

· To meet the high performance requirements of ADAS applications, balance should be achieved in software and hardware.

· The system uses several different types of sensors to perform safety-related tasks, and such systems will be more robust in the future.

Signal fusion

It is important to note that most ADAS applications require processing and analysis of multiple signals from multiple sensors, including video cameras, radars, infrared sensors, and other sensor signals such as lasers that may appear in the future. For example, hazard detection requires more than just integration and analysis of data streams from multiple cameras, and radar data must be used if it is to be used in all weather conditions. The term sensor fusion is used to describe the integration of different signals in an ADAS application.

One algorithmic solution to the problem of signal fusion is Kalman filtering, which combines a wide variety of algorithms. This is a good example of how complex the ADAS task is. For example, Kalman filtering can integrate video and radar input signals and use this data to generate a snapshot of the current environment. It then applies a process called "dead reckoning" on these snapshots to calculate what happens to the surrounding environment "may" based on physical conditions. For example, it estimates the new location of the surrounding vehicles, determines that the trees on the side of the road are not moving, and so on. Then, Kalman filtering compares these two types of snapshots and estimates what should be done based on the credibility. For example, if the car uses ACC and the head is too close, you can slow down or brake.

The role of the ECU

The electronics that control these processes are integrated into an engine control unit (ECU). These systems were originally applied to automobiles to meet environmental standards by controlling fuel injection, ignition time, idle speed, etc., and now these systems are becoming more and more functional. In general, these functions (for example, radar and camera) are developed separately and integrated into the vehicle one by one, resulting in multiple functions being performed by multiple ECUs. From a cost perspective, this is not the ideal approach. A better approach is to connect the output of many sensors to a processing unit, perform sensor fusion, more reliably understand the condition of the vehicle, and develop appropriate countermeasures in the software. However, the computing power required for this integrated approach has far exceeded the cost budget for mid-range vehicles. In order to be able to meet both performance and cost requirements, as flexible as software, many architectures use Field Programming Array (FPGA) technology.

The difference between an FPGA and a conventional integrated circuit is that the FPGA is programmable after leaving the factory. This means that it can be modified to meet the latest sensor output format requirements and the latest calculation algorithms or specifications at the end of the design/product cycle, making it ideal for use in a variety of architectures, at the last minute. New features.

Altera's low-cost Cyclone® family of FPGAs is perfectly suited for automotive applications, enabling engineers to integrate sensor interface, image processing, communication interfaces and analysis into a single silicon device, reducing the number of components in a typical ECU architecture , reducing the complexity. Altera FPGAs use hardware acceleration to handle 720p30 (720 scan lines, 30 frames per second) data streams for high-definition video, as well as 77-GHz inputs for next-generation sensors. Unlike the full integration of these algorithms into hardware, the high-performance hardware and software programmable features of Altera® FPGAs ensure the stringent 25 to 30-ms latency requirements of ADAS applications.

Altera FPGAs are not only technically superior, but also shorten the design cycle, enabling OEMs and Tier 1 manufacturers to bring ADAS applications to market as quickly as possible, and further enhance the business value by not significantly changing the hardware. In addition, Altera offers a simple and convenient application-specific integrated circuit (ASIC) pin-compatible migration path that reduces the cost of the entire product lifecycle.

to sum up

Competition and regulatory pressures in today's global automotive market have prompted manufacturers to adopt ADAS technology flexibly in their vehicles. Altera's flexible combination of technology and performance metrics is the ideal device to meet these goals.

Single Phase Ac Motor

Single Phase Ac Motor,Ac Single Pole Motors,Single Phase Motor,Single Phase Motor Connection

Changzhou Sherry International Trading Co., Ltd. , https://www.sherry-motor.com