-
Food, feed & confectioneryAdvanced materials
A profound change is underway in how we drive. Vehicles are becoming more automated and more autonomous, not only assisting the driver but taking over key decisions. To achieve this, cars must learn not only to see, but also to communicate with their environment. Innovative optical technologies play a central role in achieving this goal.
Janet Anderson
As any city driver knows, safely navigating unpredictable urban environments requires not just quick and accurate reactions but also fast thinking and good judgement. Is the motorcyclist ahead about to turn and cross your path? Has the child at the side of the road seen you? Do you have space to overtake the vehicle ahead before the road narrows? Scores of complex events unfold around the car at every moment – the person at the wheel has to be aware of everything that matters and is responsible for the decisions and actions they take.
That has been the standard since the invention of the automobile. But now the industry is moving toward a new vision as more of the tasks traditionally carried out by the human are taken over by the machine, with the goal of making traffic situations safer for all of the various participants. Adaptive cruise control slows the car down when it comes too close to the vehicle in front of it; the blind spot assistant warns of vehicles approaching from the side; the lane assistant notifies the driver if the car unintentionally drifts to the edge of the lane.“It started with technologies that assist the driver and help them to better anticipate and overcome dangers, such as emergency braking assistance, improved rearview mirrors, and intelligent headlights,” says Dr. Steffen Runkel, Head of Optics at Bühler Leybold Optics. “But increasingly, it means the car itself is making the decisions.”
The technology that enables cars not only to sense their environment but also to take sensible decisions comes from the field of optics.
Optical sensors are at the heart of this. Sensors can scan the environment and detect the size and distance of objects. Projection modules are then needed to communicate relevant information to other road users in the vicinity – a prerequisite for autonomous driving. Both sensors and projection modules can be integrated into various parts of vehicles. The smartest location for them, however, is most likely within advanced lighting units, as these have to be integrated into the car body in any case.
To scan and detect the environment accurately, the sensors need to be able to sort out the important information from distractions, like reflections from sunlight, for example. In other words, they need the kind of filter that the human brain applies to the information we receive from our senses.
Leybold Optics in Alzenau, Germany, specializes in this area. Over 20 years ago, its HELIOS sputter coater technology was developed and is used to manufacture exactly this type of filter. Known as a band-pass filter, it consists of a specific sequence of nanometer-thin optical layers created using the sputtering method. “A material like silicon or tantalum is used for the coating. We call this the target. It is placed as a block in the sputter cathode,” explains Dr. Runkel. “With the help of an energetic plasma, individual ions are created which bombard the target material. This ejects individual silicon or tantalum atoms out of the target material, which in turn condenses on the filter. By adding oxygen gas, these layers oxidize and become transparent. This results in several nanometer-thin layers of various materials. Depending on the composition, they filter different wavelengths and can thereby block unwanted reflections from sunlight or the distracting light from other vehicles when driving at night. The sensor detects only the light that is wanted, sent out from a laser, and reflected back by the surroundings of the car.”
It started with technologies that assist the driver and help them to better anticipate and overcome dangers. But increasingly, it means the car itself is making the decisions.
Dr. Steffen Runkel,
Head of Optics at Bühler Leybold Optics
Currently LiDAR systems are installed on the car’s roof, arranged at various angles so that they can capture all of the surroundings. However, such systems will only reach market maturity when the components are smaller and can be integrated into existing car components, such as headlights.
“The entire industry is working in top gear on this, both established companies and start-ups. There are dozens of start-ups around the world that have their own ideas of how LiDAR technology could work in cars in the future,” says Dr. Runkel. “There is a lot of testing and development going on. The system must be 100 percent reliable; 95 percent is not enough. It also needs to be more affordable. As the system is more widely applied in the automotive industry, the costs will come down.”
The Leybold Optics team works closely with customers and partners to keep track of the technology that is being used. “We are in regular contact with research institutions in Germany, France, and Belgium, and with automotive manufacturers and their suppliers,” says Herbig. “We offer them the opportunity to test their creative ideas and developments in our Application & Training Center in Alzenau.”
The Application & Training Center includes a 1,200-square-meter test area, a high-tech lab, and a highly modern research and development area with two HELIOS systems. There is also a DLC (diamond-like carbon) machine, which is used for manufacturing items such as cameras for night vision. The forward-facing cameras need to withstand enormous loads, like stormy weather and use in heavy traffic. The DLC can coat the outer camera window to make them very resilient.
According to a report published by the consulting firm McKinsey in January 2023, even if the road to autonomous driving is proving longer than the first visionaries expected, there is a consensus gathering around its potential to transform transportation and society as a whole. Today, most cars include basic advanced driver-assistance systems.
By 2030, the consultants forecast that the value of the hardware market to support autonomous driving, including domain control units, cameras, sensors, LiDAR, and radar, could be between USD 55 billion and USD 80 billion. The combination of all three of these technologies is seen as the way forward because each of these sensors works at a different distance.
“Many of these ideas are still in development,” says Dr. Runkel. “It is already clear that the car of the future will learn to see through a variety of smart sensors based on optical and radar technologies. Our solutions can support the industry in driving toward this goal and making mobility safer for everyone.”
Gupfenstrasse 5
Uzwil
9240
Switzerland