At CES 2017, Osram held a unique place as a provider of lighting systems, including the four-channel LiDAR laser announced in November. It’s the first time a four-channel LiDAR laser has had the capability to enable scanning in which the light beam is deflected by a MEMS chip. It’s tailored for use in self-driving cars.
As a lighting solutions company, Osram has to predict what original equipment manufacturers will require from its products. Of course, both groups must not only adhere to but predict government regulations. In the radar and LiDAR space, government regulations are growing with the automotive industry. To keep up with the conversation, Product Design & Development spoke to Rajeev Thakur, Product Marketing Manager from Osram Opto.
The first parameters that have to be determined,Thakur said, are speed and lighting. There are few federal rules about autonomous driving, so the automotive industry is still working to define rules for autonomous cars. The speed of the car determines the required specifications for headlights and sensors, because the car must be able to spot obstacles in time to stop for them: a car going 100 miles an hour needs a 190 meter range of sensing.
Osram and Innoluce (now owned by Infineon) partnered to create the MEMS LiDAR, which uses a lens to focus light onto a MEMS mirror. The mirror scans in a vertical line 2,000 times per second.
The Innoluce LiDAR is anticipated to be available in 2020, and is currently at the proof-of-concept stage. It has integrated drivers to allow control of the mirror itself and comes as a package with the controller and the mirror. Demonstration units are not yet complete, but the end product will ideally be only about an inch deep, the size of an iPhone. Including four laser diodes means the entire module can be surface-mountable together even though the lasers can be individually controlled, which reduces assembly cost and time.
The four laser package includes a trigger signal that allows it to control the laser emanation, as well as additional triggers for the laser, the MEMS, and the receiving element array, and a master computer chip. With this setup comes some challenges, including the need to crunch a lot of data. The information sent by the lasers must define an object and come to an agreement between two sensors (radar, LiDAR, or cameras) on what that object is, then devise how to react to it. Weather and time of day affect lighting such that the qualifications for defining an object are complex and full of variables.
Another challenge is the struggle to make the device both small and robust. Industry experience is key here; companies already know how to make devices small and rugged. Ensuring the LiDAR operates at effective range, resolution, and field of view is more difficult.
The laser has about a 16 degree field of view, with each laser covering 4 degrees vertically. This is particularly important in autonomous cars because of the need to detect objects above or below the vehicle. The lasers use 84 watts per channel, although customers can get about 120 watts at 40 amps if they customize the laser diode.
Another constraint is pulse width. Customers want a smaller pulse width in order to supply more power, and right now Osram is aiming for under 5 nanoseconds of pulse width.
As customers look for sensors with a smaller and smaller pulse width, eye safety also becomes a concern. A small pulse width with a high-peak power both clarifies the signal and keeps the infrared light within safe parameters.