Rambus, Lensless Smart Sensor Prototyping

Prototyping + Hardware

NOTICE – This project was completed while working at frog design. All rights to the project belong to frog design and the client. Portfolio Link.

Rambus asked frog to investigate use-cases and develop future application prototypes for their Lensless Smart Sensor (LSS) technology. The Rambus LSS is optical sensing that delivers on package, power and price by replacing traditional lenses with tiny diffractive optics. It is small (1-2 sq. mm), low power (can last five years on a coin battery), and well-suited for computer vision applications since it reconstructs a scene at a wide angle. After research into a number of verticals, and exploration into the development kit built in C++ and OpenCV on the Raspberry Pi, we settled on three focus areas: smart city lighting, drone collision avoidance, and eye-tracking for future VR headsets and AR glasses.

  • Rambus
  • Semiconductor
  • Project Lead, Lead Developer
  • 2015
  • frog Rambus Prototyping
  • ARM Community
  • OpenFrameworks, C++, OpenCV, Raspberry Pi, Arduino, Anki Drive, Python

Rambus provided a developer kit built on a Raspberry Pi with a rudimentary C++ SDK that would capture image data from the LSS sensor. Modifications were made to this sensor payload for each of the aforementioned focus areas. We demoed the results of the program in partnership with Rambus at industry summits in SF and Munich.

The eye-tracking demo used OpenCV to separate the pupil and find the center-point of the user’s eye. We prototyped a number of sensor/IR LED orientations on different pairs of glasses. For the final demo we 3D printed a custom housing and were able to allow different users to move a small target around on the screen.

Sensors with such small weight and power consumption are good candidates for use on drones. For the optical collision avoidance use case we build a custom 3D printed housing for the Raspberry Pi payload, a transformer, and the LSS sensor.  The computing system was powered by the onboard power (meant for a GoPro camera) and control was sent directly from the Raspberry Pi to the drone over the drone’s local Wifi network. In the final demo it responded to a user raising their hand to stop forward momentum.

The smart cities use-case imagined a future scenario where the LSS could be dispersed among existing city infrastructure like IoT dust. Using its thermal sensing capabilities could enable individual street lamps to detect approaching cars or pedestrians and turn on only when needed. We hacked an Anki Drive racing system and made custom miniature street lamps to simulate the scenario.