PAWS Featured on Mashable

Source: [Mashable]

Dr. Xiaofan (Fred) Jiang’s lab, at the Data Science Institute of Columbia University, has a scenic view of the Hudson River and easy access to busy streets where he can test his latest invention — smart headphones that warn pedestrians of road dangers. 

Small circuit boards sit on a gray table covered with mathematical equations written in pencil. They’re specifically engineered for different versions of the prototype — the latest of which is a modified pair of $10 overhead headphones Jiang ordered from Amazon. The idea, he said, is to build parts that can one day be integrated into mass-market headphones.

Jiang's lab re-engineers commercial headphone using custom circuit boards that help an app signal road dangers.
Jiang’s lab re-engineers commercial headphone using custom circuit boards that help an app signal road dangers.

The headphones were re-engineered with the custom circuit boards, as well as four extra microphones, meant to detect street sounds. That information is sent to the circuit board to extract useful information, and then it’s transmitted to a custom smartphone app for AI analysis. 

The app is trained to differentiate car sounds from background noise; calculate the distance and position cars relative to the headphone wearer; and alert them of surrounding dangers.

Jiang explains how the custom circuit board is wired to four microphones the lab planted on the headphones.
Jiang explains how the custom circuit board is wired to four microphones the lab planted on the headphones.

Jiang offered to start our day with a lab demo. Once I was finally able to attach one of the microphones to my uncooperative mock-neck collar, Steven Xia, the lab’s PhD research assistant, handed me his phone and instructed me to hit “start” on the app.

“Let’s hope it works,” Jiang joked. “What is it called — Murphy’s Law?”

Xia positioned himself at a 45-degree angle to my right and played sounds of a moving car from a speaker he was holding. No alerts from the headphones.

“I’m supposed to hear notifications, right?” I asked Xia, who proceeded to adjust the volume on his phone. Xia repositioned himself and continued playing car sounds. The headphones beeped, and the phone buzzed. A moving red dot on the app accurately showed me the simulated car’s position relative to where I stood.

The red dot in the app signals a car's position relative to the headphone users.
The red dot in the app signals a car’s position relative to the headphone users.

Xia explained that the red dot had moved because the app interprets distance based on the volume of the car. The louder the car gets, the closer the car is.

“Right now, it’s about how the loudness is changing, right? But it’s not actually changing,” Jiang added. “This is really just better at showing the direction. We should go to the streets and do something more realistic.”

As we headed toward a street nearby, Jiang told me how he tested the headphones at four different locations, both urban and suburban, to experiment its efficacy in different settings. 

“It’s funny because I drive an electric hybrid car,” Jiang added, noting that quiet electric cars and hybrids can sometimes be challenging for the system to detect. But he also mentioned how legislation in the EU requires hybrid and electric cars to generate artificial noise, and how others are advocating for cars to emit digital signals which his system can be modified to detect. 

Noticing a loud generator near our destination, Jiang recommended that we pick a location that would minimize any unforeseeable disruptions. Xia suggested a place just across the street and around the corner, where he had tested the headphones before.

“Are we comfortable with jaywalking? With our system, you can be more comfortable doing that,” Jiang said sarcastically. “No, no, no. There’s one thing we try to reassure: We’re not trying to desensitize pedestrians’ attention. We do not encourage jaywalking.”

Steven Xia, the lab's PhD research assistant, conducts a roadside test.
Steven Xia, the lab’s PhD research assistant, conducts a roadside test.

We settled at a location where a generator rumbled in the background. This time, I suggested that Xia put on the headphones so that I could observe. The red dot hovered around the screen, where the generator was located. “I think this is from that noise back there,” Xia said.

“Also the airplane,” Xia added as an airplane approached us overhead. The red dot bounced along the movement of the plane. Jiang noted to himself that the lab would need to sample and record more airplane sounds and label them as “not car” so the AI can learn to differentiate cars from airplanes.

Jiang observes the headphone's performance for adjustments in the future.
Jiang observes the headphone’s performance for adjustments in the future.

“This is why I say we’re still about a year away from when we can transfer it to the university’s [venture capital partner]. When the scenario is ideal, it’s what we expect. But there are a lot of these random things — it has false positives,” Jiang said, noting that these roadside tests often worked better in suburban environments. “But this is something we need to solve. Urban is where the problem is.”

Jiang told me he had experimented with other options that would detect vehicles better, such as cameras and LiDAR technologies used by self-driving cars. “But they’re power hungry; we want to be able to power this with a cell phone battery,” he said. “And we discussed privacy as another problem.”

But he also mentioned the idea of a smart city: If in the future, cities are equipped with sensors that track every moving object, then the headphones will be much better at warning everybody of potential dangers.

A few cars drove by again, and the app worked perfectly those times.

“Every time we come out we have to think about how to improve this. Sorry this didn’t work out as well as I thought it would,” Jiang said as we walked back toward campus. “But maybe next year we’re going to meet up again.”

—by Haidee Chu