Two ASU entrepreneurs are taking their computer vision technology to the streets. Through their startup, Argos Vision, the pair are developing smart traffic cameras that can passively capture, analyze and deliver droves of data to help cities improve road safety and efficiency.
A graphical representation of the Argos Vision technology. Photo illustration by Travis Buckner
By Pete Zrioka
July 18, 2022
It’s said that nothing is certain, except death and taxes. Let’s add a third certainty to that list: traffic.
All across the globe, traffic engineers and city planners are locked in an eternal struggle to improve the flow of traffic, the efficiency of streets and the safety of pedestrians, cyclists and drivers. Finding the best way to meet these goals requires an enormous amount of data, which is often difficult to collect and analyze.
Two Arizona State University entrepreneurs are making this data easier to understand and access. Mohammad Farhadi and Yezhou Yang founded Argos Vision, a tech startup developing smart traffic cameras that can passively capture, analyze and deliver droves of data to help cities improve road safety and efficiency.
The pair created a self-contained, solar-powered traffic camera that uses on-board computer vision, a type of artificial intelligence, to identify and classify what it sees.
“We identified three major things we wanted to accomplish with this technology,” says Farhadi. “Cost reduction, privacy protection and rich metadata extraction.”
Installing traffic cameras can be costly to local governments. Closing intersections to add new power and network cable to existing infrastructure is a lengthy and expensive process. Argos Vision solves this financial roadblock with a self-contained camera system that runs off solar power and transmits data over a cellular network.
“We want to extract rich data that meets not only the minimum desire of cities, such as vehicle counting, but data that can be used in the future as well,” says Farhadi.
Named for the many-eyed giant of Greek myth, the Argos algorithm can also capture detailed contextual information, including type of vehicle, dimensions, color and markings. It can also develop a 3D model of vehicles for future reference.
Distinguishing vehicle type could be helpful for road maintenance. Roads degrade at different rates depending on their use, and understanding which vehicles use which roads at high rates may help cities better allocate resources and predict where preventative maintenance is most needed. For example, an Argos camera might observe large trucks commonly using a shortcut to access an industrial area.
“At that location, a city might elect to reinforce a road so they don’t have to replace it every year,” says Farhadi.
Despite the detailed information the Argos Vision technology collects, it does not employ any facial recognition or collect identifying information to protect the privacy of everyone on the road.
Argos extracts detailed information using a novel software framework developed by Farhadi. As the Argos cameras take images, a neural network analyzes the images’ content and distills it into its component parts. Much like how our brains can quickly distinguish what we see into separate parts — person, dog on a leash, bus stop — a neural network uses a similar process to contextualize information.
Traditionally, neural networks are computationally and power intensive, especially on small devices such as cameras. But Argos Vision’s software allows their neural network to run on low power and provide real-time traffic monitoring that collects incredibly detailed data, says Yang.
A new point of view
Say a city wants to figure out why the intersection of Main Street and 1st Avenue is frequently congested. The city might send someone to observe traffic, or put down road sensors to count cars, or use mobile phone sensors to estimate the number of drivers in the area.
The problem with these methods is that the data collected is imprecise. Human observation only offers a snapshot of traffic and is prone to error. Road sensors don’t differentiate between buses, cars or emergency vehicles. Mobile data can’t tell whether 15 phone signals passing through an intersection represent 15 drivers or a mix of drivers, bus riders and pedestrians.
“This doesn’t give you a clear picture, because these are snapshots of data. Traffic has a dynamic nature,” says Farhadi. “The beauty of using a computer vision–based system like ours is that it gives cities a permanent, precise flow of information.”
Yang and Farhadi also see potential for the Argos system to augment and improve the function of autonomous vehicles.
“We can provide autonomous vehicles with situational awareness of other vehicles or pedestrians outside the scope of their on-board sensors,” says Yang. “Also, our rich metadata could help local authorities measure how safe the AVs are while operating on public roads.”
“Many of these research ideas, I have to attribute to Mohammed, thanks to his constant exploration of what is possible,” adds Yang.
The permanent flow of data supplied by Argos cameras can help cities evaluate more than just motor vehicle traffic. It could also help policymakers and city planners improve safety for all road users.
“Pedestrians are a big factor in street traffic,” says Farhadi. “Arizona has one of the highest pedestrian fatality rates, and we want to understand why that is happening and how to prevent it.”
Taking it to the streets
Argos cameras will be lending its vision to Arizona streets starting this summer, helping improve road safety for all users.
Both downtown locations — near City Hall and ASU’s Downtown campus, respectively — were chosen for their high pedestrian activity, says Simon T. Ramos, a traffic management and operations engineer in the Phoenix Street Transportation Department.
Along with collecting standard traffic information like number of vehicles, pedestrians and cyclists, the Argos camera will be cataloging near miss data.
“Say there's a close call, where a vehicle crosses the path of a pedestrian. We can identify these conflict hotspots,” says Ramos.
Through its persistent monitoring and evaluation, Argos’ data will identify conflict areas between vehicles, bicycles and pedestrians. Ramos and his department can use the near miss data to then develop tailored safety measures to mitigate such conflicts, such as changing signal timing or the visible markings on the road.
The city already has an array of traffic cameras collecting data, but Argos provides a more cost-effective alternative than existing systems.
“What really kind of drew our attention to this specific technology was it is economically cheaper than the competition,” says Ramos. “Phoenix is committed to working smarter and spending wisely and it’s an ongoing effort to identify technologies to improve travel times and reduce congestion and accidents.”
The Argos Vision team is looking forward to contributing to the city’s goals while refining their technology.
“Together with the city, we are excited to bring advanced AI technologies from ASU onto Arizona roads for social good,” says Yang.
Getting caught in traffic
Farhadi and Yang’s collaboration goes back to 2016, when both were newcomers to ASU.
“The school organized a student recruitment session, and I brought a poster of my research,” recalls Yang. “Four or five people stopped by, but Mohammad was the only person who was interested.”
Combining Yang’s expertise in computer vision and Farhadi’s background in hardware acceleration and computer networks, Argos Vision was born. When they begin looking for the most lucrative use of their technology, they first landed on shopping malls.
“We focused on tracking the movement and amount of people to improve the HVAC efficiency in a retail area,” says Farhadi.
However, they found this route to be a dead end. Not only were a lot of competitors pursing this application, but stores simply weren’t willing to justify the installation cost to save on heating and cooling. Retailers also wanted to a system that could tell them more about their customers.
“We couldn’t tell you everything about somebody,” says Ryan Kemmet, Argos’ business and legal advisor. “We don’t have facial recognition and we can’t link people to their Facebook account or anything.”
Kemmet was drawn into the Argos orbit when Farhadi and Yang joined the National Science Foundation Innovation Corps Site at ASU (NSF I-Corps). The five week training program, led by the J. Orin Edson Entrepreneurship + Innovation Institute, includes entrepreneurial training, industry mentorship and financial support for researchers looking to commercialize their technology research. Kemmet served as Argos’ industry mentor during their ASU I-Corps participation, which serves as a springboard for the nationwide NSF I-Corps program. After completing the ASU program, they were selected to continue onto the national version.
“It’s quite an intensive program,” says Kemmet. “We went through some initial ideas of what we thought the applications of this technology would be, but it was the work in the national I-Corps program that helped us define the beachhead application for this technology.”
I-Corps, along with Farhadi and Yang’s professional experience and interests, ultimately led Argos to traffic monitoring. Farhadi learned about the growing need for active traffic monitoring during a 2020 summer internship with the Ford Motor Company. Yang saw the potential from his work with the Institute of Automated Mobility, which brings together academia, government and industry to develop a safe, efficient ecosystem to support testing and adoption of autonomous vehicles in Arizona.
Getting in the driver’s seat
Prior to participating in I-Corps, Yang and Farhadi participated in a number of Edson Entrepreneurship + Innovation Institute programs to strengthen their venture and connect to resources and entrepreneurial communities.
Argos joined Edson E+I Institute’s Venture Devils in 2020. The program provides mentorship and support to fledgling businesses, social enterprises and nonprofits founded by ASU students, faculty, staff and local community members with ties to ASU. The program includes an opportunity to participate in Demo Day, a biannual pitch competition where Venture Devils startups make their case for investment to a range of funding sources. In the fall 2021 Demo Day, Argos secured $6,500 in funding.
“It’s a powerful resource,” says Farhadi of Edson E+I. “Coming from Iran, I had entrepreneurial experience, but the U.S. has a totally different culture, totally different business landscape. Edson E+I has connected us with the right people, like Ryan, and really propelled Argos Vision.”
In Iran, Farhadi ran a business providing internet-based phone service and network security to remote regions. He watched his father found and operate a telecom company from a young age, which left an impression on him.
“Iran is a consumer country, most of the time technology is imported from elsewhere,” he says. “But when my father starting selling his devices in country, suddenly there was trust in a local company. That’s something I’ve tried to pursue in my life — people trusting your work.”
Despite entrepreneurship being a family tradition, starting a company wasn’t on his mind when he came to the U.S. to study. However, Farhadi relishes the opportunity to forge his own path.
“When you work at a company, you work within someone else’s system, you have specific goals that are assigned to you. You might be able to achieve them however you want, but they aren’t your goals,” says Farhadi. “As an entrepreneur, you create your own system. You set your own goals.”
Yang, recently named a Fulton Entrepreneurial Professor, says Edson E+I resources and programs are preparing entrepreneurs in AI like himself and Farhadi for very timely opportunities.
“As a professor in AI, I wouldn’t have been interested in entrepreneurship 20 or 30 years ago. The technology was just not ready,” he says. “Right now, we’re at a very special time, where the technology is maturing and the market is very hungry for real world applications. So having the connections and resources facilitated by ASU and Edson E+I to find those applications has been very helpful.”