SparkCognition
Using AI-Driven Cameras to Increase Safety at Facilities

Using AI-Driven Cameras to Increase Safety at Facilities

Oct. 18, 2022
Using automation to scan for potential issues allows safety professionals to focus on high-value decisions.

After about 18 minutes, a human being looking at camera footage loses half of their ability to notice aberrations. But an artificial intelligence-based software system integrated with a camera can view hours of footage without losing focus. It’s able to spot activities and behaviors that could become safety problems and send an alert to a safety manager so accidents can be prevented.

These types of solutions use computer vision and AI to capture and interpret rich media content like video and still images. And the good news is that many companies already have cameras installed so the software can be readily integrated.

“There are over one billion surveillance cameras in the world, and most are passively recording,” explains Jaidev Amrite, senior director of product management for SparkCognition, a software company based in Austin, Texas, that counts among its customers and partners Boeing, Chevron, Dell Technologies and Siemens Energy. “This means that when an accident occurs these tapes are reviewed to see what happened, develop lessons learned and create lifesaving rules. But that’s a retroactive type of process.”

A much more efficient and safer way to use the cameras is to let AI constantly scan the facility and alert safety teams when it discovers an abnormality. “This way the people can focus on high-value decisions, and this is a multiplier for safety operations,” says Amrite. The company’s technology is already being deployed by 17 major companies with 120,000 cameras in industries such as manufacturing, distribution and transportation.

The computer vision algorithms can interpret image inputs, including:

  • Image classification
  • Object localization
  • Object recognition
  • Object verification
  • Object detection
  • Semantic segmentation
  • Object tracking.

AI does this across the organization and is always looking at the facility. Only when certain types of conditions develop does the system alert the safety teams. This way humans can focus on high-value decisions, instead of spending time monitoring for incidents. “It’s the problem-solving abilities of humans that become the AI code written into the technology,” says Amrite.

The Right Context for Safety

Sercan Esen, CEO of Intenseye, an EHS software platform based in New York, would agree with Amrite as to the more effective role EHS professionals can play. “Until now, EHS teams have relied upon manual, employee-led procedures to identify safe and unsafe acts happening in the workplace,” he said in a statement. “This involves individuals taking time away from their current roles to walk the shop floor in an attempt to identify unsafe acts as they happen.” A better method, notes Esen, whose company works with Fortune 500 companies, is to provide software-driven programs designed specifically for EHS professionals that can use privacy-preserving computer vision that are configured to run 24/7 safety inspections.

The company’s data provides what it calls core AI features that can be activated to detect unsafe acts and conditions. The data comes from companies using Intenseye’s platform. When there is an unsafe act or condition, an image and a 12-second video of that moment are displayed on the platform. It is sent as a real-time notification.

The areas covered include:

Emergency alerts, which can be used to get immediate notifications in situations such as when a worker falls to the ground or there is a fire-related emergency.

Area controls can be used to prevent workers from entering restricted areas and minimizing occupational exposures through time limits.

Vehicle controls can improve traffic safety by ensuring the safe operation of vehicles.

Behavioral safety can be used to avoid line-of-fire injuries such as pedestrians in vehicle paths, nonuse of handrails, and violations while working at heights.

Ergonomics AI detects risky actions for the body and prompts the manager to take the situation under control before any hazardous conditions arise.

PPE detection, which includes gathering data detecting the use of helmet, glove, apron, sleeve, respiratory protective equipment, reflective vest, and glasses

Housekeeping can be used to reduce the risks of slips, trips, falls and collisions by ensuring that vehicle paths and pedestrian walkways in a facility are always clear and safe.

Armed with data from those areas, a company is in a better position to determine what areas need improvement. “When the system captures unsafe conditions, we encourage our clients to talk to the workers and teams at the frontline, openly,” Gokhan Vildiz, Intenseye’s business development director, told Wire Journal International. “It is so crucial to invite the frontline workforce to talk about the reasons why the unsafe act or situation occurs. We strongly believe that ‘context’ drives the behavior. Thus, we steer the leadership of our clients to address the systemic level issues, to be able to provide the right context where people can work safety.”

Raising the Safety Standard

Continuously improving safety is how Chooch, an AI computer vision platform company based in San Francisco, envisions the future for this type of technology. “We believe computer vision is a fundamental part of our digital future and the applications are practically infinite—replicating any visual task,” said Emrah Gultekin, CEO and co-founder of Chooch, in a statement. “AI models can be trained and deployed on an extremely wide range of tasks, giving you a great deal of flexibility. That’s why organizations of all sizes and industries are now applying computer vision to improve efficiency and accuracy, boost their productivity, and cut costs.”

The company, which partners with such high-tech companies as Microsoft, Nvidia and Lenovo, serves the manufacturing, retail and other sectors and sees an increased adoption of the technology, pointing to a recent IDG/Insight survey in which a majority of respondents said that computer vision will boost their revenue, and 37% planned to implement this technology in the future. (Currently,  only 10% of respondents are using computer vision, while 44% are still investigating it.)

The reason for this level of investment, according to the IDG/Insight report, is that the majority of those surveyed believe computer vison has the potential to affect key business areas, including growing revenue (97%) and saving time and money (96%).

“We were not surprised to find computer vision squarely in the awareness phase. It’s an extremely complex emerging technology that requires a significant investment, with an average return of two to three years and real-world examples just starting to materialize to prove the business case,” said Amol Ajgaonkar, chief technology officer of Intelligent Edge, Insight, referring to the study results.

When asked for the best use case in a variety of industries, the respondents noted that computer vision can improve their organizations in several ways. For instance, the elimination of tedious, expensive or dangerous work is a motivation for 58% of manufacturers and 49% in retail and wholesale distribution. Augmenting current processes and improving employee experiences is a driving factor for 47% in the energy sector, 46% in healthcare and 43% in manufacturing. And 53% in the energy and utility sector, and 41% in transportation, say it’s a way to stay ahead of the competition.

The energy (56%) and healthcare (51%) sectors recognize that the technology can help to deliver new, more innovative products and services to their customers. Only 44% of retailers ranked this outcome as a priority, indicating that they may be missing an opportunity for growth and differentiation.

From a market perspective, in 2021 revenue for the technology was $15.9 billion and is expected to hit $51.3 billion in 2026. Part of the reason for this increase is that by 2025 video analytics will be a standard element in two-thirds of new video surveillance installations, compared with less than 30% in 2020.

The breath and capability of this technology will continue to grow, Amrite believes. “The beauty of having visual AI is that it can easily monitor not just human behavior but also see behind the machines. For example, it would be able to determine if they are emitting toxic chemicals. And the field will continue to integrate further into wearables so that as we become more tech literate we can raise the safety standard in all environments.”

Sponsored Recommendations

ISO 45001: Occupational Health and Safety Management Systems (OHSMS)

March 28, 2024
ISO 45001 certification – reduce your organizational risk and promote occupational health and safety (OHS) by working with SGS to achieve certification or migrate to the new standard...

Want to Verify your GHG Emissions Inventory?

March 28, 2024
With the increased focus on climate change, measuring your organization’s carbon footprint is an important first action step. Our Green House Gas (GHG) verification services provide...

Download Free ESG White Paper

March 28, 2024
The Rise and Challenges of ESG – Your Journey to Enhanced Sustainability, Brand and Investor Potential

Free Webinar: Mining & ESG: The Sustainability Mandate

March 28, 2024
Participants in this webinar will understand the business drivers and challenges of ESG and sustainability performance, the 5 steps of the ESG and sustainability cycle, and prioritized...

Voice your opinion!

To join the conversation, and become an exclusive member of EHS Today, create an account today!