Machine Learning Boosts Oil Analysis

Share

Machine Learning Boosts Oil Analysis
An employee at Bureau Veritas runs a test on an oil sample. The company's oil analysis business found that artificial intelligence and machine learning technologies can accurately evaluate high volumes of samples faster, allowing its analysts to focus more on critical samples. Photo courtesy of Bureau Veritas

There is increasing talk in the lubricants industry about a range of technologies collectively referred to as Industry 4.0 and how they can be used to improve operations. An oil analysis company offered concrete examples during a conference last year.

Bureau Veritas incorporated artificial intelligence and machine learning technologies to increase both the speed and accuracy of oil sample analysis, allowing more time for human analysts to focus on problematic samples that most need their attention. That was the message Cary Forgeron, Bureau Veritas North American director for oil condition monitoring, shared at the ICISELGI North American Industrial Lubricants Congress in Chicago in September.

Forgeron presented a case study of predictive analytics, explaining the company’s use of artificial intelligence, which takes so-called big data to the next step by allowing computers to learn to better analyze data as more information is fed into the program.

The challenge for businesses that aren’t technology giants, he said, is learning how to best take advantage of what’s been termed the Digital Revolution. “What we found out through this project is you don’t have to be a Google or Microsoft to take advantage of what’s going on,” he told attendees.

Bureau Veritas performs oil analysis for lubricant end users, typically operators of industrial plants. Customers take samples of lubes such as greases and metalworking fluids and send them to Bureau Veritas laboratories where they are tested. Analysts then review the data and classify the samples as normal and requiring no action; abnormal, which requires corrective action; or critical, which means equipment must be shut down for immediate maintenance or repair.

Growing Fast

Forgeron recounted how Bureau Veritas struggled to expand the analysis business of Analysts Inc. after acquiring it in 2014. Before the transaction, Analysts had four laboratories that examined a million samples per year. Afterward it expanded to 17 labs, including a number in countries around the globe, but continued to staff just 15 analysts, and the number of samples only grew by 20 percent.

Investigation showed that if analysts reviewed all results, they could only spend an average of 3.5 minutes on each. The system also created opportunity for human error and led to inconsistent recommendations since they were written by humans.

“In working with our customers, what we realized is they really don’t care about the normal samples,” Forgeron said. But analysts were spending the same amount of time on normal samples as abnormal and critical samples that needed more attention.

Bureau Veritas saw an opportunity for digitization to assist in the oil sample analysis process. It established a system that used computers to identify and process normal samples, freeing analysts to deal with abnormal and critical results and to interact with customers. Average time that analysts spent on those samples increased to 15 minutes, and comments to customers became more uniform.

Forgeron explained that this accomplished the company’s goals, which were to improve results as well as efficiency – not to replace humans with machines. “I think when a lot of people hear about artificial intelligence or automation, they think of people losing jobs,” Forgeron said. “That’s not really the case in this.”

Machine Learning

Artificial intelligence and machine learning can sound intimidating to those who don’t work in those fields, but their functions are fairly accessible. “Machine learning is getting computers to learn and act like humans and improve that learning over time,” Forgeron explained. “What we want the computer to do is start to learn. But we’ve got to teach it first.

Machine learning requires teaching the computer, developing models for learning, then continually feeding in data. This learning loop includes a confidence factor. [If] its 90 percent confidence that this is normal, we can release it. If its not 90 percent, it comes back to our data analysts. They review it, and it feeds into the machine.“

Bureau Veritas data set for analyzing oil samples is quite large. The company may process up to 1.2 million samples in a year, including elemental analysis for each test.

“We’re looking at 27 or 28 individual test results per sample,“ Forgeron said, adding up to more than 28 million data points. “Start to add in OEM makes and models and fluid types, brands, manufacturers and grades – you can see these data points increase drastically.“

The next step was data preparation, recognizing that – without training and learning – artificial intelligence is not so intelligent when it comes to, for example, recognizing that subtly different spellings and iterations of a company’s name mean the same thing.

“You have to go through and scrub that data,“ he said. “That’s where you start to build some of these models. If you teach the machine that those are the same or very similar, the machine can start to learn or see that in the future.“

Once test result meanings were established, the company had to teach the machine how to return useful recommendations. All the while, the artificial intelligence continued to learn and evolve.

Even though Forgeron and his colleagues work in laboratories for customers, its now become all about the data. “I don’t talk about test tubes and needles to customers anymore, he said. I’m talking about the data – what they can do with the data and what we’re doing with the data to improve their business.“