I am going to de-rail this and talk about how I am actively using AI in my line of work as I think it is a better use of my time for this blog post. In addition, I think it’s cool (although my definition of “cool” may be different than others’ lol). I work for an environmental technology company that specializes in using beams of UV light to measure gases along long paths. We do both community work and industry work. Our industry work includes monitoring emissions along oil refinery fencelines. I work as a data analyst for this company, and we use data in many different ways.
Since we develop this technology and maintain it, QA is a big aspect of data analytics in our business. Our systems operate on a real-time basis, and specifically that means we get new data every 5 minutes from all of our systems at every project we manage. This is a lot of data. As a company, it is important to us that we are producing high quality data. This means the numbers accurately reflect reality and the systems are turned on and operational as much as possible. Since starting at this company, I have learned this is no easy task. and this is where we get to AI integration.
A lot of times there are indications the systems aren’t operating optimally from the data. However, sometimes these can be very subtle, and it has been shown to us very clearly that a human cannot sit there and go through all the data that we’re producing on their own. In order to combat this, we are working on creating machine learning models that can track the system performance as data is coming in, and alert us if there are issues that may need to be addressed.
One way that you can tell a system is operating properly is by comparing to other systems nearby and making sure they match. To do this well, we will often measure additional gases that are always in the air and are not the target pollutants we’re measuring. This means that is something goes funky with the measurement of this extra gas, we know the system isn’t working. We are investigating ways to use AI and machine learning to create correlations between our systems and publically available data (such as data from local government agency sites) to make sure the measurements are similar. If they do not match, then we have an indication our instruments are not working properly.
Another aspect of machine learning we are investigating is natural language models. All of the work we do is based on Quality Assurance Project Plans (QAPPs). If the QAPP is strong and implemented well, the resulting data produced will be the same. Regulatory agencies are required to review these to evaluate the success of a project. We are investigating ways in which an AI can review the project plans and give feedback on what it does well and what it doesn’t. This would allow for a fast way for government agencies to approve/reject QAPPs and for contractors to create QAPPs more easily without as much time going back and forth for review.
Visualization and UI are a big part of these projects as well, and they serve as interesting examples as to what businesses are looking to do with these types of tools, especially in writing and data analytics.