Back
Hadrien Diesbecq The 16 March 2022

The 3 essentials to operationalize your AI projects

Operationalizing artificial intelligence projects requires optimal data preparation

According to the consulting firm Gartner, almost half of all artificial intelligence projects within companies never get beyond the POC (proof of concept) stage. However, artificial intelligence has never been so much in the news in recent years. Artificial intelligence refers to all the programs capable of simulating human intelligence, and its rise was particularly marked in the early 2000s with the emergence of big data. By exploiting very large volumes of data, algorithms are able to perform tasks of impressive complexity in a short time. This is evidenced by the dazzling victory of the DeepMind program developed by Google, which in 2017 largely defeated the South Korean Ke Jie, the best player in the world at the time.

Since then, machine learning techniques have continued to improve and many fields of application have opened up: whether to automatically translate speeches in real time, to drive autonomous cars or to help doctors analyze their X-rays.

But what about companies that want to use the power of these models to improve their business performance? There is no shortage of use cases: determining the most relevant marketing content, sales forecasting, logistics optimization, dynamic cost and price management, etc. But it’s a fact that companies have a lot of trouble operationalizing such projects, which take an average of 9 months to put into production.

Here is a brief overview of the problems encountered and the solutions that exist to solve them.

1st essential: make project integration fast and easy

The implementation of an artificial intelligence project, for example to automate the analysis of legal contracts, often requires the setting up of a POC. This involves a dedicated team developing a program capable of performing the requested task on a small scale: in this case on a few contracts of lesser importance. If this first step aims to demonstrate the feasibility of a project, it is not enough. It must be possible to deploy it in an operational way. The success of the scaling up of an AI project depends on :

– Its ability to easily integrate into the company’s data ecosystem. Questions arise: How does the program access the data (contracts in our example), knowing that they can be located in various data lake or data hub type architectures (read our article on the subject here)? How does it deliver its results? Are they easily exploitable by other visualization or analysis platforms?

– Its ability to be easily handled by non-expert users. For example, an artificial intelligence tool for contract analysis is primarily intended for lawyers. Its ergonomics and speed of execution are therefore fundamental elements for it to be fully adopted.

– Its ability to meet security and data protection requirements. Controlling access and adapting its use to current regulations, such as the RGPD, are essential. For legal contracts, the data processed is very sensitive and the algorithm that processes it must be easily auditable and secure to avoid any leak or malicious use.

These 3 dimensions: integration, ease of use and security must therefore be taken into account from the project design stage. To satisfy all these requirements, a methodology exists and consists in linking the development, deployment and maintenance of a project by promoting a permanent dialogue between all the teams working on it. This methodology is called ModelOps and is inspired by DevOps, which is now well established in software deployment.

2nd essential: precisely quantify the business impact of the project

Like any industrial project, the successful implementation of an artificial intelligence project depends on its ability to generate revenue for the company. However, given the complexity of its development, which requires the mobilization of several experts over a long period of time, its profitability is far from guaranteed. According to Gartner, only 1 company out of 10 manages to operationalize more than 75% of its AI projects. This is mainly due to poor cost estimates, development delays or even a questioning of the strategic relevance of the project, which leads to the abandonment of POCs that are technically promising.

To avoid this type of pitfall, it is important for companies to :

Systematically apply a financial analysis on the project by quantifying the risk factors and the expected return on investment according to the development and deployment costs.

Prioritize projects that are at the heart of the company’s business activity (in areas such as marketing, sales or supply chain) and that can have a significant influence on key performance indicators (growth, number of customers, …).

Build a multidisciplinary team, composed of various profiles, in charge of the project. Teams composed only of data scientists tend to fail to operationalize their POCs. It is therefore essential to integrate specialists in the company’s field of activity, right from the design phase, as well as experts with various specialties (data engineers, developers, etc.).

61570a0fab436ed24a39686f_Operationnalisation

3rd essential: do not neglect quality training data

The transition from a POC to an operational deployment is only possible if the accuracy and reliability of the results are not degraded. Indeed, the performance of an artificial intelligence algorithm is highly correlated to the quality of the data that feeds it. While a POC can work very well on a small set of rapidly prepared data, its performance is not guaranteed when it comes to processing large volumes of data, especially since the number of errors is often very high. Because the data is handled by human operators, it is often very heterogeneous and therefore difficult to use, which slows down the production of AI projects.

It is estimated that data scientists spend up to 80% of their time preparing their data. Moreover, this task can be complex if the company’s sector of activity requires expertise to understand them (for example in the medical field). Interpreting data can then lead to errors that affect the results of the algorithm.

In particular, textual data are among the most problematic because they are written by human beings, with cognitive adaptation capacities that machines do not have. A typical example is the use of abbreviations that make it difficult for an algorithm to identify text fields: “frt” for “fruit” or “yt” for “yogurt” for example.

The major challenge for companies is therefore to industrialize data quality in order to have efficient algorithms with reliable results (as we explain in more detail here) regardless of the volume of data they have to process.

Most of the time, to solve this problem, companies are tempted to impose constraints on humans, with for example pre-filled fields that prevent the use of abbreviations. However, this solution is not ideal: it is very difficult to implement in practice and does not prevent mistakes from being made. One solution is to use a tool that would automatically correct errors in the data. This is what we propose at YZR.

Exploiting the full potential of artificial intelligence and machine learning is thus a major business challenge for companies. An operationalization phase is nevertheless necessary but can be very costly in terms of time and resources. The quality of the training data is therefore a crucial point of attention on which companies must particularly focus.

Prepare your data efficiently with YZR to operationalize your AI projects

YZR is a no-code AI platform 100% dedicated to textual data normalization delivered through an API. Our no-code tool has a very easy to use interface for any user wishing to improve the quality of his data.

Our SaaS solution integrates perfectly with your various tools (Product Information Management, Master Data Management, ERP, etc.), allowing you to achieve, among other things

– A better knowledge of the customer.

Optimized sales forecasts

– Accelerated digitization of your offer.

In other words, with YZR, you exploit the full potential of your data.

Want to know more? Would you like a demonstration of our product? Don’t hesitate to contact us directly on our website or at hadrien@yzr.ai

Sources

-Partner, Erick Brethenoux, Frances Karamouzis, 5 Steps to Practically Implement AI Techniques, 25 avril 2019, mis à jour le 6 août 2020
-Gartner, Melissa Davis, Accelerating AI Deployments — Paths of Least Resistance, 13 juillet 2020
-Gartner, Whit Andrews, AI-Successful Organizations Have These 4 Habits in Common, 19 janvier 2021

Contenus Linked