How smart computers are redefining smart manufacturing

Posted on 16 Jul 2025 by Joe Bush

Manufacturing leaders are no strangers to innovation, but are we fully unlocking the potential of computing? Beyond automation, smarter computing is redefining how we augment human expertise, optimise operations and deliver value to those who matter most: the end customers.

Manufacturing leaders need to rethink computing’s role in shaping a more adaptive, intelligent and customer-driven future for manufacturing. How can we empower people with the computers that think ahead, optimise at high-speed and support every decision-maker and worker in real-time? Rob Reviere, PHD AI Solutions Architect at Lenovo, speaks to The Manufacturer.

What role has smart computing traditionally played in manufacturing environments?

RR: The last three years has been very different to what has gone before. Prior to that there was a lot of activity around the development of robotics and streamlining individual steps within manufacturing processes. We’ve also seen the advent of IoT and IIoT and along with that, very specific methodologies that have now been coalescing into the emerging space of generative AI. These are, of course, still in play and very important.

What we’re seeing coming online now however, is the combination of statistical approaches like reliability analytics with advancements in deep learning is the emerging trend; looking at the entire lifespan of products and trying to capture cradle to grave information around manufactured goods – with that data being fed back in to the manufacturing process.

As a result we now have all this new collateral in a variety of formats. We’re now taking all the lessons learned from all these past initiatives and combining them with the explosion in technologies around deep learning, particularly computer vision and large language models. We’re in a brave new world.

How is the use of smart computing evolving as tech becomes more sophisticated? Do we need to rethink its role?

It’s something that needs to be tweaked rather than rethought. The challenge is that traditional IT incorporates one way of looking at systems – whether it’s ERP or MES – rather than looking at these sources from the analytics perspective. However, in the modern world of manufacturing the data science community (not the same as IT) also has a role to play and they are looking at manufacturing systems from a more analytical perspective. This calls for a very different data structure.

Manufacturers have both structured, unstructured and semi-structured data, and everything else in between, so businesses need to set up their repositories to handle all these new formats. What this means is that there needs to be a tight partnership between traditional IT and the analytics community to structure their analytics data repositories to support this new paradigm. This is not the IT traditional way of doing things.

For analytics, everything begins and ends with the data, which is the piece that has to be revisited. And companies need to have an analytics architecture that’s devoted to bridging these two very separate disciplines by creating a comprehensive data strategy.

How can smarter computing augment people and maximise business outcomes?

In a business that is structured in a fairly sophisticated way, IT would be speaking with the engineering department which in turn would be talking to the data analytics community.

What would be desirable, and something that I’ve seen evolving recently, is the idea of a smart electronic devices that allows a technician on the shop floor, working at a particular station within a continuous processing line, to ask questions of virtual expert.

This enables the collective wisdom of other people within the organisation, and even the expertise of those that have retired, to be collated in a distillable and usable fashion. This is not only making problem solving more efficient, it is also enabling solutions to be found closer to the occurrence of the problem. So, these new technologies have helped to realise the concept of the smart assistant technician.

These advanced AI agents, if connected to inventory management systems, also aids real-time inventory, which can in turn, empower technicians by instantaneously giving them everything they need. This then makes processes more efficient, makes turns ratios faster and reduces scrap; everything you would want in a manufacturing operation.

How can smart computing predict and act before challenges arise – and what benefits can this bring to manufacturing?

Predictive maintenance has been around for several decades, so it’s not a new concept. However, we’re now seeing brand new model types emerge within manufacturing (although not new to statisticians), such as auto encoders, different types of neural net architectures etc, which are now being exposed to particular types of problems that we haven’t seen previously.

Reliability analytics has been in its own bubble for a long period of time, and is its own branch of statistics. When you are able to combine that with deep learning it offers a more prognostic way of looking at and addressing impending issues.

For example, a system can use these analytical insights to put in a limp home mode, a self-healing mode, or if it’s an imminently catastrophic scenario, you can basically perform a graceful shut down, as opposed to having to perform a drastic and potentially damaging shutdown. These options are all under the prognostic umbrella of methods to improve manufacturing operations.

Where are the biggest gaps and opportunities for AI in optimising machines in manufacturing?

A glaringly obvious gap and opportunity that currently exists is the skill sets that will enable us to use these emerging AI capabilities, and implement and integrate them in the most effective way. This is a big problem that has to be addressed.

Currently I’m seeing two separate ways in which businesses are tackling this. The first is by working with companies like NVIDIA and Microsoft that have a deep bench with AI expertise along with the software to enable AI. The second is to engage companies that have AI services, which can address the skills gap.

Alternatively there is a smaller subset of specialists that can be called upon for their advice. I would put myself in that particular bracket. And even though I work for a hardware company, it’s in Lenovo’s best interest to become a partner with companies to address these issues. And then of course, certain universities are conducting training  to grow internal skills.

So, the people aspect of this problem is the biggest opportunity we have right now. The models are developing at such a pace, keeping in front of that technology, or at least somewhat current, is an extreme challenge. And without having people with these kinds of skills, you could find yourself outdated before you even get started.

Some manufacturers struggle to capture meaningful ROI from AI-based projects. What’s your top tips for dealing with this?

Fundamentally, successful ROI begins with the use case and proof of concept (POC) that you start with. Make it small, targeted and model it in such a way that the ROI is easy to understand and communicate to all levels of the organisation.

It’s very easy in situations like this to try to boil the ocean with AI-based projects. So, it’s important to ask what is causing you pain and whether you have the data to characterise this? If you don’t, it’s advisable to stop and re-group. Therefore, having toll gates incorporated in the POC process that represent go/no go points for the next stage is key to a successful AI project. into the process before you get started.

The first toll gate after the parameters of the POC have been defined relates to the data. If there is enough data that characterises this problem, and the data has sufficient fidelity to utilise, then you can pass through the next toll gate. It’s at this point when you’re getting really close to truly defining the POC. I am a firm believer of the Six Sigma methodology and it should be strongly applied to POCs, at least for the define and measure steps. Before you even start, document your POC, get your cross functional team in place and define what will be the measure of success.

Other parts of the POC will start falling into place based upon these early decisions. These toll gates enable you to assess whether you have made successful progress and will stop you from going too far.

I would also advise people to get expert help where they can, even if it’s in an advisory capacity. Having a fresh set of eyes to look at something differently keeps you honest and grounded in the truth. And ultimately it’s the truth that is important, not opinion.

What were some of the key takeaways from the recent roundtable?

The surprising finding from the recent roundtable session we had with manufacturers around this topic, was that after outlining an ideal project, there was an element of pushback from manufacturers. Many highlighted the fact that there is still inertia that has to be overcome within their organisation.

Even though there are good ideas in place, manufacturers are still fighting against this inertia. So, there is a big challenge that needs to be addressed initially to get buy-in from key stakeholders and executive sponsorship before starting an AI POC. We all know this exists but it appears to be a bigger pain point than I anticipated.

For more articles like this, visit our Industrial Data & AI channel