What happens to plastics processing when large language models (LLMs) and neural networks collide? Robin Kent, Managing Director at Tangram Technology and Chartered Energy Manager, answers the question, and poses some more.
Robin Kent
Recently, I was fortunate to attend a presentation at the Irish Polymer Group 2025 Conference by Jonas Schwarz of KraussMaffei on “Advancements in injection moulding through high-resolution data.” Jonas described how huge amounts of data can be collected during every moulding cycle and used to help processors change from reacting to events to controlling the process.
At the same time, nobody could have possibly missed the meteoric rise in AI tools in the past two years. Most well-known AI tools are large language models (LLMs) trained on huge amounts of text data. One of any AI’s key components is the ‘neural network’, which receives the input data, transforms it and manipulates it to produce the final output.
But what happens to plastics processing when these two trends collide?
The data mass from machines (think petabytes) will need advanced tools, both analytics and algorithms, and high reaction speeds to extract useful and meaningful information for action. The amount of data that must be absorbed and the time available may mean humans cannot do it. The people we have in our factories (the ‘wetware’ as opposed to the ‘software’ and ‘hardware’) will simply not be fast or insightful enough to keep up with the data flow.
The result will be a rise in advanced analytics and algorithms to automatically analyse the data and react to it without human intervention or interaction.
This can be as simple as robots correcting placement positioning for errors caused by their own vibrations and any external ones. It can also include more complex systems, where injection moulding machines monitor the mould fill pressure curve and adjust their parameters every cycle to correct for changes in material properties or machine response. The aim is to achieve 100% reproducibility of process performance.
The machines’ analysis and self-adjustment will change how factories work. The current state of the art is primarily concerned with the service or machine level, but this is only the start of the process. The truly smart factory will take this to a new height, where the site is integrated and self-adjusting from material ordering and input to the logistics and delivery. We may end up with factories where we don’t understand – or care – what happens. All we’ll know is that the output is what we want it to be.
The end of theory?
The great statistician George Box stated, “All models are wrong, but some are useful”. In 2008, Peter Norvig, Google’s research director, rephrased this as: “All models are wrong, and increasingly you can succeed without them.”
In this data-rich world, there’s no thesis or theory. The data tells the story, and there’s no need for any model at all. The data mass allows for the correlations to be sought without looking for causality. We simply do it and follow the numbers as long as the process works.
We’re used to working with either inductive or deductive reasoning, but analytics changes everything. There’s no thesis for theory; analytics simply follows the numbers, and no model for the system response is generated. The smart, self-adjusting factory and the analytics approach could easily lead to machines adjusting themselves, meaning neither they nor we have any idea of why they’re doing it.
As big data and analytics rise in importance, the method becomes increasingly ‘theory-free’ and the only thing linking data and action is correlation. Theory is relegated to the sidelines.
What could this mean?
The differences in our machines are only the tip of what this change means. Other potential consequences to our sites include:
- No scientific moulding. Scientific moulding is the theory that currently drives good setting (although it’s not used often enough). Who needs theory when we can simply follow the numbers?
- No setting sheets or instructions. Who needs setting sheets when you can turn the machine on and let it do its own thing?
- No setters. Who needs setters when the machine and the cloud are making the decisions and adjusting the machine?
- No schedulers or production managers. Who needs schedulers and production managers when the system knows exactly what’s happening and can dynamically reschedule the machines?
- Total remote control of the site. Increasing control through Industry 4.0 cloud connections and one master setter/controller who controls 25, 60 or 1,000 machines around the world. In reality, this will be a transient ‘data scientist’ who has never seen an injection moulding machine before and will be working in pharmaceuticals next week.
- More investment in cloud servers, analytics and data scientists. This may exceed the investment in actual injection moulding machines and the services needed to operate them.
We need to forget the past. The outputs of analytics may well be counter-intuitive and not match the current mental models, but they’re based on data. They may be ‘theory-free’ and simply based on past data, but so are the theories that we use every day.
We’re entering a post-science world where ‘what works’ is more important than ‘why it works’.
What do we do?
The collision of Industry 4.0 and AI is going to change our industry beyond recognition.
A moulder from 1980 would recognise most of the machines and processes being used in 2025. A moulder from 2025 will have few points of reference in 2070. The changes won’t be so much in the machines (although fast and large-scale additive manufacturing may change that, too) but in how they’re controlled and the type and number of people who are controlling them.
It’s going to get scary out there. Sooner than you think.