Nanotechnology meets Big Data

Direction: digital layering

Media service November 2017 /

Nanoparticles can make ordinary materials into high performance materials, which is why nanotechnology is being used on the market in a wide range of diverse products. However, applying new particles to materials still represents a challenge, since you never definitively know how they will react. Fraunhofer IPA scientists are collecting process data and linking up production systems via the cloud to reduce development times and ensure quality across the entire process chain.

© Fraunhofer IPA, Rainer Bez
Roll-to-roll layering machine at the model factory.

It has been heralded as one of the key technologies of the 21st Century and has taken over the market almost silently – nanotechnology can be found in energy stores such as batteries, car accessories, clothes, cosmetics, drugs and even food. High-performance, nano-modified materials can make plastics more durable, metals lighter and energy storage more efficient.

To achieve this, conventional materials are modified using nanofillers – tiny particles measuring between one and several hundred nanometers (for reference, a nanometer is a billionth of a meter and at this scale, even a human hair is huge, at 80,000 nanometers!). Nanoparticles such as graphene, carbon nanohorns, carbon nanotubes or nano-silver fibers are created successively in strictly regulated batch processes. They are produced discontinuously in stacks. Nanoparticles are usually synthesized in reactors, and then often functionalized. This is followed by further processing in the form of powders or a dispersion in ink or paste. The final product is then created via the conventional manufacturing processes, such as printing or layering.

 

Batch processes lead to quality issues

The nanomaterial is not actually continuously processed inline until the very last step. Fluctuations in quality are a big problem in these batch processes. Ivica Kolaric, Head of the Functional Materials Department at Fraunhofer IPA explains: »Every batch – the raw material and the dispersion – has different properties due to different storage times or conditions and the effects of transportation or the environment. This means that it is difficult to keep the level of quality the same. Every batch must be started from scratch, since storage technology, working times, changeover times, etc. all have an effect on the properties, or in other words, the distribution of nanomaterials in the layers, which in turn affects the properties of the layers and thus the production time and costs. It’s a major challenge for the industry.«

Theoretical simulation models no longer help

The manufacturing process used to optimize layers using nanomaterials can change the microstructure, and by extension, the material’s properties. This is why a realistic simulation of the manufacturing process must be based on determining the resultant properties at material level and then translating this into the properties at component level. There is a whole range of commercial tools for doing this that also allow for nonlinear material and structure modeling, meaning that they are able to record any impact the manufacturing process has on composite materials to a certain degree and to analyze the findings. However, these tools are mostly used predictively and currently still require experimental validation, which can be costly and time-consuming. Linking established structure simulation tools with simulation-based manufacturing process modeling could help with this. With this, both batch processes and roll-to-roll processes could be automatically adapted to customer requirements or changes in environmental conditions in a closed-loop procedure. Kolaric adds: »If you want to bring a new product onto the market, component manufacturers and end users generally require material data that has been validated either via experiments or simulations. There are lots of simulation models that generate this data: for example, Monte Carlo or multiscale simulation models, which describe particles, and simulations, which describe fluid kinematics and adhesion, but they are difficult to scale up and do not really reflect reality.«

 

Dispersion technology and layering processes are linked

This is why Fraunhofer IPA experts are pursuing a different course. They have stayed true to the basic concepts of simulation, but the Stuttgart-based team attempted to record empirical data and to base their extrapolation and postulation on these process parameters. Kolaric calls this the »Big Data approach«: »Now, based on industry sentiment, we have linked up the dispersion process and the layering process. Paste manufacturing can communicate with layering via the cloud. We have congruent data management from dispersion through to layering.« During the digital back-and-forth between dispersion and layering, data is recorded and compared. With this, the scientists have established a basis for recording and evaluating all data from the particles to the layers. Kolaric is full of pride when he talks of the model factory, and with good reason.

 

New business models

In addition to the developmental aspect of the roll-to-roll layering machine developed at the institute , Kolaric sees digitization of layering technology as a focus for developing new business models for the collaborative use of machines, joint product development from anywhere in the world and pay-per-use concepts together with the industry.