Skip to content

x0 cut prediction#

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
plt.rcParams['figure.dpi'] = 120

Loading the data#

doe = pd.read_csv('../data/doe.csv')
data = pd.read_csv('../data/cut_x0_all.csv')
data.drop(data[data.doe_id == 1000].index, inplace=True)
data.drop(data[data.doe_id == 247].index, inplace=True)

Creating the regressor#

from mesh_predictor import CutPredictor

reg = CutPredictor()
reg.load_data(
    doe = doe,
    data = data,
    index='doe_id',
    process_parameters = [
        'Blechdicke', 
        'Niederhalterkraft', 
        'Ziehspalt', 
        'Einlegeposition', 
        'Ziehtiefe',
        'Rp0',
    ],
    categorical = [
        'Ziehspalt', 
        'Ziehtiefe',
    ],
    position = 'tp',
    output = ['deviationc'],#, 'y', 'z'],
    validation_split=0.1,
    validation_method='leaveoneout',
    position_scaler='minmax'
)
reg.save_config("../models/cut_x0_deviation.pkl")
reg.data_summary()
Data summary
------------------------------------------------------------

Process parameters:
    - Blechdicke : numerical [ 0.99  ...  1.48 ]
    - Niederhalterkraft : numerical [ 10  ...  500 ]
    - Ziehspalt : categorical [1.6, 2.4]
    - Einlegeposition : numerical [ -5  ...  5 ]
    - Ziehtiefe : categorical [30, 50, 70]
    - Rp0 : numerical [ 133.18263199999998  ...  296.5565 ]
Input variables:
    - tp : numerical, [ 0.0 / 5.0 ] 
Output variable(s):
    - deviationc : numerical, [ -3.16506211149574 / 7.228601057768613 ]

Inputs (879000, 10)
Outputs (879000, 1)
Total number of experiments: 879
Total number of samples: 879000
Number of training samples: 792000
Number of test samples: 87000
Number of experiments in the test set: 87

Training methods#

Autotuning#

best_config = reg.autotune(
    save_path='../models/best_x0_model',
    trials=100,
    max_epochs=20, 
    layers=[4, 6],
    neurons=[64, 256, 64],
    dropout=[0.0, 0.5, 0.1],
    learning_rate=[1e-5, 1e-3]
)
print(best_config)
reg.training_summary()

Alternative: define a custom network and do the optimization yourself#

One can also run the autotuning for a limited number of epochs and then fine-tune the best configuration by training it longer.

config = {
    'batch_size': 2048,
    'max_epochs': 50,
    'layers': [256, 256, 256, 256, 256],
    'dropout': 0.0,
    'learning_rate': 0.01
}

# or best_config from autotune if you already did it once

reg.custom_model(save_path='../models/current_x0_model', config=config, verbose=True)
reg.training_summary()
Metal device set to: Apple M1 Pro

systemMemory: 16.00 GB
maxCacheSize: 5.33 GB

Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 256)               2816      

 re_lu (ReLU)                (None, 256)               0         

 dense_1 (Dense)             (None, 256)               65792     

 re_lu_1 (ReLU)              (None, 256)               0         

 dense_2 (Dense)             (None, 256)               65792     

 re_lu_2 (ReLU)              (None, 256)               0         

 dense_3 (Dense)             (None, 256)               65792     

 re_lu_3 (ReLU)              (None, 256)               0         

 dense_4 (Dense)             (None, 256)               65792     

 re_lu_4 (ReLU)              (None, 256)               0         

 dense_5 (Dense)             (None, 1)                 257       

=================================================================
Total params: 266,241
Trainable params: 266,241
Non-trainable params: 0
_________________________________________________________________
Epoch 1/50
387/387 [==============================] - 7s 12ms/step - loss: 0.0951 - val_loss: 0.0015
Epoch 2/50
387/387 [==============================] - 5s 12ms/step - loss: 0.0010 - val_loss: 9.7784e-04
Epoch 3/50
387/387 [==============================] - 5s 12ms/step - loss: 7.8229e-04 - val_loss: 8.4430e-04
Epoch 4/50
387/387 [==============================] - 5s 12ms/step - loss: 6.6143e-04 - val_loss: 6.9625e-04
Epoch 5/50
387/387 [==============================] - 5s 12ms/step - loss: 5.6085e-04 - val_loss: 7.4612e-04
Epoch 6/50
387/387 [==============================] - 5s 12ms/step - loss: 4.8998e-04 - val_loss: 5.6353e-04
Epoch 7/50
387/387 [==============================] - 5s 12ms/step - loss: 4.8325e-04 - val_loss: 4.4705e-04
Epoch 8/50
387/387 [==============================] - 5s 12ms/step - loss: 3.9933e-04 - val_loss: 6.2635e-04
Epoch 9/50
387/387 [==============================] - 5s 12ms/step - loss: 3.6695e-04 - val_loss: 5.3525e-04
Epoch 10/50
387/387 [==============================] - 5s 12ms/step - loss: 3.6690e-04 - val_loss: 3.7967e-04
Epoch 11/50
387/387 [==============================] - 5s 13ms/step - loss: 3.4395e-04 - val_loss: 3.8449e-04
Epoch 12/50
387/387 [==============================] - 5s 13ms/step - loss: 3.2885e-04 - val_loss: 6.1913e-04
Epoch 13/50
387/387 [==============================] - 5s 13ms/step - loss: 2.7704e-04 - val_loss: 4.1812e-04
Epoch 14/50
387/387 [==============================] - 5s 12ms/step - loss: 2.7017e-04 - val_loss: 2.5869e-04
Epoch 15/50
387/387 [==============================] - 5s 13ms/step - loss: 2.4560e-04 - val_loss: 5.4358e-04
Epoch 16/50
387/387 [==============================] - 5s 13ms/step - loss: 2.7837e-04 - val_loss: 3.0109e-04
Epoch 17/50
387/387 [==============================] - 5s 13ms/step - loss: 2.4484e-04 - val_loss: 3.3601e-04
Epoch 18/50
387/387 [==============================] - 5s 13ms/step - loss: 2.0835e-04 - val_loss: 3.8868e-04
Epoch 19/50
387/387 [==============================] - 6s 15ms/step - loss: 2.4355e-04 - val_loss: 2.4871e-04
Epoch 20/50
387/387 [==============================] - 5s 13ms/step - loss: 1.9843e-04 - val_loss: 3.7203e-04
Epoch 21/50
387/387 [==============================] - 5s 12ms/step - loss: 2.1181e-04 - val_loss: 2.4572e-04
Epoch 22/50
387/387 [==============================] - 5s 12ms/step - loss: 1.8778e-04 - val_loss: 2.5494e-04
Epoch 23/50
387/387 [==============================] - 5s 13ms/step - loss: 1.8066e-04 - val_loss: 2.5111e-04
Epoch 24/50
387/387 [==============================] - 5s 12ms/step - loss: 1.8112e-04 - val_loss: 2.9211e-04
Epoch 25/50
387/387 [==============================] - 5s 12ms/step - loss: 2.0284e-04 - val_loss: 2.5421e-04
Epoch 26/50
387/387 [==============================] - 4s 12ms/step - loss: 1.6379e-04 - val_loss: 2.2906e-04
Epoch 27/50
387/387 [==============================] - 5s 12ms/step - loss: 1.7656e-04 - val_loss: 1.9107e-04
Epoch 28/50
387/387 [==============================] - 5s 12ms/step - loss: 1.6522e-04 - val_loss: 1.8863e-04
Epoch 29/50
387/387 [==============================] - 5s 13ms/step - loss: 1.5896e-04 - val_loss: 2.4476e-04
Epoch 30/50
387/387 [==============================] - 6s 15ms/step - loss: 1.6748e-04 - val_loss: 3.0141e-04
Epoch 31/50
387/387 [==============================] - 5s 13ms/step - loss: 1.6237e-04 - val_loss: 2.0276e-04
Epoch 32/50
387/387 [==============================] - 5s 12ms/step - loss: 1.7063e-04 - val_loss: 2.3029e-04
Epoch 33/50
387/387 [==============================] - 6s 15ms/step - loss: 1.4083e-04 - val_loss: 2.2445e-04
Epoch 34/50
387/387 [==============================] - 5s 12ms/step - loss: 1.3416e-04 - val_loss: 2.4996e-04
Epoch 35/50
387/387 [==============================] - 4s 12ms/step - loss: 1.6694e-04 - val_loss: 2.5844e-04
Epoch 36/50
387/387 [==============================] - 4s 11ms/step - loss: 1.4822e-04 - val_loss: 1.9492e-04
Epoch 37/50
387/387 [==============================] - 5s 12ms/step - loss: 1.2537e-04 - val_loss: 2.6402e-04
Epoch 38/50
387/387 [==============================] - 6s 15ms/step - loss: 1.4451e-04 - val_loss: 2.0392e-04
Epoch 39/50
387/387 [==============================] - 5s 13ms/step - loss: 1.3550e-04 - val_loss: 2.0369e-04
Epoch 40/50
387/387 [==============================] - 6s 14ms/step - loss: 1.3397e-04 - val_loss: 1.7202e-04
Epoch 41/50
387/387 [==============================] - 5s 13ms/step - loss: 1.2029e-04 - val_loss: 2.1723e-04
Epoch 42/50
387/387 [==============================] - 5s 13ms/step - loss: 1.2936e-04 - val_loss: 2.4467e-04
Epoch 43/50
387/387 [==============================] - 5s 12ms/step - loss: 1.3571e-04 - val_loss: 1.7959e-04
Epoch 44/50
387/387 [==============================] - 6s 15ms/step - loss: 1.0874e-04 - val_loss: 1.8706e-04
Epoch 45/50
387/387 [==============================] - 5s 12ms/step - loss: 1.1912e-04 - val_loss: 1.8208e-04
Epoch 46/50
387/387 [==============================] - 5s 12ms/step - loss: 1.2890e-04 - val_loss: 1.8400e-04
Epoch 47/50
387/387 [==============================] - 5s 13ms/step - loss: 1.2369e-04 - val_loss: 2.1889e-04
Epoch 48/50
387/387 [==============================] - 5s 13ms/step - loss: 1.0986e-04 - val_loss: 2.1243e-04
Epoch 49/50
387/387 [==============================] - 5s 13ms/step - loss: 1.1663e-04 - val_loss: 1.6970e-04
Epoch 50/50
387/387 [==============================] - 6s 16ms/step - loss: 1.0531e-04 - val_loss: 1.9543e-04
43/43 [==============================] - 0s 10ms/step - loss: 1.6970e-04
INFO:tensorflow:Assets written to: ../models/current_x0_model/assets
Validation mse: 0.00016970359138213098

Other alternative: the model has already been trained#

We just need to reload it to make predictions.

reg.load_network(load_path='../models/best_x0_model')

Visualization#

Prediction for single process parameter values#

x, y = reg.predict({
        'Blechdicke': 1.01, 
        'Niederhalterkraft': 410.0, 
        'Ziehspalt': 2.4, 
        'Einlegeposition': -5, 
        'Ziehtiefe': 30,
        'Stempel_ID': 3,
        'E': 191.37245,
        'Rp0': 138.22696,
        'Rp50': 449.528189,
    }, 
    positions=1000)

plt.figure()
plt.plot(x, y[:, 0])
plt.xlabel('tp')
plt.ylabel('deviationc')
plt.show()

Comparison with the ground truth on the training set#

Randomly choose an id and compare the prediction to the ground truth. If the experiment does not exist in the data, an error will be thrown.

idx = np.random.choice(data['doe_id'].unique())
print("Doe_ID", idx)
data = reg.compare(idx)
Doe_ID 106

%matplotlib inline
plt.rcParams['figure.dpi'] = 150

def viz(x, y):
    plt.figure()
    plt.plot(x, y[:, 0])
    plt.xlabel('tp')
    plt.ylabel('deviationc')


reg.interactive(function=viz, positions=100)