Iteration:
Batch Size:
Hidden Layers:
Neurons per Layer:
Activation Function: ReLU Sigmoid Tanh Leaky ReLU Softmax Select the activation function to use.
Learning Rate: Default is 0.001
Random Seed: Seed value for random number generators to ensure reproducibility.
Loading... Please wait while the program is running.