public final class MnistExperimentExample extends Object
This file builds one input layer and one hidden layer.
The input layer has input dimension of numRows*numColumns where these variables indicate the number of vertical and horizontal pixels in the image. This layer uses a rectified linear unit (relu) activation function. The weights for this layer are initialized by using Xavier initialization (https://prateekvjoshi.com/2016/03/29/understanding-xavier-initialization-in-deep-neural-networks/) to avoid having a steep learning curve. This layer will have 1000 output signals to the hidden layer.
The hidden layer has input dimensions of 1000. These are fed from the input layer. The weights for this layer is also initialized using Xavier initialization. The activation function for this layer is a softmax, which normalizes all the 10 outputs such that the normalized sums add up to 1. The highest of these normalized values is picked as the predicted class.
| Constructor and Description |
|---|
MnistExperimentExample() |
| Modifier and Type | Method and Description |
|---|---|
static void |
main(String[] args)
The experiment entry point.
|
void |
runMnistExperiment(OnlineExperiment experiment)
The experiment runner.
|
public static void main(String[] args)
You should set three environment variables to run this experiment:
Alternatively you can set these values in the resources/application.conf file
args - the command line arguments.public void runMnistExperiment(OnlineExperiment experiment) throws IOException
experiment - the Comet experiment instance.IOException - if any exception raised.Copyright © 2022. All rights reserved.