Last chance! 50% off unlimited learning
Sale ends in
The function that instantiates a neural network as created by create_nn().
inst(neural_network, activation_function, x)
The output of the continuous function that is the instantiation of the given
neural network with the given activation function at the given
is of vector size equal to the input layer of the neural network.
An ordered list of lists, of the type generated by
create_nn() where each element in the
list of lists is a pair
NOTE: We will call istantiation what Grohs et. al. call "realization".
A continuous function applied to the output of each layer. For now we only have ReLU, Sigmoid, and Tanh. Note, all proofs are only valid for ReLU activation.
our input to the continuous function formed from activation. Our input will
be an element in
Grohs, P., Hornung, F., Jentzen, A. et al. Space-time error estimates for deep neural network approximations for differential equations. (2019). https://arxiv.org/abs/1908.03833.
Definition 1.3.4. Jentzen, A., Kuckuck, B., and von Wurstemberger, P. (2023). Mathematical introduction to deep learning: Methods, implementations, and theory. https://arxiv.org/abs/2310.20360
Very precisely we will use the definition in:
Definition 2.3 in Rafi S., Padgett, J.L., Nakarmi, U. (2024) Towards an Algebraic Framework For Approximating Functions Using Neural Network Polynomials https://arxiv.org/abs/2402.01058
create_nn(c(1, 3, 5, 6)) |> inst(ReLU, 5)
create_nn(c(3, 3, 5, 6)) |> inst(ReLU, c(4, 4, 4))
Run the code above in your browser using DataLab