Question 1 of 2
You've built the structure of a model! Now, let's understand its parts. The relu activation function changes all negative values in a tensor to 0.
Examine your model and understand how relu activation works. If a tensor [-5, 0.5, 3, -1] was processed by relu, what would the output be?
Model layers: Dense(32, relu). ReLU output: [0, 0.5, 3, 0]
🧠 AI/ML Challenge
Build blocks, then run code to test your neural network