Logic Puzzle Platform

Question 1 of 2

Understanding Your Model

You've built the structure of a model! Now, let's understand its parts. The relu activation function changes all negative values in a tensor to 0.

Goal:

Examine your model and understand how relu activation works. If a tensor [-5, 0.5, 3, -1] was processed by relu, what would the output be?

Expected Output:

Model layers: Dense(32, relu). ReLU output: [0, 0.5, 3, 0]

Build Your Solution

Output

Status:
Ready

Results

🧠 AI/ML Challenge

Build blocks, then run code to test your neural network

1 / 2