Code Explanation:
Step 1: Create a Tensor
x = torch.tensor([1.0, 2.0, 3.0])
This creates a 1D tensor:
x=[1.0,2.0,3.0]
Step 2: Element-wise Subtraction
x - 2
Each element in the tensor is subtracted by 2:
[1.0−2,2.0−2,3.0−2]=[−1.0,0.0,1.0]
Step 3: Apply ReLU (Rectified Linear Unit)
y = torch.relu(x - 2)
The ReLU function is defined as:
Applying ReLU to [-1.0, 0.0, 1.0]:
ReLU(-1.0) = 0.0
ReLU(0.0) = 0.0
ReLU(1.0) = 1.0
Thus, the result is:
y=[0.0,0.0,1.0]
Final Output:
tensor([0., 0., 1.])
0 Comments:
Post a Comment