4

Activation Functions

Unsolved
Neural Networks

Difficulty: 5 | Problem written by hemdan219@gmail.com

Educational Resource: https://cs231n.github.io/neural-networks-1/


Problem reported in interviews at

Apple
Netflix

Given the nodes in a previous hidden layer of a neural network h_output connected to a current node, and the weights weights associated with each of the nodes in h_output, return the resulting node value by taking the dot product of h_output and weights.

The third parameter is a string representing the activation function applied to the intermediate output. The parameter can take the following 4 values:

'sigmoid': Sigmoid(Z) = \({1\over 1+e^{^{(-Z)}}} \)

'tanh': Tanh(Z) =\({e^{^{(Z)}}-e^{^{(-Z)}}\over e^{^{(Z)}}+e^{^{(-Z)}}}\)

'relu': ReLU(Z)=\(max(0,Z)\)

'leakyrelu': ReLULeaky(Z)=\(max(.00001\bullet Z,Z)\)

Sample Input:
<class 'list'>
h_output : [1, 2, 5]
<class 'list'>
weights: [8, 9, 7]
<class 'str'>
activation: relu

Expected Output:
<class 'int'>
61

This is a premium problem, to view more details of this problem please sign up for MLPro Premium. MLPro premium offers access to actual machine learning and data science interview questions and coding challenges commonly asked at tech companies all over the world

MLPro Premium also allows you to access all our high quality MCQs which are not available on the free tier.

Not able to solve a problem? MLPro premium brings you access to solutions for all problems available on MLPro

Get access to Premium only exclusive educational content available to only Premium users.

Have an issue, the MLPro support team is available 24X7 to Premium users.

This is a premium feature.
To access this and other such features, click on upgrade below.

Log in to post a comment

Comments
Jump to comment-141
abhishek_kumar • 3¬†months, 1¬†week ago

0

import numpy as np

def sigmoid(x):
    return 1/(1+np.exp(-x))
    
def tanh(x):
    return (np.exp(x)-np.exp(-x))/(np.exp(x)+np.exp(-x))

def relu(x):
    return max(0.0,x)

def leakyrelu(x):
    return max(0.00001*x,x)

def predict(h_output ,weights, activation):
    net_input = np.dot(h_output, weights)
    if activation == "sigmoid":
        return sigmoid(net_input)
    if activation == "relu":
        return relu(net_input)
    if activation == "tanh":
        return tanh(net_input)
    if activation == "leakyrelu":
        return leakyrelu(net_input)

 

Activation function controls the output of the neural network. In Laymen terms It classify the input on the basis of input value.

 

REFERENCES:

1. The Sigmoid Activation Function – Python Implementation

2. numpy.dot

3. A beginner’s guide to NumPy with Sigmoid, ReLu and Softmax activation functions

Ready.

Input Test Case

Please enter only one test case at a time
numpy has been already imported as np (import numpy as np)