1

Exploding Gradients Detection

Unsolved
Fundamentals
Optimization

Difficulty: 2 | Problem written by zeyad_omar
Problem reported in interviews at

Amazon
Apple
Facebook
Google
Netflix

Gradient descent is an optimization algorithm that aims to find the best weights that give the least error by subtracting a small value from the previously calculated gradient (a step towards the bottom of the search space). This small constant that the gradient is multiplied by is known as learning rate (alpha)

How small should the learning rate be?

If alpha is too high, the model error diverges and optimal weights will never obtained. On the other hand, if alpha is too small, then the learning process becomes much too slow.

In this problem, you are given a 1D vector of gradients of the model over time as well as a threshold. You should write a function that returns the index at which the gradients are going to explode (reach too high value). 

How?

When the difference between the previous gradient and current gradient exceeds the given threshold.

Return -1 if you did not face the exploding gradients problem.

Sample Input:
<class 'list'>
grads: [1.01, 0.9, 1.56, 10.2, 30.25]
threshold: 15

Expected Output:
<class 'int'>
3

Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.

Nobis rerum modi, sunt officiis error quo? Doloribus obcaecati velit libero perspiciatis, aperiam sequi sunt, sapiente quaerat quisquam praesentium, autem officiis sequi corrupti necessitatibus adipisci voluptatem perferendis illo non incidunt reprehenderit, dolorum adipisci excepturi autem aspernatur.

Quas vero aliquid facere tenetur officiis provident placeat iste, recusandae facere adipisci laboriosam nesciunt optio distinctio, ab velit ducimus expedita at similique sit impedit itaque ea quod, sint voluptatem voluptatum id aliquam quos adipisci, atque commodi eius id eligendi modi. At quasi accusamus reiciendis voluptatem saepe deleniti fugiat, cum officiis nobis esse optio voluptas nesciunt alias, mollitia officiis dolore modi ab quas dolores incidunt ut molestias suscipit.

Fugit odio accusamus, ducimus laudantium corrupti iste quis placeat natus quod sunt ab ad, incidunt eligendi fugiat, nesciunt mollitia aliquid dolorum eligendi earum. Repudiandae in odit consectetur expedita autem illo veniam porro nostrum repellat, ab nemo autem temporibus similique perferendis velit, voluptatum vel ipsam sunt, sunt unde ex. Ipsam accusantium quaerat alias dignissimos mollitia dolores architecto nemo, voluptates ipsam facere nam sint illo, odio rem animi architecto quia iste voluptatem, iste voluptate quod corrupti esse tempore, saepe amet quod odit velit animi totam porro modi dignissimos id sunt? Facilis repellendus voluptatibus mollitia incidunt debitis voluptatum a soluta fuga temporibus molestiae, ullam maxime quaerat veniam neque doloribus culpa minus rem, eos exercitationem provident nobis eligendi consectetur id dolores libero doloribus animi placeat, cumque assumenda velit repellat iure sed animi inventore fuga quibusdam consequatur, enim ut incidunt quasi odio?

This is a premium feature.
To access this and other such features, click on upgrade below.

Ready.

Input Test Case

Please enter only one test case at a time
numpy has been already imported as np (import numpy as np)