1

K-Fold Evaluation

Unsolved
Optimization
Supervised

Difficulty: 6 | Problem written by ankita
There are various hyperparameters used to train a model, and selecting the best hyperparameter plays an important role in getting the best output. To overcome such problems we use k-fold cross-validation.

K-fold cross-validation divides the training dataset into k-folds and trains on k-1 folds for one hyperparameter and calculates the score for the one remaining validation fold. We then repeat this procedure until all the folds have been validated once for that hyperparameter and the mean score is calculated for all the validation folds. 

This process continues until the mean score for all the hyperparameters has been calculated and the parameter which gives the best score is used to train the model.

We expect you to implement the algorithm and use the score mentioned below.

We are going to use the 'SVM' classifier to train our final model.

SVM stands for support vector machine that creates a line or a hyperplane that separates the data into classes.

The score, which we'll use in this particular problem is:

Score: -(log_loss) + 100*accuracy

Input:

X: an array of training examples        

Y: an array of output corresponding to each training example

K: the number of folds 

params: hyperparameters for the model

Output:

Best suited hyperparameter for the model and an array of mean scores corresponding to each hyperparameter.

Algorithm:

The dataset is divided equally into k folds if the number of training examples is divisible by k, otherwise, it is divided in such a way that k-1 folds are of equal length and the kth fold has all the remaining training examples.

Hint:

You'll have to import SVM from sklearn.

You'll have to import log_loss from sklearn.metrics.

You'll have to import accuracy_score from sklearn.metrics.

Use np.argmax() to get the best-suited hyperparameter.

Sample Input:
<class 'list'>
X: [[0.28808309, 0.13271201, 0.92122371, 0.91243814, 0.4151917], [0.85052914, 0.30628268, 0.25682314, 0.87804254, 0.45199377], [0.97808989, 0.88895976, 0.34420744, 0.54623489, 0.31057216], [0.62234358, 0.57423254, 0.75037482, 0.35049899, 0.77555272], [0.96055441, 0.11425848, 0.39821816, 0.56062956, 0.08909737], [0.01338147, 0.6444357, 0.4552864, 0.66184982, 0.49481201], [0.02758924, 0.80681603, 0.09312395, 0.00928236, 0.76826061], [0.91611049, 0.98133631, 0.20562383, 0.92889794, 0.2880263], [0.66417984, 0.99673027, 0.14082773, 0.68663448, 0.1852378], [0.127306, 0.05193181, 0.34193187, 0.79006964, 0.69190596], [0.05793603, 0.50314402, 0.73088811, 0.23800427, 0.44468312], [0.70415153, 0.17036039, 0.22422623, 0.15603537, 0.95797461], [0.34986266, 0.47993462, 0.21164228, 0.27138063, 0.81004197], [0.12191907, 0.05957449, 0.74249603, 0.2570194, 0.01236144], [0.36161561, 0.0414345, 0.17963828, 0.08565233, 0.36974245], [0.81504428, 0.9325805, 0.01796581, 0.78506612, 0.42539213], [0.15517462, 0.75956168, 0.44425727, 0.69905955, 0.8919953], [0.85736733, 0.63198963, 0.52543524, 0.82662779, 0.73772603], [0.99704502, 0.03156544, 0.44710947, 0.01758611, 0.20290011], [0.79227421, 0.56130345, 0.14378843, 0.69674449, 0.22054341], [0.65594553, 0.33069314, 0.54426334, 0.99758698, 0.62174528], [0.34331697, 0.88689343, 0.24367589, 0.93993594, 0.23290597], [0.03714022, 0.98882829, 0.62153273, 0.9318707, 0.70624316], [0.64520011, 0.33643023, 0.36503792, 0.24213934, 0.79346295], [0.29174243, 0.93621123, 0.84439963, 0.43614745, 0.25341451], [0.54201512, 0.23572186, 0.77564139, 0.97141649, 0.93005915], [0.27384658, 0.44439464, 0.9245569, 0.6127407, 0.45978992], [0.93013235, 0.39591071, 0.2968617, 0.08479829, 0.84570743], [0.17438118, 0.75926282, 0.71607561, 0.87155642, 0.11381959], [0.44189104, 0.50822277, 0.02248793, 0.21051604, 0.71579882], [0.53497343, 0.00416655, 0.70641816, 0.69786548, 0.78644013], [0.944234, 0.37333323, 0.24614889, 0.89094975, 0.03650821], [0.43451792, 0.52521582, 0.80394213, 0.88820634, 0.41390045], [0.34989699, 0.09094647, 0.76609326, 0.2566451, 0.85423253], [0.05943028, 0.47814409, 0.2798714, 0.11499398, 0.27554214], [0.27343064, 0.23370781, 0.58299519, 0.16823013, 0.42517402], [0.6343759, 0.10374378, 0.99478037, 0.91322083, 0.74115256], [0.65626876, 0.56226815, 0.54436579, 0.74667564, 0.89968993], [0.16559459, 0.89125274, 0.02333008, 0.95284121, 0.75076309], [0.59411215, 0.43988558, 0.86795254, 0.7420899, 0.26450626], [0.9240415, 0.30801223, 0.06634608, 0.15971211, 0.71680789], [0.4966878, 0.83683593, 0.68256842, 0.97665755, 0.16080067], [0.5072641, 0.52339745, 0.97606971, 0.75309326, 0.03235577], [0.72548198, 0.76678127, 0.74716786, 0.23820836, 0.24604173], [0.52839351, 0.24251113, 0.88988021, 0.99818792, 0.92640533], [0.37845405, 0.23220892, 0.06519913, 0.57503303, 0.70490758], [0.8212936, 0.10927849, 0.99535972, 0.70681867, 0.43476112], [0.68690874, 0.43757428, 0.89569817, 0.18835328, 0.17430506], [0.35882503, 0.90040749, 0.36214664, 0.14191039, 0.84137703], [0.91155096, 0.03072719, 0.97320033, 0.40312316, 0.72911722], [0.28776697, 0.52201193, 0.74349138, 0.77110118, 0.74468244], [0.77836926, 0.51905273, 0.06334974, 0.73895158, 0.05589789], [0.06312494, 0.64078434, 0.66368573, 0.08021854, 0.10716787], [0.2362742, 0.29132618, 0.16432433, 0.40782557, 0.10177961], [0.01780314, 0.40685755, 0.84033877, 0.14398734, 0.35261497], [0.75016534, 0.72420304, 0.04848289, 0.61682958, 0.17187083], [0.43359168, 0.51683344, 0.33075431, 0.6161009, 0.50886957], [0.45455179, 0.11101992, 0.69740734, 0.54958932, 0.20411589], [0.65205183, 0.81665913, 0.15260562, 0.44512577, 0.90700416], [0.66894505, 0.7547253, 0.51328169, 0.45399249, 0.52413087], [0.08879762, 0.12481797, 0.57286059, 0.04231704, 0.28682389], [0.15654173, 0.70407219, 0.2487118, 0.42385112, 0.73144897], [0.36572936, 0.13271708, 0.5680062, 0.90744433, 0.78083676], [0.57175455, 0.55173402, 0.27154379, 0.98371879, 0.75801435], [0.12098328, 0.21867875, 0.97867235, 0.08478394, 0.04764714], [0.74229282, 0.34635157, 0.33116957, 0.0629314, 0.82807809], [0.87310944, 0.96473962, 0.58226047, 0.61871701, 0.14511178], [0.60492922, 0.41891185, 0.45823699, 0.67456315, 0.8196659], [0.21960785, 0.80479582, 0.56063182, 0.65670668, 0.04457237], [0.12954945, 0.90032136, 0.33692947, 0.38270984, 0.63480873], [0.09190535, 0.93893676, 0.19130277, 0.94270217, 0.53227416], [0.55518902, 0.37911209, 0.24968603, 0.79695427, 0.68352164], [0.01028394, 0.56413764, 0.71003891, 0.90543364, 0.1475176], [0.50061622, 0.92545155, 0.63966697, 0.83688944, 0.01119546], [0.59715891, 0.87397631, 0.99140603, 0.7175234, 0.74094065], [0.23757625, 0.8977864, 0.20677632, 0.88854015, 0.80295196], [0.32164893, 0.06907393, 0.02611696, 0.03236917, 0.71255535], [0.98664963, 0.17854053, 0.79812665, 0.60481957, 0.8636909], [0.58074405, 0.85921968, 0.05445138, 0.39707516, 0.43441103], [0.34113555, 0.37746067, 0.28238333, 0.70028379, 0.09372683], [0.72954287, 0.4250067, 0.01349116, 0.90581052, 0.97631382], [0.40171488, 0.11968335, 0.47306299, 0.17901952, 0.66506976], [0.34251949, 0.36575828, 0.06948143, 0.17280023, 0.09392637], [0.11304527, 0.74263465, 0.08926681, 0.71936409, 0.19814847], [0.51674979, 0.01244053, 0.49234971, 0.45425094, 0.31793668], [0.09440213, 0.97816435, 0.30598656, 0.42029234, 0.48287456], [0.8709424, 0.59847009, 0.75686167, 0.70745862, 0.55046962], [0.27491991, 0.62970338, 0.63160984, 0.28401754, 0.50593925], [0.58368126, 0.48805522, 0.79530289, 0.10589737, 0.74074327], [0.00883314, 0.95270768, 0.60755843, 0.66822616, 0.88743599], [0.35645768, 0.41060868, 0.38050583, 0.370375, 0.14595772], [0.31770366, 0.21004162, 0.89327997, 0.56172719, 0.47937697], [0.27677351, 0.19604685, 0.5457228, 0.37965756, 0.77057216], [0.68586814, 0.26593834, 0.30350968, 0.71327724, 0.49953284], [0.92170642, 0.43857489, 0.36049305, 0.46597855, 0.4793894], [0.28439499, 0.36816905, 0.5292559, 0.85562856, 0.63882751], [0.84748763, 0.07982133, 0.45928776, 0.29978146, 0.65273754], [0.75319528, 0.83768627, 0.37534864, 0.70880077, 0.81149084], [0.1771618, 0.82725945, 0.59764865, 0.24586135, 0.74871198], [0.64357573, 0.06919947, 0.26045521, 0.70770567, 0.01820965]]
<class 'list'>
Y: [1, 1, 1, 0, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0]
<class 'int'>
K: 6
<class 'list'>
params: [0.01, 0.05, 0.1, 0.5, 1, 10, 50, 100]

Expected Output:
<class 'tuple'>
(1, array([27.68506952, 27.68506952, 27.68506952, 22.63991039, 27.96547358, 22.64001034, 25.44289318, 22.9202678 ]))

This is a premium problem, to view more details of this problem please sign up for MLPro Premium. MLPro premium offers access to actual machine learning and data science interview questions and coding challenges commonly asked at tech companies all over the world

MLPro Premium also allows you to access all our high quality MCQs which are not available on the free tier.

Not able to solve a problem? MLPro premium brings you access to solutions for all problems available on MLPro

Get access to Premium only exclusive educational content available to only Premium users.

Have an issue, the MLPro support team is available 24X7 to Premium users.

This is a premium feature.
To access this and other such features, click on upgrade below.

Log in to post a comment

Comments
Ready.

Input Test Case

Please enter only one test case at a time
numpy has been already imported as np (import numpy as np)