question archive 1 Practical: Learning Vector Quantization In Nestor you will find the file data lvq

1 Practical: Learning Vector Quantization In Nestor you will find the file data lvq

Subject:Computer SciencePrice:18.86 Bought3

1 Practical: Learning Vector Quantization In Nestor you will find the file data lvq.mat, which contains 100 two-dim. feature vectors. Here, we assume that the first 50 points belong to class 1, the other data points belong to class 2. 1.1 Implementation (3 points) Implement the LVQ1 algorithm as introduced and discussed in class, using simple (squared) Euclidean distance. Consider (a) LVQ1 with one prototype per class (b) LVQ1 with two prototypes per class. Your code should have the following structure: • Read in the file containing the data, define the corresponding class labels and determine the following: N: the dimension of input vectors and P: the number of examples • Set the following parameters K: the number of prototypes, η: the learning rate (constant step size), tmax : maximum number of epochs (sweeps through the data set) • Initialize each prototype by random selection of a data point from the corresponding class • Repeat for epochs t = 1 to t = tmax : – In each epoch present all of the data set in randomized order (every example is presented exactly once, the order of examples is different in every epoch). In matlab this could be done conveniently by permuting the examples with the randperm(P) command. Take care that also the training labels are permuted in the very same way! – perform an epoch of training using all of the P examples. At every individual step present a single example to the system, evaluate the distances from all prototypes and update the winning prototype according to the LVQ1 prescription. After each epoch, determine the number E of misclassified training examples, i.e. the ”training error”. Plot E/100 (the error in %) as a function of the number of epochs (learning curve) and stop the training when E becomes approximately constant. It turns out that for this data set a small learning rate of 0.002 appears appropriate. With this learning rate you should get reasonable results after, say, 100 or 200 epochs. 1.2 Perform training and report results (7 points in total) You should hand in a report comprising • (1 point) a brief introduction in words • (2 points) one example learning curve for (a) and (b) each (as figures with appropriate labelling and caption) • (2 points) a plot which shows the data with LVQ1 labels (e.g. as black dots vs. white circles) in the two-dim. feature space for case (b) at the end of the training process; also mark the prototype positions • (2 points) a brief discussion of your results Remarks: • If you apply special ”tricks” in your code explain them in the text and provide the corresponding lines of code as well • Of course you should code LVQ1 yourself, do not use functions from the matlab neural network toolbox or code retrieved from some repository Bonus (suggestions) 1 point max. in total: - initialize prototypes in (or very close to) the class-conditional mean vectors and compare the learning curves with those of the random initialization - consider systems with three or four prototypes per class - display the ’learning trajectory’ of prototypes in the two-dim. space - implement and run ”GLVQ” which, for every single example, updates the closest correct and the closest wrong prototype - run your LVQ implementation on the 3-class Iris data set, which is available at https://archive.ics.uci.edu/ml/datasets/iris %loading data load("/MATLAB Drive/lvqdata.mat") open("/MATLAB Drive/lvqdata.mat") %initilization of prototype parameters P = 70; K = 1; LearningRate = 0.002; MaxEpoch = 300; Error =0; %labelling class A and class B, 0 stands for class A, 1 stands for class B lvqdata(1, 3) = 0; for i=51:100 lvqdata(i, 3)=1; end newlvqdata1 = zeros(P,3); index0 =randperm(100); for idx=1:P newlvqdata1(idx, :) = lvqdata(index0(1, idx), :); end %random index for extracting prototype examples for y=1:P if newlvqdata1(y, 3) == 0 classAindex = y; else classBindex = y; end end PrototypeA = lvqdata(classAindex, :); PrototypeB = lvqdata(classBindex, :); PrototypeA_x = PrototypeA(1, 1); PrototypeA_y = PrototypeA(1, 2); PrototypeB_x = PrototypeB(1, 1); PrototypeB_y = PrototypeB(1, 2); %Error frequency errorfreq = zeros(1, MaxEpoch); %reshuffling data, keep class labels for h = 1:MaxEpoch newlvqdata = zeros(P,3); index = randperm(P); for j=1:P newlvqdata(j, :)= newlvqdata1(index(1, j), :); end for i=1:P %Extract dimension x of Prototype of class A, and also dimension of y of Prototype of class B Distance_to_PrototypeA = (newlvqdata(i,1)-PrototypeA_x)^2 + (newlvqdata(i,2)-PrototypeA_y)^2; Distance_to_PrototypeB = (newlvqdata(i,1)-PrototypeB_x)^2 + (newlvqdata(i,2)-PrototypeB_y)^2; if Distance_to_PrototypeA
 

Option 1

Low Cost Option
Download this past answer in few clicks

18.86 USD

PURCHASE SOLUTION

Option 2

Custom new solution created by our subject matter experts

GET A QUOTE

rated 5 stars

Purchased 3 times

Completion Status 100%

Related Questions