Lecture 2: About data

Σχετικά με δεδομένα και τρόποι προεπεξεργασίας τους. Ανάλυση Κύριων Συνιστωσών (Principal Component Analysis). Μέτρα ομοιότητας και απόστασης.

PCA in R
PCA-Eigenvector-Illustration.gif

Reducing a two dimensional space and data (blue dots) into an one dimensional one - which is the goal of PCA. In PCA we search for Eigenvectors, where when data are projected onto them, their variance is at maximum. Projection of the original data (in blue) onto the eigenvectors are the red dots; variance of the red dots is expressing how far apart the red dots are on the eigenvector.

PCA.Python.py
winequality-white.csv
PCA-AsProjectionOntoLowerDimensionalSpace.png

Real world example of PCA: reducing 3 dimensions (leafs of tree) to 2 (shaddow of leafs), while at the same time expressing great amount of variance of the original data.

Lecture2-AboutAlgorithmsAndData.pdf
1ErgasiaSupportMaterial.py

Βοθητικό υλικό για την εκπόνηση της εργασίας 1.