↑S↓↑p↓↑i↓↑n↓ ↑D↓↑r.↓
A
#machinelearning Loss functions have one to one mapping to distances. One example is clearly the Mean Squared Error which is equivalent to the euclidean distance between the predictions and the ground truth response variable one is trying to predict. What about other loss functions?
09:59 AM - Feb 11, 2023
0
3
0
↑S↓↑p↓↑i↓↑n↓ ↑D↓↑r.↓
A
In response to ↑S↓↑p↓↑i↓↑n↓ ↑D↓↑r.↓.
Picking the right distance has a significant impact on unsupervised learning tasks like dimensionality reduction and data clustering. Here is an example of clusters generated with sklearn gaussian quantiles. These clusters are N-dimensional concentric spheres, they cannot be separated linearly.
10:02 AM - Feb 11, 2023
1
0
↑S↓↑p↓↑i↓↑n↓ ↑D↓↑r.↓
A
In response to ↑S↓↑p↓↑i↓↑n↓ ↑D↓↑r.↓.
mse (mean squared error, euclidean distance) fails to yield a meaningful result on a umap (dimensionality reduction technique) output.
10:04 AM - Feb 11, 2023
1
0

 

{{ notificationModalContent }} {{ promptModalMessage }}