The triplet loss has a couple of disadvantages that should be considered.
First, it requires a careful selection of the anchor, positive, and negative images. The difference between the negative and the anchor images can’t be too much, if it was, the network will satisfy the loss function easily without learning anything. The anchor and the negative images must be similar but shouldn’t belong in the same class.
Second, it’s computationally expensive and lastly, the triplet loss requires the hyperparameter alpha, or the margin. it can lead to worse results when not chosen carefully.
There are many alternatives to the triplet loss, one of them is the ArcFace Loss. This is a loss based on the cross-entropy loss aiming to maximize the decision boundary between classes thus grouping similar data points closer together.
The idea behind ArcFace is that it maximizes the angle between interclass and minimizes the angle between intraclass on a hypersphere. We then add the angular margin penalty which is inserted between the weight of the true logit and the embedding. This adds a angle penalty to the original angle between the logit and the embedding.
The angle margin penalty helps in penalizing the embedding vectors that goes far and help in bringing the embedding features of a certain class come more closer.
There are many other losses out there that operates on a similar idea compared to ArcFace, such as SphereFace, CosFace.
Using the triplet loss with the Siamese Network are early approaches to similarity learning and thus have many problems and disadvantages. On the other hand, ArcFace achieves the goal of maximizing the decision boundary between classes cleverly without wasting computational resources.
Continue reading: https://towardsdatascience.com/novel-approaches-to-similarity-learning-e680c61d53cd?source=rss—-7f60cf5620c9—4