Open Access   Article Go Back

Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering

A. Mhetre1 , V.S. Gaikwad2

Section:Survey Paper, Product Type: Journal Paper
Volume-5 , Issue-1 , Page no. 36-39, Jan-2017

Online published on Jan 31, 2017

Copyright © A. Mhetre, V.S. Gaikwad . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: A. Mhetre, V.S. Gaikwad, “Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering,” International Journal of Computer Sciences and Engineering, Vol.5, Issue.1, pp.36-39, 2017.

MLA Style Citation: A. Mhetre, V.S. Gaikwad "Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering." International Journal of Computer Sciences and Engineering 5.1 (2017): 36-39.

APA Style Citation: A. Mhetre, V.S. Gaikwad, (2017). Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering. International Journal of Computer Sciences and Engineering, 5(1), 36-39.

BibTex Style Citation:
@article{Mhetre_2017,
author = {A. Mhetre, V.S. Gaikwad},
title = {Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {1 2017},
volume = {5},
Issue = {1},
month = {1},
year = {2017},
issn = {2347-2693},
pages = {36-39},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=1152},
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=1152
TI - Relaxed Constraints Formulation for Non-linear Distance Metric Learning in Hierarchical Clustering
T2 - International Journal of Computer Sciences and Engineering
AU - A. Mhetre, V.S. Gaikwad
PY - 2017
DA - 2017/01/31
PB - IJCSE, Indore, INDIA
SP - 36-39
IS - 1
VL - 5
SN - 2347-2693
ER -

VIEWS PDF XML
942 661 downloads 536 downloads
  
  
           

Abstract

The process of Learning distance function over different objects is called as Metric learning. In various of data mining processes like clustering, nearest neighbours etc. is very important problem that relies on distance function. For many types of data, linear model is not very useful but most of metric learning methods assumes linear model of distance. In the recent nonlinear data demonstrated potentialpower of non-Mahalanobis distance function, particularly tree-based functions. This leads to a more robust learning algorithm. We compare our method to a number of state-of-the-art benchmarks on k-nearest neighbour classification, large-scale image retrieval and semi supervised clustering problems. Then we find that our algorithm yields results comparable to the state-of-the-art. A novel tree-based non-linear metric learning method can have information from both constrained and unconstrained points. And hierarchical nature of training can minimize the constraint satisfaction problem as it won�t have to go through the constraint satisfaction process per object but per hierarchy. Combining the output of many of the resulting semi-random weak hierarchy metrics and by introducing randomness during hierarchy training, we can obtain a powerful and robust nonlinear metric model.

Key-Words / Index Term

Similarity Measures, Clustering, , Image Retrieval, Classification, Data Mining,Constrained And Unconstrained Point

References

[1] D. M. Johnson, C. Xiong and J. J. Corso, "Semi-Supervised Nonlinear Distance Metric Learning via Forests of Max-Margin Cluster Hierarchies," vol. 28, no. 4, pp. 1035-1046, April 1 2016.
[2] A. Bellet, A. Habrard, and M. Sebban, �A survey on metric learning for feature vectors and structured data,� arXiv preprint arXiv:1306.6709, 2013.
[3] J. V. Davis, B. Kulis, P. Jain, S. Sra, and I. S. Dhillon, �Information theoretic metric learning,� in Proc. 24th Int. Conf. Mach. Learn., 2007, pp. 209�216.
[4] C. Shen, J. Kim, L. Wang, and A. van den Hengel, �Positive semidefinite metric learning with boosting,� in Proc. Adv. Neural Inf. Process. Syst., 2009, pp. 1651�1660.
[5] J. Blitzer, K. Q. Weinberger, and L. K. Saul, �Distance metric learning for large margin nearest neighbor classification,� in Proc. Adv. Neural Inf. Process. Syst., 2005, pp. 1473�1480.
[6] Y. Ying and P. Li, �Distance metric learning with eigen value optimization,� J. Mach. Learn. Res., vol. 13, pp. 1�26, 2012.
[7] R. Chatpatanasiri, T. Korsrilabutr, P. Tangchanachaianan, and B. Kijsirikul, �A new kernelization framework for Mahalanobis distance learning algorithms,� Neurocomputing, vol. 73, no. 10, pp. 1570�1579, 2010.
[8] S. Chopra, R. Hadsell, and Y. LeCun, �Learning a similarity metric discriminatively, with application to face verification,� in Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recog.,2005, pp. 539�546.
[9] A. Frome, Y. Singer, and J. Malik, �Image retrieval and classification using local distance functions,� in Proc. Adv. Neural Inf. Process. Syst., 2006, pp. 417�424.
[10] K. Q. Weinberger and L. K. Saul, �Fast solvers and efficient implementations for distance metric learning,� in Proc. 25th Int. Conf. Mach. Learn., 2008, pp. 1160�1167.