Existing research  suggests that embedding graphs on a unit sphere can be beneficial in learning labels on the vertices of a graph. However the choice of optimal embedding remains an open issue. Orthonormal representation of graphs, a class of embeddings over the unit sphere, was introduced by Lovász . In this paper, we show that there exists orthonormal representations which are statistically consistent over a large class of graphs, including power law and random graphs. This result is achieved by extending the notion of consistency designed in the inductive setting to graph transduction. As part of the analysis, we explicitly derive relationships between the Rademacher complexity measure and structural properties of graphs, such as the chromatic number. We further show the fraction of vertices of a graph , on nodes, that need to be labelled for the learning algorithm to be consistent, also known as labelled sample complexity, is where is the famous Lovász function of the graph. This, for the first time, relates labelled sample complexity to graph connectivity properties, such as the density of graphs. In the multiview setting, whenever individual views are expressed by a graph, it is a well known heuristic that a convex combination of Laplacians  tend to improve accuracy. The analysis presented here easily extends to Multiple graph transduction, and helps develop a sound statistical understanding of the heuristic, previously unavailable.
 Rie Kubota Ando and Tong Zhang. Learning on graph with laplacian regularization. NIPS, 2007.
 László Lovász. On the shannon capacity of a graph. IEEE Transactions on Information Theory, 25(1):1–7, 1979.
 Andreas Argyriou, Mark Herbster and Massimiliano Pontil. Combining graph laplacians for semi-supervised learning. NIPS, 2005.
Please feel free to contact “rakeshsmysore at gmail dot com” for any queries, comments etc.
If you have any thoughts or criticism on this work, we will be happy to discuss.