Hyperbolic spaces offer a rich setup to learn embeddings with superior properties that have been leveraged in areas such as computer vision, natural language processing and computational biology. Recently, several hyperbolic approaches have been proposed to learn high-quality representations for users and items in the recommendation setting. However, these approaches don’t capture the higher order relations that typically exist in the implicit feedback domain. Graph convolutional neural networks (GCNs) on the other hand excel at capturing such information by applying multiple levels of aggregation to local representations. In this paper we combine these two approaches in a novel way: by applying the graph convolutions in the tangent space of a reference point. We show that this hyperbolic GCN architecture can be effectively learned with a margin ranking loss, and test-time retrieval is done using the hyperbolic distance. We conduct extensive empirical analysis on three public benchmarks and compare against a large set of baselines. Our approach achieves highly competitive performance with a significant improvement over state-of-the-art methods. the data. Full code for this work will be released at the time of publication.We further study the properties of the embeddings obtained by our hyperbolic model and show that they offer meaningful insights into

2021 THE WEB CONFERENCE NEWSLETTER
The Web Conference is announcing latest news and developments biweekly or on a monthly basis. We respect The General Data Protection Regulation 2016/679.