ID: 2211.01369

Gravitational Dimensionality Reduction Using Newtonian Gravity and Einstein's General Relativity

October 30, 2022

View on ArXiv

Similar papers 4

An Information Geometric Framework for Dimensionality Reduction

September 29, 2008

83% Match
Kevin M. Carter, Raviv Raich, Alfred O. III Hero
Machine Learning
Methodology

This report concerns the problem of dimensionality reduction through information geometric methods on statistical manifolds. While there has been considerable work recently presented regarding dimensionality reduction for the purposes of learning tasks such as classification, clustering, and visualization, these methods have focused primarily on Riemannian manifolds in Euclidean space. While sufficient for many applications, there are many high-dimensional signals which have ...

Find SimilarView on arXiv

A Nonlinear Dimensionality Reduction Framework Using Smooth Geodesics

July 21, 2017

83% Match
Kelum Gajamannage, Randy Paffenroth, Erik M. Bollt
Machine Learning
Computer Vision and Pattern ...
Machine Learning
Dynamical Systems

Existing dimensionality reduction methods are adept at revealing hidden underlying manifolds arising from high-dimensional data and thereby producing a low-dimensional representation. However, the smoothness of the manifolds produced by classic techniques over sparse and noisy data is not guaranteed. In fact, the embedding generated using such data may distort the geometry of the manifold and thereby produce an unfaithful embedding. Herein, we propose a framework for nonlinea...

Find SimilarView on arXiv

Nested Hyperbolic Spaces for Dimensionality Reduction and Hyperbolic NN Design

December 3, 2021

83% Match
Xiran Fan, Chun-Hao Yang, Baba C. Vemuri
Machine Learning
Artificial Intelligence
Machine Learning

Hyperbolic neural networks have been popular in the recent past due to their ability to represent hierarchical data sets effectively and efficiently. The challenge in developing these networks lies in the nonlinearity of the embedding space namely, the Hyperbolic space. Hyperbolic space is a homogeneous Riemannian manifold of the Lorentz group. Most existing methods (with some exceptions) use local linearization to define a variety of operations paralleling those used in trad...

Find SimilarView on arXiv

Macrodynamics of users' behavior in Information Retrieval

May 15, 2009

83% Match
Daniel Sonntag, Romàn R. Zapatrin
Information Retrieval

We present a method to geometrize massive data sets from search engines query logs. For this purpose, a macrodynamic-like quantitative model of the Information Retrieval (IR) process is developed, whose paradigm is inspired by basic constructions of Einstein's general relativity theory in which all IR objects are uniformly placed in a common Room. The Room has a structure similar to Einsteinian spacetime, namely that of a smooth manifold. Documents and queries are treated as ...

Find SimilarView on arXiv

General relativity in a nutshell I

November 30, 2023

83% Match
Jorge Pinochet
Popular Physics

Einstein's general relativity is the best available theory of gravity. In recent years, spectacular proofs of Einstein's theory have been conducted, which have aroused interest that goes far beyond the narrow circle of specialists. The aim of this work is to offer an elementary introduction to general relativity. In this first part, we introduce the geometric concepts that constitute the basis of Einstein's theory. In the second part we will use these concepts to explore the ...

Find SimilarView on arXiv

NeuroDAVIS: A neural network model for data visualization

April 1, 2023

83% Match
Chayan Maitra, Dibyendu B. Seal, Rajat K. De
Human-Computer Interaction
Artificial Intelligence
Machine Learning

The task of dimensionality reduction and visualization of high-dimensional datasets remains a challenging problem since long. Modern high-throughput technologies produce newer high-dimensional datasets having multiple views with relatively new data types. Visualization of these datasets require proper methodology that can uncover hidden patterns in the data without affecting the local and global structures within the data. To this end, however, very few such methodology exist...

Find SimilarView on arXiv

Linear Dimensionality Reduction: Survey, Insights, and Generalizations

June 3, 2014

83% Match
John P. Cunningham, Zoubin Ghahramani
Machine Learning

Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a r...

Find SimilarView on arXiv

Optimization of distributions differences for classification

March 2, 2017

83% Match
Mohammad Reza Bonyadi, Quang M. Tieng, David C. Reutens
Machine Learning
Machine Learning

In this paper we introduce a new classification algorithm called Optimization of Distributions Differences (ODD). The algorithm aims to find a transformation from the feature space to a new space where the instances in the same class are as close as possible to one another while the gravity centers of these classes are as far as possible from one another. This aim is formulated as a multiobjective optimization problem that is solved by a hybrid of an evolutionary strategy and...

Find SimilarView on arXiv

A Category Space Approach to Supervised Dimensionality Reduction

October 27, 2016

83% Match
Anthony O. Smith, Anand Rangarajan
Machine Learning
Machine Learning

Supervised dimensionality reduction has emerged as an important theme in the last decade. Despite the plethora of models and formulations, there is a lack of a simple model which aims to project the set of patterns into a space defined by the classes (or categories). To this end, we set up a model in which each class is represented as a 1D subspace of the vector space formed by the features. Assuming the set of classes does not exceed the cardinality of the features, the mode...

Find SimilarView on arXiv

Dimensionality Reduction as Probabilistic Inference

April 16, 2023

83% Match
Aditya Ravuri, Francisco Vargas, ... , Lawrence Neil D.
Machine Learning
Machine Learning

Dimensionality reduction (DR) algorithms compress high-dimensional data into a lower dimensional representation while preserving important features of the data. DR is a critical step in many analysis pipelines as it enables visualisation, noise reduction and efficient downstream processing of the data. In this work, we introduce the ProbDR variational framework, which interprets a wide range of classical DR algorithms as probabilistic inference algorithms in this framework. P...

Find SimilarView on arXiv