Rank reduction of matrices has been widely studied in linear algebra. However, its geometric understanding is limited and theoretical connection to statistical models remains unrevealed. We tackle this problem using information geometry and present a geometric unified view of matrix rank reduction. Our key idea is to treat each matrix as a probability distribution represented by the log-linear model on a partially ordered set (poset), which enables us to formulate rank reduction as projection onto a statistical submanifold, which corresponds to the set of low-rank matrices. This geometric view enables us to derive a novel efficient rank-1 reduction method, called Legendre rank-1 reduction, which analytically solves mean-field approximation and minimizes the KL divergence from a given matrix.