We prove continuity of the solution path for the group lasso, a popular method of computing group- sparse models. Unlike the more classical lasso method, the group lasso solution path is non-linear and cannot be determined in closed form. To circumvent this, we first characterize the group lasso solution set and then show how to construct an implicit function for the min-norm path. We prove our implicit representation is continuous almost everywhere and extend this to continuity every- where when the group lasso solution is unique. Our work can be viewed as extending solution path analyses from lasso setting to the group lasso and implies that grid-search is a sensible approach to hyper-parameter selection. Our results also have applications to convex reformulations of neural neural networks and so are deeply connected to solution paths for shallow neural networks.