In recent years, there have been significant advances in efficiently solving \(\ell_s\)-regression using linear system solvers and \(\ell_2\)-regression [Adil-Kyng-Peng-Sachdeva, J. ACM'24]. Would efficient smoothed \(\ell_p\)-norm solvers lead to even faster rates for solving \(\ell_s\)-regression when \(2 \leq p < s\)? In this paper, we give an affirmative answer to this question and show how to solve \(\ell_s\)-regression using \(\tilde{O}(n^{\frac{\nu}{1+\nu}})\) iterations of solving smoothed \(\ell_p\) regression problems, where \(\nu := \frac{1}{p} - \frac{1}{s}\). To obtain this result, we provide improved accelerated rates for convex optimization problems when given access to an \(\ell_p^s(\lambda)\)-proximal oracle, which, for a point \(c\), returns the solution of the regularized problem \(\min_{x} f(x) + \lambda \|x-c\|^s_p \). Additionally, we show that these rates for the \(\ell_p^s(\lambda)\)-proximal oracle are optimal for algorithms that query in the span of the previous iterates and outputs of the oracle.