Amemiya's Prediction Criterion penalizes R-squared more heavily than does
adjusted R-squared for each addition degree of freedom used on the
right-hand-side of the equation. The higher the better for this criterion.

$$((n + p) / (n - p))(1 - (R^2))$$

where n is the sample size, p is the number of predictors including the intercept and
R^2 is the coefficient of determination.

References

Amemiya, T. (1976). Selection of Regressors. Technical Report 225, Stanford University, Stanford, CA.

Judge, G. G., Griffiths, W. E., Hill, R. C., and Lee, T.-C. (1980). The Theory and Practice of Econometrics.
New York: John Wiley & Sons.