Total Variation (TV) is an effective method of removing noise in digital image processing while preserving edges. The choice of scaling or regularization parameter in the TV process defines the amount of denoising, with value of zero giving a result equivalent to the input signal. Here we explore three algorithms for specifying this parameter based on the statistics of the signal in the total variation process. The Discrepancy Principle, an empirically Bayesian approach and a method based on a χ2 test for the regularized residual. These approaches are advantageous for nonlinear or computationally large problems because they automate selection of a regularization parameter, and give statistical justification that takes away guesswork when manually adjusting or iterating it to zero.