Sharpe Ratio

Search Dictionary

Definition of 'Sharpe Ratio'

The Sharpe Ratio is also known as the Sharpe Index, Sharpe Measure or Reward-to-Variability Ratio. It is a measure of the excess return (or risk premium) per unit of risk in an investment asset or a trading strategy. It was named after William Forsyth Sharpe. It was last revised by William Sharpe in 1994:

S = (AR - RFR) / SD

where
S = Sharpe Ratio
AR = Asset Return
RFR = Risk Free Return
SD = Standard Deviation.

In place of actual returns for the Asset and Risk Free Returns the investor can use expected returns. Also, in place of the Risk Free Return a benchmark return can be used.

The standard deviation is the square root of the variance of AR - RFR.

The Sharpe Ratio is used to compare how well the return of an asset compensates the investor for the risk taken in an investment compared to that of another. The higher the Sharpe ratio number the better.

When comparing two assets with the same expected return against the same benchmark (e.g. risk free asset), then the asset with the higher Sharpe ratio gives a better return for the same risk.

Investors are advised to pick investments with high Sharpe ratios. When examining the investment performance of assets with smoothing of returns (such as with-profits funds) the Sharpe Ratio should be derived from the performance of the underlying assets rather than the fund returns.

Due to the fact that the Sharpe Ratio is a dimensionless ratio the average investor finds it difficult to understand. For example, how much better is a Sharpe Ratio of 2.5 to 1.5? For this reason, the Modigliani Risk-Adjusted Performance Measure was developed which measures units of percent return which is understandable by all investors.

Do you have a trading or investing definition for our dictionary? Click the Create Definition link to add your own definition. You will earn 150 bonus reputation points for each definition that is accepted.

Is this definition wrong? Let us know by posting to the forum and we will correct it.