What is Risk-Adjusted Return? (Plain English Definition)
Definition: Risk-adjusted return measures an investment's return relative to the amount of risk taken, showing whether higher returns adequately compensate for higher risk.
Risk-Adjusted Return Explained Simply
Risk-adjusted return answers the question: "Did this investment earn enough extra return to justify the extra risk?" A fund that returns 15% with very high volatility might actually be a worse investment on a risk-adjusted basis than a fund returning 10% with low volatility, because the second fund delivered most of the return with much less risk.
Several metrics measure risk-adjusted returns. The Sharpe ratio divides excess return (return above the risk-free rate) by standard deviation. The Sortino ratio is similar but only penalizes downside volatility. The Treynor ratio uses beta instead of standard deviation. Higher values for all of these indicate better risk-adjusted performance.
Risk-adjusted returns are particularly useful when comparing investments with very different risk profiles. Comparing a technology ETF's raw return to a bond ETF's return is not very informative because they take very different amounts of risk. But comparing their Sharpe ratios tells you which one delivered more return per unit of risk taken.
Risk-Adjusted Return Example
ETF A returns 12% with a standard deviation of 20%. ETF B returns 9% with a standard deviation of 8%. The risk-free rate is 4%. ETF A's Sharpe ratio is (12% - 4%) / 20% = 0.40. ETF B's Sharpe ratio is (9% - 4%) / 8% = 0.63. Despite ETF B's lower raw return, it delivered more return per unit of risk, making it the better risk-adjusted performer.
Why Risk-Adjusted Return Matters for ETF Investors
Risk-adjusted returns help ETF investors look beyond raw performance numbers. A fund that consistently delivers high risk-adjusted returns demonstrates genuine skill or a sound investment strategy. A fund with high raw returns but poor risk-adjusted returns is simply taking excessive risk. For ETF investors, comparing risk-adjusted returns across similar ETFs provides a more nuanced view of fund quality. Two ETFs tracking the same market segment might have similar raw returns but very different risk profiles. The one with better risk-adjusted returns is managing to capture market gains more efficiently, which means a smoother ride and fewer stomach-churning drawdowns.
Risk-Adjusted Return vs Sharpe Ratio
| Risk-Adjusted Return | Sharpe Ratio |
|---|---|
| Risk-adjusted return measures an investment's return relative to the amount of risk taken, showing whether higher returns adequately compensate for higher risk. | See full definition of Sharpe Ratio |
While risk-adjusted return and sharpe ratio are related concepts, they serve different purposes in the world of ETF investing. Understanding both terms helps you make more informed decisions about which funds to include in your portfolio and how to evaluate their performance.
Related Terms
Deepen your understanding of ETF investing by exploring these related concepts:
Sharpe Ratio
The Sharpe ratio measures an investment's risk-adjusted return by dividing its excess return above the risk-free rate by its standard deviation.
Standard Deviation
Standard deviation measures how much an investment's returns vary from its average return, quantifying the volatility or risk of the investment.
Alpha
Alpha measures an investment's excess return compared to its benchmark index, indicating how much value a fund manager adds or subtracts.
Beta
Beta measures how much an investment's price tends to move relative to the overall market, indicating its volatility compared to a benchmark.
Volatility
Volatility measures how much and how quickly an investment's price changes, with higher volatility meaning larger and more frequent price swings.
If you’re serious about learning ETF investing properly, we recommend this highly-rated Udemy course that teaches a complete selection framework — from picking profitable ETFs to building a recession-proof portfolio. No finance background needed.