In managing their foreign exchange exposure, international investors, including central banks, often compare actual portfolios with hypothetical portfolios that have been calculated using certain assumptions regarding the statistical properties of interest rates and exchange rates. One of these assumptions is that the variability of returns on various currency assets is time invariant. This assumption is tested in this paper using autoregressive conditional heteroskedastic (ARCH) models. Using weekly aata for the period February 1982 to December 1991 for major reserve currencies, including the SDR, we find evidence that the variances of returns do vary over time (i.e., they do not exhibit stationarity) and that ARCH models that specify changing variances are superior to models that assume constant variance. By incorrectly assuming constant variability of returns, currency-asset allocations are not necessarily optimal and the measures of riskiness of a fixed-income portfolio may not be accurate. Furthermore, the error introduced by incorrectly assuming stationarity is smaller with the SDR than with any other national currency in the portfolio to be managed.