Consistent High-Precision Volatility from High Frequency Data
The paper titled, “Consistent High-Precision Volatility from High Frequency Data”, talks about the trade off that one needs to make while considering sampling frequency. If you increase sampling frequency, the measurement error goes down but microstructure noise increases. If you decrease sampling frequency, the microstructure noise decreases but the measurement error goes up.
Researchers in the past have suggested the usage of 10min/20min/xmin intervals based on some visual tools that have fancy names such as volatility signature plots. The authors of the paper argue that there is a flaw in using such tools that work on homogenized time series. If such tools are used in tick time, the sampling frequency becomes so low that it discards most of the HFD.
To address this problem differently, the authors suggest a direct way to handle bias arising out of microstructure noise. For the FX markets, the authors model the price process as a combination of Brownian motion and a random i.i.d noise process. They recommend a procedure where by an EWMA operator can be used to reduce the microstructure noise, resulting in a filtered time series that can be conveniently used for volatility estimation.The EWMA operator is useful for removing noise in FX data whereas a more complicated method needs to be adopted for removing noise from the equity data.
The basic idea of the paper is to provide a method to filter away the noise at high frequency scale so that the resulting series can be used to get a high precision volatility estimate.