Abstract: Co-authored with Tse-Chun Chen and Daisuke Hotta. The National Weather Service computes operational weather forecasts using a process called âdata assimilationâ: A 6 hour forecast is computed starting from the current âanalysisâ. The 6 hour forecast is then optimally combined with the observations collected 6 hours later to create the new analysis which serves as initial conditions for the next forecast. This process, known as âanalysis cycleâ, is repeated every 6 hours. Miyakoda (personal communication, ~1980) pointed out that using any future information to improve current forecasts should be considered âcheatingâ because it cannot be done in operational forecasting. Chen (2018, PhD thesis), Chen and Kalnay (2019a) MWR, and Chen and Kalnay (2019b, under review), developed an application of Ensemble Forecast Sensitivity to Observations (EFSO, Kalnay et al., 2012, Tellus) combined with Proactive Quality Control (PQC, Hotta et al., 2017). It uses future data (e.g., observations obtained 6 hours after the present analysis) to identify and delete current detrimental observations (in the present analysis). We found that making a late correction of every current analysis after the new observations have been received, accumulates improvements with time. The accumulated improvement is found to be much larger than the last correction that cannot be used in order to avoid cheating, so that forecasts are significantly improved âwithout cheatingâ.
Abstract: References for this subject are Lowell Abrams, Two-dimensional topological quantum field theories and Frobenius algebras, J. Knot Theory Ramifications 5 (1996), no. 5, 569-587, and Stephen Sawin, Direct sum decompositions and indecomposable TQFTs, J. Math. Phys. 36 (1995), no. 12, 6673-6680.