About

Regularly rerunning forecastings using the newest data is a core aspect of iterative near-term forecasting. This sets a higher bar for repeatability and presents informatic and computational challenges that go beyond most ecological analyses, particularly for forecasts running closer to real-time. The goal of the cyberinfrastructure working group is to make it easier to implement, archive, and share automated iterative forecasts, so that any research group that can develop a forecasting model can deploy it as an automated system. Efforts of this working group include, but are not limited to, the development of standards and databases for transparent, open, and interoperable archiving and sharing or both forecasts and forecast workflows, and the development of shared community tools for data ingest/interoperability and for forecast workflow automation / continuous integration. We will make the components of our infrastructure available through open-source software and open educational resources for using existing tools. The integration of data and models is at the core of ecological forecasting. There is much that can be learned about forecasting methods from other disciplines, from weather forecasters through to economic forecasters. But ecologists also face challenges that outside the mainstream of either of these extremes, such as a high degree of process heterogeneity across many scales and an abundance of semi-mechanistic models, where physical and chemical constraints play an important role but many functional relationships are empirically derived. This working group will advance the statistical methods and tools for forecasting and data assimilation by advancing statistical approaches and best practices for data assimilation, translating these into software tools usable by ecologists, and develop uncertainty estimates on common data.

 

Team