Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Would using smaller data intervals to increase the information in the calibration phase (i.e., weeks vs. months) improve the applicability of the model to other years?

0
Posted

Would using smaller data intervals to increase the information in the calibration phase (i.e., weeks vs. months) improve the applicability of the model to other years?

0

A26. The longer the averaging period, the better the model performs. This is an artifact of averaging out the highs and lows that are so often inaccurately simulated. But the “applicability” depends on your objectives. If you need to make weekly decisions, depending on say weekly meteorology, a monthly model obviously won t cut it. If your question is, does having a greater sample of calibration points covering the same range of conditions improve the model, I d say yes. It may not actually improve the goodness of the calibration, but it will provide more info on the relative goodness of the model. That is, the goodness of fit statistics will be more revealing.

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.

Experts123