{"id":375,"date":"2017-06-09T18:12:35","date_gmt":"2017-06-09T22:12:35","guid":{"rendered":"http:\/\/hydrouncertainty.org\/?p=375"},"modified":"2019-01-15T12:27:39","modified_gmt":"2019-01-15T17:27:39","slug":"uncertainty-about-uncertainty","status":"publish","type":"post","link":"http:\/\/hydrouncertainty.org\/2017\/06\/09\/uncertainty-about-uncertainty\/","title":{"rendered":"Uncertainty about Uncertainty"},"content":{"rendered":"\n
I like to point to Keith Beven’s (1987) conference paper, titled \u2018Towards a new paradigm in hydrology\u2019, as a place for new hydrologists to start to develop an understanding of uncertainty in the hydrological sciences[1]<\/a>. In that paper Keith argued that \u201clittle to no success\u201d had been made against the fundamental problem of developing theories about how small-scale complexities lead to large-scale behavior in hydrological systems[2]<\/a>.<\/p>\n\n\n\n The paper discussed what hydrologists might do about this situation, and in the last two paragraphs Keith made two predictions:<\/p>\n\n\n\n Of course, both of his predictions were essentially correct. In 2007 Jeff McDonnell wrote \u201cto make continued progress in watershed hydrology \u2026 we need to \u2026 explore the set of organizing principles that might underlie heterogeneity and complexity\u201d <\/em>(McDonnell et al., 2007).<\/em> Jeff went on to describe several possibilities for \u2018moving beyond\u2019 heterogeneity and process complexity, but the point is nevertheless clear: the problem remains unsolved. Similarly, Keith wrote last year that \u201cour perceptual model of uncertainty is now much more sophisticated \u2026 but this has not resulted in analogous progress in uncertainty quantification\u201d<\/em> (Beven, 2016). <\/p>\n\n\n\n The situation in hydrology right now is that we understand that macro-scale behaviors of watersheds are governed by small-scale heterogeneities, but we don\u2019t have a theory about how this works, and we don\u2019t have any fundamental theory that allows us to (reliably) quantify predictive uncertainties related to these processes.<\/p>\n\n\n\n Instead, what we have are ad hoc strategies for obtaining numbers that seem like they might be related with uncertainty. One example of this is the recent proliferation of multi-parameterization modeling systems that allow the user to choose between a variety of options for different flux parameterizations. An example is the Structure for Unifying Multiple Modeling Alternatives; Clark et al. (2015) wrote that \u201c[SUMMA] provides capabilities to evaluate different representations of spatial heterogeneity and different flux parameterizations, and therefore tackle the fundamental modeling challenge of simulating the fluxes of water and energy over a hierarchy of spatial scales.\u201d<\/em> It\u2019s unclear why predicting with a variety of different parameterizations of scale-dependent processes allows us to \u2018tackle\u2019 scale-related challenges: we still lack a fundamental theory of hydrologic scaling, and making predictions with several different parameterizations is not in any way reflective of the actual nature of our lack of knowledge about the principles and behavior of hydrologic systems.<\/p>\n\n\n\n Other methods that we often use for uncertainty quantification suffer from similar problems. Bayesian methods allow us to do precisely one thing: inter-compare or average several different competing models. Bayesian methods, and indeed any methods for model inter-comparison or model averaging, are fundamentally incapable of helping us to understand the difference between our family of models (i.e.,<\/em> those models that are assigned finite probability by the prior) and the real system. Gelman & Shalizi (2013) give a philosophical treatment of this problem that is worth reading.<\/p>\n\n\n\n It has been suggested that we might use empirical methods to develop probability distributions over different components of predictive imprecision (e.g.,<\/em> Montanari and Koutsoyiannis, 2012), but this type of approach assumes stationarity not only in those aspects of the hydrological system that are captured by the model, but also stationarity in the relationship between model error and those parts of the hydrological system that are not captured by the model.<\/p>\n\n\n\n The point is that uncertainty is tautologically inestimable, and so it is really no surprise that Keith\u2019s second prediction came true \u2013 there was never any real possibility to develop a rigorous theoretical basis for understanding uncertainty, scale-dependent or otherwise. More than that, the methods that we have come up with to approximate uncertainty don\u2019t actually do that at all \u2013 at least not in any way that is fundamentally or theoretically reliable. I propose the following challenge: provide a theorem that proves a bounded, asymptotic, or even consistent relationship between any quantitative estimator and real-world uncertainty under evaluable assumptions. I offer the standard wager for scientific controversy[3]<\/a>: a bottle of Yamazaki 12 year, or comparable.<\/p>\n\n\n\n Until we have such a theorem, I propose that it is not useful to talk about uncertainty quantification \u2013 approximate or otherwise, \u2013 because none of our estimators are related to real uncertainty in any systematic way.<\/p>\n\n\n\n Instead, I predict a new paradigm change in hydrology. I suspect that within the next 30 years, the conversation in hydrology will be about information rather than uncertainty. The reason for this is that while it is impossible to estimate uncertainty (even approximately), it is possible to obtain at least bounded estimates of information measures (Nearing and Gupta, 2017). The main project of science seems to be about comparing the information contained in observation data with the information provided by a hypothesis-driven model. Similarly, the problem of scaling under heterogeneity seems to be fundamentally about cross-scale information transfers, rather than about uncertainty. Given recent work in basic physics (e.g.,<\/em> Cao et al., 2017), I suspect that it will not take three more decades for us to discover real scaling theories under this type of perspective.<\/p>\n\n\n\n Of course, we will always want to know the reliability of our model predictions, and for this reason the concept of uncertainty will never go away completely. I propose, however, that the tractable and meaningful challenge is to understand the actual predictive precision implied by our hydrological theory and hypotheses. At present, we do not do this. Current practice is to build models that are over-precise, and then append \u2018uncertainty\u2019 distributions to their predictions. This method of dealing with a lack of complete knowledge stems fundamentally from our Newtonian heritage. Essentially all process-based hydrological models are expressed as PDEs, which must admit solution in order to make a prediction.<\/p>\n\n\n\n There are two problems with building dynamical models as PDEs. First, such models make ontological<\/em> predictions (predictions about what will happen), whereas what we actually want are epistemological<\/em> predictions (predictions about what we can know about what will happen). The uncertainty probabilities that we append to our models are the latter, and they are what we actually need for both hypothesis testing and decision support. But these probabilities are not the product of actually solving our model equations. Even if our model is a stochastic PDE, the random walk component is simply an ad hoc appendage on the drift function. Sampling model inputs or different model structures does not actually tell us anything about our lack of knowledge associated with any of those model structures. Models built as PDEs simply do not solve for anything that represents what we can know from our physical theory and hypotheses.<\/p>\n\n\n\n The second problem is that a PDE only provides a prediction if it can be solved. This requires that we prescribe values (or distributions) over all parameters contained in our hypothetical parameterizations, some of which are impossible to measure and otherwise difficult to estimate. It would be exciting to have a method for constructing models that allows us to assign values to only those parameters that we feel we actually have some information about.<\/p>\n\n\n\n But there is, in principle (although I have no example of such), a way to build this type of model. Instead of expressing conservation principles using differential equations, we could express them as symmetry constraints on probability distributions. To do this, we might specify a Bayesian network such that each node is a random variable representing a particular scale-dependent quantity at a particular time and location within the modeled system; conservation laws could then be used to effectively rule out large portions of the joint space of values over these random variables. By imposing conservation laws and other physical principles as constraints on joint probability distributions, our models would fundamentally solve for what we can know about the future or unobserved behavior of a dynamical system conditional on whatever information (theories, hypotheses, data) are used to build the model. In principle, anything that we do know, or wish to hypothesize, could be imposed as a constraint on the joint distribution over a family of random variables representing different aspects of system behavior.<\/p>\n\n\n\n Although such a strategy would not allow us to measure epistemic uncertainty (uncertainty is always and still inestimable), at least it would allow us to know what information we actually have about the behavior of hydrologic systems. This would be a very different way of approaching model building than appending uncertainty distributions to PDE solutions, and would allow us to actually quantify the information content of our scientific hypotheses and models.<\/p>\n\n\n\n So perhaps my predictions about paradigm change will not come to pass. I am, after all, essentially arguing against two of the most fundamental concepts in our science: that we should not use Bayesian methods to evaluate models, and that we should not use differential equations to build models. I do suspect that I am right about both of these things, in the sense that our science (indeed, any science of complex systems) would accelerate by abandoning these ideas in favor of information-centric philosophies and methods, but perhaps it will take longer than 30 years to demonstrate that such a substantial change is necessary.\u00a0 [1]<\/a>It\u2019s the kind of paper that can be enjoyed with a beer. I like to point to Keith Beven’s (1987) conference paper, titled \u2018Towards a new paradigm in hydrology\u2019, as a place for new hydrologists to start to develop an understanding of uncertainty in the hydrological sciences[1]. In that paper Keith argued<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0},"categories":[10],"tags":[],"jetpack_featured_media_url":"","_links":{"self":[{"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/posts\/375"}],"collection":[{"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/comments?post=375"}],"version-history":[{"count":1,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/posts\/375\/revisions"}],"predecessor-version":[{"id":376,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/posts\/375\/revisions\/376"}],"wp:attachment":[{"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/media?parent=375"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/categories?post=375"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/hydrouncertainty.org\/wp-json\/wp\/v2\/tags?post=375"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}
<\/p>\n\n\n\nReferences:<\/h2>\n\n\n\n
\n\n\n\n
[2]<\/a>Dooge (1986) gave a somewhat more technical discussion of this same problem.
[3]<\/a>e.g.,https:\/\/www.quantamagazine.org\/supersymmetry-bet-settled-with-cognac-20160822<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"