20 resultados para Set covering theory
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Volcanic forcing for climate modeling: a new microphysics-based data set covering years 1600–present
Resumo:
As the understanding and representation of the impacts of volcanic eruptions on climate have improved in the last decades, uncertainties in the stratospheric aerosol forcing from large eruptions are now linked not only to visible optical depth estimates on a global scale but also to details on the size, latitude and altitude distributions of the stratospheric aerosols. Based on our understanding of these uncertainties, we propose a new model-based approach to generating a volcanic forcing for general circulation model (GCM) and chemistry–climate model (CCM) simulations. This new volcanic forcing, covering the 1600–present period, uses an aerosol microphysical model to provide a realistic, physically consistent treatment of the stratospheric sulfate aerosols. Twenty-six eruptions were modeled individually using the latest available ice cores aerosol mass estimates and historical data on the latitude and date of eruptions. The evolution of aerosol spatial and size distribution after the sulfur dioxide discharge are hence characterized for each volcanic eruption. Large variations are seen in hemispheric partitioning and size distributions in relation to location/date of eruptions and injected SO2 masses. Results for recent eruptions show reasonable agreement with observations. By providing these new estimates of spatial distributions of shortwave and long-wave radiative perturbations, this volcanic forcing may help to better constrain the climate model responses to volcanic eruptions in the 1600–present period. The final data set consists of 3-D values (with constant longitude) of spectrally resolved extinction coefficients, single scattering albedos and asymmetry factors calculated for different wavelength bands upon request. Surface area densities for heterogeneous chemistry are also provided.
Resumo:
Meteorological or climatological extremes are rare and hence studying them requires long meteorological data sets. Moreover, for addressing the underlying atmospheric processes, detailed three-dimensional data are desired. Until recently the two requirements were incompatible as long meteorological series were only available for a few locations, whereas detailed 3-dimensional data sets such as reanalyses were limited to the past few decades. In 2011, the “Twentieth Century Reanalysis” (20CR) was released, a 6-hourly global atmospheric data set covering the past 140 years, thus combining the two properties. The collection of short papers in this volume contains case studies of individual extreme events in the 20CR data set. In this overview paper we introduce the first six cases and summarise some common findings. All of the events are represented in 20CR in a physically consistent way, allowing further meteorological interpretations and process studies. Also, for most of the events, the magnitudes are underestimated in the ensemble mean. Possible causes are addressed. For interpreting extrema it may be necessary to address individual ensemble members. Also, the density of observations underlying 20CR should be considered. Finally, we point to problems in wind speeds over the Arctic and the northern North Pacific in 20CR prior to the 1950s.
Resumo:
In recent years, the econometrics literature has shown a growing interest in the study of partially identified models, in which the object of economic and statistical interest is a set rather than a point. The characterization of this set and the development of consistent estimators and inference procedures for it with desirable properties are the main goals of partial identification analysis. This review introduces the fundamental tools of the theory of random sets, which brings together elements of topology, convex geometry, and probability theory to develop a coherent mathematical framework to analyze random elements whose realizations are sets. It then elucidates how these tools have been fruitfully applied in econometrics to reach the goals of partial identification analysis.
Relative Predicativity and dependent recursion in second-order set theory and higher-orders theories
Resumo:
This article reports that some robustness of the notions of predicativity and of autonomous progression is broken down if as the given infinite total entity we choose some mathematical entities other than the traditional ω. Namely, the equivalence between normal transfinite recursion scheme and new dependent transfinite recursion scheme, which does hold in the context of subsystems of second order number theory, does not hold in the context of subsystems of second order set theory where the universe V of sets is treated as the given totality (nor in the contexts of those of n+3-th order number or set theories, where the class of all n+2-th order objects is treated as the given totality).
Resumo:
We discuss several ontological properties of explicit mathematics and operational set theory: global choice, decidable classes, totality and extensionality of operations, function spaces, class and set formation via formulas that contain the definedness predicate and applications.
Resumo:
We introduce a version of operational set theory, OST−, without a choice operation, which has a machinery for Δ0Δ0 separation based on truth functions and the separation operator, and a new kind of applicative set theory, so-called weak explicit set theory WEST, based on Gödel operations. We show that both the theories and Kripke–Platek set theory KPKP with infinity are pairwise Π1Π1 equivalent. We also show analogous assertions for subtheories with ∈-induction restricted in various ways and for supertheories extended by powerset, beta, limit and Mahlo operations. Whereas the upper bound is given by a refinement of inductive definition in KPKP, the lower bound is by a combination, in a specific way, of realisability, (intuitionistic) forcing and negative interpretations. Thus, despite interpretability between classical theories, we make “a detour via intuitionistic theories”. The combined interpretation, seen as a model construction in the sense of Visser's miniature model theory, is a new way of construction for classical theories and could be said the third kind of model construction ever used which is non-trivial on the logical connective level, after generic extension à la Cohen and Krivine's classical realisability model.
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.
Resumo:
In this note, we show that an extension of a test for perfect ranking in a balanced ranked set sample given by Li and Balakrishnan (2008) to the multi-cycle case turns out to be equivalent to the test statistic proposed by Frey et al. (2007). This provides an alternative interpretation and motivation for their test statistic.