129 resultados para Logic and Probabilistic Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Aqua-Planet Experiment (APE) was first proposed by Neale and Hoskins (2000a) as a benchmark for atmospheric general circulation models (AGCMs) on an idealised water-covered Earth. The experiment and its aims are summarised, and its context within a modelling hierarchy used to evaluate complex models and to provide a link between realistic simulation and conceptual models of atmospheric phenomena is discussed. The simplified aqua-planet configuration bridges a gap in the existing hierarchy. It is designed to expose differences between models and to focus attention on particular phenomena and their response to changes in the underlying distribution of sea surface temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Housing in the UK accounts for 30.5% of all energy consumed and is responsible for 25% of all carbon emissions. The UK Government’s Code for Sustainable Homes requires all new homes to be zero carbon by 2016. The development and widespread diffusion of low and zero carbon (LZC) technologies is recognised as being a key solution for housing developers to deliver against this zero-carbon agenda. The innovation challenge to design and incorporate these technologies into housing developers’ standard design and production templates will usher in significant technical and commercial risks. In this paper we report early results from an ongoing Engineering and Physical Sciences Research Council project looking at the innovation logic and trajectory of LZC technologies in new housing. The principal theoretical lens for the research is the socio-technical network approach which considers actors’ interests and interpretative flexibilities of technologies and how they negotiate and reproduce ‘acting spaces’ to shape, in this case, the selection and adoption of LZC technologies. The initial findings are revealing the form and operation of the technology networks around new housing developments as being very complex, involving a range of actors and viewpoints that vary for each housing development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wernicke’s aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in WA, a view consistent with current functional neuroimaging which finds areas in the superior temporal cortex responsive to phonological stimuli. However behavioural evidence to support the link between a phonological analysis deficit and auditory comprehension has not been yet shown. This study extends seminal work by Blumstein et al. (1977) to investigate the relationship between acoustic-phonological perception, measured through phonological discrimination, and auditory comprehension in a case series of Wernicke’s aphasia participants. A novel adaptive phonological discrimination task was used to obtain reliable thresholds of the phonological perceptual distance required between nonwords before they could be discriminated. Wernicke’s aphasia participants showed significantly elevated thresholds compared to age and hearing matched control participants. Acoustic-phonological thresholds correlated strongly with auditory comprehension abilities in Wernicke’s aphasia. In contrast, nonverbal semantic skills showed no relationship with auditory comprehension. The results are evaluated in the context of recent neurobiological models of language and suggest that impaired acoustic-phonological perception underlies the comprehension impairment in Wernicke’s aphasia and favour models of language which propose a leftward asymmetry in phonological analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Independent studies have demonstrated that flagella are associated with the invasive process of Salmonella enterica serotypes, and aflagellate derivatives of Salmonella enterica serotype Enteritidis are attenuated in murine and avian models of infection. One widely held view is that the motility afforded by flagella, probably aided by chemotactic responses, mediates the initial interaction between bacterium and host cell. The adherence and invasion properties of two S. Enteritidis wild-type strains and isogenic aflagellate mutants were assessed on HEp-2 and Div-1 cells that are of human and avian epithelial origin, respectively. Both aflagellate derivatives showed a significant reduction of invasion compared with wild type over the three hours of the assays. Complementation of the defective fliC allele recovered partially the wild-type phenotype. Examination of the bacterium-host cell interaction by electron and confocal microscopy approaches showed that wild-type bacteria induced ruffle formation and significant cytoskeletal rearrangements on HEp-2 cells within 5 minutes of contact. The aflagellate derivatives induced fewer ruffles than wild type. Ruffle formation on the Div-1 cell line was less pronounced than for HEp-2 cells for wild-type S. Enteritidis. Collectively, these data support the hypothesis that flagella play an active role in the early events of the invasive process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wave-activity conservation laws are key to understanding wave propagation in inhomogeneous environments. Their most general formulation follows from the Hamiltonian structure of geophysical fluid dynamics. For large-scale atmospheric dynamics, the Eliassen–Palm wave activity is a well-known example and is central to theoretical analysis. On the mesoscale, while such conservation laws have been worked out in two dimensions, their application to a horizontally homogeneous background flow in three dimensions fails because of a degeneracy created by the absence of a background potential vorticity gradient. Earlier three-dimensional results based on linear WKB theory considered only Doppler-shifted gravity waves, not waves in a stratified shear flow. Consideration of a background flow depending only on altitude is motivated by the parameterization of subgrid-scales in climate models where there is an imposed separation of horizontal length and time scales, but vertical coupling within each column. Here we show how this degeneracy can be overcome and wave-activity conservation laws derived for three-dimensional disturbances to a horizontally homogeneous background flow. Explicit expressions for pseudoenergy and pseudomomentum in the anelastic and Boussinesq models are derived, and it is shown how the previously derived relations for the two-dimensional problem can be treated as a limiting case of the three-dimensional problem. The results also generalize earlier three-dimensional results in that there is no slowly varying WKB-type requirement on the background flow, and the results are extendable to finite amplitude. The relationship A E =cA P between pseudoenergy A E and pseudomomentum A P, where c is the horizontal phase speed in the direction of symmetry associated with A P, has important applications to gravity-wave parameterization and provides a generalized statement of the first Eliassen–Palm theorem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we report on a study conducted using the Middle Atmospheric Nitrogen TRend Assessment (MANTRA) balloon measurements of stratospheric constituents and temperature and the Canadian Middle Atmosphere Model (CMAM). Three different kinds of data are used to assess the inter-consistency of the combined dataset: single profiles of long-lived species from MANTRA 1998, sparse climatologies from the ozonesonde measurements during the four MANTRA campaigns and from HALOE satellite measurements, and the CMAM climatology. In doing so, we evaluate the ability of the model to reproduce the measured fields and to thereby test our ability to describe mid-latitude summertime stratospheric processes. The MANTRA campaigns were conducted at Vanscoy, Saskatchewan, Canada (52◦ N, 107◦ W)in late August and early September of 1998, 2000, 2002 and 2004. During late summer at mid-latitudes, the stratosphere is close to photochemical control, providing an ideal scenario for the study reported here. From this analysis we find that: (1) reducing the value for the vertical diffusion coefficient in CMAM to a more physically reasonable value results in the model better reproducing the measured profiles of long-lived species; (2) the existence of compact correlations among the constituents, as expected from independent measurements in the literature and from models, confirms the self-consistency of the MANTRA measurements; and (3) the 1998 measurements show structures in the chemical species profiles that can be associated with transport, adding to the growing evidence that the summertime stratosphere can be much more disturbed than anticipated. The mechanisms responsible for such disturbances need to be understood in order to assess the representativeness of the measurements and to isolate longterm trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new technique for objective classification of boundary layers is applied to ground-based vertically pointing Doppler lidar and sonic anemometer data. The observed boundary layer has been classified into nine different types based on those in the Met Office ‘Lock’ scheme, using vertical velocity variance and skewness, along with attenuated backscatter coefficient and surface sensible heat flux. This new probabilistic method has been applied to three years of data from Chilbolton Observatory in southern England and a climatology of boundary-layer type has been created. A clear diurnal cycle is present in all seasons. The most common boundary-layer type is stable with no cloud (30.0% of the dataset). The most common unstable type is well mixed with no cloud (15.4%). Decoupled stratocumulus is the third most common boundary-layer type (10.3%) and cumulus under stratocumulus occurs 1.0% of the time. The occurrence of stable boundary-layer types is much higher in the winter than the summer and boundary-layer types capped with cumulus cloud are more prevalent in the warm seasons. The most common diurnal evolution of boundary-layer types, occurring on 52 days of our three-year dataset, is that of no cloud with the stability changing from stable to unstable during daylight hours. These results are based on 16393 hours, 62.4% of the three-year dataset, of diagnosed boundary-layer type. This new method is ideally suited to long-term evaluation of boundary-layer type parametrisations in weather forecast and climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theorem-proving is a one-player game. The history of computer programs being the players goes back to 1956 and the ‘LT’ LOGIC THEORY MACHINE of Newell, Shaw and Simon. In game-playing terms, the ‘initial position’ is the core set of axioms chosen for the particular logic and the ‘moves’ are the rules of inference. Now, the Univalent Foundations Program at IAS Princeton and the resulting ‘HoTT’ book on Homotopy Type Theory have demonstrated the success of a new kind of experimental mathematics using computer theorem proving.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An initial validation of the Along Track Scanning Radiometer (ATSR) Reprocessing for Climate (ARC) retrievals of sea surface temperature (SST) is presented. ATSR-2 and Advanced ATSR (AATSR) SST estimates are compared to drifting buoy and moored buoy observations over the period 1995 to 2008. The primary ATSR estimates are of skin SST, whereas buoys measure SST below the surface. Adjustment is therefore made for the skin effect, for diurnal stratification and for differences in buoy–satellite observation time. With such adjustments, satellite-in situ differences are consistent between day and night within ~ 0.01 K. Satellite-in situ differences are correlated with differences in observation time, because of the diurnal warming and cooling of the ocean. The data are used to verify the average behaviour of physical and empirical models of the warming/cooling rates. Systematic differences between adjusted AATSR and in-situ SSTs against latitude, total column water vapour (TCWV), and wind speed are less than 0.1 K, for all except the most extreme cases (TCWV < 5 kg m–2, TCWV > 60 kg m–2). For all types of retrieval except the nadir-only two-channel (N2), regional biases are less than 0.1 K for 80% of the ocean. Global comparison against drifting buoys shows night time dual-view two-channel (D2) SSTs are warm by 0.06 ± 0.23 K and dual-view three-channel (D3) SSTs are warm by 0.06 ± 0.21 K (day-time D2: 0.07 ± 0.23 K). Nadir-only results are N2: 0.03 ± 0.33 K and N3: 0.03 ± 0.19 K showing the improved inter-algorithm consistency to ~ 0.02 K. This represents a marked improvement from the existing operational retrieval algorithms for which inter-algorithm inconsistency is > 0.5 K. Comparison against tropical moored buoys, which are more accurate than drifting buoys, gives lower error estimates (N3: 0.02 ± 0.13 K, D2: 0.03 ± 0.18 K). Comparable results are obtained for ATSR-2, except that the ATSR-2 SSTs are around 0.1 K warm compared to AATSR