127 resultados para Probabilistic constraints


Relevância:

20.00% 20.00%

Publicador:

Resumo:

How do probabilistic models represent their targets and how do they allow us to learn about them? The answer to this question depends on a number of details, in particular on the meaning of the probabilities involved. To classify the options, a minimalist conception of representation (Su\'arez 2004) is adopted: Modelers devise substitutes (``sources'') of their targets and investigate them to infer something about the target. Probabilistic models allow us to infer probabilities about the target from probabilities about the source. This leads to a framework in which we can systematically distinguish between different models of probabilistic modeling. I develop a fully Bayesian view of probabilistic modeling, but I argue that, as an alternative, Bayesian degrees of belief about the target may be derived from ontic probabilities about the source. Remarkably, some accounts of ontic probabilities can avoid problems if they are supposed to apply to sources only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper analyzes how to comply with an emission constraint, which restricts the use of an established energy technique, given the two options to save energy and to invest in two alternative energy techniques. These techniques differ in their deterioration rates and the investment lags of the corresponding capital stocks. Thus, the paper takes a medium-term perspective on climate change mitigation, where the time horizon is too short for technological change to occur, but long enough for capital stocks to accumulate and deteriorate. It is shown that, in general, only one of the two alternative techniques prevails in the stationary state, although, both techniques might be utilized during the transition phase. Hence, while in a static economy only one technique is efficient, this is not necessarily true in a dynamic economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual worlds have moved from being a geek topic to one of mainstream academic interest. This transition is contingent not only on the augmented economic, societal and cultural value of these virtual realities and their effect upon real life but also on their convenience as fields for experimentation, for testing models and paradigms. User creation is however not something that has been transplanted from the real to the virtual world but a phenomenon and a dynamic process that happens from within and is defined through complex relationships between commercial and non-commercial, commodified and not commodified, individual and of the community, amateur and professional, art and not art. Accounting for this complex environment, the present paper explores user created content in virtual worlds, its dimensions and value and above all, its constraints by code and law. It puts forward suggestions for better understanding and harnessing this creativity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AB A fundamental capacity of the human brain is to learn relations (contingencies) between environmental stimuli and the consequences of their occurrence. Some contingencies are probabilistic; that is, they predict an event in some situations but not in all. Animal studies suggest that damage to limbic structures or the prefrontal cortex may disturb probabilistic learning. The authors studied the learning of probabilistic contingencies in amnesic patients with limbic lesions, patients with prefrontal cortex damage, and healthy controls. Across 120 trials, participants learned contingent relations between spatial sequences and a button press. Amnesic patients had learning comparable to that of control subjects but failed to indicate what they had learned. Across the last 60 trials, amnesic patients and control subjects learned to avoid a noncontingent choice better than frontal patients. These results indicate that probabilistic learning does not depend on the brain structures supporting declarative memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Morphogenesis occurs in 3D space over time and is guided by coordinated gene expression programs. Here we use postembryonic development in Arabidopsis plants to investigate the genetic control of growth. We demonstrate that gene expression driving the production of the growth-stimulating hormone gibberellic acid and downstream growth factors is first induced within the radicle tip of the embryo. The center of cell expansion is, however, spatially displaced from the center of gene expression. Because the rapidly growing cells have very different geometry from that of those at the tip, we hypothesized that mechanical factors may contribute to this growth displacement. To this end we developed 3D finite-element method models of growing custom-designed digital embryos at cellular resolution. We used this framework to conceptualize how cell size, shape, and topology influence tissue growth and to explore the interplay of geometrical and genetic inputs into growth distribution. Our simulations showed that mechanical constraints are sufficient to explain the disconnect between the experimentally observed spatiotemporal patterns of gene expression and early postembryonic growth. The center of cell expansion is the position where genetic and mechanical facilitators of growth converge. We have thus uncovered a mechanism whereby 3D cellular geometry helps direct where genetically specified growth takes place.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present high resolution transmission spectra of giant planet atmospheres from a coupled 3-D atmospheric dynamics and transmission spectrum model that includes Doppler shifts which arise from winds and planetary motion. We model jovian planets covering more than two orders of magnitude in incident flux, corresponding to planets with 0.9 to 55 day orbital periods around solar-type stars. The results of our 3-D dynamical models reveal certain aspects of high resolution transmission spectra that are not present in simple 1-D models. We find that the hottest planets experience strong substellar to anti-stellar (SSAS) winds, resulting in transmission spectra with net blue shifts of up to 3 km s−1, whereas less irradiated planets show almost no net Doppler shifts. Compared to 1-D models, peak line strengths are significantly reduced for the hottest atmospheres owing to Doppler broadening from a combination of rotation (which is faster for close-in planets under the assumption of tidal locking) and atmospheric winds. Finally, high resolution transmission spectra may be useful in studying the atmospheres of exoplanets with optically thick clouds since line cores for very strong transitions should remain optically thick to very high altitude. High resolution transmission spectra are an excellent observational test for the validity of 3-D atmospheric dynamics models, because they provide a direct probe of wind structures and heat circulation. Ground-based exoplanet spectroscopy is currently on the verge of being able to verify some of our modeling predictions, most notably the dependence of SSAS winds on insolation. We caution that interpretation of high resolution transmission spectra based on 1-D atmospheric models may be inadequate, as 3-D atmospheric motions can produce a noticeable effect on the absorption signatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop further the effective fluid theory of stationary branes. This formalism applies to stationary blackfolds as well as to other equilibrium brane systems at finite temperature. The effective theory is described by a Lagrangian containing the information about the elastic dynamics of the brane embedding as well as the hydrodynamics of the effective fluid living on the brane. The Lagrangian is corrected order-by-order in a derivative expansion, where we take into account the dipole moment of the brane which encompasses finite-thickness corrections, including transverse spin. We describe how to extract the thermodynamics from the Lagrangian and we obtain constraints on the higher-derivative terms with one and two derivatives. These constraints follow by comparing the brane thermodynamics with the conserved currents associated with background Killing vector fields. In particular, we fix uniquely the one- and two-derivative terms describing the coupling of the transverse spin to the background space-time. Finally, we apply our formalism to two blackfold examples, the black tori and charged black rings and compare the latter to a numerically generated solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We briefly review some of the lower-energy constraints to the perturbative behaviour of the strong coupling αs, with some emphasis on the determination coming from the energy between two static sources calculated on the lattice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider an effective field theory for a gauge singlet Dirac dark matter particle interacting with the standard model fields via effective operators suppressed by the scale Λ≳1  TeV. We perform a systematic analysis of the leading loop contributions to spin-independent Dirac dark matter–nucleon scattering using renormalization group evolution between Λ and the low-energy scale probed by direct detection experiments. We find that electroweak interactions induce operator mixings such that operators that are naively velocity suppressed and spin dependent can actually contribute to spin-independent scattering. This allows us to put novel constraints on Wilson coefficients that were so far poorly bounded by direct detection. Constraints from current searches are already significantly stronger than LHC bounds, and will improve in the near future. Interestingly, the loop contribution we find is isospin violating even if the underlying theory is isospin conserving.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is an important greenhouse gas and ozone-depleting substance that has anthropogenic as well as natural marine and terrestrial sources. The tropospheric N2O concentrations have varied substantially in the past in concert with changing climate on glacial–interglacial and millennial timescales. It is not well understood, however, how N2O emissions from marine and terrestrial sources change in response to varying environmental conditions. The distinct isotopic compositions of marine and terrestrial N2O sources can help disentangle the relative changes in marine and terrestrial N2O emissions during past climate variations. Here we present N2O concentration and isotopic data for the last deglaciation, from 16,000 to 10,000 years before present, retrieved from air bubbles trapped in polar ice at Taylor Glacier, Antarctica. With the help of our data and a box model of the N2O cycle, we find a 30 per cent increase in total N2O emissions from the late glacial to the interglacial, with terrestrial and marine emissions contributing equally to the overall increase and generally evolving in parallel over the last deglaciation, even though there is no a priori connection between the drivers of the two sources. However, we find that terrestrial emissions dominated on centennial timescales, consistent with a state-of-the-art dynamic global vegetation and land surface process model that suggests that during the last deglaciation emission changes were strongly influenced by temperature and precipitation patterns over land surfaces. The results improve our understanding of the drivers of natural N2O emissions and are consistent with the idea that natural N2O emissions will probably increase in response to anthropogenic warming.