60 resultados para Weak instruments
Resumo:
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.
Resumo:
Transforming today’s energy systems in industrialized countries requires a substantial reduction of the total energy consumption at the individual level. Selected instruments have been found to be effective in changing people’s behavior in single domains. However, the so far weak success story on reducing overall energy consumption indicates that our understanding of the determining factors of individual energy consumption as well as of its change is far from being conclusive. Among others, the scientific state of the art is dominated by analyzing single domains of consumption and by neglecting embodied energy. It also displays strong disciplinary splits and the literature often fails to distinguish between explaining behavior and explaining change of behavior. Moreover, there are knowledge gaps regarding the legitimacy and effectiveness of the governance of individual consumption behavior and its change. Against this backdrop, the aim of this paper is to establish an integrated interdisciplinary framework that offers a systematic basis for linking the different aspects in research on energy related consumption behavior, thus paving the way for establishing a better evidence base to inform societal actions. The framework connects the three relevant analytical aspects of the topic in question: (1) It systematically and conceptually frames the objects, i.e. the energy consumption behavior and its change (explananda); (2) it structures the factors that potentially explain the energy consumption behavior and its change (explanantia); (3) it provides a differentiated understanding of change inducing interventions in terms of governance. Based on the existing states of the art approaches from different disciplines within the social sciences the proposed framework is supposed to guide interdisciplinary empirical research.
Resumo:
A well developed theoretical framework is available in which paleofluid properties, such as chemical composition and density, can be reconstructed from fluid inclusions in minerals that have undergone no ductile deformation. The present study extends this framework to encompass fluid inclusions hosted by quartz that has undergone weak ductile deformation following fluid entrapment. Recent experiments have shown that such deformation causes inclusions to become dismembered into clusters of irregularly shaped relict inclusions surrounded by planar arrays of tiny, new-formed (neonate) inclusions. Comparison of the experimental samples with a naturally sheared quartz vein from Grimsel Pass, Aar Massif, Central Alps, Switzerland, reveals striking similarities. This strong concordance justifies applying the experimentally derived rules of fluid inclusion behaviour to nature. Thus, planar arrays of dismembered inclusions defining cleavage planes in quartz may be taken as diagnostic of small amounts of intracrystalline strain. Deformed inclusions preserve their pre-deformation concentration ratios of gases to electrolytes, but their H2O contents typically have changed. Morphologically intact inclusions, in contrast, preserve the pre-deformation composition and density of their originally trapped fluid. The orientation of the maximum principal compressive stress (σ1σ1) at the time of shear deformation can be derived from the pole to the cleavage plane within which the dismembered inclusions are aligned. Finally, the density of neonate inclusions is commensurate with the pressure value of σ1σ1 at the temperature and time of deformation. This last rule offers a means to estimate magnitudes of shear stresses from fluid inclusion studies. Application of this new paleopiezometer approach to the Grimsel vein yields a differential stress (σ1–σ3σ1–σ3) of ∼300 MPa∼300 MPa at View the MathML source390±30°C during late Miocene NNW–SSE orogenic shortening and regional uplift of the Aar Massif. This differential stress resulted in strain-hardening of the quartz at very low total strain (<5%<5%) while nearby shear zones were accommodating significant displacements. Further implementation of these experimentally derived rules should provide new insight into processes of fluid–rock interaction in the ductile regime within the Earth's crust.
Resumo:
In this paper we continue Feferman’s unfolding program initiated in (Feferman, vol. 6 of Lecture Notes in Logic, 1996) which uses the concept of the unfolding U(S) of a schematic system S in order to describe those operations, predicates and principles concerning them, which are implicit in the acceptance of S. The program has been carried through for a schematic system of non-finitist arithmetic NFA in Feferman and Strahm (Ann Pure Appl Log, 104(1–3):75–96, 2000) and for a system FA (with and without Bar rule) in Feferman and Strahm (Rev Symb Log, 3(4):665–689, 2010). The present contribution elucidates the concept of unfolding for a basic schematic system FEA of feasible arithmetic. Apart from the operational unfolding U0(FEA) of FEA, we study two full unfolding notions, namely the predicate unfolding U(FEA) and a more general truth unfolding UT(FEA) of FEA, the latter making use of a truth predicate added to the language of the operational unfolding. The main results obtained are that the provably convergent functions on binary words for all three unfolding systems are precisely those being computable in polynomial time. The upper bound computations make essential use of a specific theory of truth TPT over combinatory logic, which has recently been introduced in Eberhard and Strahm (Bull Symb Log, 18(3):474–475, 2012) and Eberhard (A feasible theory of truth over combinatory logic, 2014) and whose involved proof-theoretic analysis is due to Eberhard (A feasible theory of truth over combinatory logic, 2014). The results of this paper were first announced in (Eberhard and Strahm, Bull Symb Log 18(3):474–475, 2012).
Resumo:
Weak radiative decays of the B mesons belong to the most important flavor changing processes that provide constraints on physics at the TeV scale. In the derivation of such constraints, accurate standard model predictions for the inclusive branching ratios play a crucial role. In the current Letter we present an update of these predictions, incorporating all our results for the O(α2s) and lower-order perturbative corrections that have been calculated after 2006. New estimates of nonperturbative effects are taken into account, too. For the CP- and isospin-averaged branching ratios, we find Bsγ=(3.36±0.23)×10−4 and Bdγ=(1.73+0.12−0.22)×10−5, for Eγ>1.6 GeV. Both results remain in agreement with the current experimental averages. Normalizing their sum to the inclusive semileptonic branching ratio, we obtain Rγ≡(Bsγ+Bdγ)/Bcℓν=(3.31±0.22)×10−3. A new bound from Bsγ on the charged Higgs boson mass in the two-Higgs-doublet-model II reads MH±>480 GeV at 95% C.L.
Resumo:
Mechanical thrombectomy provides higher recanalization rates than intravenous or intra-arterial thrombolysis. Finally this has been shown to translate into improved clinical outcome in six multicentric randomized controlled trials. However, within cohorts the clinical outcomes may vary, depending on the endovascular techniques applied. Systems aiming mainly for thrombus fragmentation and lacking a protection against distal embolization have shown disappointing results when compared to recent stent-retriever studies or even to historical data on local arterial fibrinolysis. Procedure-related embolic events are usually graded as adverse events in interventional neuroradiology. In stroke, however, the clinical consequences of secondary emboli have so far mostly been neglected and attributed to progression of the stroke itself. We summarize the evolution of instruments and techniques for endovascular, image-guided, microneurosurgical recanalization in acute stroke, and discuss how to avoid procedure-related embolic complications.
Resumo:
The important task to observe the global coverage of middle atmospheric trace gases like water vapor or ozone usually is accomplished by satellites. Climate and atmospheric studies rely upon the knowledge of trace gas distributions throughout the stratosphere and mesosphere. Many of these gases are currently measured from satellites, but it is not clear whether this capability will be maintained in the future. This could lead to a significant knowledge gap of the state of the atmosphere. We explore the possibilities of mapping middle atmospheric water vapor in the Northern Hemisphere by using Lagrangian trajectory calculations and water vapor profile data from a small network of five ground-based microwave radiometers. Four of them are operated within the frame of NDACC (Network for the Detection of Atmospheric Composition Change). Keeping in mind that the instruments are based on different hardware and calibration setups, a height-dependent bias of the retrieved water vapor profiles has to be expected among the microwave radiometers. In order to correct and harmonize the different data sets, the Microwave Limb Sounder (MLS) on the Aura satellite is used to serve as a kind of traveling standard. A domain-averaging TM (trajectory mapping) method is applied which simplifies the subsequent validation of the quality of the trajectory-mapped water vapor distribution towards direct satellite observations. Trajectories are calculated forwards and backwards in time for up to 10 days using 6 hourly meteorological wind analysis fields. Overall, a total of four case studies of trajectory mapping in different meteorological regimes are discussed. One of the case studies takes place during a major sudden stratospheric warming (SSW) accompanied by the polar vortex breakdown; a second takes place after the reformation of stable circulation system. TM cases close to the fall equinox and June solstice event from the year 2012 complete the study, showing the high potential of a network of ground-based remote sensing instruments to synthesize hemispheric maps of water vapor.
Resumo:
The interaction of a comet with the solar wind undergoes various stages as the comet’s activity varies along its orbit. For a comet like 67P/Churyumov–Gerasimenko, the target comet of ESA’s Rosetta mission, the various features include the formation of a Mach cone, the bow shock, and close to perihelion even a diamagnetic cavity. There are different approaches to simulate this complex interplay between the solar wind and the comet’s extended neutral gas coma which include magnetohydrodynamics (MHD) and hybrid-type models. The first treats the plasma as fluids (one fluid in basic single fluid MHD) and the latter treats the ions as individual particles under the influence of the local electric and magnetic fields. The electrons are treated as a charge-neutralizing fluid in both cases. Given the different approaches both models yield different results, in particular for a low production rate comet. In this paper we will show that these differences can be reduced when using a multifluid instead of a single-fluid MHD model and increase the resolution of the Hybrid model. We will show that some major features obtained with a hybrid type approach like the gyration of the cometary heavy ions and the formation of the Mach cone can be partially reproduced with the multifluid-type model.
Resumo:
The important task to observe the global coverage of middle atmospheric trace gases like water vapor or ozone usually is accomplished by satellites. Climate and atmospheric studies rely upon the knowledge of trace gas distributions throughout the stratosphere and mesosphere. Many of these gases are currently measured from satellites, but it is not clear whether this capability will be maintained in the future. This could lead to a significant knowledge gap of the state of the atmosphere. We explore the possibilities of mapping middle atmospheric water vapor in the Northern Hemisphere by using Lagrangian trajectory calculations and water vapor profile data from a small network of five ground-based microwave radiometers. Four of them are operated within the frame of NDACC (Network for the Detection of Atmospheric Composition Change). Keeping in mind that the instruments are based on different hardware and calibration setups, a height-dependent bias of the retrieved water vapor profiles has to be expected among the microwave radiometers. In order to correct and harmonize the different data sets, the Microwave Limb Sounder (MLS) on the Aura satellite is used to serve as a kind of traveling standard. A domain-averaging TM (trajectory mapping) method is applied which simplifies the subsequent validation of the quality of the trajectory-mapped water vapor distribution towards direct satellite observations. Trajectories are calculated forwards and backwards in time for up to 10 days using 6 hourly meteorological wind analysis fields. Overall, a total of four case studies of trajectory mapping in different meteorological regimes are discussed. One of the case studies takes place during a major sudden stratospheric warming (SSW) accompanied by the polar vortex breakdown; a second takes place after the reformation of stable circulation system. TM cases close to the fall equinox and June solstice event from the year 2012 complete the study, showing the high potential of a network of ground-based remote sensing instruments to synthesize hemispheric maps of water vapor.
Resumo:
Aim The usual hypothesis about the relationship between niche breadth and range size posits that species with the capacity to use a wider range of resources or to tolerate a greater range of environmental conditions should be more widespread. In plants, broader niches are often hypothesized to be due to pronounced phenotypic plasticity, and more plastic species are therefore predicted to be more common. We examined the relationship between the magnitude of phenotypic plasticity in five functional traits, mainly related to leaves, and several measures of abundance in 105 Central European grassland species. We further tested whether mean values of traits, rather than their plasticity, better explain the commonness of species, possibly because they are pre-adapted to exploiting the most common resources. Location Central Europe. Methods In a multispecies experiment with 105 species we measured leaf thickness, leaf greenness, specific leaf area, leaf dry matter content and plant height, and the plasticity of these traits in response to fertilization, waterlogging and shading. For the same species we also obtained five measures of commonness, ranging from plot-level abundance to range size in Europe. We then examined whether these measures of commonness were associated with the magnitude of phenotypic plasticity, expressed as composite plasticity of all traits across the experimental treatments. We further estimated the relative importance of trait plasticity and trait means for abundance and geographical range size. Results More abundant species were less plastic. This negative relationship was fairly consistent across several spatial scales of commonness, but it was weak. Indeed, compared with trait means, plasticity was relatively unimportant for explaining differences in species commonness. Main conclusions Our results do not indicate that larger phenotypic plasticity of leaf morphological traits enhances species abundance. Furthermore, possession of a particular trait value, rather than of trait plasticity, is a more important determinant of species commonness.
Resumo:
The goal of the AEgIS experiment is to measure the gravitational acceleration of antihydrogen – the simplest atom consisting entirely of antimatter – with the ultimate precision of 1%. We plan to verify the Weak Equivalence Principle (WEP), one of the fundamental laws of nature, with an antimatter beam. The experiment consists of a positron accumulator, an antiproton trap and a Stark accelerator in a solenoidal magnetic field to form and accelerate a pulsed beam of antihydrogen atoms towards a free-fall detector. The antihydrogen beam passes through a moir ́e deflectometer to measure the vertical displacement due to the gravitational force. A position and time sensitive hybrid detector registers the annihilation points of the antihydrogen atoms and their time-of-flight. The detection principle has been successfully tested with antiprotons and a miniature moir ́e deflectometer coupled to a nuclear emulsion detector.
Resumo:
This brochure deals with policies and policy instruments needed to promote sustainable development in mountain areas. The first part presents an overview of key issues in mountain development, and principles and strategies that should be adopted. Each principle contains a checklist for policy-makers. The second part presents national and regional case studies of successful approaches and initiatives relating to mountain policy from all over the world. The brochure concludes with a call for multi-level initiatives and partnerships. This full-colour publication is part of the Mountains of the World series. It was prepared for the 2002 World Summit on Sustainable Development in Johannesburg by an international panel of experts coordinated by CDE. It was commissioned and funded by the Swiss Agency for Development and Cooperation (SDC).