932 resultados para Uncertainty in governance
Resumo:
In the inherently anarchic international system the validity of moral principles is weakening. To overcome anarchy global governance is needed. It means efficient international institutions, but also pressures from the global civil society and the self-regulation of business. Multinational firms have the duty of cooperating in governance systems. They also have the duty of reconciling in their activity the two, equally legitimate claims of universalism and cultural relativism; i.e., applying universal moral principles and respecting local moral norms. Finally, multinationals must be guided by the principle of enhanced responsibility. However, although globalizing efforts are important in overcoming international anarchy and coordinating the protection of global commons, strong arguments support the notion that economic globalization does not promote sustainable development. Some form of localization of the economy is certainly needed. The challenge is to find a way towards more global governance with less economic globalization.
Resumo:
There is a long debate (going back to Keynes) how to interpret the concept of probability in economics, in business decisions, in finance. Iván Bélyácz suggested that the Black–Scholes– Merton analysis of fi nancial derivatives has a contribution to this risk vs. uncertainty debate. This article tries to interpret this suggestion, from the viewpoint of traded options, real options, Arrow–Debreu model, Heath–Jarrow–Morton model, insurance business. The article suggests making clear distinction and using different naming ● when the frequents approach and the statistics is relevant, ● when we just use consequent relative weights during the no-arbitrage pricing, and these weight are just interpreted as probabilities, ● when we just lack the necessary information, and there is a basic uncertainty in the business decision making process. The paper suggests making a sharp distinction between fi nancial derivatives used for market risk management and credit risk type derivatives (CDO, CDS, etc) in the reregulation process of the fi nancial markets.
Resumo:
Tavaly ünnepelte a közgazdász-társadalom Milton Friedman Nobel-díjas közgazdász születésének századik évfordulóját. A jubileumi megemlékezésnek különös aktualitást ad, hogy a 2008 óta tartó pénzügyi világválság hátterében ismét fellobbant a 20. századi közgazdaságtan két meghatározó irányzata - a Friedman nevével fémjelzett monetarizmus és a Keynes és követői által követett keynesizmus - közötti vita. E szerteágazó vitasorozat egyik "gyöngyszeme" két nemzetközileg ismert és elismert közgazdász, Tim Congdon és Robert (Lord) Skidelsky, összecsapása a Standpoint hasábjain 2009-ben. A szerző megmutatja, hogy a vita valójában nem a pénz fontosságáról vagy a mennyiségi pénzelmélet igazságáról folyt, hanem egyrészt egy sokkal elvontabb fogalomról: a bizonytalanság közgazdasági szerepéről, másrészt gyakorlati, gazdaságpolitikai kérdésekről: a monetáris és a fiskális politika lehetséges hatékonyságáról. A máig is tartó vitában "az inga többször kilengett", hol a keynesiánusok, hol a monetaristák javára, de még semmi nem dőlt el. ____ Last year economists marked the centenary of the birth of genius among them, Milton Friedman. The commemoration was especially topical because the world financial crisis that erupted in 2008 has brought sharply into focus again the old division in 20th-century economics between monetarism and Keynesianism. One highlight in this series of disputes was the 2009 clash between two internationally known and appreciated economists Tim Congdon and Robert (Lord) Skidelsky in the columns of Standpoint. The central element in the discussion is the role of money: what kind of economic policy to pursue, monetary or fiscal, to pull troubled economies out of crisis. The question closely resembles a decisive dilemma for Keynes in the 1930s. Though Keynes turned against some basic propositions of neoclassical economics, he never challenged the importance of money to the functioning of the economy, or the validity of the quantity theory of money. The author argues here that the issue is not about the formal category of money or demand for it, but about the far deeper economic concept of the role of uncertainty in economics. Another aspect concerns the relative efficiency of various kinds of economic policy, i. e. the strengths and weaknesses of monetary and fiscal policies.
Resumo:
Significant improvements have been made in estimating gross primary production (GPP), ecosystem respiration (R), and net ecosystem production (NEP) from diel, “free-water” changes in dissolved oxygen (DO). Here we evaluate some of the assumptions and uncertainties that are still embedded in the technique and provide guidelines on how to estimate reliable metabolic rates from high-frequency sonde data. True whole-system estimates are often not obtained because measurements reflect an unknown zone of influence which varies over space and time. A minimum logging frequency of 30 min was sufficient to capture metabolism at the daily time scale. Higher sampling frequencies capture additional pattern in the DO data, primarily related to physical mixing. Causes behind the often large daily variability are discussed and evaluated for an oligotrophic and a eutrophic lake. Despite a 3-fold higher day-to-day variability in absolute GPP rates in the eutrophic lake, both lakes required at least 3 sonde days per week for GPP estimates to be within 20% of the weekly average. A sensitivity analysis evaluated uncertainties associated with DO measurements, piston velocity (k), and the assumption that daytime R equals nighttime R. In low productivity lakes, uncertainty in DO measurements and piston velocity strongly impacts R but has no effect on GPP or NEP. Lack of accounting for higher R during the day underestimates R and GPP but has no effect on NEP. We finally provide suggestions for future research to improve the technique.
Resumo:
Recent attention has focused on the high rates of annual carbon sequestration in vegetated coastal ecosystems—marshes, mangroves, and seagrasses—that may be lost with habitat destruction (‘conversion’). Relatively unappreciated, however, is that conversion of these coastal ecosystems also impacts very large pools of previously-sequestered carbon. Residing mostly in sediments, this ‘blue carbon’ can be released to the atmosphere when these ecosystems are converted or degraded. Here we provide the first global estimates of this impact and evaluate its economic implications. Combining the best available data on global area, land-use conversion rates, and near-surface carbon stocks in each of the three ecosystems, using an uncertainty-propagation approach, we estimate that 0.15–1.02 Pg (billion tons) of carbon dioxide are being released annually, several times higher than previous estimates that account only for lost sequestration. These emissions are equivalent to 3–19% of those from deforestation globally, and result in economic damages of $US 6–42 billion annually. The largest sources of uncertainty in these estimates stems from limited certitude in global area and rates of land-use conversion, but research is also needed on the fates of ecosystem carbon upon conversion. Currently, carbon emissions from the conversion of vegetated coastal ecosystems are not included in emissions accounting or carbon market protocols, but this analysis suggests they may be disproportionally important to both. Although the relevant science supporting these initial estimates will need to be refined in coming years, it is clear that policies encouraging the sustainable management of coastal ecosystems could significantly reduce carbon emissions from the land-use sector, in addition to sustaining the well-recognized ecosystem services of coastal habitats.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Recent studies suggest that coastal ecosystems can bury significantly more C than tropical forests, indicating that continued coastal development and exposure to sea level rise and storms will have global biogeochemical consequences. The Florida Coastal Everglades Long Term Ecological Research (FCE LTER) site provides an excellent subtropical system for examining carbon (C) balance because of its exposure to historical changes in freshwater distribution and sea level rise and its history of significant long-term carbon-cycling studies. FCE LTER scientists used net ecosystem C balance and net ecosystem exchange data to estimate C budgets for riverine mangrove, freshwater marsh, and seagrass meadows, providing insights into the magnitude of C accumulation and lateral aquatic C transport. Rates of net C production in the riverine mangrove forest exceeded those reported for many tropical systems, including terrestrial forests, but there are considerable uncertainties around those estimates due to the high potential for gain and loss of C through aquatic fluxes. C production was approximately balanced between gain and loss in Everglades marshes; however, the contribution of periphyton increases uncertainty in these estimates. Moreover, while the approaches used for these initial estimates were informative, a resolved approach for addressing areas of uncertainty is critically needed for coastal wetland ecosystems. Once resolved, these C balance estimates, in conjunction with an understanding of drivers and key ecosystem feedbacks, can inform cross-system studies of ecosystem response to long-term changes in climate, hydrologic management, and other land use along coastlines.
Resumo:
The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.
Resumo:
This paper determines the capability of two photogrammetric systems in terms of their measurement uncertainty in an industrial context. The first system – V-STARS inca3 from Geodetic Systems Inc. – is a commercially available measurement solution. The second system comprises an off-the-shelf Nikon D700 digital camera fitted with a 28 mm Nikkor lens and the research-based Vision Measurement Software (VMS). The uncertainty estimate of these two systems is determined with reference to a calibrated constellation of points determined by a Leica AT401 laser tracker. The calibrated points have an average associated standard uncertainty of 12·4 μm, spanning a maximum distance of approximately 14·5 m. Subsequently, the two systems’ uncertainty was determined. V-STARS inca3 had an estimated standard uncertainty of 43·1 μm, thus outperforming its manufacturer's specification; the D700/VMS combination achieved a standard uncertainty of 187 μm.
Resumo:
Students hold a number of personal theories about education that influence motivation and achievement in the classroom: theories about their own abilities, knowledge, and the learning process. Therefore, college instructors have a great interest in helping to develop adaptive personal theories in their students. The current studies investigated whether specific messages that instructors send in college classroom might serve as a mechanism of personal theory development. Across 2 studies, 17 college instructors and 401 students completed surveys assessing their personal theories about education at the beginning and end of college courses. Students and instructors reported hearing and sending many messages in the classroom, including instructor help messages, conciliatory messages, uncertainty in the field messages, differential ability messages and generalized positive and negative feedback. Between-class and within-class differences in message reports were associated with students’ personal theories at the end of their courses, controlling for initial personal theories. Students’ initial personal theories were also related to the messages students reported hearing. The findings demonstrate the utility of assessing non-content messages in college classrooms as potential mechanisms for changing students’ personal theories in college. Implications for research and practice are discussed.
Resumo:
This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.
The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.
Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.
Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.
The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.
Resumo:
Young infants' learning of words for abstract concepts like 'all gone' and 'eat,' in contrast to their learning of more concrete words like 'apple' and 'shoe,' may follow a relatively protracted developmental course. We examined whether infants know such abstract words. Parents named one of two events shown in side-by-side videos while their 6-16-month-old infants (n=98) watched. On average, infants successfully looked at the named video by 10 months, but not earlier, and infants' looking at the named referent increased robustly at around 14 months. Six-month-olds already understand concrete words in this task (Bergelson & Swingley, 2012). A video-corpus analysis of unscripted mother-infant interaction showed that mothers used the tested abstract words less often in the presence of their referent events than they used concrete words in the presence of their referent objects. We suggest that referential uncertainty in abstract words' teaching conditions may explain the later acquisition of abstract than concrete words, and we discuss the possible role of changes in social-cognitive abilities over the 6-14 month period.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Plant leaf wax hydrogen isotope (dDwax) reconstructions are increasingly being used to reconstruct hydrological change. This approach is based upon the assumption that variations in hydroclimatic variables, and in particular, the isotopic composition of precipitation (dDP), dominate dDwax. However modern calibration studies suggest that offsets between plant types may bias the dDwax hydrological proxy at times of vegetation change. In this study, I pair leaf wax analyses with published pollen data to quantify this effect and construct the first vegetation-corrected hydrogen isotopic evidence for precipitation (dDcorrP). In marine sediments from Deep Sea Drilling Program Site 231 in the Gulf of Aden spanning 11.4-3.8 Ma (late Miocene and earliest Pliocene), I find 77 per mil swings in dDwax that correspond to pollen evidence for substantial vegetation change. Similarities between dDP and dDcorrP imply that the hydrological tracer is qualitatively robust to vegetation change. However, computed vegetation corrections can be as large as 31 per mil indicating substantial quantitative uncertainty in the raw hydrological proxy. The resulting dDcorrP values quantify hydrological change and allow us to identify times considerably wetter than modern at 11.09, 7.26, 5.71 and 3.89 Ma. More generally, this novel interpretative framework builds the foundations of improved quantitative paleohydrological reconstructions with the dDwax proxy, in contexts where vegetation change may bias the plant-based proxy. The vegetation corrected paleoprecipitation reconstruction dDcorrP, represents the best available estimate as proof-of-concept, for an approach that I hope will be refined and more broadly applied.
Resumo:
The compositions of natural glasses and phenocrysts in basalts from Deep Sea Drilling Project Sites 501, 504, and 505, near the Costa Rica Rift, constitute evidence for the existence of a periodically replenished axial magma chamber that repeatedly erupted lavas of remarkably uniform composition. Magma compositions were affected by three general components: (1) injected magmas carrying (in decreasing order of abundance) Plagioclase, olivine, and chrome-spinel phenocrysts (spinel assemblage); (2) injected magmas carrying Plagioclase, clinopyroxene, and olivine phenocrysts, but no spinel (clinopyroxene assemblage); and (3) moderately evolved hybrids in the magma chamber itself. The compositions of the injected phenocrysts and minerals in glomerocrysts are as follows: Plagioclase - An85-94; olivine - Fo87-89; clinopyroxene - high Cr2O3 (0.7-1.1%), endiopside (Wo42En51Fs7), and aluminous chromian spinel (Cr/Cr + Al = 0.3). These minerals resemble those thought to occur in upper mantle sources (9 kbars and less) of ocean-ridge basalts and to crystallize in magmas near those sources. In the magma chamber, more sodic Plagioclase (An79-85), less magnesian olivine (Fo81-86) and low-Cr2O3 (0.1-0.4%) clinopyroxene formed rims on these crystals, grew as other phenocrysts, and formed cumulus segregations on the walls and floors of the magma chamber. In the spinel-assemblage magmas, magnesiochromite (Cr/Cr + Al = 0.4-0.5) also formed. Some cumulus segregations were later entrained in lavas as xenoliths. The glass compositions define 16 internally homogeneous eruptive units, 13 of which are in stratigraphic order in a single hole, Hole 504B, which was drilled 561.5 meters into the ocean crust. These units are defined as differing from each other by more than analytical uncertainty in one or more oxides. However, many of the glass groups in Hole 504B show virtually no differences in TiO2 contents, Mg/Mg + Fe2+, or normative An/An + Ab, all of which are sensitive indicators of crystallization differentiation. The differences are so small that they are only apparent in the glass compositions; they are almost completely obscured in whole-rock samples by the presence of phenocrysts and the effects of alteration. Moreover, several of the glass units at different depths in Hole 504B are compositionally identical, with all oxides falling within the range of analytical uncertainty, with only small variations in the rest of the suite. The repetition of identical chemical types requires (1) very regular injection of magmas into the magma chamber, (2) extreme similarity of injected magmas, and (3) displacement of very nearly the same proportion of the magmas in the chamber at each injection. Numerical modeling and thermal considerations have led some workers to propose the existence of such conditions at certain types of spreading centers, but the lava and glass compositions at Hole 504B represent the first direct evidence revealed by drilling of the existence of a compositionally nearly steady-state magma chamber, and this chapter examines the processes acting in it in some detail. The glass groups that are most similar are from clinopyroxene-assemblage lavas, which have a range of Mg/Mg + Fe2"1" of 0.59 to 0.65. Spinel-assemblage basalts are less evolved, with Mg/Mg + Fe2+ of 0.65 to 0.69, but both types have nearly identical normative An/An + Ab (0.65-0.66). However, the two lava types contain megacrysts (olivine, Plagioclase, clinopyroxene) that crystallized from melts with Mg/Mg + Fe2+ values of 0.70 to 0.72. Projection of glass compositions into ternary normative systems suggests that spinel-assemblage magmas originated deeper in the mantle than clinopyroxene-assemblage magmas, and mineral data indicate that the two types followed different fractionation paths before reaching the magma chamber. The two magma types therefore represent neither a low- nor a high-pressure fractionation sequence. Some of the spinel-assemblage magmas may have had picritic parents, but were coprecipitating all of the spinel-assemblage phenocrysts before reaching the magma chamber. Clinopyroxene-assemblage magmas did not have picritic parents, but the compositions of phenocrysts suggest that they originated at about 9 kbars, near the transition between plagioclase peridotite and spinel peridotite in the mantle. Two glass groups have higher contents of alkalis, TiO2, and P2O5 than the others, evidently as a result of the compositions of mantle sources. Eruption of these lavas implies that conduits and chambers containing magmas from dissimilar sources were not completely interconnected on the Costa Rica Rift. The data are used to draw comparisons with the East Pacific Rise and to consider the mechanisms that may have prevented the eruption of ferrobasalts at these sites.