6 resultados para Gap statistics

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of spatial downscaling strategies is to increase the information content of coarse datasets at smaller scales. In the case of quantitative precipitation estimation (QPE) for hydrological applications, the goal is to close the scale gap between the spatial resolution of coarse datasets (e.g., gridded satellite precipitation products at resolution L × L) and the high resolution (l × l; L»l) necessary to capture the spatial features that determine spatial variability of water flows and water stores in the landscape. In essence, the downscaling process consists of weaving subgrid-scale heterogeneity over a desired range of wavelengths in the original field. The defining question is, which properties, statistical and otherwise, of the target field (the known observable at the desired spatial resolution) should be matched, with the caveat that downscaling methods be as a general as possible and therefore ideally without case-specific constraints and/or calibration requirements? Here, the attention is focused on two simple fractal downscaling methods using iterated functions systems (IFS) and fractal Brownian surfaces (FBS) that meet this requirement. The two methods were applied to disaggregate spatially 27 summertime convective storms in the central United States during 2007 at three consecutive times (1800, 2100, and 0000 UTC, thus 81 fields overall) from the Tropical Rainfall Measuring Mission (TRMM) version 6 (V6) 3B42 precipitation product (~25-km grid spacing) to the same resolution as the NCEP stage IV products (~4-km grid spacing). Results from bilinear interpolation are used as the control. A fundamental distinction between IFS and FBS is that the latter implies a distribution of downscaled fields and thus an ensemble solution, whereas the former provides a single solution. The downscaling effectiveness is assessed using fractal measures (the spectral exponent β, fractal dimension D, Hurst coefficient H, and roughness amplitude R) and traditional operational scores statistics scores [false alarm rate (FR), probability of detection (PD), threat score (TS), and Heidke skill score (HSS)], as well as bias and the root-mean-square error (RMSE). The results show that both IFS and FBS fractal interpolation perform well with regard to operational skill scores, and they meet the additional requirement of generating structurally consistent fields. Furthermore, confidence intervals can be directly generated from the FBS ensemble. The results were used to diagnose errors relevant for hydrometeorological applications, in particular a spatial displacement with characteristic length of at least 50 km (2500 km2) in the location of peak rainfall intensities for the cases studied. © 2010 American Meteorological Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a class of optical media based on adiabatically modulated, dielectric-only, and potentially extremely low-loss, photonic crystals (PC). The media we describe represent a generalization of the eikonal limit of transformation optics (TO). The basis of the concept is the possibility to fit some equal frequency surfaces of certain PCs with elliptic surfaces, allowing them to mimic the dispersion relation of light in anisotropic effective media. PC cloaks and other TO devices operating at visible wavelengths can be constructed from optically transparent substances such as glasses, whose attenuation coefficient can be as small as 10 dB/km, suggesting the TO design methodology can be applied to the development of optical devices not limited by the losses inherent to metal-based, passive metamaterials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Claims of injustice in global forest governance are prolific: assertions of colonization, marginalization and disenfranchisement of forest-dependent people, and privatization of common resources are some of the most severe allegations of injustice resulting from globally-driven forest conservation initiatives. At its core, the debate over the future of the world's forests is fraught with ethical concerns. Policy makers are not only deciding how forests should be governed, but also who will be winners, losers, and who should have a voice in the decision-making processes. For 30 years, policy makers have sought to redress the concerns of the world's 1.6 billion forest-dependent poor by introducing rights-based and participatory approaches to conservation. Despite these efforts, however, claims of injustice persist. This research examines possible explanations for continued claims of injustice by asking: What are the barriers to delivering justice to forest-dependent communities? Using data collected through surveys, interviews, and collaborative event ethnography in Laos and at the Tenth Conference of Parties to the Convention on Biological Diversity, this dissertation examines the pursuit of justice in global forest governance across multiple scales of governance. The findings reveal that particular conceptualizations of justice have become a central part of the metanormative fabric of global environmental governance, inhibiting institutional evolution and therewith perpetuating the justice gap in global forest governance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or "quakes". We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects "tuned critical" behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stress-dependent cutoff function. The results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.