12 resultados para Robust estimates

em Duke University


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing point estimates of half-life deviations from purchasing power parity (PPP), around 3-5 years, suggest that the speed of convergence is extremely slow. This article assesses the degree of uncertainty around these point estimates by using local-to-unity asymptotic theory to construct confidence intervals that are robust to high persistence in small samples. The empirical evidence suggests that the lower bound of the confidence interval is between four and eight quarters for most currencies, which is not inconsistent with traditional price-stickiness explanations. However, the upper bounds are infinity for all currencies, so we cannot provide conclusive evidence in favor of PPP either. © 2005 American Statistical Association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional hedonic techniques for estimating the value of local amenities rely on the assumption that households move freely among locations. We show that when moving is costly, the variation in housing prices and wages across locations may no longer reflect the value of differences in local amenities. We develop an alternative discrete-choice approach that models the household location decision directly, and we apply it to the case of air quality in US metro areas in 1990 and 2000. Because air pollution is likely to be correlated with unobservable local characteristics such as economic activity, we instrument for air quality using the contribution of distant sources to local pollution-excluding emissions from local sources, which are most likely to be correlated with local conditions. Our model yields an estimated elasticity of willingness to pay with respect to air quality of 0.34-0.42. These estimates imply that the median household would pay $149-$185 (in constant 1982-1984 dollars) for a one-unit reduction in average ambient concentrations of particulate matter. These estimates are three times greater than the marginal willingness to pay estimated by a conventional hedonic model using the same data. Our results are robust to a range of covariates, instrumenting strategies, and functional form assumptions. The findings also confirm the importance of instrumenting for local air pollution. © 2009 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Does environmental regulation impair international competitiveness of pollution-intensive industries to the extent that they relocate to countries with less stringent regulation, turning those countries into "pollution havens"? We test this hypothesis using panel data on outward foreign direct investment (FDI) flows of various industries in the German manufacturing sector and account for several econometric issues that have been ignored in previous studies. Most importantly, we demonstrate that externalities associated with FDI agglomeration can bias estimates away from finding a pollution haven effect if omitted from the analysis. We include the stock of inward FDI as a proxy for agglomeration and employ a GMM estimator to control for endogenous time-varying determinants of FDI flows. Furthermore, we propose a difference estimator based on the least polluting industry to break the possible correlation between environmental regulatory stringency and unobservable attributes of FDI recipients in the cross-section. When accounting for these issues we find robust evidence of a pollution haven effect for the chemical industry. © 2008 Springer Science+Business Media B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In a time-course microarray experiment, the expression level for each gene is observed across a number of time-points in order to characterize the temporal trajectories of the gene-expression profiles. For many of these experiments, the scientific aim is the identification of genes for which the trajectories depend on an experimental or phenotypic factor. There is an extensive recent body of literature on statistical methodology for addressing this analytical problem. Most of the existing methods are based on estimating the time-course trajectories using parametric or non-parametric mean regression methods. The sensitivity of these regression methods to outliers, an issue that is well documented in the statistical literature, should be of concern when analyzing microarray data. RESULTS: In this paper, we propose a robust testing method for identifying genes whose expression time profiles depend on a factor. Furthermore, we propose a multiple testing procedure to adjust for multiplicity. CONCLUSIONS: Through an extensive simulation study, we will illustrate the performance of our method. Finally, we will report the results from applying our method to a case study and discussing potential extensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a framework for robust optimization that relaxes the standard notion of robustness by allowing the decision maker to vary the protection level in a smooth way across the uncertainty set. We apply our approach to the problem of maximizing the expected value of a payoff function when the underlying distribution is ambiguous and therefore robustness is relevant. Our primary objective is to develop this framework and relate it to the standard notion of robustness, which deals with only a single guarantee across one uncertainty set. First, we show that our approach connects closely to the theory of convex risk measures. We show that the complexity of this approach is equivalent to that of solving a small number of standard robust problems. We then investigate the conservatism benefits and downside probability guarantees implied by this approach and compare to the standard robust approach. Finally, we illustrate theme thodology on an asset allocation example consisting of historical market data over a 25-year investment horizon and find in every case we explore that relaxing standard robustness with soft robustness yields a seemingly favorable risk-return trade-off: each case results in a higher out-of-sample expected return for a relatively minor degradation of out-of-sample downside performance. © 2010 INFORMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Steady-state diffuse reflection spectroscopy is a well-studied optical technique that can provide a noninvasive and quantitative method for characterizing the absorption and scattering properties of biological tissues. Here, we compare three fiber-based diffuse reflection spectroscopy systems that were assembled to create a light-weight, portable, and robust optical spectrometer that could be easily translated for repeated and reliable use in mobile settings. The three systems were built using a broadband light source and a compact, commercially available spectrograph. We tested two different light sources and two spectrographs (manufactured by two different vendors). The assembled systems were characterized by their signal-to-noise ratios, the source-intensity drifts, and detector linearity. We quantified the performance of these instruments in extracting optical properties from diffuse reflectance spectra in tissue-mimicking liquid phantoms with well-controlled optical absorption and scattering coefficients. We show that all assembled systems were able to extract the optical absorption and scattering properties with errors less than 10%, while providing greater than ten-fold decrease in footprint and cost (relative to a previously well-characterized and widely used commercial system). Finally, we demonstrate the use of these small systems to measure optical biomarkers in vivo in a small-animal model cancer therapy study. We show that optical measurements from the simple portable system provide estimates of tumor oxygen saturation similar to those detected using the commercial system in murine tumor models of head and neck cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While numerous studies find that deep-saline sandstone aquifers in the United States could store many decades worth of the nation's current annual CO 2 emissions, the likely cost of this storage (i.e. the cost of storage only and not capture and transport costs) has been harder to constrain. We use publicly available data of key reservoir properties to produce geo-referenced rasters of estimated storage capacity and cost for regions within 15 deep-saline sandstone aquifers in the United States. The rasters reveal the reservoir quality of these aquifers to be so variable that the cost estimates for storage span three orders of magnitude and average>$100/tonne CO 2. However, when the cost and corresponding capacity estimates in the rasters are assembled into a marginal abatement cost curve (MACC), we find that ~75% of the estimated storage capacity could be available for<$2/tonne. Furthermore, ~80% of the total estimated storage capacity in the rasters is concentrated within just two of the aquifers-the Frio Formation along the Texas Gulf Coast, and the Mt. Simon Formation in the Michigan Basin, which together make up only ~20% of the areas analyzed. While our assessment is not comprehensive, the results suggest there should be an abundance of low-cost storage for CO 2 in deep-saline aquifers, but a majority of this storage is likely to be concentrated within specific regions of a smaller number of these aquifers. © 2011 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research and development costs of 68 randomly selected new drugs were obtained from a survey of 10 pharmaceutical firms. These data were used to estimate the average pre-tax cost of new drug development. The costs of compounds abandoned during testing were linked to the costs of compounds that obtained marketing approval. The estimated average out-of-pocket cost per new drug is 403 million US dollars (2000 dollars). Capitalizing out-of-pocket costs to the point of marketing approval at a real discount rate of 11% yields a total pre-approval cost estimate of 802 million US dollars (2000 dollars). When compared to the results of an earlier study with a similar methodology, total capitalized costs were shown to have increased at an annual rate of 7.4% above general price inflation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention. For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN). Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability. Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records. We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20(th) century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario's forced signal, but is likely inconsistent with the steepest emission scenario's forced signal.