925 resultados para Data uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we calculate the electron-impact uncertainties in atomic data for direct ionization and recombination and investigate the role of these uncertainties on spectral diagnostics. We outline a systematic approach to assigning meaningful uncertainties that vary with electron temperature. Once these uncertainty parameters have been evaluated, we can then calculate the uncertainties on key diagnostics through a Monte Carlo routine, using the Astrophysical Emission Code (APEC) [Smith et al. 2001]. We incorporate these uncertainties into well known temperature diagnostics, such as the Lyman alpha versus resonance line ratio and the G ratio. We compare these calculations to a study performed by [Testa et al. 2004], where significant discrepancies in the two diagnostic ratios were observed. We conclude that while the atomic physics uncertainties play a noticeable role in the discrepancies observed by Testa, they do not explain all of them. This indicates that there is another physical process occurring in the system that is not being taken into account. This work is supported in part by the National Science Foundation REU and Department of Defense ASSURE programs under NSF Grant no. 1262851 and by the Smithsonian Institution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates the unparalleled value of full scale data which has been acquired from ocean trials of Aquamarine Power’s Oyster 800 Wave Energy Converter (WEC) at the European Marine Energy Centre (EMEC), Orkney, Scotland.
High quality prototype and wave data were simultaneously recorded in over 750 distinct sea states (comprising different wave height, wave period and tidal height combinations) and include periods of operation where the hydraulic Power Take-Off (PTO) system was both pressurised (damped operation) and de-pressurised (undamped operation).
A detailed model-prototype correlation procedure is presented where the full scale prototype behaviour is compared to predictions from both experimental and numerical modelling techniques via a high temporal resolution wave-by-wave reconstruction. This unquestionably provides the definitive verification of the capabilities of such research techniques and facilitates a robust and meaningful uncertainty analysis to be performed on their outputs.
The importance of a good data capture methodology, both in terms of handling and accuracy is also presented. The techniques and procedures implemented by Aquamarine Power for real-time data management are discussed, including lessons learned on the instrumentation and infrastructure required to collect high-value data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithm developed uses an octree pyramid in which noise is reduced at the expense of the spatial resolution. At a certain level an unsupervised clustering without spatial connectivity constraints is applied. After the classification, isolated voxels and insignificant regions are removed by assigning them to their neighbours. The spatial resolution is then increased by the downprojection of the regions, level by level. At each level the uncertainty of the boundary voxels is minimised by a dynamic selection and classification of these, using an adaptive 3D filtering. The algorithm is tested using different data sets, including NMR data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing literature considers the impact of uncertainty using SVAR models that include proxies for uncertainty shocks as endogenous variables. In this paper we consider the impact of measurement error in these proxies on the estimated impulse responses. We show via a Monte-Carlo experiment that measurement error can result in attenuation bias in impulse responses. In contrast, the proxy SVAR that uses the uncertainty shock proxy as an instrument does not su¤er from this bias. Applying this latter method to the Bloom (2009) data-set results in impulse responses to uncertainty shocks that are larger in magnitude and more persistent than those obtained from a recursive SVAR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To explore people's experiences of starting antidepressant treatment. Design Qualitative interpretive approach combining thematic analysis with constant comparison. Relevant coding reports from the original studies (generated using NVivo) relating to initial experiences of antidepressants were explored in further detail, focusing on the ways in which participants discussed their experiences of taking or being prescribed an antidepressant for the first time. Participants 108 men and women aged 22–84 who had taken antidepressants for depression. Setting Respondents recruited throughout the UK during 2003–2004 and 2008 and 2012–2013 and in Australia during 2010–2011. Results People expressed a wide range of feelings about initiating antidepressant use. People's attitudes towards starting antidepressant use were shaped by stereotypes and stigmas related to perceived drug dependency and potentially extreme side effects. Anxieties were expressed about starting use, and about how long the antidepressant might begin to take effect, how much it might help or hinder them, and about what to expect in the initial weeks. People worried about the possibility of experiencing adverse effects and implications for their senses of self. Where people felt they had not been given sufficient time during their consultation information or support to take the medicines, the uncertainty could be particularly unsettling and impact on their ongoing views on and use of antidepressants as a viable treatment option. Conclusions Our paper is the first to explore in-depth patient existential concerns about start of antidepressant use using multicountry data. People need additional support when they make decisions about starting antidepressants. Health professionals can use our findings to better understand and explore with patients’ their concerns before their patients start antidepressants. These insights are key to supporting patients, many of whom feel intimidated by the prospect of taking antidepressants, especially during the uncertain first few weeks of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many countries the use of renewable energy is increasing due to the introduction of new energy and environmental policies. Thus, the focus on the efficient integration of renewable energy into electric power systems is becoming extremely important. Several European countries have already achieved high penetration of wind based electricity generation and are gradually evolving towards intensive use of this generation technology. The introduction of wind based generation in power systems poses new challenges for the power system operators. This is mainly due to the variability and uncertainty in weather conditions and, consequently, in the wind based generation. In order to deal with this uncertainty and to improve the power system efficiency, adequate wind forecasting tools must be used. This paper proposes a data-mining-based methodology for very short-term wind forecasting, which is suitable to deal with large real databases. The paper includes a case study based on a real database regarding the last three years of wind speed, and results for wind speed forecasting at 5 minutes intervals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology supported on the data base knowledge discovery process (KDD), in order to find out the failure probability of electrical equipments’, which belong to a real electrical high voltage network. Data Mining (DM) techniques are used to discover a set of outcome failure probability and, therefore, to extract knowledge concerning to the unavailability of the electrical equipments such us power transformers and high-voltages power lines. The framework includes several steps, following the analysis of the real data base, the pre-processing data, the application of DM algorithms, and finally, the interpretation of the discovered knowledge. To validate the proposed methodology, a case study which includes real databases is used. This data have a heavy uncertainty due to climate conditions for this reason it was used fuzzy logic to determine the set of the electrical components failure probabilities in order to reestablish the service. The results reflect an interesting potential of this approach and encourage further research on the topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnitude of the cervical cancer problem, coupled with the potential for prevention with recent technological advances, made it imperative to step back and reassess strategic options for dealing with cervical cancer screening in Kenya. The purpose of this qualitative study was: 1) to explore the extent to which the Participatory Action Research (PAR) methodology and the Scenario Based Planning (SBP) method, with the application of analytics, could enable strategic, consequential, informed decision making, and 2) to determine how influential Kenyan decision makers could apply SBP with analytic tools and techniques to make strategic, consequential decisions regarding the implementation of a Cervical Self Sampling Program (CSSP) in both urban and rural settings. The theoretical paradigm for this study was action research; it was experiential, practical, and action oriented, and resulted in co-created knowledge that influenced study participants’ decision making. Action Africa Help International (AAHI) and Brock University collaborated with Local Decision Influencing Participants (LDIP’s) to develop innovative strategies on how to implement the CSSP. SBP tools, along with traditional approaches to data collection and analysis, were applied to collect, visualize and analyze predominately qualitative data. Outputs from the study included: a) a generic implementation scenario for a CSSP (along with scenarios unique to urban and rural settings), and b) 10 strategic directions and 22 supporting implementation strategies that address the variables of: 1) technical viability, 2) political support, 3) affordability, 4) logistical feasibility, 5) social acceptability, and 6) transformation/sustainability. In addition, study participants’ capacity to effectively engage in predictive/prescriptive strategic decision making was strengthened.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In any discipline, where uncertainty and variability are present, it is important to have principles which are accepted as inviolate and which should therefore drive statistical modelling, statistical analysis of data and any inferences from such an analysis. Despite the fact that two such principles have existed over the last two decades and from these a sensible, meaningful methodology has been developed for the statistical analysis of compositional data, the application of inappropriate and/or meaningless methods persists in many areas of application. This paper identifies at least ten common fallacies and confusions in compositional data analysis with illustrative examples and provides readers with necessary, and hopefully sufficient, arguments to persuade the culprits why and how they should amend their ways

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite-based rainfall monitoring is widely used for climatological studies because of its full global coverage but it is also of great importance for operational purposes especially in areas such as Africa where there is a lack of ground-based rainfall data. Satellite rainfall estimates have enormous potential benefits as input to hydrological and agricultural models because of their real time availability, low cost and full spatial coverage. One issue that needs to be addressed is the uncertainty on these estimates. This is particularly important in assessing the likely errors on the output from non-linear models (rainfall-runoff or crop yield) which make use of the rainfall estimates, aggregated over an area, as input. Correct assessment of the uncertainty on the rainfall is non-trivial as it must take account of • the difference in spatial support of the satellite information and independent data used for calibration • uncertainties on the independent calibration data • the non-Gaussian distribution of rainfall amount • the spatial intermittency of rainfall • the spatial correlation of the rainfall field This paper describes a method for estimating the uncertainty on satellite-based rainfall values taking account of these factors. The method involves firstly a stochastic calibration which completely describes the probability of rainfall occurrence and the pdf of rainfall amount for a given satellite value, and secondly the generation of ensemble of rainfall fields based on the stochastic calibration but with the correct spatial correlation structure within each ensemble member. This is achieved by the use of geostatistical sequential simulation. The ensemble generated in this way may be used to estimate uncertainty at larger spatial scales. A case study of daily rainfall monitoring in the Gambia, west Africa for the purpose of crop yield forecasting is presented to illustrate the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The s–x model of microwave emission from soil and vegetation layers is widely used to estimate soil moisture content from passive microwave observations. Its application to prospective satellite-based observations aggregating several thousand square kilometres requires understanding of the effects of scene heterogeneity. The effects of heterogeneity in soil surface roughness, soil moisture, water area and vegetation density on the retrieval of soil moisture from simulated single- and multi-angle observing systems were tested. Uncertainty in water area proved the most serious problem for both systems, causing errors of a few percent in soil moisture retrieval. Single-angle retrieval was largely unaffected by the other factors studied here. Multiple-angle retrievals errors around one percent arose from heterogeneity in either soil roughness or soil moisture. Errors of a few percent were caused by vegetation heterogeneity. A simple extension of the model vegetation representation was shown to reduce this error substantially for scenes containing a range of vegetation types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Faced by the realities of a changing climate, decision makers in a wide variety of organisations are increasingly seeking quantitative predictions of regional and local climate. An important issue for these decision makers, and for organisations that fund climate research, is what is the potential for climate science to deliver improvements - especially reductions in uncertainty - in such predictions? Uncertainty in climate predictions arises from three distinct sources: internal variability, model uncertainty and scenario uncertainty. Using data from a suite of climate models we separate and quantify these sources. For predictions of changes in surface air temperature on decadal timescales and regional spatial scales, we show that uncertainty for the next few decades is dominated by sources (model uncertainty and internal variability) that are potentially reducible through progress in climate science. Furthermore, we find that model uncertainty is of greater importance than internal variability. Our findings have implications for managing adaptation to a changing climate. Because the costs of adaptation are very large, and greater uncertainty about future climate is likely to be associated with more expensive adaptation, reducing uncertainty in climate predictions is potentially of enormous economic value. We highlight the need for much more work to compare: a) the cost of various degrees of adaptation, given current levels of uncertainty; and b) the cost of new investments in climate science to reduce current levels of uncertainty. Our study also highlights the importance of targeting climate science investments on the most promising opportunities to reduce prediction uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.