925 resultados para Data uncertainty


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – Rapid urbanisation, fragmented governance and recurrent flooding complicates resolution of DKI Jakarta’s chronic housing shortage. Failure to effectively implement planning decisionmaking processes poses potential human rights violations. Contemporary planning policy requires the relocation of households living in floodplains within fifteen metres of DKI Jakarta’s main watercourses; further constraining land availability and potentially requiring increased densification. The purpose of this paper is to re-frame planning decision-making to address risks of flooding and to increase community resilience. Design/methodology/approach – This paper presents a preliminary scoping study for a technologically enhanced participatory planning method, incorporating synthesis of existing information on urbanisation, governance, and flood risk management in Jakarta. Findings – Responsibility for flood risk management in DKI Jakarta is fragmented both within and across administrative boundaries. Decision-making is further complicated by: limited availability of land use data; uncertainty as to the delineated extent of watercourses, floodplains, and flood modelling; unclear risk and liability for infrastructure investments; and technical literacy of both public and government participants. Practical implications – This research provides information to facilitate consultation with government entities tasked with re-framing planning processes to increase public participation. Social implications – Reduction in risk exposure amongst DKI Jakarta’s most vulnerable populations addresses issues of social justice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out to be intractable. The key novelty is in employing Bernstein bounding schemes to relax the CCP as a convex second order cone program whose solution is guaranteed to satisfy the probabilistic constraint. Prior to this work, only the Chebyshev based relaxations were exploited in learning algorithms. Bernstein bounds employ richer partial information and hence can be far less conservative than Chebyshev bounds. Due to this efficient modeling of uncertainty, the resulting classifiers achieve higher classification margins and hence better generalization. Methodologies for classifying uncertain test data points and error measures for evaluating classifiers robust to uncertain data are discussed. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle data uncertainty and outperform state-of-the-art in many cases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

While changes in land precipitation during the last 50 years have been attributed in part to human influences, results vary by season, are affected by data uncertainty and do not account for changes over ocean. One of the more physically robust responses of the water cycle to warming is the expected amplification of existing patterns of precipitation minus evaporation. Here, precipitation changes in wet and dry regions are analyzed from satellite data for 1988–2010, covering land and ocean. We derive fingerprints for the expected change from climate model simulations that separately track changes in wet and dry regions. The simulations used are driven with anthropogenic and natural forcings combined, and greenhouse gas forcing or natural forcing only. Results of detection and attribution analysis show that the fingerprint of combined external forcing is detectable in observations and that this intensification of the water cycle is partly attributable to greenhouse gas forcing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Middle East and Southwest Asia comprise a region that is water-stressed, societally vulnerable, and prone to severe droughts. Large-scale climate variability, particularly La Niña, appears to play an important role in region-wide drought, including the two most severe of the last fifty years—1999-2001 and 2007-2008—with implications for drought forecasting. Important dynamical factors include orography, thermodynamic influence on vertical motion, storm track changes, and moisture transport. Vegetation in the region is strongly impacted by drought and may provide an important feedback mechanism. In future projections, drying of the eastern Mediterranean is a robust feature, as are temperature increases throughout the region, which will affect evaporation and the timing and intensity of snowmelt. Vegetation feedbacks may become more important in a warming climate. There are a wide range of outstanding issues for understanding, monitoring, and predicting drought in the region, including: dynamics of the regional storm track, the relative importance of the range of dynamical mechanisms related to drought, regional coherence of drought, the relationship between synoptic-scale mechanisms and drought, predictability of vegetation and crop yields, stability of remote influences, data uncertainty, and the role of temperature. Development of a regional framework for cooperative work and dissemination of information and existing forecasts would speed understanding and make better use of available information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this work is to provide a precise and accurate measurement of the 238U(n,gamma) reaction cross-section. This reaction is of fundamental importance for the design calculations of nuclear reactors, governing the behaviour of the reactor core. In particular, fast neutron reactors, which are experiencing a growing interest for their ability to burn radioactive waste, operate in the high energy region of the neutron spectrum. In this energy region inconsistencies between the existing measurements are present up to 15%, and the most recent evaluations disagree each other. In addition, the assessment of nuclear data uncertainty performed for innovative reactor systems shows that the uncertainty in the radiative capture cross-section of 238U should be further reduced to 1-3% in the energy region from 20 eV to 25 keV. To this purpose, addressed by the Nuclear Energy Agency as a priority nuclear data need, complementary experiments, one at the GELINA and two at the n_TOF facility, were scheduled within the ANDES project within the 7th Framework Project of the European Commission. The results of one of the 238U(n,gamma) measurement performed at the n_TOF CERN facility are presented in this work, carried out with a detection system constituted of two liquid scintillators. The very accurate cross section from this work is compared with the results obtained from the other measurement performed at the n_TOF facility, which exploit a different and complementary detection technique. The excellent agreement between the two data-sets points out that they can contribute to the reduction of the cross section uncertainty down to the required 1-3%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Back ground and Purpose. There is a growing consensus among health care researchers that Quality of Life (QoL) is an important outcome and, within the field of family caregiving, cost effectiveness research is needed to determine which programs have the greatest benefit for family members. This study uses a multidimensional approach to measure the cost effectiveness of a multicomponent intervention designed to improve the quality of life of spousal caregivers of stroke survivors. Methods. The CAReS study (Committed to Assisting with Recovery after Stroke) was a 5-year prospective, longitudinal intervention study for 159 stroke survivors and their spousal caregivers upon discharge of the stroke survivor from inpatient rehabilitation to their home. CAReS cost data were analyzed to determine the incremental cost of the intervention per caregiver. The mean values of the quality-of-life predictor variables of the intervention group of caregivers were compared to the mean values of usual care groups found in the literature. Significant differences were then divided into the cost of the intervention per caregiver to calculate the incremental cost effectiveness ratio for each predictor variable. Results. The cost of the intervention per caregiver was approximately $2,500. Statistically significant differences were found between the mean scores for the Perceived Stress and Satisfaction with Life scales. Statistically significant differences were not found between the mean scores for the Self Reported Health Status, Mutuality, and Preparedness scales. Conclusions. This study provides a prototype cost effectiveness analysis on which researchers can build. Using a multidimensional approach to measure QoL, as used in this analysis, incorporates both the subjective and objective components of QoL. Some of the QoL predictor variable scores were significantly different between the intervention and comparison groups, indicating a significant impact of the intervention. The estimated cost of the impact was also examined. In future studies, a scale that takes into account both the dimensions and the weighting each person places on the dimensions of QoL should be used to provide a single QoL score per participant. With participant level cost and outcome data, uncertainty around each cost-effectiveness ratio can be calculated using the bias-corrected percentile bootstrapping method and plotted to calculate the cost-effectiveness acceptability curves.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ice shelves strongly impact coastal Antarctic sea-ice and the associated ecosystem through the formation of a sub-sea-ice platelet layer. Although progress has been made in determining and understanding its spatio-temporal variability based on point measurements, an investigation of this phenomenon on a larger scale remains a challenge due to logistical constraints and a lack of suitable methodology. In this study, we applied a laterally-constrained Marquardt-Levenberg inversion to a unique multi-frequency electromagnetic (EM) induction sounding dataset obtained on the landfast sea ice of Atka Bay, eastern Weddell Sea, in 2012. In addition to consistent fast-ice thickness and -conductivities along > 100 km transects; we present the first comprehensive, high resolution platelet-layer thickness and -conductivity dataset recorded on Antarctic sea ice. The reliability of the algorithm was confirmed by using synthetic data, and the inverted platelet-layer thicknesses agreed within the data uncertainty to drill-hole measurements. Ice-volume fractions were calculated from platelet-layer conductivities, revealing that an older and thicker platelet layer is denser and more compacted than a loosely attached, young platelet layer. The overall platelet-layer volume below Atka Bay fast ice suggests that the contribution of ocean/ice-shelf interaction to sea-ice volume in this region is even higher than previously thought. This study also implies that multi-frequency EM induction sounding is an effective approach in determining platelet layer volume on a larger scale than previously feasible. When applied to airborne multi-frequency EM, this method could provide a step towards an Antarctic-wide quantification of ocean/ice-shelf interaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The selection of metrics for ecosystem restoration programs is critical for improving the quality of monitoring programs and characterizing project success. Moreover it is oftentimes very difficult to balance the importance of multiple ecological, social, and economical metrics. Metric selection process is a complex and must simultaneously take into account monitoring data, environmental models, socio-economic considerations, and stakeholder interests. We propose multicriteria decision analysis (MCDA) methods, broadly defined, for the selection of optimal sets of metrics to enhance evaluation of ecosystem restoration alternatives. Two MCDA methods, a multiattribute utility analysis (MAUT), and a probabilistic multicriteria acceptability analysis (ProMAA), are applied and compared for a hypothetical case study of a river restoration involving multiple stakeholders. Overall, the MCDA results in a systematic, unbiased, and transparent solution, informing restoration alternatives evaluation. The two methods provide comparable results in terms of selected metrics. However, because ProMAA can consider probability distributions for weights and utility values of metrics for each criteria, it is suggested as the best option if data uncertainty is high. Despite the increase in complexity in the metric selection process, MCDA improves upon the current ad-hoc decision practice based on the consultations with stakeholders and experts, and encourages transparent and quantitative aggregation of data and judgement, increasing the transparency of decision making in restoration projects. We believe that MCDA can enhance the overall sustainability of ecosystem by enhancing both ecological and societal needs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seawater intrusion in coastal agricultural areas due to groundwater abstraction is a major environmental problem along the northeastern coast of Australia. Management options are being explored using numerical modelling, however, questions remain concerning the appropriate level of sophistication in models, choice of seaward boundary conditions, and how to accommodate heterogeneity and data uncertainty. The choice of seaward boundary condition is important since it affects the amount of salt transported into the aquifers and forms the focus of the present study. The impact of this boundary condition is illustrated for the seawater-intrusion problem in the Gooburrum aquifers, which occur within Tertiary sedimentary strata. A two-dimensional variable-density groundwater and solute-transport model was constructed using the computer code 2DFEMFAT (Cheng et al. 1998). The code was tested against an experiment for a steady-state freshwater-saltwater interface and against the Elder (Elder 1967) free-convection problem. Numerical simulations show that the imposition of the commonly-used equivalent hydrostatic freshwater heads, combined with a constant salt concentration at the seaward boundary, results in overestimated seawater intrusion in the lower Gooburrum aquifer. Since the imposition of this boundary condition allows water flow across the boundary, which subsequently takes salt into the aquifer, a careful check is essential to estimate whether too much mass of salt is introduced.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Construction projects are complex endeavors that require the involvement of different professional disciplines in order to meet various project objectives that are often conflicting. The level of complexity and the multi-objective nature of construction projects lend themselves to collaborative design and construction such as integrated project delivery (IPD), in which relevant disciplines work together during project conception, design and construction. Traditionally, the main objectives of construction projects have been to build in the least amount of time with the lowest cost possible, thus the inherent and well-established relationship between cost and time has been the focus of many studies. The importance of being able to effectively model relationships among multiple objectives in building construction has been emphasized in a wide range of research. In general, the trade-off relationship between time and cost is well understood and there is ample research on the subject. However, despite sustainable building designs, relationships between time and environmental impact, as well as cost and environmental impact, have not been fully investigated. The objectives of this research were mainly to analyze and identify relationships of time, cost, and environmental impact, in terms of CO2 emissions, at different levels of a building: material level, component level, and building level, at the pre-use phase, including manufacturing and construction, and the relationships of life cycle cost and life cycle CO2 emissions at the usage phase. Additionally, this research aimed to develop a robust simulation-based multi-objective decision-support tool, called SimulEICon, which took construction data uncertainty into account, and was capable of incorporating life cycle assessment information to the decision-making process. The findings of this research supported the trade-off relationship between time and cost at different building levels. Moreover, the time and CO2 emissions relationship presented trade-off behavior at the pre-use phase. The results of the relationship between cost and CO2 emissions were interestingly proportional at the pre-use phase. The same pattern continually presented after the construction to the usage phase. Understanding the relationships between those objectives is a key in successfully planning and designing environmentally sustainable construction projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the context of learning paradigms of identification in the limit, we address the question: why is uncertainty sometimes desirable? We use mind change bounds on the output hypotheses as a measure of uncertainty, and interpret ‘desirable’ as reduction in data memorization, also defined in terms of mind change bounds. The resulting model is closely related to iterative learning with bounded mind change complexity, but the dual use of mind change bounds — for hypotheses and for data — is a key distinctive feature of our approach. We show that situations exists where the more mind changes the learner is willing to accept, the lesser the amount of data it needs to remember in order to converge to the correct hypothesis. We also investigate relationships between our model and learning from good examples, set-driven, monotonic and strong-monotonic learners, as well as class-comprising versus class-preserving learnability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the context of learning paradigms of identification in the limit, we address the question: why is uncertainty sometimes desirable? We use mind change bounds on the output hypotheses as a measure of uncertainty and interpret ‘desirable’ as reduction in data memorization, also defined in terms of mind change bounds. The resulting model is closely related to iterative learning with bounded mind change complexity, but the dual use of mind change bounds — for hypotheses and for data — is a key distinctive feature of our approach. We show that situations exist where the more mind changes the learner is willing to accept, the less the amount of data it needs to remember in order to converge to the correct hypothesis. We also investigate relationships between our model and learning from good examples, set-driven, monotonic and strong-monotonic learners, as well as class-comprising versus class-preserving learnability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this paper is to test for the effect of uncertainty in a model of real estate investment in Finland during the hihhly cyclical period of 1975 to 1998. We use two alternative measures of uncertainty. The first measure is the volatility of stock market returns and the second measure is the heterogeneity in the answers of the quarterly business survey of the Confederation of Finnish Industry and Employers. The econometric analysis is based on the autoregressive distributed lag (ADL) model and the paper applies a 'general-to-specific' modelling approach. We find that the measure of heterogeneity is significant in the model, but the volatility of stock market returns is not. The empirical results give some evidence of an uncertainty-induced threshold slowing down real estate investment in Finland.