983 resultados para estimation risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the basic present value model of interest rates under rational expectations with two additional features. First, following McCallum (1994), the model assumes a policy reaction function where changes in the short-term interest rate are determined by the long-short spread. Second, the short-term interest rate and the risk premium processes are characterized by a Markov regime-switching model. Using US post-war interest rate data, this paper finds evidence that a two-regime switching model fits the data better than the basic model. The estimation results also show the presence of two alternative states displaying quite different features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural design is a decision-making process in which a wide spectrum of requirements, expectations, and concerns needs to be properly addressed. Engineering design criteria are considered together with societal and client preferences, and most of these design objectives are affected by the uncertainties surrounding a design. Therefore, realistic design frameworks must be able to handle multiple performance objectives and incorporate uncertainties from numerous sources into the process.

In this study, a multi-criteria based design framework for structural design under seismic risk is explored. The emphasis is on reliability-based performance objectives and their interaction with economic objectives. The framework has analysis, evaluation, and revision stages. In the probabilistic response analysis, seismic loading uncertainties as well as modeling uncertainties are incorporated. For evaluation, two approaches are suggested: one based on preference aggregation and the other based on socio-economics. Both implementations of the general framework are illustrated with simple but informative design examples to explore the basic features of the framework.

The first approach uses concepts similar to those found in multi-criteria decision theory, and directly combines reliability-based objectives with others. This approach is implemented in a single-stage design procedure. In the socio-economics based approach, a two-stage design procedure is recommended in which societal preferences are treated through reliability-based engineering performance measures, but emphasis is also given to economic objectives because these are especially important to the structural designer's client. A rational net asset value formulation including losses from uncertain future earthquakes is used to assess the economic performance of a design. A recently developed assembly-based vulnerability analysis is incorporated into the loss estimation.

The presented performance-based design framework allows investigation of various design issues and their impact on a structural design. It is a flexible one that readily allows incorporation of new methods and concepts in seismic hazard specification, structural analysis, and loss estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with time-domain optimal control of active suspensions. The optimal control problem formulation has been generalised by incorporating both road disturbances (ride quality) and a representation of driver inputs (handling quality) into the optimal control formulation. A regular optimal control problem as well as a risk-sensitive exponential optimal control performance index is considered. Emphasis has been given to practical considerations including the issue of state estimation in the presence of load disturbances (driver inputs). © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile regression refers to the process of estimating the quantiles of a conditional distribution and has many important applications within econometrics and data mining, among other domains. In this paper, we show how to estimate these conditional quantile functions within a Bayes risk minimization framework using a Gaussian process prior. The resulting non-parametric probabilistic model is easy to implement and allows non-crossing quantile functions to be enforced. Moreover, it can directly be used in combination with tools and extensions of standard Gaussian Processes such as principled hyperparameter estimation, sparsification, and quantile regression with input-dependent noise rates. No existing approach enjoys all of these desirable properties. Experiments on benchmark datasets show that our method is competitive with state-of-the-art approaches. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hip fracture is the leading cause of acute orthopaedic hospital admission amongst the elderly, with around a third of patients not surviving one year post-fracture. Although various preventative therapies are available, patient selection is difficult. The current state-of-the-art risk assessment tool (FRAX) ignores focal structural defects, such as cortical bone thinning, a critical component in characterizing hip fragility. Cortical thickness can be measured using CT, but this is expensive and involves a significant radiation dose. Instead, Dual-Energy X-ray Absorptiometry (DXA) is currently the preferred imaging modality for assessing hip fracture risk and is used routinely in clinical practice. Our ambition is to develop a tool to measure cortical thickness using multi-view DXA instead of CT. In this initial study, we work with digitally reconstructed radiographs (DRRs) derived from CT data as a surrogate for DXA scans: this enables us to compare directly the thickness estimates with the gold standard CT results. Our approach involves a model-based femoral shape reconstruction followed by a data-driven algorithm to extract numerous cortical thickness point estimates. In a series of experiments on the shaft and trochanteric regions of 48 proximal femurs, we validated our algorithm and established its performance limits using 20 views in the range 0°-171°: estimation errors were 0:19 ± 0:53mm (mean +/- one standard deviation). In a more clinically viable protocol using four views in the range 0°-51°, where no other bony structures obstruct the projection of the femur, measurement errors were -0:07 ± 0:79 mm. © 2013 SPIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L. Blot and R. Zwiggelaar, 'A volumetric approach to risk assessment in mammography: a feasibility study', Physics in Medicine and Biology 50 (4), 695-708 (2005)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Cardiovascular disease (CVD) occurs more frequently in individuals with a family history of premature CVD. Within families the demographics of CVD are poorly described. DESIGN: We examined the risk estimation based on the Systematic Coronary Risk Evaluation (SCORE) system and the Joint British Guidelines (JBG) for older unaffected siblings of patients with premature CVD (onset ≤55 years for men and ≤60 years for women). METHODS: Between August 1999 and November 2003 laboratory and demographic details were collected on probands with early-onset CVD and their older unaffected siblings. Siblings were screened for clinically overt CVD by a standard questionnaire and 12-lead electrocardiogram (ECG). RESULTS: A total of 790 siblings was identified and full demographic details were available for 645. The following siblings were excluded: 41 with known diabetes mellitus; seven with random plasma glucose of 11.1 mmol/l or greater; and eight with ischaemic ECG. Data were analysed for 589 siblings from 405 families. The mean age was 55.0 years, 43.1% were men and 28.7% were smokers. The mean total serum cholesterol was 5.8 mmol/l and hypertension was present in 49.4%. Using the SCORE system, when projected to age 60 years, 181 men (71.3%) and 67 women (20.0%) would be eligible for risk factor modification. Using JBG with a 10-year risk of 20% or greater, 42 men (16.5%) and four women (1.2%) would be targeted. CONCLUSIONS: Large numbers of these asymptomatic individuals meet both European and British guidelines for the primary prevention of CVD and should be targeted for risk factor modification. The prevalence of individuals defined as eligible for treatment is much higher when using the SCORE system. © 2007 European Society of Cardiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of methods providing reliable estimates of demographic parameters (e. g., survival rates, fecundity) for wild populations is essential to better understand the ecology and conservation requirements of individual species. A number of methods exist for estimating the demographics of stage-structured populations, but inherent mathematical complexity often limits their uptake by conservation practitioners. Estimating survival rates for pond-breeding amphibians is further complicated by their complex migratory and reproductive behaviours, often resulting in nonobservable states and successive cohorts of eggs and tadpoles. Here we used comprehensive data on 11 distinct breeding toad populations (Bufo calamita) to clarify and assess the suitability of a relatively simple method [the Kiritani-Nakasuji-Manly (KNM) method] to estimate the survival rates of stage-structured populations with overlapping life stages. The study shows that the KNM method is robust and provides realistic estimates of amphibian egg and larval survival rates for species in which breeding can occur as a single pulse or over a period of several weeks. The study also provides estimates of fecundity for seven distinct toad populations and indicates that it is essential to use reliable estimates of fecundity to limit the risk of under- or overestimating the survival rates when using the KNM method. Survival and fecundity rates for B. calamita populations were then used to define population matrices and make a limited exploration of their growth and viability. The findings of the study recently led to the implementation of practical conservation measures at the sites where populations were most vulnerable to extinction. © 2010 The Society of Population Ecology and Springer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arsenic (As) contamination of rice plants can result in high total As concentrations (t-As) in cooked rice, especially if As-contaminated water is used for cooking. This study examines two variables: (1) the cooking method (water volume and inclusion of a washing step); and (2) the rice type (atab and boiled). Cooking water and raw atab and boiled rice contained 40 g As l-1 and 185 and 315 g As kg-1, respectively. In general, all cooking methods increased t-As from the levels in raw rice; however, raw boiled rice decreased its t-As by 12.7% when cooked by the traditional method, but increased by 15.9% or 23.5% when cooked by the intermediate or contemporary methods, respectively. Based on the best possible scenario (the traditional cooking method leading to the lowest level of contamination, and the atab rice type with the lowest As content), t-As daily intake was estimated to be 328 g, which was twice the tolerable daily intake of 150 g.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arsenic contamination of rice plants by arsenic-polluted irrigation groundwater could result in high arsenic concentrations in cooked rice. The main objective of the study was to estimate the total and inorganic arsenic intakes in a rural population of West Bengal, India, through both drinking water and cooked rice. Simulated cooking of rice with different levels of arsenic species in the cooking water was carried out. The presence of arsenic in the cooking water was provided by four arsenic species (arsenite, arsenate, methylarsonate or dimethylarsinate) and at three total arsenic concentrations (50, 250 or 500 mu g l(-1)). The results show that the arsenic concentration in cooked rice is always higher than that in raw rice and range from 227 to 1642 mu g kg(-1). The cooking process did not change the arsenic speciation in rice. Cooked rice contributed a mean of 41% to the daily intake of inorganic arsenic. The daily inorganic arsenic intakes for water plus rice were 229, 1024 and 2000 mu g day(-1) for initial arsenic concentrations in the cooking water of 50, 250 and 500 g arsenic l(-1), respectively, compared with the tolerable daily intake which is 150 mu g day(-1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lead (Pb) is a non-threshold toxin capable of inducing toxic effects at any blood level but availability of soil screening criteria for assessing potential health risks is limited. The oral bioaccessibility of Pb in 163 soil samples was attributed to sources through solubility estimation and domain identification. Samples were extracted following the Unified BARGE Method. Urban, mineralisation, peat and granite domains accounted for elevated Pb concentrations compared to rural samples. High Pb solubility explained moderate-high gastric (G) bioaccessible fractions throughout the study area. Higher maximum G concentrations were measured in urban (97.6 mg kg−1) and mineralisation (199.8 mg kg−1) domains. Higher average G concentrations occurred in mineralisation (36.4 mg kg−1) and granite (36.0 mg kg−1) domains. Findings suggest diffuse anthropogenic and widespread geogenic contamination could be capable of presenting health risks, having implications for land management decisions in jurisdictions where guidance advises these forms of pollution should not be regarded as contaminated land.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel real-time power-device temperature estimation method that monitors the power MOSFET's junction temperature shift arising from thermal aging effects and incorporates the updated electrothermal models of power modules into digital controllers. Currently, the real-time estimator is emerging as an important tool for active control of device junction temperature as well as online health monitoring for power electronic systems, but its thermal model fails to address the device's ongoing degradation. Because of a mismatch of coefficients of thermal expansion between layers of power devices, repetitive thermal cycling will cause cracks, voids, and even delamination within the device components, particularly in the solder and thermal grease layers. Consequently, the thermal resistance of power devices will increase, making it possible to use thermal resistance (and junction temperature) as key indicators for condition monitoring and control purposes. In this paper, the predicted device temperature via threshold voltage measurements is compared with the real-time estimated ones, and the difference is attributed to the aging of the device. The thermal models in digital controllers are frequently updated to correct the shift caused by thermal aging effects. Experimental results on three power MOSFETs confirm that the proposed methodologies are effective to incorporate the thermal aging effects in the power-device temperature estimator with good accuracy. The developed adaptive technologies can be applied to other power devices such as IGBTs and SiC MOSFETs, and have significant economic implications. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bridge construction responds to the need for environmentally friendly design of motorways and facilitates the passage through sensitive natural areas and the bypassing of urban areas. However, according to numerous research studies, bridge construction presents substantial budget overruns. Therefore, it is necessary early in the planning process for the decision makers to have reliable estimates of the final cost based on previously constructed projects. At the same time, the current European financial crisis reduces the available capital for investments and financial institutions are even less willing to finance transportation infrastructure. Consequently, it is even more necessary today to estimate the budget of high-cost construction projects -such as road bridges- with reasonable accuracy, in order for the state funds to be invested with lower risk and the projects to be designed with the highest possible efficiency. In this paper, a Bill-of-Quantities (BoQ) estimation tool for road bridges is developed in order to support the decisions made at the preliminary planning and design stages of highways. Specifically, a Feed-Forward Artificial Neural Network (ANN) with a hidden layer of 10 neurons is trained to predict the superstructure material quantities (concrete, pre-stressed steel and reinforcing steel) using the width of the deck, the adjusted length of span or cantilever and the type of the bridge as input variables. The training dataset includes actual data from 68 recently constructed concrete motorway bridges in Greece. According to the relevant metrics, the developed model captures very well the complex interrelations in the dataset and demonstrates strong generalisation capability. Furthermore, it outperforms the linear regression models developed for the same dataset. Therefore, the proposed cost estimation model stands as a useful and reliable tool for the construction industry as it enables planners to reach informed decisions for technical and economic planning of concrete bridge projects from their early implementation stages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of the world population has led to an increase of settlements often located in areas prone to natural disasters, including earthquakes. Consequently, despite the important advances in the field of natural catastrophes modelling and risk mitigation actions, the overall human losses have continued to increase and unprecedented economic losses have been registered. In the research work presented herein, various areas of earthquake engineering and seismology are thoroughly investigated, and a case study application for mainland Portugal is performed. Seismic risk assessment is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible numerical tools and software. In the present work, an open-source platform for seismic hazard and risk assessment is developed. This software is capable of computing the distribution of losses or damage for an earthquake scenario (deterministic event-based) or earthquake losses due to all the possible seismic events that might occur within a region for a given interval of time (probabilistic event-based). This effort has been developed following an open and transparent philosophy and therefore, it is available to any individual or institution. The estimation of the seismic risk depends mainly on three components: seismic hazard, exposure and vulnerability. The latter component assumes special importance, as by intervening with appropriate retrofitting solutions, it may be possible to decrease directly the seismic risk. The employment of analytical methodologies is fundamental in the assessment of structural vulnerability, particularly in regions where post-earthquake building damage might not be available. Several common methodologies are investigated, and conclusions are yielded regarding the method that can provide an optimal balance between accuracy and computational effort. In addition, a simplified approach based on the displacement-based earthquake loss assessment (DBELA) is proposed, which allows for the rapid estimation of fragility curves, considering a wide spectrum of uncertainties. A novel vulnerability model for the reinforced concrete building stock in Portugal is proposed in this work, using statistical information collected from hundreds of real buildings. An analytical approach based on nonlinear time history analysis is adopted and the impact of a set of key parameters investigated, including the damage state criteria and the chosen intensity measure type. A comprehensive review of previous studies that contributed to the understanding of the seismic hazard and risk for Portugal is presented. An existing seismic source model was employed with recently proposed attenuation models to calculate probabilistic seismic hazard throughout the territory. The latter results are combined with information from the 2011 Building Census and the aforementioned vulnerability model to estimate economic loss maps for a return period of 475 years. These losses are disaggregated across the different building typologies and conclusions are yielded regarding the type of construction more vulnerable to seismic activity.