950 resultados para Model accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An integrated model relating workplace rumor activity, belief, and accuracy is proposed and tested. Senior VPs of Communications from a sample of Fortune-500 corporations and CEOs of established public relations firms were surveyed regarding rumor episodes that they had experienced. Results confirmed previous research on the role of uncertainty, anxiety, and belief in rumor activity. In addition, a reduced sense of control mediated the effects of uncertainty on anxiety, and anxiety mediated the effects of importance on rumor activity. Evidence was found for the roles of group bias in how strongly a rumor is believed. Rumor activity was also implicated in the formation of more accurate rumors. The significance of these results for rumor theory and for Public Relations practitioners is presented. (C) 2002 Elsevier Science Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictions of flow patterns in a 600-mm scale model SAG mill made using four classes of discrete element method (DEM) models are compared to experimental photographs. The accuracy of the various models is assessed using quantitative data on shoulder, toe and vortex center positions taken from ensembles of both experimental and simulation results. These detailed comparisons reveal the strengths and weaknesses of the various models for simulating mills and allow the effect of different modelling assumptions to be quantitatively evaluated. In particular, very close agreement is demonstrated between the full 3D model (including the end wall effects) and the experiments. It is also demonstrated that the traditional two-dimensional circular particle DEM model under-predicts the shoulder, toe and vortex center positions and the power draw by around 10 degrees. The effect of particle shape and the dimensionality of the model are also assessed, with particle shape predominantly affecting the shoulder position while the dimensionality of the model affects mainly the toe position. Crown Copyright (C) 2003 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model updating methods often neglect that in fact all physical structures are damped. Such simplification relies on the structural modelling approach, although it compromises the accuracy of the predictions of the structural dynamic behaviour. In the present work, the authors address the problem of finite element (FE) model updating based on measured frequency response functions (FRFs), considering damping. The proposed procedure is based upon the complex experimental data, which contains information related to the damped FE model parameters and presents the advantage of requiring no prior knowledge about the damping matrix structure or its content, only demanding the definition of the damping type. Numerical simulations are performed in order to establish the applicability of the proposed damped FE model updating technique and its results are discussed in terms of the correlation between the simulated experimental complex FRFs and the ones obtained from the updated FE model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proceedings of International Conference - SPIE 7477, Image and Signal Processing for Remote Sensing XV - 28 September 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to conduct a methodical drawback analysis of a financial supplier risk management approach which is currently implemented in the automotive industry. Based on identified methodical flaws, the risk assessment model is further developed by introducing a malus system which incorporates hidden risks into the model and by revising the derivation of the most central risk measure in the current model. Both methodical changes lead to significant enhancements in terms of risk assessment accuracy, supplier identification and workload efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The purpose of this study was to develop a mathematical model (sine model, SIN) to describe fat oxidation kinetics as a function of the relative exercise intensity [% of maximal oxygen uptake (%VO2max)] during graded exercise and to determine the exercise intensity (Fatmax) that elicits maximal fat oxidation (MFO) and the intensity at which the fat oxidation becomes negligible (Fatmin). This model included three independent variables (dilatation, symmetry, and translation) that incorporated primary expected modulations of the curve because of training level or body composition. METHODS: Thirty-two healthy volunteers (17 women and 15 men) performed a graded exercise test on a cycle ergometer, with 3-min stages and 20-W increments. Substrate oxidation rates were determined using indirect calorimetry. SIN was compared with measured values (MV) and with other methods currently used [i.e., the RER method (MRER) and third polynomial curves (P3)]. RESULTS: There was no significant difference in the fitting accuracy between SIN and P3 (P = 0.157), whereas MRER was less precise than SIN (P < 0.001). Fatmax (44 +/- 10% VO2max) and MFO (0.37 +/- 0.16 g x min(-1)) determined using SIN were significantly correlated with MV, P3, and MRER (P < 0.001). The variable of dilatation was correlated with Fatmax, Fatmin, and MFO (r = 0.79, r = 0.67, and r = 0.60, respectively, P < 0.001). CONCLUSIONS: The SIN model presents the same precision as other methods currently used in the determination of Fatmax and MFO but in addition allows calculation of Fatmin. Moreover, the three independent variables are directly related to the main expected modulations of the fat oxidation curve. SIN, therefore, seems to be an appropriate tool in analyzing fat oxidation kinetics obtained during graded exercise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rats were treated postnatally (PND 5-16) with BSO (l-buthionine-(S,R)-sulfoximine) in an animal model of schizophrenia based on transient glutathione deficit. The BSO treated rats were impaired in patrolling a maze or a homing table when adult, yet demonstrated preserved escape learning, place discrimination and reversal in a water maze task [37]. In the present work, BSO rats' performance in the water maze was assessed in conditions controlling for the available visual cues. First, in a completely curtained environment with two salient controlled cues, BSO rats showed little accuracy compared to control rats. Secondly, pre-trained BSO rats were impaired in reaching the familiar spatial position when curtains partially occluded different portions of the room environment in successive sessions. The apparently preserved place learning in a classical water maze task thus appears to require the stability and the richness of visual landmarks from the surrounding environment. In other words, the accuracy of BSO rats in place and reversal learning is impaired in a minimal cue condition or when the visual panorama changes between trials. However, if the panorama remains rich and stable between trials, BSO rats are equally efficient in reaching a familiar position or in learning a new one. This suggests that the BSO accurate performance in the water maze does not satisfy all the criteria for a cognitive map based navigation on the integration of polymodal cues. It supports the general hypothesis of a binding deficit in BSO rats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Practice guidelines recommend outpatient care for selected patients with non-massive pulmonary embolism (PE), but fail to specify how these low-risk patients should be identified. Using data from U.S. patients, we previously derived the Pulmonary Embolism Severity Index (PESI), a prediction rule that risk stratifies patients with PE. We sought to validate the PESI in a European patient cohort. We prospectively validated the PESI in patients with PE diagnosed at six emergency departments in three European countries. We used baseline data for the rule's 11 prognostic variables to stratify patients into five risk classes (I-V) of increasing probability of mortality. The outcome was overall mortality at 90 days after presentation. To assess the accuracy of the PESI to predict mortality, we estimated the sensitivity, specificity, and predictive values for low- (risk classes I/II) versus higher-risk patients (risk classes III-V), and the discriminatory power using the area under the receiver operating characteristic (ROC) curve. Among 357 patients with PE, overall mortality was 5.9%, ranging from 0% in class I to 17.9% in class V. The 186 (52%) low-risk patients had an overall mortality of 1.1% (95% confidence interval [CI]: 0.1-3.8%) compared to 11.1% (95% CI: 6.8-16.8%) in the 171 (48%) higher-risk patients. The PESI had a high sensitivity (91%, 95% CI: 71-97%) and a negative predictive value (99%, 95% CI: 96-100%) for predicting mortality. The area under the ROC curve was 0.78 (95% CI: 0.70-0.86). The PESI reliably identifies patients with PE who are at low risk of death and who are potential candidates for outpatient care. The PESI may help physicians make more rational decisions about hospitalization for patients with PE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To elucidate the diagnostic accuracy of granulocyte colony-stimulating factor (G-CSF), interleukin-8 (IL-8), and interleukin-1 receptor antagonist (IL-1ra) in identifying patients with sepsis among critically ill pediatric patients with suspected infection. DESIGN AND SETTING: Nested case-control study in a multidisciplinary neonatal and pediatric intensive care unit (PICU) PATIENTS: PICU patients during a 12-month period with suspected infection, and plasma available from the time of clinical suspicion (254 episodes, 190 patients). MEASUREMENTS AND RESULTS: Plasma levels of G-CSF, IL-8, and IL-1ra. Episodes classified on the basis of clinical and bacteriological findings into: culture-confirmed sepsis, probable sepsis, localized infection, viral infection, and no infection. Plasma levels were significantly higher in episodes of culture-confirmed sepsis than in episodes with ruled-out infection. The area under the receiver operating characteristic curve was higher for IL-8 and G-CSF than for IL-1ra. Combining IL-8 and G-CSF improved the diagnostic performance, particularly as to the detection of Gram-negative sepsis. Sensitivity was low (<50%) in detecting Staphylococcus epidermidis bacteremia or localized infections. CONCLUSIONS: In this heterogeneous population of critically ill children with suspected infection, a model combining plasma levels of IL-8 and G-CSF identified patients with sepsis. Negative results do not rule out S. epidermidis bacteremia or locally confined infectious processes. The model requires validation in an independent data-set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To measure the contribution of individual transactions inside the total risk of a credit portfolio is a major issue in financial institutions. VaR Contributions (VaRC) and Expected Shortfall Contributions (ESC) have become two popular ways of quantifying the risks. However, the usual Monte Carlo (MC) approach is known to be a very time consuming method for computing these risk contributions. In this paper we consider the Wavelet Approximation (WA) method for Value at Risk (VaR) computation presented in [Mas10] in order to calculate the Expected Shortfall (ES) and the risk contributions under the Vasicek one-factor model framework. We decompose the VaR and the ES as a sum of sensitivities representing the marginal impact on the total portfolio risk. Moreover, we present technical improvements in the Wavelet Approximation (WA) that considerably reduce the computational effort in the approximation while, at the same time, the accuracy increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The ASTRAL score was recently shown to reliably predict three-month functional outcome in patients with acute ischemic stroke. AIM: The study aims to investigate whether information from multimodal imaging increases ASTRAL score's accuracy. METHODS: All patients registered in the ASTRAL registry until March 2011 were included. In multivariate logistic-regression analyses, we added covariates derived from parenchymal, vascular, and perfusion imaging to the 6-parameter model of the ASTRAL score. If a specific imaging covariate remained an independent predictor of three-month modified Rankin score > 2, the area-under-the-curve (AUC) of this new model was calculated and compared with ASTRAL score's AUC. We also performed similar logistic regression analyses in arbitrarily chosen patient subgroups. RESULTS: When added to the ASTRAL score, the following covariates on admission computed tomography/magnetic resonance imaging-based multimodal imaging were not significant predictors of outcome: any stroke-related acute lesion, any nonstroke-related lesions, chronic/subacute stroke, leukoaraiosis, significant arterial pathology in ischemic territory on computed tomography angiography/magnetic resonance angiography/Doppler, significant intracranial arterial pathology in ischemic territory, and focal hypoperfusion on perfusion-computed tomography. The Alberta Stroke Program Early CT score on plain imaging and any significant extracranial arterial pathology on computed tomography angiography/magnetic resonance angiography/Doppler were independent predictors of outcome (odds ratio: 0·93, 95% CI: 0·87-0·99 and odds ratio: 1·49, 95% CI: 1·08-2·05, respectively) but did not increase ASTRAL score's AUC (0·849 vs. 0·850, and 0·8563 vs. 0·8564, respectively). In exploratory analyses in subgroups of different prognosis, age or stroke severity, no covariate was found to increase ASTRAL score's AUC, either. CONCLUSIONS: The addition of information derived from multimodal imaging does not increase ASTRAL score's accuracy to predict functional outcome despite having an independent prognostic value. More selected radiological parameters applied in specific subgroups of stroke patients may add prognostic value of multimodal imaging.