963 resultados para Probabilistic mean value theorem
Resumo:
Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.
Resumo:
Background: Visual analog scales (VAS) are used to assess readiness to changeconstructs, which are often considered critical for change.Objective: We studied whether 3 constructs -readiness to change, importance of changing and confidence inability to change- predict risk status 6 months later in 20 year-old men with either orboth of two behaviors: risky drinking and smoking. Methods: 577 participants in abrief intervention randomized trial were assessed at baseline and 6 months later onalcohol and tobacco consumption and with three 1-10 VAS (readiness, importance,confidence) for each behavior. For each behavior, we used one regression model foreach constructs. Models controlled for receipt of a brief intervention and used thelowest level (1-4) in each construct as the reference group (vs medium (5-7) and high(8-10) levels).Results: Among the 475 risky drinkers, mean (SD) readiness, importance and confidence to change drinking were 4.0 (3.1), 2.8 (2.2) and 7.2 (3.0).Readiness was not associated with being alcohol-risk free 6 months later (OR 1.3[0.7; 2.2] and 1.4 [0.8; 2.6] for medium and high readiness). High importance andhigh confidence were associated with being risk free (OR 0.9 [0.5; 1.8] and 2.9 [1.2;7.5] for medium and high importance; 2.1 [1.0;4.8] and 2.8 [1.5;5.6] for medium andhigh confidence). Among the 320 smokers, mean readiness, importance andconfidence to change smoking were 4.6 (2.6), 5.3 (2.6) and 5.9 (2.6). Neitherreadiness nor importance were associated with being smoking free (OR 2.1 [0.9; 4.7]and 2.1 [0.8; 5.8] for medium and high readiness; 1.4 [0.6; 3.4] and 2.1 [0.8; 5.4] formedium and high importance). High confidence was associated with being smokingfree (OR 2.2 [0.8;6.6] and 3.4 [1.2;9.8] for medium and high confidence).Conclusions: For drinking and smoking, high confidence in ability to change wasassociated -with similar magnitude- with a favorable outcome. This points to thevalue of confidence as an important predictor of successful change.
Resumo:
The prognostic significance of magnetic resonance imaging (MRI) in the neonatal period was studied prospectively in 43 term infants with perinatal asphyxia. MRI was performed between 1 and 14 days after birth with a high field system (2.35 Tesla). Neurodevelopmental outcome was assessed by a standardized neurological examination and the Griffiths developmental test at a mean age of 18.9 months. The predictive value of the various MRI patterns was as follows: Severe diffuse brain injury (pattern AII+III; n = 7) and lesions of thalamus and basal ganglia (pattern C; n = 5) were strongly associated with poor outcome and greatly reduced head growth. Mild diffuse brain injury (pattern AI; n = 7), parasagittal lesions (B; n = 7), periventricular hyperintensity (D; n = 2), focal brain necrosis and hemorrhage (E; n = 3) and periventricular hypointense stripes (on T2-weighted images; F; n = 3) led in one third of the infants to minor neurological disturbances and mild developmental delay. Infants with normal MRI findings (G; n = 9) developed normally with the exception of one infant who was mildly delayed at 18 months. The results indicate that MRI examination during the first two weeks of life is of prognostic significance in term infants suffering from perinatal asphyxia. Severe hypoxic-ischemic brain lesions were associated highly significantly with poor neuro-developmental outcome, whereas infants with inconspicuous MRI developed normally.
Resumo:
Résumé : Description : Ce travail de thèse évalue l'impact de la consommation importante d'alcool sur les facteurs de risque cardiovasculaire et l'estimation du risque cardiovasculaire à 10 ans (risque de développer une maladie coronarienne}, dans une population avec une consommation moyenne élevée d'alcool. La consommation modérée d'alcool a été liée à un risque plus faible de développer une maladie coronarienne. Cependant, les données concernant la consommation importante d'alcool et le risque de développer une maladie coronarienne sont conflictuelles. Il y a également peu d'études dans lesquelles les consommations importantes d'alcool ont pu être évaluées en raison du petit nombre de sujets présentant une telle consommation. Résultats: Nous avons utilisé les données de l'étude CoLaus, une étude populationnelle qui inclut des adultes, âgés de 35 à 75 ans, de la ville de Lausanne. Nous avons inclus 5'769 participants, sans maladie cardiovasculaire, pour lesquels la consommation d'alcool d'une semaine a été catégorisée en 0, 1 à 6, 7 à 13, 14 à 20, 21 à 27, 28 à 34 et >=35 verres/semaine et en non-consommateur (0 verre/semaine), consommateur modéré (1 à 13 verres/semaine), important (14 à 34 verres/semaine) et très important (>= 35). La tension artérielle et les lipides ont été mesurés et le risque de développer une maladie coronarienne à 10 ans a été calculé en utilisant le score de Framingham. 73% des participants consommaient de l'alcool; 16% étaient des consommateurs importants et 2% des consommateurs très importants. L'analyse rnultivariée a montré une augmentation du cholestérol HDL avec la consommation d'alcool (de 1.570.01 [moyenne +- erreur standard] chez les non consommateurs à 1.880.03 mmol/L chez les consommateurs très importants), des triglycérides (1.17+-1.01 à 1.32+-1.05 mmol/L) et des valeurs de tension artérielle systolique (127.4+-0.4 à 132.2+-.4 mm Hg) et diastolique (78.7+-0.3 à 81.7+-0.9 mm Hg, toutes les valeurs de p pour trend<0.001). Quant au risque de développer une maladie coronarienne à 10 ans, il a augmenté de 4.31%+-0.10 à 4.90%+-0.37 (p=0.03) avec la consommation d'alcool, en décrivant une courbe en J. En examinant le type de consommation, on a vu que la consommation de vin a plus d'effet sur l'augmentation des valeurs de cholestérol HDL, alors que la consommation de bière ou de spiritueux a plus d'effet sur l'augmentation des valeurs de triglycérides. Conclusions et perspectives: Nos résultats montrent qu'en ce qui concerne l'estimation du risque cardiovasculaire à 10 ans, l'effet protecteur de la consommation d'alcool disparaît pour des consommations très importantes, car l'effet bénéfique des valeurs augmentées de cholestérol HDL est contrecarré par l'augmentation des valeurs de tension artérielle. Quant aux différents types d'alcool, d'autres études sont nécessaires pour mieux évaluer leur effet spécifique sur les facteurs de risque cardiovasculaire.
Resumo:
The impact of depressed neonatal cerebral oxidative phosphorylation for diagnosing the severity of perinatal asphyxia was estimated by correlating the concentrations of phosphocreatine (PCr) and ATP as determined by magnetic resonance spectroscopy with the degree of hypoxic-ischemic encephalopathy (HIE) in 23 asphyxiated term neonates. Ten healthy age-matched neonates served as controls. In patients, the mean concentrations +/- SD of PCr and ATP were 0.99 +/- 0.46 mmol/L (1.6 +/- 0.2 mmol/L) and 0.99 +/- 0.35 mmol/L (1.7 +/- 0.2 mmol/L), respectively (normal values in parentheses). [PCr] and [ATP] correlated significantly with the severity of HIE (r = 0.85 and 0.9, respectively, p < 0.001), indicating that the neonatal encephalopathy is the clinical manifestation of a marred brain energy metabolism. Neurodevelopmental outcome was evaluated in 21 children at 3, 9, and 18 mo. Seven infants had multiple impairments, five were moderately handicapped, five had only mild symptoms, and four were normal. There was a significant correlation between the cerebral concentrations of PCr or ATP at birth and outcome (r = 0.8, p < 0.001) and between the degree of neonatal neurologic depression and outcome (r = 0.7). More important, the outcome of neonates with moderate HIE could better be predicted with information from quantitative 31P magnetic resonance spectroscopy than from neurologic examinations. In general, the accuracy of outcome predictability could significantly be increased by adding results from 31P magnetic resonance spectroscopy to the neonatal neurologic score, but not vice versa. No correlation with outcome was found for other perinatal risk factors, including Apgar score.
Resumo:
In crop rotations that include alfalfa (Medicago sativa L.), agronomic and environmental concerns mean that it is important to determine the N fertilizer contribution of this legume for subsequent crops in order to help to increase the sustainability of cropping systems. To determine the N fertilizer replacement value (FRV) of a 2-yr alfalfa crop on subsequent crops of corn (Zea mays L.) followed by wheat (Triticum aestivum L.) under irrigated Mediterranean conditions, two 4-yr rotations (alfalfa-corn-wheat and corn-corn-corn-wheat) were conducted from 2001 to 2004 in a Typic Xerofluvent soil. Corn yields were compared after two years of alfalfa and a third year of corn under monoculture and wheat yields were also compared after both rotations. Corn production after alfalfa outyielded monoculture corn at all four rates of N fertilizer application analyzed (0, 100, 200 and 300 kg N/ha). The FRV of 2-yr alfalfa for corn was about 160 kg N/ha. Wheat grown after the alfalfa-corn rotation outyielded that grown after corn under monoculture at both the rates of N studied (0 and 100 kg N/ha). The FRV of alfalfa for wheat following alfalfa-corn was about 76 kg N/ha. Soil NO3 -N content after alfalfa was greater than with the corn monoculture at all rates of N fertilizer application and this higher value persisted during the second crop after alfalfa. This was probably one of the reasons for the better yields associated with the alfalfa rotation. These results make a valuable contribution to irrigated agriculture under mediterranean conditions, show reasons for interest in rotating alfalfa with corn, and explain how it is possible to make savings when applying N fertilizer.
Resumo:
Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
In a cohort study of 182 consecutive patients with active endogenous Cushing's syndrome, the only predictor of fracture occurrence after adjustment for age, gender bone mineral density (BMD) and trabecular bone score (TBS) was 24-h urinary free cortisol (24hUFC) levels with a threshold of 1472 nmol/24 h (odds ratio, 3.00 (95 % confidence interval (CI), 1.52-5.92); p = 0.002). INTRODUCTION: The aim was to estimate the risk factors for fracture in subjects with endogenous Cushing's syndrome (CS) and to evaluate the value of the TBS in these patients. METHODS: All enrolled patients with CS (n = 182) were interviewed in relation to low-traumatic fractures and underwent lateral X-ray imaging from T4 to L5. BMD measurements were performed using a DXA Prodigy device (GEHC Lunar, Madison, Wisconsin, USA). The TBS was derived retrospectively from existing BMD scans, blinded to clinical outcome, using TBS iNsight software v2.1 (Medimaps, Merignac, France). Urinary free cortisol (24hUFC) was measured by immunochemiluminescence assay (reference range, 60-413 nmol/24 h). RESULTS: Among enrolled patients with CS (149 females; 33 males; mean age, 37.8 years (95 % confidence interval, 34.2-39.1); 24hUFC, 2370 nmol/24 h (2087-2632), fractures were confirmed in 81 (44.5 %) patients, with 70 suffering from vertebral fractures, which were multiple in 53 cases; 24 patients reported non-vertebral fractures. The mean spine TBS was 1.207 (1.187-1.228), and TBS Z-score was -1.86 (-2.07 to -1.65); area under the curve (AUC) was used to predict fracture (mean spine TBS) = 0.548 (95 % CI, 0.454-0.641)). In the final regression model, the only predictor of fracture occurrence was 24hUFC levels (p = 0.001), with an increase of 1.041 (95 % CI, 1.019-1.063), calculated for every 100 nmol/24-h cortisol elevation (AUC (24hUFC) = 0.705 (95 % CI, 0.629-0.782)). CONCLUSIONS: Young patients with CS have a low TBS. However, the only predictor of low traumatic fracture is the severity of the disease itself, indicated by high 24hUFC levels.
Resumo:
Purpose: We aimed to determine the impact of SPECT/CT performed in addition to whole-‐body scintigraphy augmented with prone lateral views in patients with well-‐differentiated thyroid carcinoma. Methods and Materials: This retrospective study included 141 patients (87 female, 54 male, mean age 47 years) with well-‐differentiated thyroid carcinoma (105 papillary, 31 follicular, 1 Hürthle cell and 4 poorly differentiated) treated with radioiodine therapy (1000-7400 MBq). Patients were referred for either first postsurgical therapy (n=76) or further treatment (n=65). Two nuclear medicine physicians interpreted the scans in consensus (first whole-‐body scintigraphy with prone lateral view, then SPECT/CT) reporting abnormal iodine uptake in the thyroid bed, lymph nodes and distant metastasis. The corresponding ATA risk score was calculated for each patient before and after SPECT/CT, as well as change in disease extension Results: The analysis showed a difference between scintigraphy and SPECT/CT in n=17 lesions in 14 patients (9.9%): 12 were described as suspicious on scintigraphy and could be considered as benign on SPECT/CT (3 corresponded to local iodine uptake, 6 to lymph nodes metastases and 3 to distant metastases). The others 5 corresponded to metastases (4 lymph nodes and 1 distant) that were not seen on whole-‐body scintigraphy augmented with prone lateral views. In 10 of 141 (7.1%) patients, we observed a change in ATA risk stratification, with a risk increase in 4 of them (2.8%). Conclusion: SPECT/CT allowed detecting 5 focal lesions missed on planar scintigraphy, and to precise benignity of 12 suspicious lesions on planar scintigraphy. Moreover, SPECT/CT improved the risk stratification in 10 patients with a significant change in the patient management
Resumo:
We propose a new kernel estimation of the cumulative distribution function based on transformation and on bias reducing techniques. We derive the optimal bandwidth that minimises the asymptotic integrated mean squared error. The simulation results show that our proposed kernel estimation improves alternative approaches when the variable has an extreme value distribution with heavy tail and the sample size is small.
Resumo:
Based on a polygenic system of a diploid species, without epistasis, and a population in Hardy-Weinberg equilibrium, without inbreeding and under linkage equilibrium, it can be shown that: (1) the narrow sense heritability at half-sib family level is equal to the square of the correlation coefficient between family mean and the additive genetic value of its common parent; (2) the narrow sense heritability at full-sib family level is equal to the square of the correlation coefficient between family mean and the mean of the additive genetic values of its parents; (3) the narrow sense heritability at Sn family level is exactly equal to the square of the correlation coefficient between family mean and the additive genetic value of its parent only in absence of dominance or when allele frequencies are equal; and (4) the broad sense heritability at full-sib or Sn family level can be used to analyze selection efficiency, since the progeny genotypic mean is, in general, a good indicator of parents, or Sn-1 plant superiority with respect to the frequency of favorable genes.
Resumo:
The objective of the dissertation is to examine organizational responses of public actors to customer requirements which drive the transformation of value networks and promote public-private partnership in the electricity distribution industry and elderly care sectors. The research bridges the concept of offering to value networks where capabilities can be acquired for novel product concepts. The research contributes to recent literature, re-examining theories on interactions of customer requirements and supply management. A critical realist case study approach is applied to this abductive the research which directs to describe causalities in the analyzed phenomena. The presented evidence is based on three sources, which are in-depth interviews, archival analysis and the Delphi method. Service provision requires awareness on technology and functionalities of offering. Moreover, service provision includes interactions of multiple partners, which suggests the importance of the co-operative orientation of actors. According to the findings,portfolio management has a key role when intelligent solutions are implemented in public service provision because its concepts involve a variety of resources from multiple suppliers. However, emergent networks are not functional if they lack leaders who have access to the customer interface, have power to steer networks and a capability to build offerings. Public procurement policies were recognized to focus on a narrow scope in which price is a key factor in decisions. In the future, the public sector has to implement technology strategies and portfolio management, which mean longterm platform development and commitment to partnerships. On the other hand, the service providers should also be more aware of offerings into which their products will be integrated in the future. This requires making the customer’s voice in product development and co-operation in order to increase the interconnectivity of products.
Resumo:
This study aimed to describe the probabilistic structure of the annual series of extreme daily rainfall (Preabs), available from the weather station of Ubatuba, State of São Paulo, Brazil (1935-2009), by using the general distribution of extreme value (GEV). The autocorrelation function, the Mann-Kendall test, and the wavelet analysis were used in order to evaluate the presence of serial correlations, trends, and periodical components. Considering the results obtained using these three statistical methods, it was possible to assume the hypothesis that this temporal series is free from persistence, trends, and periodicals components. Based on quantitative and qualitative adhesion tests, it was found that the GEV may be used in order to quantify the probabilities of the Preabs data. The best results of GEV were obtained when the parameters of this function were estimated using the method of maximum likelihood. The method of L-moments has also shown satisfactory results.
Resumo:
The application of the Extreme Value Theory (EVT) to model the probability of occurrence of extreme low Standardized Precipitation Index (SPI) values leads to an increase of the knowledge related to the occurrence of extreme dry months. This sort of analysis can be carried out by means of two approaches: the block maxima (BM; associated with the General Extreme Value distribution) and the peaks-over-threshold (POT; associated with the Generalized Pareto distribution). Each of these procedures has its own advantages and drawbacks. Thus, the main goal of this study is to compare the performance of BM and POT in characterizing the probability of occurrence of extreme dry SPI values obtained from the weather station of Ribeirão Preto-SP (1937-2012). According to the goodness-of-fit tests, both BM and POT can be used to assess the probability of occurrence of the aforementioned extreme dry SPI monthly values. However, the scalar measures of accuracy and the return level plots indicate that POT provides the best fit distribution. The study also indicated that the uncertainties in the parameters estimates of a probabilistic model should be taken into account when the probability associated with a severe/extreme dry event is under analysis.