882 resultados para VaR Estimation methods, Statistical Methods, Risk managment, Investments
Resumo:
Background. Infant colic is a common condition that is thought to put infants at risk for Shaken Baby Syndrome, a particularly devastating form of child abuse. However, little research has been done on techniques parents can use to deal with infant colic. This pilot study was conducted to assess the equipment that will be used in a randomized control trial that will compare the results for two different techniques that parents can use to reduce crying in infants with colic. ^ Methods. A total of 11 healthy infants, between one and five months of age, were recruited into this pilot study. All infants had a dosimeter, actiwatch and maternal log placed into the home and a subset of infants (N=3) were also recorded by a video camera. The equipment recorded between 6pm and 6am for at least two and up to five nights. The maternal log and video log were compared with one another to determine if the maternal log provides an accurate representation of the infant's night-time activities (i.e. sleep, awake, crying, feeding). The maternal log was then compared to the dosimeter and actiwatch data to determine if the dosimeter/actiwatch accurately reproduce the maternal log. ^ Results. Data from 10 infants were included in the analyses. The maternal log and video log were in full or partial agreement 90% of the time. When comparing events noted by the mother, the maternal log and dosimeter data were in agreement 84% of the time, and the maternal log and actiwatch data were in agreement 87% of the time. In combination, the dosimeter and/or actiwatch data agreed with the maternal log 90% of the time. ^ Conclusions. Our preliminary analyses of these data suggest the dosimeter and actiwatch will be useful tool for defining infant sleep patterns relative to the maternal log. However further analysis will be required to develop threshold values that can be used to objectively define events in the proposed RCT. Such analyses will need to integrate data from multiple dosimeters and deal with the shifting baselines observed for both the dosimeter and actiwatch.^
Resumo:
Background: Helicobacter pylori infection among Native Americans is more prevalent than any other minority group in the United States. Few studies involving Helicobacter pylori have been conducted on Native Americans and no previous studies have been conducted in the Ysleta del Sur Pueblo population. Therefore we wanted to explore the prevalence and risk factors of Helicobacter pylori within this community. We also explored whether household transmission is occurring. ^ Materials and Methods: We conducted a cross-section study on the prevalence of Helicobacter pylori in the Ysleta del Sur Pueblo community. Main household caregivers were interviewed on household conditions, hygiene practices, and household sociodemographics. All household members were tested for IgG urine antibodies against Helicobacter pylori using RAPIRUN test kits. 13C urea breath testing using BREATHTEK kits was provided to study participants that had positive antibody results and utilized as confirmatory results of infection. ^ Results: Prevalence of Ysleta del Sur Pueblo was determined to be 27.4%. When comparing for ethnicity, Native Americans had increased prevalence of infection then Mexican-Americans living on the Pueblo. That prevalence increased from 1.6 to 3.3 when taking account only United States born study participants. The household secondary prevalence rate was found to be 23.8%. Helicobacter pylori infection rates increased with increasing age and decreasing income. ^ Conclusions: Native Americans had an increased risk of infection. As expected risk factors for Helicobacter pylori correlated with previous studies, but we found evidence of limited current transmission within households. However, due to the limited sample size (n=62) and power, we were not able to find statistical significance for some risk factors. A statistical association was found with age where increasing prevalence corresponded with increasing age suggesting that the birth cohort may be in effect within this population.^
Resumo:
Mixed longitudinal designs are important study designs for many areas of medical research. Mixed longitudinal studies have several advantages over cross-sectional or pure longitudinal studies, including shorter study completion time and ability to separate time and age effects, thus are an attractive choice. Statistical methodology used in general longitudinal studies has been rapidly developing within the last few decades. Common approaches for statistical modeling in studies with mixed longitudinal designs have been the linear mixed-effects model incorporating an age or time effect. The general linear mixed-effects model is considered an appropriate choice to analyze repeated measurements data in longitudinal studies. However, common use of linear mixed-effects model on mixed longitudinal studies often incorporates age as the only random-effect but fails to take into consideration the cohort effect in conducting statistical inferences on age-related trajectories of outcome measurements. We believe special attention should be paid to cohort effects when analyzing data in mixed longitudinal designs with multiple overlapping cohorts. Thus, this has become an important statistical issue to address. ^ This research aims to address statistical issues related to mixed longitudinal studies. The proposed study examined the existing statistical analysis methods for the mixed longitudinal designs and developed an alternative analytic method to incorporate effects from multiple overlapping cohorts as well as from different aged subjects. The proposed study used simulation to evaluate the performance of the proposed analytic method by comparing it with the commonly-used model. Finally, the study applied the proposed analytic method to the data collected by an existing study Project HeartBeat!, which had been evaluated using traditional analytic techniques. Project HeartBeat! is a longitudinal study of cardiovascular disease (CVD) risk factors in childhood and adolescence using a mixed longitudinal design. The proposed model was used to evaluate four blood lipids adjusting for age, gender, race/ethnicity, and endocrine hormones. The result of this dissertation suggest the proposed analytic model could be a more flexible and reliable choice than the traditional model in terms of fitting data to provide more accurate estimates in mixed longitudinal studies. Conceptually, the proposed model described in this study has useful features, including consideration of effects from multiple overlapping cohorts, and is an attractive approach for analyzing data in mixed longitudinal design studies.^
Resumo:
Michelle Rhee weighs in on measuring success in pubic education.
New methods for quantification and analysis of quantitative real-time polymerase chain reaction data
Resumo:
Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^
Resumo:
Background: Surgical site infections (SSIs) after abdominal surgeries account for approximately 26% of all reported SSIs. The Center for Disease Control and Prevention (CDC) defines 3 types of SSIs: superficial incisional, deep incisional, and organ/space. Preventing SSIs has become a national focus. This dissertation assesses several associations with the individual types of SSI in patients that have undergone colon surgery. ^ Methods: Data for this dissertation was obtained from the American College of Surgeons' National Surgical Quality Improvement Program (NSQIP); major colon surgeries were identified in the database that occurred between the time period of 2007 and 2009. NSQIP data includes more than 50 preoperative and 30 intraoperative factors; 40 collected postoperative occurrences are based on a follow-up period of 30 days from surgery. Initially, four individual logistic regressions were modeled to compare the associations between risk factors and each of the SSI groups: superficial, deep, organ/space and a composite of any single SSI. A second analysis used polytomous regression to assess simultaneously the associations between risk factors and the different types of SSIs, as well as, formally test the different effect estimates of 13 common risk factors for SSIs. The final analysis explored the association between venous thromboembolism (VTEs) and the different types of SSIs and risk factors. ^ Results: A total of 59,365 colon surgeries were included in the study. Overall, 13% of colon cases developed a single type of SSI; 8% of these were superficial SSIs, 1.4% was deep SSIs, and 3.8% were organ/space SSIs. The first article identifies the unique set of risk factors associated with each of the 4 SSI models. Distinct risk factors for superficial SSIs included factors, such as alcohol, chronic obstructive pulmonary disease, dyspnea and diabetes. Organ/space SSIs were uniquely associated with disseminated cancer, preoperative dialysis, preoperative radiation treatment, bleeding disorder and prior surgery. Risk factors that were significant in all models had different effect estimates. The second article assesses 13 common SSI risk factors simultaneously across the 3 different types of SSIs using polytomous regression. Then each risk factor was formally tested for the effect heterogeneity exhibited. If the test was significant the final model would allow for the effect estimations for that risk factor to vary across each type of SSI; if the test was not significant, the effect estimate would remain constant across the types of SSIs using the aggregate SSI value. The third article explored the relationship of venous thromboembolism (VTE) and the individual types of SSIs and risk factors. The overall incidence of VTEs after the 59,365 colon cases was 2.4%. All 3 types of SSIs and several risk factors were independently associated with the development of VTEs. ^ Conclusions: Risk factors associated with each type of SSI were different in patients that have undergone colon surgery. Each model had a unique cluster of risk factors. Several risk factors, including increased BMI, duration of surgery, wound class, and laparoscopic approach, were significant across all 4 models but no statistical inferences can be made about their different effect estimates. These results suggest that aggregating SSIs may misattribute and hide true associations with risk factors. Using polytomous regression to assess multiple risk factors with the multiple types of SSI, this study was able to identify several risk factors that had significant effect heterogeneity across the 3 types of SSI challenging the use of aggregate SSI outcomes. The third article recognizes the strong association between VTEs and the 3 types of SSIs. Clinicians understand the difference between superficial, deep and organ/space SSIs. Our results indicate that they should be considered individually in future studies.^
Resumo:
Maritime accidents involving ships carrying passengers may pose a high risk with respect to human casualties. For effective risk mitigation, an insight into the process of risk escalation is needed. This requires a proactive approach when it comes to risk modelling for maritime transportation systems. Most of the existing models are based on historical data on maritime accidents, and thus they can be considered reactive instead of proactive. This paper introduces a systematic, transferable and proactive framework estimating the risk for maritime transportation systems, meeting the requirements stemming from the adopted formal definition of risk. The framework focuses on ship-ship collisions in the open sea, with a RoRo/Passenger ship (RoPax) being considered as the struck ship. First, it covers an identification of the events that follow a collision between two ships in the open sea, and, second, it evaluates the probabilities of these events, concluding by determining the severity of a collision. The risk framework is developed with the use of Bayesian Belief Networks and utilizes a set of analytical methods for the estimation of the risk model parameters. The model can be run with the use of GeNIe software package. Finally, a case study is presented, in which the risk framework developed here is applied to a maritime transportation system operating in the Gulf of Finland (GoF). The results obtained are compared to the historical data and available models, in which a RoPax was involved in a collision, and good agreement with the available records is found.
Resumo:
Despite the fact that input–output (IO) tables form a central part of the System of National Accounts, each individual country's national IO table exhibits more or less different features and characteristics, reflecting the country's socioeconomic idiosyncrasies. Consequently, the compilers of a multi-regional input–output table (MRIOT) are advised to thoroughly examine the conceptual as well as methodological differences among countries in the estimation of basic statistics for national IO tables and, if necessary, to carry out pre-adjustment of these tables into a common format prior to the MRIOT compilation. The objective of this study is to provide a practical guide for harmonizing national IO tables to construct a consistent MRIOT, referring to the adjustment practices used by the Institute of Developing Economies, JETRO (IDE-JETRO) in compiling the Asian International Input–Output Table.
Resumo:
An image processing observational technique for the stereoscopic reconstruction of the wave form of oceanic sea states is developed. The technique incorporates the enforcement of any given statistical wave law modeling the quasi Gaussianity of oceanic waves observed in nature. The problem is posed in a variational optimization framework, where the desired wave form is obtained as the minimizer of a cost functional that combines image observations, smoothness priors and a weak statistical constraint. The minimizer is obtained combining gradient descent and multigrid methods on the necessary optimality equations of the cost functional. Robust photometric error criteria and a spatial intensity compensation model are also developed to improve the performance of the presented image matching strategy. The weak statistical constraint is thoroughly evaluated in combination with other elements presented to reconstruct and enforce constraints on experimental stereo data, demonstrating the improvement in the estimation of the observed ocean surface.
Resumo:
All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.
Resumo:
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
Resumo:
Training and assessment paradigms for laparoscopic surgical skills are evolving from traditional mentor–trainee tutorship towards structured, more objective and safer programs. Accreditation of surgeons requires reaching a consensus on metrics and tasks used to assess surgeons’ psychomotor skills. Ongoing development of tracking systems and software solutions has allowed for the expansion of novel training and assessment means in laparoscopy. The current challenge is to adapt and include these systems within training programs, and to exploit their possibilities for evaluation purposes. This paper describes the state of the art in research on measuring and assessing psychomotor laparoscopic skills. It gives an overview on tracking systems as well as on metrics and advanced statistical and machine learning techniques employed for evaluation purposes. The later ones have a potential to be used as an aid in deciding on the surgical competence level, which is an important aspect when accreditation of the surgeons in particular, and patient safety in general, are considered. The prospective of these methods and tools make them complementary means for surgical assessment of motor skills, especially in the early stages of training. Successful examples such as the Fundamentals of Laparoscopic Surgery should help drive a paradigm change to structured curricula based on objective parameters. These may improve the accreditation of new surgeons, as well as optimize their already overloaded training schedules.
Resumo:
This paper describes two methods to cancel the effect of two kinds of leakage signals which may be presented when an antenna is measured in a planar near-field range. One method tries to reduce leakage bias errors from the receiver¿s quadrature detector and it is based on estimating the bias constant added to every near-field data sample. Then, that constant is subtracted from the data, removing its undesired effect on the far-field pattern. The estimation is performed by back-propagating the field from the scan plane to the antenna under test plane (AUT) and averaging all the data located outside the AUT aperture. The second method is able to cancel the effect of the leakage from faulty transmission lines, connectors or rotary joints. The basis of this method is also a reconstruction process to determine the field distribution on the AUT plane. Once this distribution is known, a spatial filtering is applied to cancel the contribution due to those faulty elements. After that, a near-field-to-far-field transformation is applied, obtaining a new radiation pattern where the leakage effects have disappeared. To verify the effectiveness of both methods, several examples are presented.
Resumo:
Following the success achieved in previous research projects usin non-destructive methods to estimate the physical and mechanical aging of particle and fibre boards, this paper studies the relationships between aging, physical and mechanical changes, using non-destructive measurements of oriented strand board (OSB). 184 pieces of OSB board from a French source were tested to analyze its actual physical and mechanical properties. The same properties were estimated using acoustic non-destructive methods (ultrasound and stress wave velocity) during a physical laboratory aging test. Measurements were recorded of propagation wave velocity with the sensors aligned, edge to edge, and forming an angle of 45 degrees, with both sensors on the same face of the board. This is because aligned measures are not possible on site. The velocity results are always higher in 45 degree measurements. Given the results of statistical analysis, it can be concluded that there is a strong relationship between acoustic measurements and the decline in physical and mechanical properties of the panels due to aging. The authors propose several models to estimate the physical and mechanical properties of board, as well as their degree of aging. The best results are obtained using ultrasound, although the difference in comparison with the stress wave method is not very significant. A reliable prediction of the degree of deterioration (aging) of board is presented.
Resumo:
This paper studies the relationship between aging, physical changes and the results of non-destructive testing of plywood. 176 pieces of plywood were tested to analyze their actual and estimated density using non-destructive methods (screw withdrawal force and ultrasound wave velocity) during a laboratory aging test. From the results of statistical analysis it can be concluded that there is a strong relationship between the non-destructive measurements carried out, and the decline in the physical properties of the panels due to aging. The authors propose several models to estimate board density. The best results are obtained with ultrasound. A reliable prediction of the degree of deterioration (aging) of board is presented.