15 resultados para Quantitative Methods

em DigitalCommons@The Texas Medical Center


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs) remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. METHODS: Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT) pre- and post-intervention. RESULTS: Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs) of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p<0.01) and was sustained at month 4 following the intervention. CONCLUSION: Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This exploratory study assesses the utility of substance abuse treatment as a strategy for preventing human immunodeficiency virus (HIV) transmission among injecting drug users (IDUs). Data analyzed in this study were collected in San Antonio, TX, 1989 through 1995 using both qualitative and quantitative methods. Qualitative data included ethnographic interviews with 234 active IDUs; quantitative data included baseline risk assessments and HIV screening plus interviews follow-up interviews administered approximately six months later to 823 IDUs participating in a Federally-funded AIDS community outreach demonstration project.^ Findings that have particularly important implications for substance abuse treatment as an HIV prevention strategy for IDUs are listed below. (1) IDUs who wanted treatment were significantly more likely to be daily heroin users. (2) IDUs who want treatment were significantly more likely to have been to treatment previously. (3) IDUs who wanted treatment at baseline reported significantly higher levels of HIV risk than IDUs who did not want treatment. (4) IDUs who went to treatment between their baseline and follow-up interviews reported significantly higher levels of HIV risk at baseline than IDUs who did not go to treatment. (5) IDUs who went to treatment between their baseline and follow-up interviews reported significantly greater decreases in injection-related HIV risk behaviors. (6) IDUs who went to treatment reported significantly greater decreases in sexual HIV risk behaviors than IDUs who did not go to treatment.^ This study also noted a number of factors that may limit the effectiveness of substance abuse treatment in reducing HIV risk among IDUs. Findings suggest that the impact of methadone maintenance on HIV risk behaviors among opioid dependent IDUs may be limited by the negative manner in which it is perceived by IDUs as well as other elements of society. One consequence of the negative perception of methadone maintenance held by many elements of society may be an unwillingness to provide public funding for an adequate number of methadone maintenance slots. Thus many IDUs who would be willing to enter methadone maintenance are unable to enter it and many IDUs who do enter it are forced to drop out prematurely. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It has been hypothesized that results from the short term bioassays will ultimately provide information that will be useful for human health hazard assessment. Although toxicologic test systems have become increasingly refined, to date, no investigator has been able to provide qualitative or quantitative methods which would support the use of short term tests in this capacity.^ Historically, the validity of the short term tests have been assessed using the framework of the epidemiologic/medical screens. In this context, the results of the carcinogen (long term) bioassay is generally used as the standard. However, this approach is widely recognized as being biased and, because it employs qualitative data, cannot be used in the setting of priorities. In contrast, the goal of this research was to address the problem of evaluating the utility of the short term tests for hazard assessment using an alternative method of investigation.^ Chemical carcinogens were selected from the list of carcinogens published by the International Agency for Research on Carcinogens (IARC). Tumorigenicity and mutagenicity data on fifty-two chemicals were obtained from the Registry of Toxic Effects of Chemical Substances (RTECS) and were analyzed using a relative potency approach. The relative potency framework allows for the standardization of data "relative" to a reference compound. To avoid any bias associated with the choice of the reference compound, fourteen different compounds were used.^ The data were evaluated in a format which allowed for a comparison of the ranking of the mutagenic relative potencies of the compounds (as estimated using short term data) vs. the ranking of the tumorigenic relative potencies (as estimated from the chronic bioassays). The results were statistically significant (p $<$.05) for data standardized to thirteen of the fourteen reference compounds. Although this was a preliminary investigation, it offers evidence that the short term test systems may be of utility in ranking the hazards represented by chemicals which may be human carcinogens. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Black and Hispanic youth experience the largest burden of sexually transmitted infections, teen pregnancy, and childbirth (Hamilton, Martin, & Ventura, 2011). Minority youth are disporportionately more likely to sexually debut at every age and debut before the age of 13 compared to whites (Centers for Disease Control and Prevention, 2011). However, there is little known about pre-coital sexual activity or protective parental factors in early adolscent minority youth. Parental factors such as parent-child communication and parental monitoring influence adolescent sexual behaviors and pre-coital sexual behaviors in early adolescence. Three distinct methods were used in this dissertation. Study one used qualitative methods, semi-structured, in-depth, individual interviews, to explore parent-child communication in African American mother-early adolescent son dyads. Study two used quantitative methods, secondary data analysis of a cross sectional study, to conduct a moderation analysis. For study three, I conducted a systematic review of parent-based adolescent sexual health interventions. Study one found that mothers feel comfortable talking about sex with adolescents, provide a two-prong sexual health message, and want their sons to tell their when they are thinking of having sex. Study found that parental monitoring moderates the relation between parent-child communication and pre-coital sexual behaviors. Study three found that interventions use a variety of theory, methods, and strategies and that no parent-based programs target faith-based organizations, mother-son or father-daughter dyads, or parents of LGBTQ youth. Adolescent sexual health interventions should consider addressing youth-to-parent disclosure of sexual activity or intentions to debut, addressing both parent-child sexual health communication and parental monitoring, and using a theoretical framework.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Few reports of the utilization of an accurate, cost-effective means for measuring HPV oncogene transcripts have been published. Several papers have reported the use of relative quantitation or more expensive Taqman methods. Here, we report a method of absolute quantitative real-time PCR utilizing SYBR-green fluorescence for the measurement of HPV E7 expression in cervical cytobrush specimens. RESULTS: The construction of a standard curve based on the serial dilution of an E7-containing plasmid was the key for being able to accurately compare measurements between cervical samples. The assay was highly reproducible with an overall coefficient of variation of 10.4%. CONCLUSION: The use of highly reproducible and accurate SYBR-based real-time polymerase chain reaction (PCR) assays instead of performing Taqman-type assays allows low-cost, high-throughput analysis of viral mRNA expression. The development of such assays will help in refining the current screening programs for HPV-related carcinomas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The PEM Flex Solo II (Naviscan, Inc., San Diego, CA) is currently the only commercially-available positron emission mammography (PEM) scanner. This scanner does not apply corrections for count rate effects, attenuation or scatter during image reconstruction, potentially affecting the quantitative accuracy of images. This work measures the overall quantitative accuracy of the PEM Flex system, and determines the contributions of error due to count rate effects, attenuation and scatter. Materials and Methods: Gelatin phantoms were designed to simulate breasts of different sizes (4 – 12 cm thick) with varying uniform background activity concentration (0.007 – 0.5 μCi/cc), cysts and lesions (2:1, 5:1, 10:1 lesion-to-background ratios). The overall error was calculated from ROI measurements in the phantoms with a clinically relevant background activity concentration (0.065 μCi/cc). The error due to count rate effects was determined by comparing the overall error at multiple background activity concentrations to the error at 0.007 μCi/cc. A point source and cold gelatin phantoms were used to assess the errors due to attenuation and scatter. The maximum pixel values in gelatin and in air were compared to determine the effect of attenuation. Scatter was evaluated by comparing the sum of all pixel values in gelatin and in air. Results: The overall error in the background was found to be negative in phantoms of all thicknesses, with the exception of the 4-cm thick phantoms (0%±7%), and it increased with thickness (-34%±6% for the 12-cm phantoms). All lesions exhibited large negative error (-22% for the 2:1 lesions in the 4-cm phantom) which increased with thickness and with lesion-to-background ratio (-85% for the 10:1 lesions in the 12-cm phantoms). The error due to count rate in phantoms with 0.065 μCi/cc background was negative (-23%±6% for 4-cm thickness) and decreased with thickness (-7%±7% for 12 cm). Attenuation was a substantial source of negative error and increased with thickness (-51%±10% to -77% ±4% in 4 to 12 cm phantoms, respectively). Scatter contributed a relatively constant amount of positive error (+23%±11%) for all thicknesses. Conclusion: Applying corrections for count rate, attenuation and scatter will be essential for the PEM Flex Solo II to be able to produce quantitatively accurate images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Quantitative myocardial PET perfusion imaging requires partial volume corrections. METHODS: Patients underwent ECG-gated, rest-dipyridamole, myocardial perfusion PET using Rb-82 decay corrected in Bq/cc for diastolic, systolic, and combined whole cycle ungated images. Diastolic partial volume correction relative to systole was determined from the systolic/diastolic activity ratio, systolic partial volume correction from phantom dimensions comparable to systolic LV wall thicknesses and whole heart cycle partial volume correction for ungated images from fractional systolic-diastolic duration for systolic and diastolic partial volume corrections. RESULTS: For 264 PET perfusion images from 159 patients (105 rest-stress image pairs, 54 individual rest or stress images), average resting diastolic partial volume correction relative to systole was 1.14 ± 0.04, independent of heart rate and within ±1.8% of stress images (1.16 ± 0.04). Diastolic partial volume corrections combined with those for phantom dimensions comparable to systolic LV wall thickness gave an average whole heart cycle partial volume correction for ungated images of 1.23 for Rb-82 compared to 1.14 if positron range were negligible as for F-18. CONCLUSION: Quantitative myocardial PET perfusion imaging requires partial volume correction, herein demonstrated clinically from systolic/diastolic absolute activity ratios combined with phantom data accounting for Rb-82 positron range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique for the detection of microbiological fecal pollution in drinking and in raw surface water has been modified and tested against the standard multiple-tube fermentation technique (most-probable-number, MPN). The performance of the new test in detecting fecal pollution in drinking water has been tested at different incubation temperatures. The basis for the new test was the detection of hydrogen sulfide produced by the hydrogen sulfide producing bacteria which are usually associated with the coliform group. The positive results are indicated by the appearance of a brown to black color in the contents of the fermentation tube within 18 to 24 hours of incubation at 35 (+OR-) .5(DEGREES)C. For this study 158 water samples of different sources have been used. The results were analyzed statistically with the paired t-test and the one-way analysis of variance. No statistically significant difference was noticed between the two methods, when tested 35 (+OR-) .5(DEGREES)C, in detecting fecal pollution in drinking water. The new test showed more positive results with raw surface water, which could be due to the presence of hydrogen sulfide producing bacteria of non-fecal origin like Desulfovibrio and Desulfomaculum. The survival of the hydrogen sulfide producing bacteria and the coliforms was also tested over a 7-day period, and the results showed no significant difference. The two methods showed no significant difference when used to detect fecal pollution at a very low coliform density. The results showed that the new test is mostly effective, in detecting fecal pollution in drinking water, when used at 35 (+OR-) .5(DEGREES)C. The new test is effective, simple, and less expensive when used to detect fecal pollution in drinking water and raw surface water at 35 (+OR-) .5(DEGREES)C. The method can be used for qualitative and/or quantitative analysis of water in the field and in the laboratory. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular disease (CVD) is a threat to public health. It has been reported to be the leading cause of death in United States. The invention of next generation sequencing (NGS) technology has revolutionized the biomedical research. To investigate NGS data of CVD related quantitative traits would contribute to address the unknown etiology and disease mechanism of CVD. NHLBI's Exome Sequencing Project (ESP) contains CVD related phenotypes and their associated NGS exomes sequence data. Initially, a subset of next generation sequencing data consisting of 13 CVD-related quantitative traits was investigated. Only 6 traits, systolic blood pressure (SBP), diastolic blood pressure (DBP), height, platelet counts, waist circumference, and weight, were analyzed by functional linear model (FLM) and 7 currently existing methods. FLM outperformed all currently existing methods by identifying the highest number of significant genes and had identified 96, 139, 756, 1162, 1106, and 298 genes associated with SBP, DBP, Height, Platelet, Waist, and Weight respectively. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To systematically review published literature to examine the complications associated with the use of misoprostol and compare these complications to those associated with other forms of abortion induction. ^ DATA SOURCES: Studies were identified through searches of medical literature databases including Medline (Ovid), PubMed (NLM), LILACS, sciELO, and AIM (AFRO), and review of references of relevant articles. ^ STUDY SELECTION AND METHODS: A descriptive systematic review that included studies reported in English and published before December 2012. Eligibility criteria included: misoprostol (with or without other methods) and any other method of abortion in a developing country, as well as quantitative data on the complication of each method. The following is information extracted from each study: author/year, country/city, study design/study sample, age range, setting of data collection, sample size, the method of abortion induction, the number of cases for each method, and the percentage of complications with each method. RESULTS: A total of 4 studies were identified (all in Latin America) describing post-abortion complications of misoprostol and other methods in countries where abortion is generally considered unsafe and/or illegal. The four studies reported on a range of complications including: bleeding, infection, incomplete abortion, intense pelvic pain, uterine perforation, headache, diarrhea, nausea, mechanical lesions, and systemic collapse. The most prevalent complications of misoprostol-induced abortion reported were: bleeding (7-82%), incomplete abortion (33-70%), and infection (0.8-67%). The prevalence of these complications reported from other abortion methods include: bleeding (16-25%), incomplete abortion (15-82%), and infection (13-50%). ^ CONCLUSION: The literature identified by this systematic review is inadequate for determining the complications of misoprostol used in unsafe settings. Abortion is considered an illicit behavior in these countries, therefore making it difficult to investigate the details needed to conduct a study on abortion complications. Given the differences between the reviewed studies as well as a variety of study limitations, it is not possible to draw firm conclusions about the rates of specific-abortion related complications.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^