9 resultados para Expected and observed heterozygosity

em DigitalCommons@The Texas Medical Center


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project develops K(bin), a relatively simple, binomial based statistic for assessing interrater agreement in which expected agreement is calculated a priori from the number of raters involved in the study and number of categories on the rating tool. The statistic is logical in interpretation, easily calculated, stable for small sample sizes, and has application over a wide range of possible combinations from the simplest case of two raters using a binomial scale to multiple raters using a multiple level scale.^ Tables of expected agreement values and tables of critical values for K(bin) which include power to detect three levels of the population parameter K for n from 2 to 30 and observed agreement $\ge$.70 calculated at alpha =.05,.025, and.01 are included.^ An example is also included which describes the use of the tables for planning and evaluating an interrater reliability study using the statistic, K(bin). ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The prevalence of overweight and obesity differs substantially among children of different ethnic origin in the United States. The objective of this project is to estimate to what extent changes in ethnic composition since 1980 have contributed to the current general “obesity epidemic” in the childhood population of the United States.^ Methods. Populations by single year of age, 0 to 19, male and female, for Hispanics, non-Hispanic whites, and non-Hispanic blacks, from the US Census’ July estimates for 1985, 1990, 1995, 2000 and 2005 were taken and compared to the population and percentage of those groups from 1980. Age, sex, and ethnicity specific prevalence rates for overweight in 1980 were then applied to the populations by age for the specified year and differences in expected and actual overweight populations were assessed.^ Result. The results from this investigation provide estimates of the contribution that different ethnic groups have made to the overall prevalence of overweight and obesity in the childhood population of the United States. Assuming that the 1976-1980 prevalence rates had remained unchanged, and then comparing the population had there been no change in ethnic composition with the population given the actual change in ethnicity, the percentage increase was 1.06% in 1985, 1.72% in 1990, 2.57% in 1995, 3.95% in 2000, and 4.39% in 2005.^ Conclusion. The changes in ethnic composition of the population, independent of changes in ethnicity-specific prevalence, have contributed substantially to the current overall prevalence of obesity in the United States childhood population. There are a number of factors that may be responsible for the apparent susceptibility of Mexican-Americans and non-Hispanic blacks to overweight and obesity. Further research is needed on specific characteristics of those populations.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper defines and compares several models for describing excess influenza pneumonia mortality in Houston. First, the methodology used by the Center for Disease Control is examined and several variations of this methodology are studied. All of the models examined emphasize the difficulty of omitting epidemic weeks.^ In an attempt to find a better method of describing expected and epidemic mortality, time series methods are examined. Grouping in four-week periods, truncating the data series to adjust epidemic periods, and seasonally-adjusting the series y(,t), by:^ (DIAGRAM, TABLE OR GRAPHIC OMITTED...PLEASE SEE DAI)^ is the best method examined. This new series w(,t) is stationary and a moving average model MA(1) gives a good fit for forecasting influenza and pneumonia mortality in Houston.^ Influenza morbidity, other causes of death, sex, race, age, climate variables, environmental factors, and school absenteeism are all examined in terms of their relationship to influenza and pneumonia mortality. Both influenza morbidity and ischemic heart disease mortality show a very high relationship that remains when seasonal trends are removed from the data. However, when jointly modeling the three series it is obvious that the simple time series MA(1) model of truncated, seasonally-adjusted four-week data gives a better forecast.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative imaging with 18F-FDG PET/CT has the potential to provide an in vivo assessment of response to radiotherapy (RT). However, comparing tissue tracer uptake in longitudinal studies is often confounded by variations in patient setup and potential treatment induced gross anatomic changes. These variations make true response monitoring for the same anatomic volume a challenge, not only for tumors, but also for normal organs-at-risk (OAR). The central hypothesis of this study is that more accurate image registration will lead to improved quantitation of tissue response to RT with 18F-FDG PET/CT. Employing an in-house developed “demons” based deformable image registration algorithm, pre-RT tumor and parotid gland volumes can be more accurately mapped to serial functional images. To test the hypothesis, specific aim 1 was designed to analyze whether deformably mapping tumor volumes rather than aligning to bony structures leads to superior tumor response assessment. We found that deformable mapping of the most metabolically avid regions improved response prediction (P<0.05). The positive predictive power for residual disease was 63% compared to 50% for contrast enhanced post-RT CT. Specific aim 2 was designed to use parotid gland standardized uptake value (SUV) as an objective imaging biomarker for salivary toxicity. We found that relative change in parotid gland SUV correlated strongly with salivary toxicity as defined by the RTOG/EORTC late effects analytic scale (Spearman’s ρ = -0.96, P<0.01). Finally, the goal of specific aim 3 was to create a phenomenological dose-SUV response model for the human parotid glands. Utilizing only baseline metabolic function and the planned dose distribution, predicting parotid SUV change or salivary toxicity, based upon specific aim 2, became possible. We found that the predicted and observed parotid SUV relative changes were significantly correlated (Spearman’s ρ = 0.94, P<0.01). The application of deformable image registration to quantitative treatment response monitoring with 18F-FDG PET/CT could have a profound impact on patient management. Accurate and early identification of residual disease may allow for more timely intervention, while the ability to quantify and predict toxicity of normal OAR might permit individualized refinement of radiation treatment plan designs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anticancer drugs typically are administered in the clinic in the form of mixtures, sometimes called combinations. Only in rare cases, however, are mixtures approved as drugs. Rather, research on mixtures tends to occur after single drugs have been approved. The goal of this research project was to develop modeling approaches that would encourage rational preclinical mixture design. To this end, a series of models were developed. First, several QSAR classification models were constructed to predict the cytotoxicity, oral clearance, and acute systemic toxicity of drugs. The QSAR models were applied to a set of over 115,000 natural compounds in order to identify promising ones for testing in mixtures. Second, an improved method was developed to assess synergistic, antagonistic, and additive effects between drugs in a mixture. This method, dubbed the MixLow method, is similar to the Median-Effect method, the de facto standard for assessing drug interactions. The primary difference between the two is that the MixLow method uses a nonlinear mixed-effects model to estimate parameters of concentration-effect curves, rather than an ordinary least squares procedure. Parameter estimators produced by the MixLow method were more precise than those produced by the Median-Effect Method, and coverage of Loewe index confidence intervals was superior. Third, a model was developed to predict drug interactions based on scores obtained from virtual docking experiments. This represents a novel approach for modeling drug mixtures and was more useful for the data modeled here than competing approaches. The model was applied to cytotoxicity data for 45 mixtures, each composed of up to 10 selected drugs. One drug, doxorubicin, was a standard chemotherapy agent and the others were well-known natural compounds including curcumin, EGCG, quercetin, and rhein. Predictions of synergism/antagonism were made for all possible fixed-ratio mixtures, cytotoxicities of the 10 best-scoring mixtures were tested, and drug interactions were assessed. Predicted and observed responses were highly correlated (r2 = 0.83). Results suggested that some mixtures allowed up to an 11-fold reduction of doxorubicin concentrations without sacrificing efficacy. Taken together, the models developed in this project present a general approach to rational design of mixtures during preclinical drug development. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Survivin (BIRC5) is a member of the Inhibitor of Apoptosis (IAP) gene family and functions as a chromosomal passenger protein as well as a mediator of cell survival. Survivin is widely expressed during embryonic development then becomes transcriptionally silent in most highly differentiated adult tissues. It is also overexpressed in virtually every type of tumor. The survivin promoter contains a canonical CpG island that has been described as epigenetically regulated by DNA methylation. We observed that survivin is overexpressed in high grade, poorly differentiated endometrial tumors, and we hypothesized that DNA hypomethylation could explain this expression pattern. Surprisingly, methylation specific PCR and bisulfite pyrosequencing analysis showed that survivin was hypermethylated in endometrial tumors and that this hypermethylation correlated with increased survivin expression. We proposed that methylation could activate survivin expression by inhibit the binding of a transcriptional repressor. ^ The tumor suppressor protein p53 is a well documented transcriptional repressor of survivin and examination of the survivin promoter showed that the p53 binding site contains 3 CpG sites which often become methylated in endometrial tumors. To determine if methylation regulates survivin expression, we treated HCT116 cells with decitabine, a demethylation agent, and observed that survivin transcript and protein levels were significantly repressed following demethylation in a p53 dependent manner. Subsequent binding studies confirmed that DNA methylation inhibited the binding of p53 protein to its binding site in the survivin promoter. ^ We are the first to report this novel mechanism of epigenetic regulation of survivin. We also conducted microarray analysis which showed that many other cancer relevant genes may also be regulated in this manner. While demethylation agents are traditionally thought to inhibit cancer cell growth by reactivating tumor suppressors, our results indicate that an additional important mechanism is to decrease the expression of oncogenes. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the fundamental questions in neuroscience is to understand how encoding of sensory inputs is distributed across neuronal networks in cerebral cortex to influence sensory processing and behavioral performance. The fact that the structure of neuronal networks is organized according to cortical layers raises the possibility that sensory information could be processed differently in distinct layers. The goal of my thesis research is to understand how laminar circuits encode information in their population activity, how the properties of the population code adapt to changes in visual input, and how population coding influences behavioral performance. To this end, we performed a series of novel experiments to investigate how sensory information in the primary visual cortex (V1) emerges across laminar cortical circuits. First, it is commonly known that the amount of information encoded by cortical circuits depends critically on whether or not nearby neurons exhibit correlations. We examined correlated variability in V1 circuits from a laminar-specific perspective and observed that cells in the input layer, which have only local projections, encode incoming stimuli optimally by exhibiting low correlated variability. In contrast, output layers, which send projections to other cortical and subcortical areas, encode information suboptimally by exhibiting large correlations. These results argue that neuronal populations in different cortical layers play different roles in network computations. Secondly, a fundamental feature of cortical neurons is their ability to adapt to changes in incoming stimuli. Understanding how adaptation emerges across cortical layers to influence information processing is vital for understanding efficient sensory coding. We examined the effects of adaptation, on the time-scale of a visual fixation, on network synchronization across laminar circuits. Specific to the superficial layers, we observed an increase in gamma-band (30-80 Hz) synchronization after adaptation that was correlated with an improvement in neuronal orientation discrimination performance. Thus, synchronization enhances sensory coding to optimize network processing across laminar circuits. Finally, we tested the hypothesis that individual neurons and local populations synchronize their activity in real-time to communicate information about incoming stimuli, and that the degree of synchronization influences behavioral performance. These analyses assessed for the first time the relationship between changes in laminar cortical networks involved in stimulus processing and behavioral performance.