20 resultados para Multivariate measurement model

em DigitalCommons@The Texas Medical Center


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multivariate frailty hazard model is developed for joint-modeling of three correlated time-to-event outcomes: (1) local recurrence, (2) distant recurrence, and (3) overall survival. The term frailty is introduced to model population heterogeneity. The dependence is modeled by conditioning on a shared frailty that is included in the three hazard functions. Independent variables can be included in the model as covariates. The Markov chain Monte Carlo methods are used to estimate the posterior distributions of model parameters. The algorithm used in present application is the hybrid Metropolis-Hastings algorithm, which simultaneously updates all parameters with evaluations of gradient of log posterior density. The performance of this approach is examined based on simulation studies using Exponential and Weibull distributions. We apply the proposed methods to a study of patients with soft tissue sarcoma, which motivated this research. Our results indicate that patients with chemotherapy had better overall survival with hazard ratio of 0.242 (95% CI: 0.094 - 0.564) and lower risk of distant recurrence with hazard ratio of 0.636 (95% CI: 0.487 - 0.860), but not significantly better in local recurrence with hazard ratio of 0.799 (95% CI: 0.575 - 1.054). The advantages and limitations of the proposed models, and future research directions are discussed. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ethnic violence appears to be the major source of violence in the world. Ethnic hostilities are potentially all-pervasive because most countries in the world are multi-ethnic. Public health's focus on violence documents its increasing role in this issue.^ The present study is based on a secondary analysis of a dataset of responses by 272 individuals from four ethnic groups (Anglo, African, Mexican, and Vietnamese Americans) who answered questions regarding variables related to ethnic violence from a general questionnaire which was distributed to ethnically diverse purposive, nonprobability, self-selected groups of individuals in Houston, Texas, in 1993.^ One goal was psychometric: learning about issues in analysis of datasets with modest numbers, comparison of two approaches to dealing with missing observations not missing at random (conducting analysis on two datasets), transformation analysis of continuous variables for logistic regression, and logistic regression diagnostics.^ Regarding the psychometric goal, it was concluded that measurement model analysis was not possible with a relatively small dataset with nonnormal variables, such as Likert-scaled variables; therefore, exploratory factor analysis was used. The two approaches to dealing with missing values resulted in comparable findings. Transformation analysis suggested that the continuous variables were in the correct scale, and diagnostics that the model fit was adequate.^ The substantive portion of the analysis included the testing of four hypotheses. Hypothesis One proposed that attitudes/efficacy regarding alternative approaches to resolving grievances from the general questionnaire represented underlying factors: nonpunitive social norms and strategies for addressing grievances--using the political system, organizing protests, using the system to punish offenders, and personal mediation. Evidence was found to support all but one factor, nonpunitive social norms.^ Hypothesis Two proposed that the factor variables and the other independent variables--jail, grievance, male, young, and membership in a particular ethnic group--were associated with (non)violence. Jail, grievance, and not using the political system to address grievances were associated with a greater likelihood of intergroup violence.^ No evidence was found to support Hypotheses Three and Four, which proposed that grievance and ethnic group membership would interact with other variables (i.e., age, gender, etc.) to produce variant levels of subgroup (non)violence.^ The generalizability of the results of this study are constrained by the purposive self-selected nature of the sample and small sample size (n = 272).^ Suggestions for future research include incorporating other possible variables or factors predictive of intergroup violence in models of the kind tested here, and the development and evaluation of interventions that promote electoral and nonelectoral political participation as means of reducing interethnic conflict. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trauma and severe head injuries are important issues because they are prevalent, because they occur predominantly in the young, and because variations in clinical management may matter. Trauma is the leading cause of death for those under age 40. The focus of this head injury study is to determine if variations in time from the scene of accident to a trauma center hospital makes a difference in patient outcomes.^ A trauma registry is maintained in the Houston-Galveston area and includes all patients admitted to any one of three trauma center hospitals with mild or severe head injuries. A study cohort, derived from the Registry, includes 254 severe head injury cases, for 1980, with a Glasgow Coma Score of 8 or less.^ Multiple influences relate to patient outcomes from severe head injury. Two primary variables and four confounding variables are identified, including time to emergency room, time to intubation, patient age, severity of injury, type of injury and mode of transport to the emergency room. Regression analysis, analysis of variance, and chi-square analysis were the principal statistical methods utilized.^ Analysis indicates that within an urban setting, with a four-hour time span, variations in time to emergency room do not provide any strong influence or predictive value to patient outcome. However, data are suggestive that at longer time periods there is a negative influence on outcomes. Age is influential only when the older group (55-64) is included. Mode of transport (helicopter or ambulance) did not indicate any significant difference in outcome.^ In a multivariate regression model, outcomes are influenced primarily by severity of injury and age which explain 36% (R('2)) of variance. Inclusion of time to emergency room, time to intubation, transport mode and type injury add only 4% (R('2)) additional contribution to explaining variation in patient outcome.^ The research concludes that since the group most at risk to head trauma is the young adult male involved in automobile/motorcycle accidents, more may be gained by modifying driving habits and other preventive measures. Continuous clinical and evaluative research are required to provide updated clinical wisdom in patient management and trauma treatment protocols. A National Institute of Trauma may be required to develop a national public policy and evaluate the many medical, behavioral and social changes required to cope with the country's number 3 killer and the primary killer of young adults.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Experience with anidulafungin against Candida krusei is limited. Immunosuppressed mice were injected with 1.3 x 10(7) to 1.5 x 10(7) CFU of C. krusei. Animals were treated with saline, 40 mg/kg fluconazole, 1 mg/kg amphotericin B, or 10 and 20 mg/kg anidulafungin for 5 days. Anidulafungin improved survival and significantly reduced the number of CFU/g in kidneys and serum beta-glucan levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In regression analysis, covariate measurement error occurs in many applications. The error-prone covariates are often referred to as latent variables. In this proposed study, we extended the study of Chan et al. (2008) on recovering latent slope in a simple regression model to that in a multiple regression model. We presented an approach that applied the Monte Carlo method in the Bayesian framework to the parametric regression model with the measurement error in an explanatory variable. The proposed estimator applied the conditional expectation of latent slope given the observed outcome and surrogate variables in the multiple regression models. A simulation study was presented showing that the method produces estimator that is efficient in the multiple regression model, especially when the measurement error variance of surrogate variable is large.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ionotropic glutamate receptors are important excitatory neurotransmitter receptors in the mammalian central nervous system that have been implicated in a number of neuropathologies such as epilepsy, ischemia, and amyotrophic lateral sclerosis. Glutamate binding to an extracellular ligand binding domain initiates a series of structural changes that leads to the formation of a cation selective transmembrane channel, which consequently closes due to desensitization of the receptor. The crystal structures of the AMPA subtype of the glutamate receptor have been particularly useful in providing initial insight into the conformational changes in the ligand binding domain; however, these structures are limited by crystallographic constraint. To gain a clear picture of how agonist binding is coupled to channel activation and desensitization, it is essential to study changes in the ligand binding domain in a dynamic, physiological state. In this dissertation, a technique called Luminescence Resonance Energy Transfer was used to determine the conformational changes associated with activation and desensitization in a functional AMPA receptor (ÄN*-AMPA) that contains the ligand binding domain and transmembrane segments; ÄN*-AMPA has been modified such that fluorophores can be introduced at specific sites to serve as a readout of cleft closure or to establish intersubunit distances. Previous structural studies of cleft closure of the isolated ligand binding domain in conjunction with functional studies of the full receptor suggest that extent of cleft closure correlates with extent of activation. Here, LRET has been used to show that a similar relationship between cleft closure and activation is observed in the “full length” receptor showing that the isolated ligand binding domain is a good model of the domain in the full length receptor for changes within a subunit. Similar LRET investigations were used to study intersubunit distances specifically to probe conformational changes between subunits within a dimer in the tetrameric receptor. These studies show that the dimer interface is coupled in the open state, and decoupled in the desensitized state, similar to the isolated ligand binding domain crystal structure studies. However, we show that the apo state dimer interface is not pre-formed as in the crystal structure, hence suggesting a mechanism for functional transitions within the receptor based on LRET distances obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A historical prospective study was designed to assess the man weight status of subjects who participated in a behavioral weight reduction program in 1983 and to determine whether there was an association between the dependent variable weight change and any of 31 independent variables after a 2 year follow-up period. Data was obtained by abstracting the subjects records and from a follow-up questionnaire administered 2 years following program participation. Five hundred nine subjects (386 females and 123 males) of 1460 subjects who participated in the program, completed and returned the questionnaire. Results showed that mean weight was significantly different (p < 0.001) between the measurement at baseline and after a 2 year follow-up period. The mean weight loss of the group was 5.8 pounds, 10.7 pounds for males and 4.2 pounds for females after a 2 year follow-up period. A total of 63.9% of the group, 69.9% of males and 61.9% of females were still below their initial weight after the 2 year follow-up period. Sixteen of the 31 variables assessed utilizing bivariate analyses were found to be significantly (p (LESSTHEQ) 0.05) associated with weight change after a 2 year follow-up period. These variables were then entered into a multivariate linear regression model. A total of 37.9% of the variance of the dependent variable, weight change, was accounted for by all 16 variables. Eight of these variables were found to be significantly (p (LESSTHEQ) 0.05) predictive of weight change in the stepwise multivariate process accounting for 37.1% of the variance. These variables included: Two baseline variables (percent over ideal body weight at enrollment and occupation) and six follow-up variables (feeling in control of eating habits, percent of body weight lost during treatment, frequency of weight measurement, physical activity, eating in response to emotions, and number of pounds of weight gain needed to resume a diet). It was concluded that a greater amount of emphasis should be placed on the six follow-up variables by clinicians involved in the treatment of obesity, and by the subjects themselves to enhance their chances of success at long-term weight loss. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical oncologists and cancer researchers benefit from information on the vascularization or non-vascularization of solid tumors because of blood flow's influence on three popular treatment types: hyperthermia therapy, radiotherapy, and chemotherapy. The objective of this research is the development of a clinically useful tumor blood flow measurement technique. The designed technique is sensitive, has good spatial resolution, in non-invasive and presents no risk to the patient beyond his usual treatment (measurements will be subsequent only to normal patient treatment).^ Tumor blood flow was determined by measuring the washout of positron emitting isotopes created through neutron therapy treatment. In order to do this, several technical and scientific questions were addressed first. These questions were: (1) What isotopes are created in tumor tissue when it is irradiated in a neutron therapy beam and how much of each isotope is expected? (2) What are the chemical states of the isotopes that are potentially useful for blood flow measurements and will those chemical states allow these or other isotopes to be washed out of the tumor? (3) How should isotope washout by blood flow be modeled in order to most effectively use the data? These questions have been answered through both theoretical calculation and measurement.^ The first question was answered through the measurement of macroscopic cross sections for the predominant nuclear reactions in the body. These results correlate well with an independent mathematical prediction of tissue activation and measurements of mouse spleen neutron activation. The second question was addressed by performing cell suspension and protein precipitation techniques on neutron activated mouse spleens. The third and final question was answered by using first physical principles to develop a model mimicking the blood flow system and measurement technique.^ In a final set of experiments, the above were applied to flow models and animals. The ultimate aim of this project is to apply its methodology to neutron therapy patients. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arterial spin labeling (ASL) is a technique for noninvasively measuring cerebral perfusion using magnetic resonance imaging. Clinical applications of ASL include functional activation studies, evaluation of the effect of pharmaceuticals on perfusion, and assessment of cerebrovascular disease, stroke, and brain tumor. The use of ASL in the clinic has been limited by poor image quality when large anatomic coverage is required and the time required for data acquisition and processing. This research sought to address these difficulties by optimizing the ASL acquisition and processing schemes. To improve data acquisition, optimal acquisition parameters were determined through simulations, phantom studies and in vivo measurements. The scan time for ASL data acquisition was limited to fifteen minutes to reduce potential subject motion. A processing scheme was implemented that rapidly produced regional cerebral blood flow (rCBF) maps with minimal user input. To provide a measure of the precision of the rCBF values produced by ASL, bootstrap analysis was performed on a representative data set. The bootstrap analysis of single gray and white matter voxels yielded a coefficient of variation of 6.7% and 29% respectively, implying that the calculated rCBF value is far more precise for gray matter than white matter. Additionally, bootstrap analysis was performed to investigate the sensitivity of the rCBF data to the input parameters and provide a quantitative comparison of several existing perfusion models. This study guided the selection of the optimum perfusion quantification model for further experiments. The optimized ASL acquisition and processing schemes were evaluated with two ASL acquisitions on each of five normal subjects. The gray-to-white matter rCBF ratios for nine of the ten acquisitions were within ±10% of 2.6 and none were statistically different from 2.6, the typical ratio produced by a variety of quantitative perfusion techniques. Overall, this work produced an ASL data acquisition and processing technique for quantitative perfusion and functional activation studies, while revealing the limitations of the technique through bootstrap analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With substance abuse treatment expanding in prisons and jails, understanding how behavior change interacts with a restricted setting becomes more essential. The Transtheoretical Model (TTM) has been used to understand intentional behavior change in unrestricted settings, however, evidence indicates restrictive settings can affect the measurement and structure of the TTM constructs. The present study examined data from problem drinkers at baseline and end-of-treatment from three studies: (1) Project CARE (n = 187) recruited inmates from a large county jail; (2) Project Check-In (n = 116) recruited inmates from a state prison; (3) Project MATCH, a large multi-site alcohol study had two recruitment arms, aftercare (n = 724 pre-treatment and 650 post-treatment) and outpatient (n = 912 pre-treatment and 844 post-treatment). The analyses were conducted using cross-sectional data to test for non-invariance of measures of the TTM constructs: readiness, confidence, temptation, and processes of change (Structural Equation Modeling, SEM) across restricted and unrestricted settings. Two restricted (jail and aftercare) and one unrestricted group (outpatient) entering treatment and one restricted (prison) and two unrestricted groups (aftercare and outpatient) at end-of-treatment were contrasted. In addition TTM end-of-treatment profiles were tested as predictors of 12 month drinking outcomes (Profile Analysis). Although SEM did not indicate structural differences in the overall TTM construct model across setting types, there were factor structure differences on the confidence and temptation constructs at pre-treatment and in the factor structure of the behavioral processes at the end-of-treatment. For pre-treatment temptation and confidence, differences were found in the social situations factor loadings and in the variance for the confidence and temptation latent factors. For the end-of-treatment behavioral processes, differences across the restricted and unrestricted settings were identified in the counter-conditioning and stimulus control factor loadings. The TTM end-of-treatment profiles were not predictive of drinking outcomes in the prison sample. Both pre and post-treatment differences in structure across setting types involved constructs operationalized with behaviors that are limited for those in restricted settings. These studies suggest the TTM is a viable model for explicating addictive behavior change in restricted settings but calls for modification of subscale items that refer to specific behaviors and caution in interpreting the mean differences across setting types for problem drinkers. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The airliner cabin environment and its effects on occupant health have not been fully characterized. This dissertation is: (1) A review of airliner environmental control systems (ECSs) that modulate the ventilation, temperature, relative humidity (RH), and barometric pressure (PB) of the cabin environment---variables related to occupant comfort and health. (2) A review and assessment of the methods and findings of key cabin air quality (CAQ) investigations. Several significant deficiencies impede the drawing of inferences about CAQ, e.g., lack of detail about investigative methods, differences in methods between investigations, limited assessment of CAQ variables, small sample sizes, and technological deficiencies of data collection. (3) A comprehensive evaluation of the methods used in the subsequent NIOSH-FAA Airliner CAQ Exposure Assessment Feasibility Study (STUDY) in which this author participated. A number of problems were identified which limit the usefulness of the data. (4) An analysis of the reliable 10-flight STUDY data. Univariate and multivariate methods applied to CO2 (a surrogate for air contaminants), temperature, RH, and PB, in association with percent passenger load, ventilation system, flight duration, airliner body type, and measurement location within the cabin, revealed neither the measured values nor their variability exceeded established health-based exposure limits. Regression analyses suggest CO2, temperature, and RH were affected by percent passenger load. In-flight measurements of CO2 and RH were relatively independent of ventilation system type or flight duration. Cabin temperature was associated with percent passenger load, ventilation system type, and flight duration. (5) A synthesis of the implications of the airliner ECS and cabin O2 environment on occupant health. A model was developed to predict consequences of the airliner cabin pressure altitude 8,000 ft limit and resulting model-estimated PO2 on cardiopulmonary status. Based on the PB, altitude, and environmental data derived from the 10 STUDY flights, the predicted PaO2 of adults with COPD, or elderly adults with or without COPD, breathing ambient cabin air could be < 55 mm Hg (SaO2 < 88%). Reduction in cabin PB found in the STUDY flights could aggravate various medical conditions and require the use of in-flight supplemental O2. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Retail clinics, also called convenience care clinics, have become a rapidly growing trend since their initial development in 2000. These clinics are coupled within a larger retail operation and are generally located in "big-box" discount stores such as Wal-mart or Target, grocery stores such as Publix or H-E-B, or in retail pharmacies such as CVS or Walgreen's (Deloitte Center for Health Solutions, 2008). Care is typically provided by nurse practitioners. Research indicates that this new health care delivery system reduces cost, raises quality, and provides a means of access to the uninsured population (e.g., Deloitte Center for Health Solutions, 2008; Convenient Care Association, 2008a, 2008b, 2008c; Hansen-Turton, Miller, Nash, Ryan, Counts, 2007; Salinsky, 2009; Scott, 2006; Ahmed & Fincham, 2010). Some healthcare analysts even suggest that retail clinics offer a feasible solution to the shortage of primary care physicians facing the nation (AHRQ Health Care Innovations Exchange, 2010). ^ The development and performance of retail clinics is heavily dependent upon individual state policies regulating NPs. Texas currently has one of the most highly regulated practice environments for NPs (Stout & Elton, 2007; Hammonds, 2008). In September 2009, Texas passed Senate Bill 532 addressing the scope of practice of nurse practitioners in the convenience care model. In comparison to other states, this law still heavily regulates nurse practitioners. However, little research has been conducted to evaluate the impact of state laws regulating nurse practitioners on the development and performance of retail clinics. ^ Objectives. (1). To describe the potential impact that SB 532 has on retail clinic performance. (2). To discuss the effectiveness, efficiency, and equity of the convenience care model. (3). To describe possible alternatives to Texas' nurse practitioner scope of practice guidelines as delineated in Texas Senate Bill 532. (4). To describe the type of nurse practitioner state regulation (i.e. independent, light, moderate, or heavy) that best promotes the convenience care model. ^ Methods. State regulations governing nurse practitioners can be characterized as independent, light, moderate, and heavy. Four state NP regulatory types and retail clinic performance were compared and contrasted to that of Texas regulations using Dunn and Aday's theoretical models for conducting policy analysis and evaluating healthcare systems. Criteria for measurement included effectiveness, efficiency, and equity. Comparison states were Arizona (Independent), Minnesota (Light), Massachusetts (Moderate), and Florida (Heavy). ^ Results. A comparative states analysis of Texas SB 532 and alternative NP scope of practice guidelines among the four states: Arizona, Florida, Massachusetts, and Minnesota, indicated that SB 532 has minimal potential to affect the shortage of primary care providers in the state. Although SB 532 may increase the number of NPs a physician may supervise, NPs are still heavily restricted in their scope of practice and limited in their ability to act as primary care providers. Arizona's example of independent NP practice provided the best alternative to affect the shortage of PCPs in Texas as evidenced by a lower uninsured rate and less ED visits per 1,000 population. A survey of comparison states suggests that retail clinics thrive in states that more heavily restrict NP scope of practice as opposed to those that are more permissive, with the exception of Arizona. An analysis of effectiveness, efficiency, and equity of the convenience care model indicates that retail clinics perform well in the areas of effectiveness and efficiency; but, fall short in the area of equity. ^ Conclusion. Texas Senate 532 represents an incremental step towards addressing the problem of a shortage of PCPs in the state. A comparative policy analysis of the other four states with varying degrees of NP scope of practice indicate that a more aggressive policy allowing for independent NP practice will be needed to achieve positive changes in health outcomes. Retail clinics pose a temporary solution to the shortage of PCPs and will need to expand their locations to poorer regions and incorporate some chronic care to obtain measurable health outcomes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of clinical chemistry has traditionally been to evaluate acutely ill or hospitalized patients. Traditional statistical methods have serious drawbacks in that they use univariate techniques. To demonstrate alternative methodology, a multivariate analysis of covariance model was developed and applied to the data from the Cooperative Study of Sickle Cell Disease.^ The purpose of developing the model for the laboratory data from the CSSCD was to evaluate the comparability of the results from the different clinics. Several variables were incorporated into the model in order to control for possible differences among the clinics that might confound any real laboratory differences.^ Differences for LDH, alkaline phosphatase and SGOT were identified which will necessitate adjustments by clinic whenever these data are used. In addition, aberrant clinic values for LDH, creatinine and BUN were also identified.^ The use of any statistical technique including multivariate analysis without thoughtful consideration may lead to spurious conclusions that may not be corrected for some time, if ever. However, the advantages of multivariate analysis far outweigh its potential problems. If its use increases as it should, the applicability to the analysis of laboratory data in prospective patient monitoring, quality control programs, and interpretation of data from cooperative studies could well have a major impact on the health and well being of a large number of individuals. ^