884 resultados para Problem analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Proper diagnosis of skin diseases relies on dermatopathology, the most important diagnostic technique in dermatology. Unfortunately, there are few dermatopathology institutions in sub-Saharan Africa, where little is known about the spectrum of histopathological features observed. OBJECTIVES To investigate the spectrum of dermatopathological diagnoses made in a sub-Saharan African reference centre of a large, mainly rural area. PATIENTS/METHODS To retrospectively evaluate all dermatopathological diagnoses made over a period of 5 years at the Regional Dermatology Training Centre (RDTC) in Moshi, Tanzania. RESULTS There were a total of 1554 skin biopsy specimens. In 45% of cases, there were inflammatory diseases, most frequently lichenoid conditions. Cutaneous neoplasms represented 30.4% of all diagnoses, with Kaposi's sarcoma (KS) and, less frequently, squamous cell carcinoma (SCC) being the two most common neoplastic conditions. The latter also reflected the intensive management of persons with albinism in the RDTC. The distribution of histological diagnoses seemed to correlate with the overall clinical spectrum of cutaneous diseases managed in the RDTC. CONCLUSIONS In this African study inflammatory conditions are the main burden of skin diseases leading to a diagnostic biopsy. Our findings provide further evidence that KS, primarily related to the high prevalence of HIV infection is an epidemiological problem. Both SCC and basal cell carcinoma represent another relatively common malignant cutaneous neoplasms, reflecting the presence of specific populations at risk. The challenging spectrum of histological diagnoses observed in this specific African setting with basic working conditions shows that development of laboratory services of good standards and specific training in dermatopathology are urgently needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background context Studies involving factor analysis (FA) of the items in the North American Spine Society (NASS) outcome assessment instrument have revealed inconsistent factor structures for the individual items. Purpose This study examined whether the factor structure of the NASS varied in relation to the severity of the back/neck problem and differed from that originally recommended by the developers of the questionnaire, by analyzing data before and after surgery in a large series of patients undergoing lumbar or cervical disc arthroplasty. Study design/setting Prospective multicenter observational case series. Patient sample Three hundred ninety-one patients with low back pain and 553 patients with neck pain completed questionnaires preoperatively and again at 3 to 6 and 12 months follow-ups (FUs), in connection with the SWISSspine disc arthroplasty registry. Outcome measures North American Spine Society outcome assessment instrument. Methods First, an exploratory FA without a priori assumptions and subsequently a confirmatory FA were performed on the 17 items of the NASS-lumbar and 19 items of the NASS-cervical collected at each assessment time point. The item-loading invariance was tested in the German version of the questionnaire for baseline and FU. Results Both NASS-lumbar and NASS-cervical factor structures differed between baseline and postoperative data sets. The confirmatory analysis and item-loading invariance showed better fit for a three-factor (3F) structure for NASS-lumbar, containing items on “disability,” “back pain,” and “radiating pain, numbness, and weakness (leg/foot)” and for a 5F structure for NASS-cervical including disability, “neck pain,” “radiating pain and numbness (arm/hand),” “weakness (arm/hand),” and “motor deficit (legs).” Conclusions The best-fitting factor structure at both baseline and FU was selected for both the lumbar- and cervical-NASS questionnaires. It differed from that proposed by the originators of the NASS instruments. Although the NASS questionnaire represents a valid outcome measure for degenerative spine diseases, it is able to distinguish among all major symptom domains (factors) in patients undergoing lumbar and cervical disc arthroplasty; overall, the item structure could be improved. Any potential revision of the NASS should consider its factorial structure; factorial invariance over time should be aimed for, to allow for more precise interpretations of treatment success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vertebral compression fracture is a common medical problem in osteoporotic individuals. The quantitative computed tomography (QCT)-based finite element (FE) method may be used to predict vertebral strength in vivo, but needs to be validated with experimental tests. The aim of this study was to validate a nonlinear anatomy specific QCT-based FE model by using a novel testing setup. Thirty-seven human thoracolumbar vertebral bone slices were prepared by removing cortical endplates and posterior elements. The slices were scanned with QCT and the volumetric bone mineral density (vBMD) was computed with the standard clinical approach. A novel experimental setup was designed to induce a realistic failure in the vertebral slices in vitro. Rotation of the loading plate was allowed by means of a ball joint. To minimize device compliance, the specimen deformation was measured directly on the loading plate with three sensors. A nonlinear FE model was generated from the calibrated QCT images and computed vertebral stiffness and strength were compared to those measured during the experiments. In agreement with clinical observations, most of the vertebrae underwent an anterior wedge-shape fracture. As expected, the FE method predicted both stiffness and strength better than vBMD (R2 improved from 0.27 to 0.49 and from 0.34 to 0.79, respectively). Despite the lack of fitting parameters, the linear regression of the FE prediction for strength was close to the 1:1 relation (slope and intercept close to one (0.86 kN) and to zero (0.72 kN), respectively). In conclusion, a nonlinear FE model was successfully validated through a novel experimental technique for generating wedge-shape fractures in human thoracolumbar vertebrae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recurrent wheezing or asthma is a common problem in children that has increased considerably in prevalence in the past few decades. The causes and underlying mechanisms are poorly understood and it is thought that a numb er of distinct diseases causing similar symptoms are involved. Due to the lack of a biologically founded classification system, children are classified according to their observed disease related features (symptoms, signs, measurements) into phenotypes. The objectives of this PhD project were a) to develop tools for analysing phenotypic variation of a disease, and b) to examine phenotypic variability of wheezing among children by applying these tools to existing epidemiological data. A combination of graphical methods (multivariate co rrespondence analysis) and statistical models (latent variables models) was used. In a first phase, a model for discrete variability (latent class model) was applied to data on symptoms and measurements from an epidemiological study to identify distinct phenotypes of wheezing. In a second phase, the modelling framework was expanded to include continuous variability (e.g. along a severity gradient) and combinations of discrete and continuo us variability (factor models and factor mixture models). The third phase focused on validating the methods using simulation studies. The main body of this thesis consists of 5 articles (3 published, 1 submitted and 1 to be submitted) including applications, methodological contributions and a review. The main findings and contributions were: 1) The application of a latent class model to epidemiological data (symptoms and physiological measurements) yielded plausible pheno types of wheezing with distinguishing characteristics that have previously been used as phenotype defining characteristics. 2) A method was proposed for including responses to conditional questions (e.g. questions on severity or triggers of wheezing are asked only to children with wheeze) in multivariate modelling.ii 3) A panel of clinicians was set up to agree on a plausible model for wheezing diseases. The model can be used to generate datasets for testing the modelling approach. 4) A critical review of methods for defining and validating phenotypes of wheeze in children was conducted. 5) The simulation studies showed that a parsimonious parameterisation of the models is required to identify the true underlying structure of the data. The developed approach can deal with some challenges of real-life cohort data such as variables of mixed mode (continuous and categorical), missing data and conditional questions. If carefully applied, the approach can be used to identify whether the underlying phenotypic variation is discrete (classes), continuous (factors) or a combination of these. These methods could help improve precision of research into causes and mechanisms and contribute to the development of a new classification of wheezing disorders in children and other diseases which are difficult to classify.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM Virtual patients (VPs) are a one-of-a-kind e-learning resource, fostering clinical reasoning skills through clinical case examples. The combination with face-to-face teaching is important for their successful integration, which is referred to as "blended learning". So far little is known about the use of VPs in the field of continuing medical education and residency training. The pilot study presented here inquired the application of VPs in the framework of a pediatric residency revision course. METHODS Around 200 participants of a pediatric nephology lecture ('nephrotic and nephritic syndrome in children') were offered two VPs as a wrap-up session at the revision course of the German Society for Pediatrics and Adolescent Medicine (DGKJ) 2009 in Heidelberg, Germany. Using a web-based survey form, different aspects were evaluated concerning the learning experiences with VPs, the combination with the lecture, and the use of VPs for residency training in general. RESULTS N=40 evaluable survey forms were returned (approximately 21%). The return rate was impaired by a technical problem with the local Wi-Fi firewall. The participants perceived the work-up of the VPs as a worthwhile learning experience, with proper preparation for diagnosing and treating real patients with similar complaints. Case presentations, interactivity, and locally and timely independent repetitive practices were, in particular, pointed out. On being asked about the use of VPs in general for residency training, there was a distinct demand for more such offers. CONCLUSION VPs may reasonably complement existing learning activities in residency training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and purpose: Breast cancer continues to be a health problem for women, representing 28 percent of all female cancers and remaining one of the leading causes of death for women. Breast cancer incidence rates become substantial before the age of 50. After menopause, breast cancer incidence rates continue to increase with age creating a long-lasting source of concern (Harris et al., 1992). Mammography, a technique for the detection of breast tumors in their nonpalpable stage when they are most curable, has taken on considerable importance as a public health measure. The lifetime risk of breast cancer is approximately 1 in 9 and occurs over many decades. Recommendations are that screening be periodic in order to detect cancer at early stages. These recommendations, largely, are not followed. Not only are most women not getting regular mammograms, but this circumstance is particularly the case among older women where regular mammography has been proven to reduce mortality by approximately 30 percent. The purpose of this project was to increase our understanding of factors that are associated with stage of readiness to obtain subsequent mammograms. A secondary purpose of this research was to suggest further conceptual considerations toward the extension of the Transtheoretical Model (TTM) of behavior change to repeat screening mammography. ^ Methods. A sample (n = 1,222) of women 50 years and older in a large multi-specialty clinic in Houston, Texas was surveyed by mail questionnaire regarding their previous screening experience and stage of readiness to obtain repeat screening. A computerized database, maintained on all women who undergo mammography at the clinic, was used to identify women who are eligible for the project. The major statistical technique employed to select the significant variables and to examine the man and interaction effects of independent variables on dependent variables was polychotomous stepwise, logistic regression. A prediction model for each stage of readiness definition was estimated. The expected probabilities for stage of readiness were calculated to assess the magnitude and direction of significant predictors. ^ Results. Analysis showed that both ways of defining stage of readiness for obtaining a screening mammogram were associated with specific constructs, including decisional balance and processes of the change. ^ Conclusions. The results of the present study demonstrate that the TTM appears to translate to repeat mammography screening. Findings in the current study also support finding of previous studies that suggest that stage of readiness is associated with respondent decisional balance and the processes of change. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A problem frequently encountered in Data Envelopment Analysis (DEA) is that the total number of inputs and outputs included tend to be too many relative to the sample size. One way to counter this problem is to combine several inputs (or outputs) into (meaningful) aggregate variables reducing thereby the dimension of the input (or output) vector. A direct effect of input aggregation is to reduce the number of constraints. This, in its turn, alters the optimal value of the objective function. In this paper, we show how a statistical test proposed by Banker (1993) may be applied to test the validity of a specific way of aggregating several inputs. An empirical application using data from Indian manufacturing for the year 2002-03 is included as an example of the proposed test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microarray technology is a high-throughput method for genotyping and gene expression profiling. Limited sensitivity and specificity are one of the essential problems for this technology. Most of existing methods of microarray data analysis have an apparent limitation for they merely deal with the numerical part of microarray data and have made little use of gene sequence information. Because it's the gene sequences that precisely define the physical objects being measured by a microarray, it is natural to make the gene sequences an essential part of the data analysis. This dissertation focused on the development of free energy models to integrate sequence information in microarray data analysis. The models were used to characterize the mechanism of hybridization on microarrays and enhance sensitivity and specificity of microarray measurements. ^ Cross-hybridization is a major obstacle factor for the sensitivity and specificity of microarray measurements. In this dissertation, we evaluated the scope of cross-hybridization problem on short-oligo microarrays. The results showed that cross hybridization on arrays is mostly caused by oligo fragments with a run of 10 to 16 nucleotides complementary to the probes. Furthermore, a free-energy based model was proposed to quantify the amount of cross-hybridization signal on each probe. This model treats cross-hybridization as an integral effect of the interactions between a probe and various off-target oligo fragments. Using public spike-in datasets, the model showed high accuracy in predicting the cross-hybridization signals on those probes whose intended targets are absent in the sample. ^ Several prospective models were proposed to improve Positional Dependent Nearest-Neighbor (PDNN) model for better quantification of gene expression and cross-hybridization. ^ The problem addressed in this dissertation is fundamental to the microarray technology. We expect that this study will help us to understand the detailed mechanism that determines sensitivity and specificity on the microarrays. Consequently, this research will have a wide impact on how microarrays are designed and how the data are interpreted. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liver transplantation is a widely accepted treatment for end stage liver disease. Research has shown that people with end-stage liver disease experience improved survival and health-related quality of life after transplantation. However, the unemployment rate among liver transplant recipients remains high. The reasons for this were the subject of a study that was used as the primary dataset for this policy analysis. According to the primary data and background supporting data, many transplant recipients remain unemployed for fear of losing needed healthcare and disability benefits. When employment is considered as a health outcome, it is important in an era of evidence based medicine to ensure that healthcare interventions such as liver transplantation produce improved health outcomes. Therefore, the high unemployment rate among liver transplant recipients is a poor health outcome that should be addressed. In this policy analysis, it is proposed that policy might affect this outcome. The problem of unemployment after liver transplantation is structured and policies affecting the problem are evaluated according to the validated criteria - Effectiveness, Equity, Efficiency, and Feasibility. A policy solution is proposed, evaluated, and ultimately recommended to effectively address the problem, to make healthcare coverage more equitable for liver transplant recipients, and to provide a more cost-effective healthcare coverage model during this time of healthcare crisis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Violence against women has been recognized as a significant worldwide human rights issue and public health problem. Women of reproductive age may be particularly at risk, and pregnancy may trigger or escalate violence. Using data available from Demographic and Health Surveys on 271,103 women of reproductive age (15-49) from Bolivia, Cameroon, Colombia, Dominican Republic, Egypt, Haiti, India, Kenya, Nicaragua, Peru, South Africa, and Zambia, this study examined the nature of domestic violence during pregnancy in developing countries, including prevalence, demographic and risk factors, maternal and child health outcomes, perpetrators of violence, help-seeking behavior, and social support. In the majority of countries analyzed, violence during pregnancy consistently occurred at approximately one-third the rate at which domestic violence occurred overall. Younger women and women with more children were particularly at risk. Abuse during pregnancy was significantly associated with history of a terminated pregnancy and under-5 child mortality in most countries, and with neonatal and post-neonatal mortality in most Latin American countries. Women who were abused during pregnancy were most often abused by their current or former husband or boyfriend and most never attempted to seek help. In most countries that examined social support, women abused during pregnancy had significantly less contact with family and friends. Implications for practice and research are discussed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teen pregnancy is a continuing problem, bringing with it a host of associated health and social risks. Alternative school students are especially at risk, but are historically under-represented in research. This is especially problematic in that instruments are needed to guide effective intervention development, but psychometrics for these instruments cannot be assumed when used in new populations. Decisional balance from the transtheoretical model offers a framework for understanding condom decision making, but has not been tested with alternative school students. Using responses from 640 subjects from Safer Choices 2 (a school-based HIV/STD/pregnancy prevention program implemented in 10 urban, southwestern alternative schools), a decisional balance scale for condom use was examined. A two-factor, mildly correlated model fit the data well. Tests of invariance examined scale functioning within gender and racial/ethnic groups. The underlying structure varied slightly based on subgroup, but on a practical level the impact on the use of scales was minimal. The structure and loadings were invariant across experimental condition. The pro scale was associated with a lower probability of having engaged in unprotected sexual behavior for sexually active subjects, and this association remained significant while controlling for demographic variables. The con scale did not show a significant association with engagement in unprotected sexual behaviors. Limitations and directions for future research were also discussed.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Retail clinics, also called convenience care clinics, have become a rapidly growing trend since their initial development in 2000. These clinics are coupled within a larger retail operation and are generally located in "big-box" discount stores such as Wal-mart or Target, grocery stores such as Publix or H-E-B, or in retail pharmacies such as CVS or Walgreen's (Deloitte Center for Health Solutions, 2008). Care is typically provided by nurse practitioners. Research indicates that this new health care delivery system reduces cost, raises quality, and provides a means of access to the uninsured population (e.g., Deloitte Center for Health Solutions, 2008; Convenient Care Association, 2008a, 2008b, 2008c; Hansen-Turton, Miller, Nash, Ryan, Counts, 2007; Salinsky, 2009; Scott, 2006; Ahmed & Fincham, 2010). Some healthcare analysts even suggest that retail clinics offer a feasible solution to the shortage of primary care physicians facing the nation (AHRQ Health Care Innovations Exchange, 2010). ^ The development and performance of retail clinics is heavily dependent upon individual state policies regulating NPs. Texas currently has one of the most highly regulated practice environments for NPs (Stout & Elton, 2007; Hammonds, 2008). In September 2009, Texas passed Senate Bill 532 addressing the scope of practice of nurse practitioners in the convenience care model. In comparison to other states, this law still heavily regulates nurse practitioners. However, little research has been conducted to evaluate the impact of state laws regulating nurse practitioners on the development and performance of retail clinics. ^ Objectives. (1). To describe the potential impact that SB 532 has on retail clinic performance. (2). To discuss the effectiveness, efficiency, and equity of the convenience care model. (3). To describe possible alternatives to Texas' nurse practitioner scope of practice guidelines as delineated in Texas Senate Bill 532. (4). To describe the type of nurse practitioner state regulation (i.e. independent, light, moderate, or heavy) that best promotes the convenience care model. ^ Methods. State regulations governing nurse practitioners can be characterized as independent, light, moderate, and heavy. Four state NP regulatory types and retail clinic performance were compared and contrasted to that of Texas regulations using Dunn and Aday's theoretical models for conducting policy analysis and evaluating healthcare systems. Criteria for measurement included effectiveness, efficiency, and equity. Comparison states were Arizona (Independent), Minnesota (Light), Massachusetts (Moderate), and Florida (Heavy). ^ Results. A comparative states analysis of Texas SB 532 and alternative NP scope of practice guidelines among the four states: Arizona, Florida, Massachusetts, and Minnesota, indicated that SB 532 has minimal potential to affect the shortage of primary care providers in the state. Although SB 532 may increase the number of NPs a physician may supervise, NPs are still heavily restricted in their scope of practice and limited in their ability to act as primary care providers. Arizona's example of independent NP practice provided the best alternative to affect the shortage of PCPs in Texas as evidenced by a lower uninsured rate and less ED visits per 1,000 population. A survey of comparison states suggests that retail clinics thrive in states that more heavily restrict NP scope of practice as opposed to those that are more permissive, with the exception of Arizona. An analysis of effectiveness, efficiency, and equity of the convenience care model indicates that retail clinics perform well in the areas of effectiveness and efficiency; but, fall short in the area of equity. ^ Conclusion. Texas Senate 532 represents an incremental step towards addressing the problem of a shortage of PCPs in the state. A comparative policy analysis of the other four states with varying degrees of NP scope of practice indicate that a more aggressive policy allowing for independent NP practice will be needed to achieve positive changes in health outcomes. Retail clinics pose a temporary solution to the shortage of PCPs and will need to expand their locations to poorer regions and incorporate some chronic care to obtain measurable health outcomes. ^