733 resultados para Missed appointments


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Germline mutation testing in patients with colorectal cancer (CRC) is offered only to a subset of patients with a clinical presentation or tumor histology suggestive of familial CRC syndromes, probably underestimating familial CRC predisposition. The aim of our study was to determine whether unbiased screening of newly diagnosed CRC cases with next generation sequencing (NGS) increases the overall detection rate of germline mutations. We analyzed 152 consecutive CRC patients for germline mutations in 18 CRC-associated genes using NGS. All patients were also evaluated for Bethesda criteria and all tumors were investigated for microsatellite instability, immunohistochemistry for mismatch repair proteins and the BRAF*V600E somatic mutation. NGS based sequencing identified 27 variants in 9 genes in 23 out of 152 patients studied (18%). Three of them were already reported as pathogenic and 12 were class 3 germline variants with an uncertain prediction of pathogenicity. Only 1 of these patients fulfilled Bethesda criteria and had a microsatellite instable tumor and an MLH1 germline mutation. The others would have been missed with current approaches: 2 with a MSH6 premature termination mutation and 12 uncertain, potentially pathogenic class 3 variants in APC, MLH1, MSH2, MSH6, MSH3 and MLH3. The higher NGS mutation detection rate compared with current testing strategies based on clinicopathological criteria is probably due to the large genetic heterogeneity and overlapping clinical presentation of the various CRC syndromes. It can also identify apparently nonpenetrant germline mutations complicating the clinical management of the patients and their families.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background.  Cryptococcal meningitis is a leading cause of death in people living with human immunodeficiency virus (HIV)/acquired immune deficiency syndrome. The World Health Organizations recommends pre-antiretroviral treatment (ART) cryptococcal antigen (CRAG) screening in persons with CD4 below 100 cells/µL. We assessed the prevalence and outcome of cryptococcal antigenemia in rural southern Tanzania. Methods.  We conducted a retrospective study including all ART-naive adults with CD4 <150 cells/µL prospectively enrolled in the Kilombero and Ulanga Antiretroviral Cohort between 2008 and 2012. Cryptococcal antigen was assessed in cryopreserved pre-ART plasma. Cox regression estimated the composite outcome of death or loss to follow-up (LFU) by CRAG status and fluconazole use. Results.  Of 750 ART-naive adults, 28 (3.7%) were CRAG-positive, corresponding to a prevalence of 4.4% (23 of 520) in CD4 <100 and 2.2% (5 of 230) in CD4 100-150 cells/µL. Within 1 year, 75% (21 of 28) of CRAG-positive and 42% (302 of 722) of CRAG-negative patients were dead or LFU (P<.001), with no differences across CD4 strata. Cryptococcal antigen positivity was an independent predictor of death or LFU after adjusting for relevant confounders (hazard ratio [HR], 2.50; 95% confidence interval [CI], 1.29-4.83; P = .006). Cryptococcal meningitis occurred in 39% (11 of 28) of CRAG-positive patients, with similar retention-in-care regardless of meningitis diagnosis (P = .8). Cryptococcal antigen titer >1:160 was associated with meningitis development (odds ratio, 4.83; 95% CI, 1.24-8.41; P = .008). Fluconazole receipt decreased death or LFU in CRAG-positive patients (HR, 0.18; 95% CI, .04-.78; P = .022). Conclusions.  Cryptococcal antigenemia predicted mortality or LFU among ART-naive HIV-infected persons with CD4 <150 cells/µL, and fluconazole increased survival or retention-in-care, suggesting that targeted pre-ART CRAG screening may decrease early mortality or LFU. A CRAG screening threshold of CD4 <100 cells/µL missed 18% of CRAG-positive patients, suggesting guidelines should consider a higher threshold.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To determine the effect of nonadherence to antiretroviral therapy (ART) on virologic failure and mortality in naive individuals starting ART. DESIGN Prospective observational cohort study. METHODS Eligible individuals enrolled in the Swiss HIV Cohort Study, started ART between 2003 and 2012, and provided adherence data on at least one biannual clinical visit. Adherence was defined as missed doses (none, one, two, or more than two) and percentage adherence (>95, 90-95, and <90) in the previous 4 weeks. Inverse probability weighting of marginal structural models was used to estimate the effect of nonadherence on viral failure (HIV-1 viral load >500 copies/ml) and mortality. RESULTS Of 3150 individuals followed for a median 4.7 years, 480 (15.2%) experienced viral failure and 104 (3.3%) died, 1155 (36.6%) reported missing one dose, 414 (13.1%) two doses and, 333 (10.6%) more than two doses of ART. The risk of viral failure increased with each missed dose (one dose: hazard ratio [HR] 1.15, 95% confidence interval 0.79-1.67; two doses: 2.15, 1.31-3.53; more than two doses: 5.21, 2.96-9.18). The risk of death increased with more than two missed doses (HR 4.87, 2.21-10.73). Missing one to two doses of ART increased the risk of viral failure in those starting once-daily (HR 1.67, 1.11-2.50) compared with those starting twice-daily regimens (HR 0.99, 0.64-1.54, interaction P = 0.09). Consistent results were found for percentage adherence. CONCLUSION Self-report of two or more missed doses of ART is associated with an increased risk of both viral failure and death. A simple adherence question helps identify patients at risk for negative clinical outcomes and offers opportunities for intervention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Crossing a street can be a very difficult task for older pedestrians. With increased age and potential cognitive decline, older people take the decision to cross a street primarily based on vehicles' distance, and not on their speed. Furthermore, older pedestrians tend to overestimate their own walking speed, and could not adapt it according to the traffic conditions. Pedestrians' behavior is often tested using virtual reality. Virtual reality presents the advantage of being safe, cost-effective, and allows using standardized test conditions. METHODS: This paper describes an observational study with older and younger adults. Street crossing behavior was investigated in 18 healthy, younger and 18 older subjects by using a virtual reality setting. The aim of the study was to measure behavioral data (such as eye and head movements) and to assess how the two age groups differ in terms of number of safe street crossings, virtual crashes, and missed street crossing opportunities. Street crossing behavior, eye and head movements, in older and younger subjects, were compared with non-parametric tests. RESULTS: The results showed that younger pedestrians behaved in a more secure manner while crossing a street, as compared to older people. The eye and head movements analysis revealed that older people looked more at the ground and less at the other side of the street to cross. CONCLUSIONS: The less secure behavior in street crossing found in older pedestrians could be explained by their reduced cognitive and visual abilities, which, in turn, resulted in difficulties in the decision-making process, especially under time pressure. Decisions to cross a street are based on the distance of the oncoming cars, rather than their speed, for both groups. Older pedestrians look more at their feet, probably because of their need of more time to plan precise stepping movement and, in turn, pay less attention to the traffic. This might help to set up guidelines for improving senior pedestrians' safety, in terms of speed limits, road design, and mixed physical-cognitive trainings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Distraction-based spinal growth modulation by growing rods or vertical expandable prosthetic titanium ribs (VEPTRs) is the mainstay of instrumented operative strategies to correct early onset spinal deformities. In order to objectify the benefits, it has become common sense to measure the gain in spine height by assessing T1-S1 distance on anteroposterior (AP) radiographs. However, by ignoring growth changes on vertebral levels and by limiting measurement to one plane, valuable data is missed regarding the three-dimensional (3D) effects of growth modulation. This information might be interesting when it comes to final fusion or, even more so, when the protective growing implants are removed and the spine re-exposed to physiologic forces at the end of growth. METHODS The goal of this retrospective radiographic study was to assess the growth modulating impact of year-long, distraction-based VEPTR treatment on the morphology of single vertebral bodies. We digitally measured lumbar vertebral body height (VBH) and upper endplate depth (VBD) at the time of the index procedure and at follow-up in nine patients with rib-to-ileum constructs (G1) spanning an anatomically normal lumbar spine. Nine patients with congenital thoracic scoliosis and VEPTR rib-to-rib constructs, but uninstrumented lumbar spines, served as controls (G2). All had undergone more than eight half-yearly VEPTR expansions. A Wilcoxon signed-rank test was used for statistical comparison of initial and follow-up VBH, VBD and height/depth (H/D) ratio (significance level 0.05). RESULTS The average age was 7.1 years (G1) and 5.2 year (G2, p > 0.05) at initial surgery; the average overall follow-up time was 5.5 years (p = 1). In both groups, VBH increased significantly without a significant intergroup difference. Group 1 did not show significant growth in depth, whereas VBD increased significantly in the control group. As a consequence, the H/D ratio increased significantly in group 1 whereas it remained unchanged in group 2. The growth rate for height in mm/year was 1.4 (group 1) and 1.1 (group 2, p = 0.45), and for depth, it was -0.3 and 1.1 (p < 0.05), respectively. CONCLUSIONS VEPTR growth modulating treatment alters the geometry of vertebral bodies by increasing the H/D ratio. We hypothesize that the implant-related deprivation from axial loads (stress-shielding) impairs anteroposterior growth. The biomechanical consequence of such slender vertebrae when exposed to unprotected loads in case of definitive VEPTR removal at the end of growth is uncertain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION Left ventricular thrombus (LVT) formation may worsen the post-infarct outcome as a result of thromboembolic events. It also complicates the use of modern antiplatelet regimens, which are not compatible with long-term oral anticoagulation. The knowledge of the incidence of LVT may therefore be of importance to guide antiplatelet and antithrombotic therapy after acute myocardial infarction (AMI). METHODS In 177 patients with large, mainly anterior AMI, standard cardiac magnetic resonance imaging (CMR) including cine and late gadolinium enhancement (LGE) imaging was performed shortly after AMI as per protocol. CMR images were analysed at an independent core laboratory blinded to the clinical data. Transthoracic echocardiography (TTE) was not mandatory for the trial, but was performed in 64% of the cases following standard of care. In a logistic model, 3 out of 61 parameters were used in a multivariable model to predict LVT. RESULTS LVT was detected by use of CMR in 6.2% (95% confidence interval [CI] 3.1%-10.8%). LGE sequences were best to detect LVT, which may be missed in cine sequences. We identified body mass index (odds ratio 1.18; p = 0.01), baseline platelet count (odds ratio 1.01, p = 0.01) and infarct size as assessed by use of CMR (odds ratio 1.03, p = 0.02) as best predictors for LVT. The agreement between TTE and CMR for the detection of LVT is substantial (kappa = 0.70). DISCUSSION In the current analysis, the incidence of LVT shortly after AMI is relatively low, even in a patient population at high risk. An optimal modality for LVT detection is LGE-CMR but TTE has an acceptable accuracy when LGE-CMR is not available.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Limited data exist on the efficacy of long-term therapies for osteoporosis. In osteoporotic postmenopausal women receiving denosumab for 7 years, nonvertebral fracture rates significantly decreased in years 4-7 versus years 1-3. This is the first demonstration of a further benefit on fracture outcomes with long-term therapy for osteoporosis. INTRODUCTION This study aimed to evaluate whether denosumab treatment continued beyond 3 years is associated with a further reduction in nonvertebral fracture rates. METHODS Participants who completed the 3-year placebo-controlled Fracture REduction Evaluation of Denosumab in Osteoporosis every 6 Months (FREEDOM) study were invited to participate in an open-label extension. The present analysis includes 4,074 postmenopausal women with osteoporosis (n = 2,343 long-term; n = 1,731 cross-over) who enrolled in the extension, missed ≤1 dose during their first 3 years of denosumab treatment, and continued into the fourth year of treatment. Comparison of nonvertebral fracture rates during years 1-3 of denosumab with that of the fourth year and with the rate during years 4-7 was evaluated. RESULTS For the combined group, the nonvertebral fracture rate per 100 participant-years was 2.15 for the first 3 years of denosumab treatment (referent) and 1.36 in the fourth year (rate ratio [RR] = 0.64; 95 % confidence interval (CI) = 0.48 to 0.85, p = 0.003). Comparable findings were observed in the groups separately and when nonvertebral fracture rates during years 1-3 were compared to years 4-7 in the long-term group (RR = 0.79; 95 % CI = 0.62 to 1.00, p = 0.046). Fracture rate reductions in year 4 were most prominent in subjects with persisting low hip bone mineral density (BMD). CONCLUSIONS Denosumab treatment beyond 3 years was associated with a further reduction in nonvertebral fracture rate that persisted through 7 years of continuous denosumab administration. The degree to which denosumab further reduces nonvertebral fracture risk appears influenced by the hip bone density achieved with initial therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. MATERIALS AND METHODS This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. RESULTS All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). CONCLUSION Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: In team sports the ability to use peripheral vision is essential to track a number of players and the ball. By using eye-tracking devices it was found that players either use fixations and saccades to process information on the pitch or use smooth pursuit eye movements (SPEM) to keep track of single objects (Schütz, Braun, & Gegenfurtner, 2011). However, it is assumed that peripheral vision can be used best when the gaze is stable while it is unknown whether motion changes can be equally well detected when SPEM are used especially because contrast sensitivity is reduced during SPEM (Schütz, Delipetkose, Braun, Kerzel, & Gegenfurtner, 2007). Therefore, peripheral motion change detection will be examined by contrasting a fixation condition with a SPEM condition. Methods: 13 participants (7 male, 6 female) were presented with a visual display consisting of 15 white and 1 red square. Participants were instructed to follow the red square with their eyes and press a button as soon as a white square begins to move. White square movements occurred either when the red square was still (fixation condition) or moving in a circular manner with 6 °/s (pursuit condition). The to-be-detected white square movements varied in eccentricity (4 °, 8 °, 16 °) and speed (1 °/s, 2 °/s, 4 °/s) while movement time of white squares was constant at 500 ms. 180 events should be detected in total. A Vicon-integrated eye-tracking system and a button press (1000 Hz) was used to control for eye-movements and measure detection rates and response times. Response times (ms) and missed detections (%) were measured as dependent variables and analysed with a 2 (manipulation) x 3 (eccentricity) x 3 (speed) ANOVA with repeated measures on all factors. Results: Significant response time effects were found for manipulation, F(1,12) = 224.31, p < .01, ηp2 = .95, eccentricity, F(2,24) = 56.43; p < .01, ηp2 = .83, and the interaction between the two factors, F(2,24) = 64.43; p < .01, ηp2 = .84. Response times increased as a function of eccentricity for SPEM only and were overall higher than in the fixation condition. Results further showed missed events effects for manipulation, F(1,12) = 37.14; p < .01, ηp2 = .76, eccentricity, F(2,24) = 44.90; p < .01, ηp2 = .79, the interaction between the two factors, F(2,24) = 39.52; p < .01, ηp2 = .77 and the three-way interaction manipulation x eccentricity x speed, F(2,24) = 3.01; p = .03, ηp2 = .20. While less than 2% of events were missed on average in the fixation condition as well as at 4° and 8° eccentricity in the SPEM condition, missed events increased for SPEM at 16 ° eccentricity with significantly more missed events in the 4 °/s speed condition (1 °/s: M = 34.69, SD = 20.52; 2 °/s: M = 33.34, SD = 19.40; 4 °/s: M = 39.67, SD = 19.40). Discussion: It could be shown that using SPEM impairs the ability to detect peripheral motion changes at the far periphery and that fixations not only help to detect these motion changes but also to respond faster. Due to high temporal constraints especially in team sports like soccer or basketball, fast reaction are necessary for successful anticipation and decision making. Thus, it is advised to anchor gaze at a specific location if peripheral changes (e.g. movements of other players) that require a motor response have to be detected. In contrast, SPEM should only be used if a single object, like the ball in cricket or baseball, is necessary for a successful motor response. References: Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11, 1-30. Schütz, A. C., Delipetkose, E., Braun, D. I., Kerzel, D., & Gegenfurtner, K. R. (2007). Temporal contrast sensitivity during smooth pursuit eye movements. Journal of Vision, 7, 1-15.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bern is a classic example of a so-called secondary capital city, which is defined as a capital city that is not the primary economic center of its nation. Such capital cities feature a specific political economy characterized by a strong government presence in its regional economy and its local governance arrangements. Bern has been losing importance in the Swiss urban system over the past decades due to a stagnating economy, population decline and missed opportunities for regional cooperation. To re-position itself in the Swiss urban hierarchy, political leaders and policymakers established a non-profit organization called “Capital Region Switzerland” in 2010 arguing that a capital city should not be measured by economic success only, but by its function as a political center where political decisions are negotiated and implemented. This city profile analyses Bern's strategy and discusses its ambitions and limitations in the context of the city's history, socio-economic and political conditions. We conclude that Bern's positioning strategy has so far been a political success, yet that there are severe limitations regarding advancing economic development. As a result, this re-positioning strategy is not able to address the fundamental economic development challenges that Bern faces as a secondary capital city.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE As survival rates of adolescent and young adult (AYA) cancer patients increase, a growing number of AYA cancer survivors need follow-up care. However, there is little research on their preferences for follow-up care. We aimed to (1) describe AYA cancer survivors' preferences for the organization and content of follow-up care, (2) describe their preferences for different models of follow-up, and (3) investigate clinical and sociodemographic characteristics associated with preferences for the different models. METHODS AYA cancer survivors (diagnosed with cancer at age 16-25 years; ≥5 years after diagnosis) were identified through the Cancer Registry Zurich and Zug. Survivors completed a questionnaire on follow-up attendance, preferences for organizational aspects of follow-up care (what is important during follow-up, what should be included during appointments, what specialists should be involved, location), models of follow-up (telephone/questionnaire, general practitioner (GP), pediatric oncologist, medical oncologist, multidisciplinary team), and sociodemographic characteristics. Information on tumor and treatment was available through the Cancer Registry Zurich and Zug. RESULTS Of 389 contacted survivors, 160 (41.1 %) participated and 92 (57.5 %) reported still attending follow-up. Medical aspects of follow-up care were more important than general aspects (p < 0.001). Among different organizational models, follow-up by a medical oncologist was rated higher than all other models (p = 0.002). Non-attenders of follow-up rated GP-led follow-up significantly higher than attenders (p = 0.001). CONCLUSION Swiss AYA cancer survivors valued medical content of follow-up and showed a preference for medical oncologist-led follow-up. Implementation of different models of follow-up care might improve accessibility and attendance among AYA cancer survivors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The focus of the study was to identify variables that African American women who delivered at a teaching hospital in Houston, Harris County, Texas, between January 12, 1998 and April 24, 1998 perceived to prevent them from receiving adequate prenatal care. The research was based on Aday and Andersen's Framework for the Study of Access to Medical Care. A self-administered questionnaire, using realized and potential access indicators, was developed and administered to 161 African American patients at the study hospital. ^ The objectives of the study were (1) to describe the demographic characteristics of African American women who delivered at a large urban teaching hospital between January 12, 1998 and April 24, 1998; and to determine the relationships between (2) predisposing factors such as age, race, educational level, marital status, family structure, social support and attitude toward prenatal care and prenatal care utilization; (3) enabling factors such as income, employment, insurance status, transportation, appointment, and regular source of care; (4) need factors such as perceived health status, number of past pregnancies, pregnancy occurrence; and (5) the relative importance of predisposing, enabling and need factors as predictors of utilization of prenatal care. The indicators of prenatal care utilization examined included the trimester in which the women initiated prenatal care, number of visits, and numbers and types of services received during pregnancy. Barriers cited included low income and inadequate insurance coverage, problems of transportation and child care, unawareness of pregnancy, delays in the scheduling of appointments, and having too many other problems. ^ The results of the study have implications for well-defined public health promotion campaigns, social support system enhancement, and appointment scheduling reform with an emphasis on prenatal care. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regulatory change not seen since the Great Depression swept the U.S. banking industry beginning in the early 1980s, culminating with the Interstate Banking and Branching Efficiency Act of 1994. Significant consolidations have occurred in the banking industry. This paper considers the market-power versus the efficient-structure theories of the positive correlation between banking concentration and performance on a state-by-state basis. Temporal causality tests imply that bank concentration leads bank profitability, supporting the market-power, rather than the efficient-structure, theory of that positive correlation. Our finding suggests that bank regulators, by focusing on local banking markets, missed the initial stages of an important structural change at the state level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2004, Houston had one of the lowest childhood immunization levels among major metropolitan cities in the United States at 65% for the 4:3:1:3:3 vaccination series. Delays in the receipt of scheduled vaccinations may be related to missed opportunities due to health care provider lack of knowledge about catch-up regimens and contraindications for pediatric vaccination. The objectives of this study are to identify, measure, and report on VFC provider-practice characteristics, knowledge of catch-up regimens and contraindications, and use of Reminder recall (R/R) and moved or gone elsewhere (MOGE) practices among providers with high (>80%) and low (<70%) immunization coverage among 19-35 month old children. The sampling frame consists of 187 Vaccines for Children (VFC) providers with 2004 clinic assessment software application (CASA) scores. Data were collected by personal interview with each participating practice provider. Only ten VFC providers were successful at maximizing vaccinations for every vignette and no provider administered the maximum possible number of vaccinations at visit 2 for all six vignettes. Both coverage groups administered polio conjugate vaccine (PCV), haemophilus influenza type b (Hib), and diphtheria, tetanus and acellular pertussis (DTaP) most frequently and omitted most frequently varicella zoster vaccine (VZV) and measles, mumps, and rubella (MMR) vaccine. ^