13 resultados para upper primary
em Duke University
Resumo:
Background: Because most developing countries lack sufficient resources and infrastructure to conduct population-based studies on childhood blindness, it can be difficult to obtain epidemiologically reliable data available for planning public health strategies to effectively address the major determinants of childhood blindness. The major etiologies of blindness can differ regionally and intra-regionally. The objective of this retrospective study was to determine (1) the major causes of childhood blindness (BL) and severe visual impairment (SVI) in students who attend Wa Methodist School for the Blind in Upper West Region, North Ghana, and (2) any potential temporal trends in the causes of blindness for this region.
Methods: In this retrospective study, demographic data and clinical information from an eye screening at Wa Methodist School for the Blind were coded according to the World Health Organization/Prevention of Blindness standardized reporting methodology. Causes of BL and SVI were categorized anatomically and etiologically. We determined the major causes of BL/SVI over time using information provided about the age at onset of visual loss for each student.
Results: The major anatomical causes of BL/SVI among the 190 students screened were corneal opacity and phthisis bulbi (n=28, 15%), optic atrophy (n=23, 13%), glaucoma (n=18, 9%), microphthalmos (n=18, 9%), and cataract (n=18, 9%). Within the first year of life, students became blind mainly due to whole globe causes (n=23, 26%), cataract (n=15, 17%), and optic atrophy (n=11, 13%). Those who became blind after age one year had whole globe causes (n=26, 26%), corneal opacity (n=24, 24%), and optic atrophy (n=13, 13%).
Conclusion: At the Wa Methodist School for the Blind, the major anatomical causes of BL/SVI were corneal opacity and phthisis bulbi. About half of all students became blind within the first year of life, and were disproportionately affected by cataract and retinal causes in comparison to the other students who became blind after age one year. While research in blind schools has a number of implicit disadvantages and limitations, considering the temporal trends and other epidemiological factors of blindness may increase the usefulness and/or implications of the data that come from blind school studies in order to improve screening methods for newborns in hospitals and primary care centers, and to help tailor preventative and treatment programs to reduce avoidable childhood blindness in neonates and schoolchildren.
Resumo:
BACKGROUND: Despite the impact of hypertension and widely accepted target values for blood pressure (BP), interventions to improve BP control have had limited success. OBJECTIVES: We describe the design of a 'translational' study that examines the implementation, impact, sustainability, and cost of an evidence-based nurse-delivered tailored behavioral self-management intervention to improve BP control as it moves from a research context to healthcare delivery. The study addresses four specific aims: assess the implementation of an evidence-based behavioral self-management intervention to improve BP levels; evaluate the clinical impact of the intervention as it is implemented; assess organizational factors associated with the sustainability of the intervention; and assess the cost of implementing and sustaining the intervention. METHODS: The project involves three geographically diverse VA intervention facilities and nine control sites. We first conduct an evaluation of barriers and facilitators for implementing the intervention at intervention sites. We examine the impact of the intervention by comparing 12-month pre/post changes in BP control between patients in intervention sites versus patients in the matched control sites. Next, we examine the sustainability of the intervention and organizational factors facilitating or hindering the sustained implementation. Finally, we examine the costs of intervention implementation. Key outcomes are acceptability and costs of the program, as well as changes in BP. Outcomes will be assessed using mixed methods (e.g., qualitative analyses--pattern matching; quantitative methods--linear mixed models). DISCUSSION: The study results will provide information about the challenges and costs to implement and sustain the intervention, and what clinical impact can be expected.
Resumo:
Light-dependent deactivation of rhodopsin as well as homologous desensitization of beta-adrenergic receptors involves receptor phosphorylation that is mediated by the highly specific protein kinases rhodopsin kinase (RK) and beta-adrenergic receptor kinase (beta ARK), respectively. We report here the cloning of a complementary DNA for RK. The deduced amino acid sequence shows a high degree of homology to beta ARK. In a phylogenetic tree constructed by comparing the catalytic domains of several protein kinases, RK and beta ARK are located on a branch close to, but separate from the cyclic nucleotide-dependent protein kinase and protein kinase C subfamilies. From the common structural features we conclude that both RK and beta ARK are members of a newly delineated gene family of guanine nucleotide-binding protein (G protein)-coupled receptor kinases that may function in diverse pathways to regulate the function of such receptors.
Resumo:
BACKGROUND: Primary care providers' suboptimal recognition of the severity of chronic kidney disease (CKD) may contribute to untimely referrals of patients with CKD to subspecialty care. It is unknown whether U.S. primary care physicians' use of estimated glomerular filtration rate (eGFR) rather than serum creatinine to estimate CKD severity could improve the timeliness of their subspecialty referral decisions. METHODS: We conducted a cross-sectional study of 154 United States primary care physicians to assess the effect of use of eGFR (versus creatinine) on the timing of their subspecialty referrals. Primary care physicians completed a questionnaire featuring questions regarding a hypothetical White or African American patient with progressing CKD. We asked primary care physicians to identify the serum creatinine and eGFR levels at which they would recommend patients like the hypothetical patient be referred for subspecialty evaluation. We assessed significant improvement in the timing [from eGFR < 30 to ≥ 30 mL/min/1.73m(2)) of their recommended referrals based on their use of creatinine versus eGFR. RESULTS: Primary care physicians recommended subspecialty referrals later (CKD more advanced) when using creatinine versus eGFR to assess kidney function [median eGFR 32 versus 55 mL/min/1.73m(2), p < 0.001]. Forty percent of primary care physicians significantly improved the timing of their referrals when basing their recommendations on eGFR. Improved timing occurred more frequently among primary care physicians practicing in academic (versus non-academic) practices or presented with White (versus African American) hypothetical patients [adjusted percentage(95% CI): 70% (45-87) versus 37% (reference) and 57% (39-73) versus 25% (reference), respectively, both p ≤ 0.01). CONCLUSIONS: Primary care physicians recommended subspecialty referrals earlier when using eGFR (versus creatinine) to assess kidney function. Enhanced use of eGFR by primary care physicians' could lead to more timely subspecialty care and improved clinical outcomes for patients with CKD.
Resumo:
OBJECT: Chordoma cells can generate solid-like tumors in xenograft models that express some molecular characteristics of the parent tumor, including positivity for brachyury and cytokeratins. However, there is a dearth of molecular markers that relate to chordoma tumor growth, as well as the cell lines needed to advance treatment. The objective in this study was to isolate a novel primary chordoma cell source and analyze the characteristics of tumor growth in a mouse xenograft model for comparison with the established U-CH1 and U-CH2b cell lines. METHODS: Primary cells from a sacral chordoma, called "DVC-4," were cultured alongside U-CH1 and U-CH2b cells for more than 20 passages and characterized for expression of CD24 and brachyury. While brachyury is believed essential for driving tumor formation, CD24 is associated with healthy nucleus pulposus cells. Each cell type was subcutaneously implanted in NOD/SCID/IL2Rγ(null) mice. The percentage of solid tumors formed, time to maximum tumor size, and immunostaining scores for CD24 and brachyury (intensity scores of 0-3, heterogeneity scores of 0-1) were reported and evaluated to test differences across groups. RESULTS: The DVC-4 cells retained chordoma-like morphology in culture and exhibited CD24 and brachyury expression profiles in vitro that were similar to those for U-CH1 and U-CH2b. Both U-CH1 and DVC-4 cells grew tumors at rates that were faster than those for U-CH2b cells. Gross tumor developed at nearly every site (95%) injected with U-CH1 and at most sites (75%) injected with DVC-4. In contrast, U-CH2b cells produced grossly visible tumors in less than 50% of injected sites. Brachyury staining was similar among tumors derived from all 3 cell types and was intensely positive (scores of 2-3) in a majority of tissue sections. In contrast, differences in the pattern and intensity of staining for CD24 were noted among the 3 types of cell-derived tumors (p < 0.05, chi-square test), with evidence of intense and uniform staining in a majority of U-CH1 tumor sections (score of 3) and more than half of the DVC-4 tumor sections (scores of 2-3). In contrast, a majority of sections from U-CH2b cells stained modestly for CD24 (scores of 1-2) with a predominantly heterogeneous staining pattern. CONCLUSIONS: This is the first report on xenografts generated from U-CH2b cells in which a low tumorigenicity was discovered despite evidence of chordoma-like characteristics in vitro. For tumors derived from a primary chordoma cell and U-CH1 cell line, similarly intense staining for CD24 was observed, which may correspond to their similar potential to grow tumors. In contrast, U-CH2b tumors stained less intensely for CD24. These results emphasize that many markers, including CD24, may be useful in distinguishing among chordoma cell types and their tumorigenicity in vivo.
Resumo:
UNLABELLED: BACKGROUND: Primary care, an essential determinant of health system equity, efficiency, and effectiveness, is threatened by inadequate supply and distribution of the provider workforce. The Veterans Health Administration (VHA) has been a frontrunner in the use of nurse practitioners (NPs) and physician assistants (PAs). Evaluation of the roles and impact of NPs and PAs in the VHA is critical to ensuring optimal care for veterans and may inform best practices for use of PAs and NPs in other settings around the world. The purpose of this study was to characterize the use of NPs and PAs in VHA primary care and to examine whether their patients and patient care activities were, on average, less medically complex than those of physicians. METHODS: This is a retrospective cross-sectional analysis of administrative data from VHA primary care encounters between 2005 and 2010. Patient and patient encounter characteristics were compared across provider types (PA, NP, and physician). RESULTS: NPs and PAs attend about 30% of all VHA primary care encounters. NPs, PAs, and physicians fill similar roles in VHA primary care, but patients of PAs and NPs are slightly less complex than those of physicians, and PAs attend a higher proportion of visits for the purpose of determining eligibility for benefits. CONCLUSIONS: This study demonstrates that a highly successful nationwide primary care system relies on NPs and PAs to provide over one quarter of primary care visits, and that these visits are similar to those of physicians with regard to patient and encounter characteristics. These findings can inform health workforce solutions to physician shortages in the USA and around the world. Future research should compare the quality and costs associated with various combinations of providers and allocations of patient care work, and should elucidate the approaches that maximize quality and efficiency.
Resumo:
BACKGROUND: Little is known about the constraints of optimizing health care for prostate cancer survivors in Alaska primary care. OBJECTIVE: To describe the experiences and attitudes of primary care providers within the Alaska Tribal Health System (ATHS) regarding the care of prostate cancer survivors. DESIGN: In late October 2011, we emailed a 22-item electronic survey to 268 ATHS primary care providers regarding the frequency of Prostate Specific Antigen (PSA) monitoring for a hypothetical prostate cancer survivor; who should be responsible for the patient's life-long prostate cancer surveillance; who should support the patient's emotional and medical needs as a survivor; and providers' level of comfort addressing recurrence monitoring, erectile dysfunction, urinary incontinence, androgen deprivation therapy, and emotional needs. We used simple logistic regression to examine the association between provider characteristics and their responses to the survivorship survey items. RESULTS: Of 221 individuals who were successfully contacted, a total of 114 responded (52% response rate). Most ATHS providers indicated they would order a PSA test every 12 months (69%) and believed that, ideally, the hypothetical patient's primary care provider should be responsible for his life-long prostate cancer surveillance (60%). Most providers reported feeling either "moderately" or "very" comfortable addressing topics such as prostate cancer recurrence (59%), erectile dysfunction (64%), urinary incontinence (63%), and emotional needs (61%) with prostate cancer survivors. These results varied somewhat by provider characteristics including female sex, years in practice, and the number of prostate cancer survivors seen in their practice. CONCLUSIONS: These data suggest that most primary care providers in Alaska are poised to assume the care of prostate cancer survivors locally. However, we also found that large minorities of providers do not feel confident in their ability to manage common issues in prostate cancer survivorship, implying that continued access to specialists with more expert knowledge would be beneficial.
Resumo:
Grafts can be rejected even when matched for MHC because of differences in the minor histocompatibility Ags (mH-Ags). H4- and H60-derived epitopes are known as immunodominant mH-Ags in H2(b)-compatible BALB.B to C57BL/6 transplantation settings. Although multiple explanations have been provided to explain immunodominance of Ags, the role of vascularization of the graft is yet to be determined. In this study, we used heart (vascularized) and skin (nonvascularized) transplantations to determine the role of primary vascularization of the graft. A higher IFN-γ response toward H60 peptide occurs in heart recipients. In contrast, a higher IFN-γ response was generated against H4 peptide in skin transplant recipients. Peptide-loaded tetramer staining revealed a distinct antigenic hierarchy between heart and skin transplantation: H60-specific CD8(+) T cells were the most abundant after heart transplantation, whereas H4-specific CD8(+) T cells were more abundant after skin graft. Neither the tissue-specific distribution of mH-Ags nor the draining lymph node-derived dendritic cells correlated with the observed immunodominance. Interestingly, non-primarily vascularized cardiac allografts mimicked skin grafts in the observed immunodominance, and H60 immunodominance was observed in primarily vascularized skin grafts. However, T cell depletion from the BALB.B donor prior to cardiac allograft induces H4 immunodominance in vascularized cardiac allograft. Collectively, our data suggest that immediate transmigration of donor T cells via primary vascularization is responsible for the immunodominance of H60 mH-Ag in organ and tissue transplantation.
Resumo:
BACKGROUND: Early preparation for renal replacement therapy (RRT) is recommended for patients with advanced chronic kidney disease (CKD), yet many patients initiate RRT urgently and/or are inadequately prepared. METHODS: We conducted audio-recorded, qualitative, directed telephone interviews of nephrology health care providers (n = 10, nephrologists, physician assistants, and nurses) and primary care physicians (PCPs, n = 4) to identify modifiable challenges to optimal RRT preparation to inform future interventions. We recruited providers from public safety-net hospital-based and community-based nephrology and primary care practices. We asked providers open-ended questions to assess their perceived challenges and their views on the role of PCPs and nephrologist-PCP collaboration in patients' RRT preparation. Two independent and trained abstractors coded transcribed audio-recorded interviews and identified major themes. RESULTS: Nephrology providers identified several factors contributing to patients' suboptimal RRT preparation, including health system resources (e.g., limited time for preparation, referral process delays, and poorly integrated nephrology and primary care), provider skills (e.g., their difficulty explaining CKD to patients), and patient attitudes and cultural differences (e.g., their poor understanding and acceptance of their CKD and its treatment options, their low perceived urgency for RRT preparation; their negative perceptions about RRT, lack of trust, or language differences). PCPs desired more involvement in preparation to ensure RRT transitions could be as "smooth as possible", including providing patients with emotional support, helping patients weigh RRT options, and affirming nephrologist recommendations. Both nephrology providers and PCPs desired improved collaboration, including better information exchange and delineation of roles during the RRT preparation process. CONCLUSIONS: Nephrology and primary care providers identified health system resources, provider skills, and patient attitudes and cultural differences as challenges to patients' optimal RRT preparation. Interventions to improve these factors may improve patients' preparation and initiation of optimal RRTs.
Resumo:
The purpose of this study was to identify preoperative predictors of discharge destination after total joint arthroplasty. A retrospective study of three hundred and seventy-two consecutive patients who underwent primary total hip and knee arthroplasty was performed. The mean length of stay was 2.9 days and 29.0% of patients were discharged to extended care facilities. Age, caregiver support at home, and patient expectation of discharge destination were the only significant multivariable predictors regardless of the type of surgery (total knee versus total hip arthroplasty). Among those variables, patient expectation was the most important predictor (P < 0.001; OR 169.53). The study was adequately powered to analyze the variables in the multivariable logistic regression model, which had a high concordance index of 0.969.
Resumo:
OBJECTIVES: Identification of patient subpopulations susceptible to develop myocardial infarction (MI) or, conversely, those displaying either intrinsic cardioprotective phenotypes or highly responsive to protective interventions remain high-priority knowledge gaps. We sought to identify novel common genetic variants associated with perioperative MI in patients undergoing coronary artery bypass grafting using genome-wide association methodology. SETTING: 107 secondary and tertiary cardiac surgery centres across the USA. PARTICIPANTS: We conducted a stage I genome-wide association study (GWAS) in 1433 ethnically diverse patients of both genders (112 cases/1321 controls) from the Genetics of Myocardial Adverse Outcomes and Graft Failure (GeneMAGIC) study, and a stage II analysis in an expanded population of 2055 patients (225 cases/1830 controls) combined from the GeneMAGIC and Duke Perioperative Genetics and Safety Outcomes (PEGASUS) studies. Patients undergoing primary non-emergent coronary bypass grafting were included. PRIMARY AND SECONDARY OUTCOME MEASURES: The primary outcome variable was perioperative MI, defined as creatine kinase MB isoenzyme (CK-MB) values ≥10× upper limit of normal during the first postoperative day, and not attributable to preoperative MI. Secondary outcomes included postoperative CK-MB as a quantitative trait, or a dichotomised phenotype based on extreme quartiles of the CK-MB distribution. RESULTS: Following quality control and adjustment for clinical covariates, we identified 521 single nucleotide polymorphisms in the stage I GWAS analysis. Among these, 8 common variants in 3 genes or intergenic regions met p<10(-5) in stage II. A secondary analysis using CK-MB as a quantitative trait (minimum p=1.26×10(-3) for rs609418), or a dichotomised phenotype based on extreme CK-MB values (minimum p=7.72×10(-6) for rs4834703) supported these findings. Pathway analysis revealed that genes harbouring top-scoring variants cluster in pathways of biological relevance to extracellular matrix remodelling, endoplasmic reticulum-to-Golgi transport and inflammation. CONCLUSIONS: Using a two-stage GWAS and pathway analysis, we identified and prioritised several potential susceptibility loci for perioperative MI.
Resumo:
The correlation between diet and dental topography is of importance to paleontologists seeking to diagnose ecological adaptations in extinct taxa. Although the subject is well represented in the literature, few studies directly compare methods or evaluate dietary signals conveyed by both upper and lower molars. Here, we address this gap in our knowledge by comparing the efficacy of three measures of functional morphology for classifying an ecologically diverse sample of thirteen medium- to large-bodied platyrrhines by diet category (e.g., folivore, frugivore, hard object feeder). We used Shearing Quotient (SQ), an index derived from linear measurements of molar cutting edges and two indices of crown surface topography, Occlusal Relief (OR) and Relief Index (RFI). Using SQ, OR, and RFI, individuals were then classified by dietary category using Discriminate Function Analysis. Both upper and lower molar variables produce high classification rates in assigning individuals to diet categories, but lower molars are consistently more successful. SQs yield the highest classification rates. RFI and OR generally perform above chance. Upper molar RFI has a success rate below the level of chance. Adding molar length enhances the discriminatory power for all variables. We conclude that upper molar SQs are useful for dietary reconstruction, especially when combined with body size information. Additionally, we find that among our sample of platyrrhines, SQ remains the strongest predictor of diet, while RFI is less useful at signaling dietary differences in absence of body size information. The study demonstrates new ways for inferring the diets of extinct platyrrhine primates when both upper and lower molars are available, or, for taxa known only from upper molars. The techniques are useful in reconstructing diet in stem representatives of anthropoid clade, who share key aspects of molar morphology with extant platyrrhines.