967 resultados para RECIPIENTS
Resumo:
This study examined relationships among variables in the Pre-International Baccalaureate (Pre-IB) Program admissions criteria and the Pre-IB Program course grades to discriminate between recipient and non recipient groups of the International Baccalaureate (IB) Diploma award. The study involved a multiracial sample of 142 IB Diploma graduates between the years 1992 and 1996 from one IB magnet school. The IB school is located within an urban high school of a predominantly Black student enrollment. A discriminant function analysis found that the highest correlations between predictors and the discriminant function were 9th- and 10th-grade mathematics and 10th-grade science course grades. Ninth-grade course grades of science, 9th-grade and 10th-grade course grades of English, foreign language, and social studies, and 7th-grade Iowa Tests of Basic Skills (ITBS) Reading Comprehension scores were also highly correlated to the discriminant function. The ITBS Battery and subscores of Vocabulary, Total Language, Total Work-Study, and Total Mathematics subscores in seventh grade and a grade point average from language arts, social studies, science, and mathematics in seventh grade were not highly correlated to the discriminant function. Recommendations were presented in the areas of curriculum and instruction, guidance services, student mentoring, and decision-making processes which would parallel the IB examination procedure and thereby enhance the alignment of the IB Program enabling more students to become recipients of the IB Diploma award. ^
Women of the year award recipients, from left to right: Cicely Tyson, Pearl Bailey, and Maya Angelou
Resumo:
General note: Title and date provided by Bettye Lane.
Resumo:
Purpose: Given the ageing UK population and the high prevalence of activity-limiting illness and disability in the over 65s, the demand for domiciliary eye care services is set to grow significantly. Over 400,000 NHS domiciliary eye examinations are conducted each year, yet minimal research attention has been directed to this mode of practice or patient needs amongst this group. The study aimed to compare clinical characteristics and benefits of cataract surgery between conventional in-practice patients and domiciliary service users. Methods: Clinical characteristics were compared between patients in North-West England receiving NHS domiciliary eye care services (n = 197; median age 76.5 years), and an age-matched group of conventional in-practice patients (n = 107; median age 74.6 years). Data including reason for visit; logMAR uncorrected and best corrected distance (UDVA and CDVA) and near acuities (UNVA and CNVA); presence of ocular pathology and examination outcome were documented retrospectively. To compare the benefit of cataract surgery in terms of functional capacity between the patient groups, individuals undergoing routine referral for first-eye surgery completed the VF-14 questionnaire pre-operatively, and at 6 weeks post-operatively. Results: UDVA was similar between the two groups (median 0.48 and 0.50 logMAR in the domiciliary and practice groups, P = 0.916); CDVA was significantly worse in the domiciliary group (median 0.18 vs 0.08 logMAR, P<0.001), who were more likely to have clinically-significant cataract. Both groups showed similar improvements in VF-14 scores following cataract surgery (mean gains 24.4 ± 11.7, and 31.5 ± 14.7 points in the in-practice and domiciliary groups, respectively. P = 0.312). Conclusions: Patients receiving domiciliary eye care services are more likely to have poorer corrected vision than in-practice patients of a similar age, partly due to a higher prevalence of significant cataract. Despite limitations in their activities due to illness and disability, domiciliary patients experience similar gains in self-reported functional capacity following cataract surgery
Resumo:
To explore phenotype and function of NK cells in kidney transplant recipients, we investigated the peripheral NK cell repertoire, capacity to respond to various stimuli and impact of immunosuppressive drugs on NK cell activity in kidney transplant recipients. CD56(dim) NK cells of kidney transplanted patients displayed an activated phenotype characterized by significantly decreased surface expression of CD16 (p=0.0003), CD226 (p<0.0001), CD161 (p=0.0139) and simultaneously increased expression of activation markers like HLA-DR (p=0.0011) and CD25 (p=0.0015). Upon in vitro stimulation via Ca++-dependent signals, down-modulation of CD16 was associated with induction of interferon (IFN)-gamma expression. CD16 modulation and secretion of NFAT-dependent cytokines such as IFN-gamma, TNF-alpha, IL-10 and IL-31 were significantly suppressed by treatment of isolated NK cells with calcineurin inhibitors but not with mTOR inhibitors. In kidney transplant recipients, IFN-gamma production was retained in response to HLA class I-negative target cells and to non-specific stimuli, respectively. However, secretion of other cytokines like IL-13, IL-17, IL-22 and IL-31 was significantly reduced compared to healthy donors. In contrast to suppression of cytokine expression at the transcriptional level, cytotoxin release, i.e. perforin, granzyme A/B, was not affected by immunosuppression in vitro and in vivo in patients as well as in healthy donors. Thus, immunosuppressive treatment affects NK cell function at the level of NFAT-dependent gene expression whereby calcineurin inhibitors primarily impair cytokine secretion while mTOR inhibitors have only marginal effects. Taken together, NK cells may serve as indicators for immunosuppression and may facilitate a personalized adjustment of immunosuppressive medication in kidney transplant recipients.
Resumo:
Duodeno-gastroesophageal reflux aspiration is associated with chronic lung allograft dysfunction (CLAD) and aspiration of bile acids (BA), functional molecules in the gastro-intestinal tract with emulsifying properties. While links between reflux aspiration to lung disease have been identified, the relevance of bile acid as molecular ligands and outcome predictors is poorly defined. We sought to determine and quantify the various BA species in airways of the lung transplant recipients to better understand the various effects of aspirated BA that contribute to post-transplantation outcomes and to investigate their molecular effects on airway function and contractility.
Resumo:
Objective: Liver transplantation has been associated with a high prevalence of osteoporosis, although most data rely on single-center studies with limited sample size, with most of them dating back to late 1990s and early 2000s. The present thesis aims to assess the prevalence of fragility fractures and contributing factors in a large modern cohort of liver transplant recipients managed in a referral Italian Liver Transplant Center. Design and Methods: Paper and electronic medical records of 429 consecutive patients receiving liver transplantation from 1/1/2010 to 31/12/2015 were reviewed, and 366 patients were selected. Clinically obtained electronic radiological images within 6 months from the date of liver transplant surgery, such as lateral views of spine X-rays or CT abdominal scans, were opportunistically reviewed in a blinded fashion to screen for morphometric vertebral fractures. Clinical fragility fractures reported in the medical records, along with information on etiology of cirrhosis and biochemistries at the time of liver surgery were also recorded. Results: Prevalence of fragility fractures in the whole cohort was 155/366 (42.3%), with no significant differences between sexes. Of patients with fractures, most sustained vertebral fractures (145/155, 93.5%), the majority of which were mild or moderate wedges. Multiple vertebral fractures were common (41.3%). Fracture rates were similar across different etiologies of cirrhosis and were also comparable in patients with diabetes or exposed to glucocorticoids. Kidney function was significantly worse in women with fractures. Independent of age, sex, alcohol use, eGFR, etiology of liver disease, lower BMI was the only independent risk factor for fractures (adjusted OR 1,058, 95%CI 1,001-1,118, P=0.046) in this study population. Conclusions: A considerable fracture burden was shown in a large and modern cohort of liver transplant recipients. Given the remarkably high prevalence of fractures, a metabolic bone disease screening should be implemented in every patient awaiting liver transplantation.
Resumo:
Background and Aim: Acute cardiac rejection is currently diagnosed by endomyocardial biopsy (EMB), but multiparametric cardiac magnetic resonance (CMR) may be a non-invasive alternative by its capacity for myocardial structure and function characterization. Our primary aim was to determine the utility of multiparametric CMR in identifying acute graft rejection in paediatric heart transplant recipients. The second aim was to compare textural features of parametric maps in cases of rejection versus those without rejection. Methods: Fifteen patients were prospectively enrolled for contrast-enhanced CMR followed by EMB and right heart catheterization. Images were acquired on a 1,5 Tesla scanner including T1 mapping (modified Look-Locker inversion recovery sequence – MOLLI) and T2 mapping (modified GraSE sequence). The extracellular volume (ECV) was calculated using pre- and post-gadolinium T1 times of blood and myocardium and the patient’s hematocrit. Markers of graft dysfunction including hemodynamic measurements from echocardiography, catheterization and CMR were collated. Patients were divided into two groups based on degree of rejection at EMB: no rejection with no change in treatment (Group A) and acute rejection requiring new therapy (Group B). Statistical analysis included student’t t test and Pearson correlation. Results: Acute rejection was diagnosed in five patients. Mean T1 values were significantly associated with acute rejection. A monotonic, increasing trend was noted in both mean and peak T1 values, with increasing degree of rejection. ECV was significantly higher in Group B. There was no difference in T2 signal between two groups. Conclusion: Multiparametric CMR serves as a noninvasive screening tool during surveillance encounters and may be used to identify those patients that may be at higher risk of rejection and therefore require further evaluation. Future and multicenter studies are necessary to confirm these results and explore whether multiparametric CMR can decrease the number of surveillance EMBs in paediatric heart transplant recipients.
Resumo:
Cyclosporine, a drug used in immunosuppression protocols for hematopoietic stem cell transplantation that has a narrow therapeutic index, may cause various adverse reactions, including nephrotoxicity. This has a direct clinical impact on the patient. This study aims to summarize available evidence in the scientific literature on the use of cyclosporine in respect to its risk factor for the development of nephrotoxicity in patients submitted to hematopoietic stem cell transplantation. A systematic review was made with the following electronic databases: PubMed, Web of Science, Embase, Scopus, CINAHL, LILACS, SciELO and Cochrane BVS. The keywords used were: bone marrow transplantation OR stem cell transplantation OR grafting, bone marrow AND cyclosporine OR cyclosporin OR risk factors AND acute kidney injury OR acute kidney injuries OR acute renal failure OR acute renal failures OR nephrotoxicity. The level of scientific evidence of the studies was classified according to the Oxford Centre for Evidence Based Medicine. The final sample was composed of 19 studies, most of which (89.5%) had an observational design, evidence level 2B and pointed to an incidence of nephrotoxicity above 30%. The available evidence, considered as good quality and appropriate for the analyzed event, indicates that cyclosporine represents a risk factor for the occurrence of nephrotoxicity, particularly when combined with amphotericin B or aminoglycosides, agents commonly used in hematopoietic stem cell transplantation recipients.
Resumo:
Herpesvirus reactivation is common after liver transplantation. Analyze the presence of cytomegalovirus (HCMV) and human herpesvirus-6 (HHV-6) DNA in liver donor biopsies, seeking to better understand issues involving human donor leukocyte antigens (HLA)-A, B and DR, as well as correlations with acute cellular rejection. Fifty-nine liver transplantation patients were investigated for the presence of HCMV and HHV-6 DNA in liver donor biopsies, using the Nested-PCR technique. The clinical donor information and HLA matches were obtained from the São Paulo State Transplant System. The recipients' records regarding acute cellular rejection were studied. Seven (11.8%) biopsies were positive for HCMV DNA and 29 (49%) were positive for HHV-6 DNA. In 14 donors with HLA-DR 15 nine had HHV-6 DNA positive liver biopsy with a tendency for significant association (p=0.09), 22 recipients developed acute cellular rejection and 9/22 were positive for HLA-DR 15 (p=0.03; χ(2)=4.51), which was statistically significant in univariate analysis and showed a tendency after multivariate analysis (p=0.08). HHV-6 DNA was prevalent in liver donors studied as well as HLA-DR 15. These findings suggest that patients with HLA-DR 15 in liver donor biopsies develop more rejection after liver transplantation.
Resumo:
Reports of long-term tenofovir disoproxil fumarate (TDF) treatment in HIV-infected adolescents are limited. We present final results from the open-label (OL) TDF extension following the randomized, placebo (PBO)-controlled, double-blind phase of GS-US-104-0321 (Study 321). HIV-infected 12- to 17-year-olds treated with TDF 300 mg or PBO with an optimized background regimen (OBR) for 24-48 weeks subsequently received OL TDF plus OBR in a single arm study extension. HIV-1 RNA and safety, including bone mineral density (BMD), was assessed in all TDF recipients. Eighty-one subjects received TDF (median duration 96 weeks). No subject died or discontinued OL TDF for safety/tolerability. At week 144, proportions with HIV-1 RNA <50 copies/mL were 30.4% (7 of 23 subjects with baseline HIV-1 RNA >1000 c/mL initially randomized to TDF), 41.7% (5 of 12 subjects with HIV-1 RNA <1000 c/mL who switched PBO to TDF) and 0% (0 of 2 subjects failed randomized PBO plus OBR with HIV-1 RNA >1000 c/mL and switched PBO to TDF). Viral resistance to TDF occurred in 1 subject. At week 144, median decrease in estimated glomerular filtration rate was 38.1 mL/min/1.73 m (n = 25). Increases in median spine (+12.70%, n = 26) and total body less head BMD (+4.32%, n = 26) and height-age adjusted Z-scores (n = 21; +0.457 for spine, +0.152 for total body less head) were observed at week 144. Five of 81 subjects (6%) had persistent >4% BMD decreases from baseline. Some subjects had virologic responses to TDF plus OBR, and TDF resistance was rare. TDF was well tolerated and can be considered for treatment of HIV-infected adolescents.
Resumo:
BACKGROUND: The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. There are few reports studying the correlation between MELD and long-term posttransplantation survival. AIM: To assess the value of pretransplant MELD in the prediction of posttransplant survival. METHODS: The adult patients (age >18 years) who underwent liver transplantation were examined in a retrospective longitudinal cohort of patients, through the prospective data base. We excluded acute liver failure, retransplantation and reduced or split-livers. The liver donors were evaluated according to: age, sex, weight, creatinine, bilirubin, sodium, aspartate aminotransferase, personal antecedents, brain death cause, steatosis, expanded criteria donor number and index donor risk. The recipients' data were: sex, age, weight, chronic hepatic disease, Child-Turcotte-Pugh points, pretransplant and initial MELD score, pretransplant creatinine clearance, sodium, cold and warm ischemia times, hospital length of stay, blood requirements, and alanine aminotransferase (ALT >1,000 UI/L = liver dysfunction). The Kaplan-Meier method with the log-rank test was used for the univariable analyses of posttransplant patient survival. For the multivariable analyses the Cox proportional hazard regression method with the stepwise procedure was used with stratifying sodium and MELD as variables. ROC curve was used to define area under the curve for MELD and Child-Turcotte-Pugh. RESULTS: A total of 232 patients with 10 years follow up were available. The MELD cutoff was 20 and Child-Turcotte-Pugh cutoff was 11.5. For MELD score > 20, the risk factors for death were: red cell requirements, liver dysfunction and donor's sodium. For the patients with hyponatremia the risk factors were: negative delta-MELD score, red cell requirements, liver dysfunction and donor's sodium. The regression univariated analyses came up with the following risk factors for death: score MELD > 25, blood requirements, recipient creatinine clearance pretransplant and age donor >50. After stepwise analyses, only red cell requirement was predictive. Patients with MELD score < 25 had a 68.86%, 50,44% and 41,50% chance for 1, 5 and 10-year survival and > 25 were 39.13%, 29.81% and 22.36% respectively. Patients without hyponatremia were 65.16%, 50.28% and 41,98% and with hyponatremia 44.44%, 34.28% and 28.57% respectively. Patients with IDR > 1.7 showed 53.7%, 27.71% and 13.85% and index donor risk <1.7 was 63.62%, 51.4% and 44.08%, respectively. Age donor > 50 years showed 38.4%, 26.21% and 13.1% and age donor <50 years showed 65.58%, 26.21% and 13.1%. Association with delta-MELD score did not show any significant difference. Expanded criteria donors were associated with primary non-function and severe liver dysfunction. Predictive factors for death were blood requirements, hyponatremia, liver dysfunction and donor's sodium. CONCLUSION: In conclusion MELD over 25, recipient's hyponatremia, blood requirements, donor's sodium were associated with poor survival.
Resumo:
The objective of the work was to evaluate the effects of environment, recipients, and substrate compositions in passion fruit (Passiflora edulis Sims f. flavicarpa Deg.) seedlings biomass production in Pantanal region from September to November of 2006. Experimental trials were conducted in four protected environments, in two types of containers and three different substrate compositions. The environments were: A1 (greenhouse covered with low-density, 150-microns-thick polyethylene film), A2 (monofilament black screened with mesh for 50% of shade), A3 (aluminized screened with mesh for 50% of shade) and A4 (environment covered with straw of native coconut palm); the recipients were: polyethylene bags (R1) (15 x 25 cm) and polystyrene trays (R2) (with 72 cells). There substrates were: S1 (soil + organic compost + vermiculite, 1:1: 1 v/v), S2 (soil + organic compost + sawdust, 1:1: 1 v/v) and S3 (soil + organic compost + vermiculite + sawdust, 1:1: 1/2: 1/2 v/v). The experimental design was completely randomized statistical analysis in split-split-plot, with fifteen replications. The treatments in the plot were environments, in the subplots were pots, and subsubplots were substrates (4 x 2 x 3 = 24 treatments). Fresh and dry mass of aerial and root system parts were evaluated. Environments with screen showed better results for seedlings of yellow passion fruit biomass in polyethylene bags. Polyethylene bags promoted higher biomasses. The substrate with vermiculite showed better results for both types of containers. The substrate with a higher percentage of sawdust showed the worst result.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Prosthetic restorations that have been tried in the patient's mouth are potential sources of infection. In order to avoid cross-infection, protocols for infection control should be established in dental office and laboratory. This study evaluated the antimicrobial efficacy of disinfectants on full metal crowns contaminated with microorganisms. Full crowns cast in a Ni-Cr alloy were assigned to one control group (n=6) and 5 experimental groups (n=18). The crowns were placed in flat-bottom glass balloons and were autoclaved. A microbial suspension of each type of strain - S. aureus, P. aeruginosa, S. mutans, E. faecalis and C. albicans- was aseptically added to each experimental group, the crowns being allowed for contamination during 30 min. The contaminated specimens were placed into recipients with the chemical disinfectants (1% and 2% sodium hypochlorite and 2% glutaraldehyde) for 5, 10 and 15 min. Thereafter, the crowns were placed into tubes containing different broths and incubated at 35ºC. The control specimens were contaminated, immersed in distilled water for 20 min and cultured in Thioglycollate broth at 35ºC. Microbial growth assay was performed by qualitative visual examination after 48 h, 7 and 12 days. Microbial growth was noticed only in the control group. In the experimental groups, turbidity of the broths was not observed, regardless of the strains and immersion intervals, thus indicating absence of microbial growth. In conclusion, all chemical disinfectants were effective in preventing microbial growth onto full metal crowns.
Resumo:
Sendo as necessidades de saúde mais amplas que os recursos disponíveis, escolhas têm de ser feitas. Disto resulta ser preciso que se estabeleçam limites, critérios e parâmetros para priorizar o que vai ser ofertado e a quem os serviços e os cuidados de saúde serão oferecidos. Discute-se alternativas éticas para a priorização e racionamento de cuidados de saúde, enfocando os princípios da eqüidade e da utilidade social.