940 resultados para Corneal transplant
Resumo:
SUMÁRIO - O desafio atual da Saúde Pública é assegurar a sustentabilidade financeira do sistema de saúde. Em ambiente de recursos escassos, as análises económicas aplicadas à prestação dos cuidados de saúde são um contributo para a tomada de decisão que visa a maximização do bem-estar social sujeita a restrição orçamental. Portugal é um país com 10,6 milhões de habitantes (2011) com uma incidência e prevalência elevadas de doença renal crónica estadio 5 (DRC5), respetivamente, 234 doentes por milhão de habitantes (pmh) e 1.600 doentes/pmh. O crescimento de doenças associadas às causas de DRC, nomeadamente, diabetes Mellitus e hipertensão arterial, antecipam uma tendência para o aumento do número de doentes. Em 2011, dos 17.553 doentes em tratamento substitutivo renal, 59% encontrava-se em programa de hemodiálise (Hd) em centros de diálise extra-hospitalares, 37% viviam com um enxerto renal funcionante e 4% estavam em diálise peritoneal (SPN, 2011). A lista ativa para transplante (Tx) renal registava 2.500 doentes (SPN 2009). O Tx renal é a melhor modalidade terapêutica pela melhoria da sobrevida, qualidade de vida e relação custo-efetividade, mas a elegibilidade para Tx e a oferta de órgãos condicionam esta opção. Esta investigação desenvolveu-se em duas vertentes: i) determinar o rácio custo-utilidade incremental do Tx renal comparado com a Hd; ii) avaliar a capacidade máxima de dadores de cadáver em Portugal, as características e as causas de morte dos dadores potenciais a nível nacional, por hospital e por Gabinete Coordenador de Colheita e Transplantação (GCCT), e analisar o desempenho da rede de colheita de órgãos para Tx. Realizou-se um estudo observacional/não interventivo, prospetivo e analítico que incidiu sobre uma coorte de doentes em Hd que foi submetida a Tx renal. O tempo de seguimento mínimo foi de um ano e máximo de três anos. No início do estudo, colheram-se dados sociodemográficos e clínicos em 386 doentes em Hd, elegíveis para Tx renal. A qualidade de vida relacionada com a saúde (QVRS) foi avaliada nos doentes em Hd (tempo 0) e nos transplantados, aos três, seis, 12 meses, e depois, anualmente. Incluíram-se os doentes que por falência do enxerto renal transitaram para Hd. Na sua medição, utilizou-se um instrumento baseado em preferências da população, o EuroQol-5D, que permite o posterior cálculo dos QALY. Num grupo de 82 doentes, a QVRS em Hd foi avaliada em dois tempos de resposta o que permitiu a análise da sua evolução. Realizou-se uma análise custo-utilidade do Tx renal comparado com a Hd na perspetiva da sociedade. Identificaram-se os custos diretos, médicos e não médicos, e as alterações de produtividade em Hd e Tx renal. Incluíram-se os custos da colheita de órgãos, seleção dos candidatos a Tx renal e follow-up dos dadores vivos. Cada doente transplantado foi utilizado como controle de si próprio em diálise. Avaliou-se o custo médio anual em programa de Hd crónica relativo ao ano anterior à Tx renal. Os custos do Tx foram avaliados prospetivamente. Considerou-se como horizonte temporal o ciclo de vida nas duas modalidades. Usaram-se taxas de atualização de 0%, 3% e 5% na atualização dos custos e QALY e efetuaram-se análises de sensibilidade one way. Entre 2008 e 2010, 65 doentes foram submetidos a Tx renal. Registaram-se, prospetivamente, os resultados em saúde incluíndo os internamentos e os efeitos adversos da imunossupressão, e o consumo dos recursos em saúde. Utilizaram-se modelos de medidas repetidas na avaliação da evolução da QVRS e modelos de regressão múltipla na análise da associação da QVRS e dos custos do transplante com as características basais dos doentes e os eventos clínicos. Comparativamente à Hd, observou-se melhoria da utilidade ao 3º mês de Tx e a qualidade de vida aferida pela escala EQ-VAS melhorou em todos os tempos de observação após o Tx renal. O custo médio da Hd foi de 32.567,57€, considerado uniforme ao longo do tempo. O custo médio do Tx renal foi de 60.210,09€ no 1º ano e 12.956,77€ nos anos seguintes. O rácio custo-utilidade do Tx renal vs Hd crónica foi de 2.004,75€/QALY. A partir de uma sobrevivência do enxerto de dois anos e cinco meses, o Tx associou-se a poupança dos custos. Utilizaram-se os dados nacionais dos Grupos de Diagnóstico Homogéneos e realizou-se um estudo retrospectivo que abrangeu as mortes ocorridas em 34 hospitais com colheita de órgãos, em 2006. Considerou-se como dador potencial o indivíduo com idade entre 1-70 anos cuja morte ocorrera a nível hospitalar, e que apresentasse critérios de adequação à doação de rim. Analisou-se a associação dos dadores potenciais com características populacionais e hospitalares. O desempenho das organizações de colheita de órgãos foi avaliado pela taxa de conversão (rácio entre os dadores potenciais e efetivos) e pelo número de dadores potenciais por milhão de habitantes a nível nacional, regional e por Gabinete Coordenador de Colheita e Transplantação (GCCT). Identificaram-se 3.838 dadores potenciais dos quais 608 apresentaram códigos da Classificação Internacional de Doenças, 9.ª Revisão, Modificações Clínicas (CID- 9-MC) que, com maior frequência, evoluem para a morte cerebral. O modelo logit para dados agrupados identificou a idade, o rácio da lotação em Unidades de Cuidados Intensivos e lotação de agudos, existência de GCCT e de Unidade de Transplantação, e mortalidade por acidente de trabalho como fatores preditivos da conversão dum dador potencial em efetivo e através das estimativas do modelo logit quantificou-se a probabilidade dessa conversão. A doação de órgãos deve ser assumida como uma prioridade e as autoridades em saúde devem assegurar o financiamento dos hospitais com programas de doação, evitando o desperdício de órgãos para transplantação, enquanto um bem público e escasso. A colheita de órgãos deve ser considerada uma opção estratégica da atividade hospitalar orientada para a organização e planeamento de serviços que maximizem a conversão de dadores potenciais em efetivos incluindo esse critério como medida de qualidade e efetividade do desempenho hospitalar. Os resultados deste estudo demonstram que: 1) o Tx renal proporciona ganhos em saúde, aumento da sobrevida e qualidade de vida, e poupança de custos; 2) em Portugal, a taxa máxima de eficácia da conversão dos dadores cadavéricos em dadores potenciais está longe de ser atingida. O investimento na rede de colheita de órgãos para Tx é essencial para assegurar a sustentabilidade financeira e promover a qualidade, eficiência e equidade dos cuidados em saúde prestados na DRC5.
Resumo:
Abstract: Dengue is an arbovirosis that ranges from an asymptomatic presentation to a more severe disease, which is characterized by a vascular leakage syndrome where abdominal pain is a major symptom. Transplant recipients are immunosuppressed and are less likely to develop a severe form of the disease because of a reduction in immune-mediated responses that trigger plasma extravasation events. Herein, we report two cases of severe dengue in the early postoperative period of two kidney transplant recipients. Considering the severity of the cases, we emphasize the importance of dengue screening immediately before transplantation in areas endemic for the disease.
Resumo:
Glutamine is the most abundant amino acid in the blood and plays a key role in the response of the small intestine to systemic injuries. Mucosal atrophy is an important phenomenon that occurs in some types of clinical injury, such as states of severe undernutrition. Glutamine has been shown to exert powerful trophic effects on the gastrointestinal mucosa after small bowel resection or transplant, radiation injury, surgical trauma, ischemic injury and administration of cytotoxic drugs. Since no study has been performed on the malnourished animal, we examined whether glutamine exerts a trophic effect on the intestinal mucosa of the malnourished growing rat. Thirty-five growing female rats (aged 21 days) were divided into 4 groups: control - chow diet; malnutrition diet; malnutrition+chow diet; and malnutrition+glutamine-enriched chow diet (2%). For the first 15 days of the experiment, animals in the test groups received a malnutrition diet, which was a lactose-enriched diet designed to induce diarrhea and malnutrition. For the next 15 days, these animals received either the lactose-enriched diet, a regular chow diet or a glutamine-enriched chow diet. After 30 days, the animals were weighed, sacrificed, and a section of the jejunum was taken and prepared for histological examination. All the animals had similar weights on day 1 of experiment, and feeding with the lactose-enriched diet promoted a significant decrease in body weight in comparison to the control group. Feeding with both experimental chow-based diets promoted significant body weight gains, although the glutamine-enriched diet was more effective. RESULTS: The morphological and morphometric analyses demonstrated that small intestinal villous height was significantly decreased in the malnourished group, and this change was partially corrected by the two types of chow-based diet. Crypt depth was significantly increased by malnutrition, and this parameter was partially corrected by the two types of chow-based diet. The glutamine-enriched diet resulted in the greatest reduction of crypt depth, and this reduction was also statistically significant when compared with control animals. CONCLUSIONS: Enteral glutamine has some positive effects on body weight gain and trophism of the jejunal mucosa in the malnourished growing rat.
Resumo:
PURPOSE: To study the indications and results of tacrolimus as rescue therapy for acute cellular or chronic rejection in liver transplantation. PATIENTS AND METHODS: Eighteen liver transplant recipients who underwent rescue therapy with tacrolimus between March 1995 and August 1999 were retrospectively studied. The treatment indication, patients, and graft situation were recorded as of October 31st, 1999. The response to tacrolimus was defined as patient survival with a functional graft and histological reversal of acute cellular, or for chronic rejection, bilirubin serum levels decreasing to up to twice the upper normal limit. RESULTS: Fourteen cases (77.8%) presented a good response. The response rate for the different indications was: (1) acute cellular + sepsis - 0/1 case; (2) recurrent acute cellular - 1/1 case; (3) OKT3-resistant acute cellular - 2/2 cases; (4) steroid-resistant acute cellular + active viral infection - 3/3 cases; (5) chronic rejection - 8/11 cases (72.7% response rate). The 4 patients who did not respond died. CONCLUSION: Tacrolimus rescue therapy was successful in most cases of acute cellular and chronic rejection in liver transplantation.
Resumo:
Liver transplantation is now the standard treatment for end-stage liver disease. Given the shortage of liver donors and the progressively higher number of patients waiting for transplantation, improvements in patient selection and optimization of timing for transplantation are needed. Several solutions have been suggested, including increasing the donor pool; a fair policy for allocation, not permitting variables such as age, gender, and race, or third-party payer status to play any role; and knowledge of the natural history of each liver disease for which transplantation is offered. To observe ethical rules and distributive justice (guarantee to every citizen the same opportunity to get an organ), the "sickest first" policy must be used. Studies have demonstrated that death has no relationship with waiting time, but rather with the severity of liver disease at the time of inclusion. Thus, waiting time is no longer part of the United Network for Organ Sharing distribution criteria. Waiting time only differentiates between equally severely diseased patients. The authors have analyzed the waiting list mortality and 1-year survival for patients of the State of São Paulo, from July 1997 through January 2001. Only the chronological criterion was used. According to "Secretaria de Estado da Saúde de São Paulo" data, among all waiting list deaths, 82.2% occurred within the first year, and 37.6% within the first 3 months following inclusion. The allocation of livers based on waiting time is neither fair nor ethical, impairs distributive justice and human rights, and does not occur in any other part of the world.
Resumo:
The thymus is the central organ responsible for the generation of T lymphocytes (1). Various diseases cause the thymus to produce in- sufficient T cells, which can lead to immune-suppression (2). Since T cells are essential for the protection against pathogens, it is crucial to promote de novo differentiation of T cells on diseased individuals. The available clinical solutions are: 1) one protocol involving the transplant of thymic stroma from unrelated children only applicable for athymic children (3); 2) for patients with severe peripheral T cell depletion and reduced thymic activity, the administration of stimu- lating molecules stimulating the activity of the endogenous thymus (4). A scaffold (CellFoam) was suggested to support thymus regen- eration in vivo (5), although this research was discontinued. Herein, we propose an innovative strategy to generate a bioartificial thymus. We use a polycaprolactone nanofiber mesh (PCL-NFM) seeded and cultured with human thymic epithelial cells (hTECs). The cells were obtained from infant thymus collected during pediatric cardio-tho- racic surgeries. We report new data on the isolation and characterization of those cells and their interaction with PCL-NFM, by expanding hTECs into relevant numbers and by optimizing cell seeding methods.
Resumo:
Purpose. To analyze dry eye disease (DED) tests and their consistency in similar nonsymptomatic population samples living in two geographic locations with different climates (Continental vs. Atlantic). Methods. This is a pilot study including 14 nonsymptomatic residents from Valladolid (Continental climate, Spain) and 14 sex-matched and similarly aged residents from Braga (Atlantic climate, Portugal); they were assessed during the same season (spring) of two consecutive years. Phenol red thread test, conjunctival hyperemia, fluorescein tear breakup time, corneal and conjunctival staining, and Schirmer test were evaluated on three different consecutive visits. Reliability was assessed using the intraclass correlation coefficient and weighted kappa (J) coefficient for quantitative and ordinal variables, respectively. Results. Fourteen subjects were recruited in each city with a mean (TSD) age of 63.0 (T1.7) and 59.1 (T0.9) years (p = 0.08) in Valladolid and Braga, respectively. Intraclass correlation coefficient and J values of the tests performed were below 0.69 and 0.61, respectively, for both samples, thus showing moderate to poor reliability. Subsequently, comparisons were made between the results corresponding to the middle and higher outdoor relative humidity (RH) visit in each location as there were no differences in mean temperature (p Q 0.75) despite RH values significantly differing (p e 0.005). Significant (p e 0.05) differences were observed between Valladolid and Braga samples on tear breakup time (middle RH visit, 2.76 T 0.60 vs. 5.26 T 0.64 seconds; higher RH visit, 2.61 T 0.32 vs. 5.78 T 0.88 seconds) and corneal (middle RH, 0.64 T 0.17 vs. 0.14 T 0.10; higher RH, 0.60 T 0.22 vs. 0.0 T 0.0) and conjunctival staining (middle RH, 0.61 T 0.17 vs. 0.14 T 0.08; higher RH, 0.57 T 0.15 vs. 0.18 T 0.09). Conclusions. This pilot study provides initial evidence to support that DED test outcomes assessing the ocular surface integrity and tear stability are climate dependent. Future large-sample studies should support these outcomes also in DED patients. This knowledge is fundamental for multicenter clinical trials. Lack of consistency in diagnostic clinical tests for DED was also corroborated. (Optom Vis Sci 2015;92:e284Ye289)
Resumo:
Purpose: To evaluate how soft lens power affects rigid gas-permeable (RGP) lens power and visual acuity (VA) in piggyback fittings for keratoconus. Methods: Sixteen keratoconus subjects (30 eyes) were included in the study. Piggyback contact lens fittings combining Senofilcon-A soft lenses of −6.00, −3.00, +3.00 and +6.00 D with Rose K2 RGP contact lenses were performed. Corneal topography was taken on the naked eye and over each soft contact lens before fitting RGP lenses. Mean central keratometry, over-refraction, RGP back optic zone radius (BOZR) and estimated final power as well as VA were recorded and analyzed. Results: In comparison to the naked eye, the mean central keratometry flattened with both negative lens powers (p < 0.05 in all cases), did not change with the +3.00 soft lens power (p = 1.0); and steepened with the +6.00 soft lens power (p = 0.02). Rigid gas-permeable over-refraction did not change significantly between different soft lens powers (all p > 0.05). RGP’s BOZR decreased significantly with both positive in comparison with both negative soft lens powers (all p < 0.001), but no significant differences were found among negative- or positive-powers separately (both p > 0.05). Estimated RGP’s final power increased significantly with positive in comparison with negative lens powers (all p < 0.001), but no significant differences were found among negative or positive lens powers separately (both p > 0.05). Visual acuity did not change significantly between the different soft lens powers assessed (all p > 0.05). Conclusion: The use of negative-powered soft lenses in piggyback fitting reduces RGP lens power without impacting VA in keratoconus subjects.
Resumo:
METHODS: Refractive lens exchange was performed with implantation of an AT Lisa 839M (trifocal) or 909MP (bifocal toric) IOL, the latter if corneal astigmatism was more than 0.75 diopter (D). The postoperative visual and refractive outcomes were evaluated. A prototype light-distortion analyzer was used to quantify the postoperative light-distortion indices. A control group of eyes in which a Tecnis ZCB00 1-piece monofocal IOL was implanted had the same examinations. RESULTS: A trifocal or bifocal toric IOL was implanted in 66 eyes. The control IOL was implanted in 18 eyes. All 3 groups obtained a significant improvement in uncorrected distance visual acuity (UDVA) (P < .001) and corrected distance visual acuity (CDVA) (P Z .001). The mean uncorrected near visual acuity (UNVA) was 0.123 logMAR with the trifocal IOL and 0.130 logMAR with the bifocal toric IOL. The residual refractive cylinder was less than 1.00 D in 86.7% of cases with the toric IOL. The mean light-distortion index was significantly higher in the multifocal IOL groups than in the monofocal group (P < .001), although no correlation was found between the light-distortion index and CDVA. CONCLUSIONS: The multifocal IOLs provided excellent UDVA and functional UNVA despite increased light-distortion indices. The light-distortion analyzer reliably quantified a subjective component of vision distinct from visual acuity; it may become a useful adjunct in the evaluation of visual quality obtained with multifocal IOLs.
Resumo:
Objectives: To evaluate neophyte contact lens wearers’ fitting to rigid gas permeable (RGP) contact lenses in terms of wearing time, tear volume, stability, corneal staining, and subjective ratings, over a 1-month period of time. Methods: Twenty-two young healthy subjects were enrolled for wearing RGP on a daily wear basis. The participants included in this study never wore contact lenses and showed a value under 10 in McMonnies Questionnaire. Contact Lens Dry Eye Questionnaire, Visual Analog Scales, Schirmer test, tear film break-up time (BUT), and corneal staining grading were performed. Follow-up visits were scheduled at 1, 7, 15, and 28 days. Results: Six subjects dropped out due to discomfort from the study before 1 month (27% of discontinuation rate). Successful RGP wearers (16 participants) achieved high levels of subjective vision and reported comfort scores of approximately 9 of 10 between 10 and 15 days. They reported wearing their lenses for an average of 10.1262.43 hr after 1 month of wear. Conversely, unsuccessful wearers discontinued wearing the lenses after the first 10 to 15 days, showing comfort scores and wearing time significantly lower compared with the first day of wear. Schirmer test showed a signifi- cant increase at 10 days (P,0.001), and the BUT trends decreased after the first week of wear in unsuccessful group. Conclusions: Symptomatology related with dryness and discomfort, detected during the first 10 days of the adaptation, may help the clinician to predict those participants who will potentially fail to adapt to RGP lens wear.
Resumo:
Purpose: Higher myopic refractive errors are associated with serious ocular complications that can put visual function at risk. There is respective interest in slowing and if possible stopping myopia progression before it reaches a level associated with increased risk of secondary pathology. The purpose of this report was to review our understanding of the rationale(s) and success of contact lenses (CLs) used to reduce myopia progression. Methods: A review commenced by searching the PubMed database. The inclusion criteria stipulated publications of clinical trials evaluating the efficacy of CLs in regulating myopia progression based on the primary endpoint of changes in axial length measurements and published in peerreviewed journals. Other publications from conference proceedings or patents were exceptionally considered when no peer-review articles were available. Results: The mechanisms that presently support myopia regulation with CLs are based on the change of relative peripheral defocus and changing the foveal image quality signal to potentially interfere with the accommodative system. Ten clinical trials addressing myopia regulation with CLs were reviewed, including corneal refractive therapy (orthokeratology), peripheral gradient lenses, and bifocal (dual-focus) and multifocal lenses. Conclusions: CLs were reported to be well accepted, consistent, and safe methods to address myopia regulation in children. Corneal refractive therapy (orthokeratology) is so far the method with the largest demonstrated efficacy in myopia regulation across different ethnic groups. However, factors such as patient convenience, the degree of initial myopia, and non-CL treatments may also be considered. The combination of different strategies (i.e., central defocus, peripheral defocus, spectral filters, pharmaceutical delivery, and active lens-borne illumination) in a single device will present further testable hypotheses exploring how different mechanisms can reinforce or compete with each other to improve or reduce myopia regulation with CLs.
Resumo:
Dissertação de mestrado em Optometria Avançada
Resumo:
Purpose: To determine the relationship of goblet cell density (GCD) with tear function and ocular surface physiology. Methods: This was a cross-sectional study conducted in 35 asymptomatic subjects with mean age 23.8±3.6 years. Tear film assessment, conjunctiva and cornea examination were done in each subject. Conjunctival impression cytology was performed by applying Nitrocellulose Millipore MFTM-Membrane filter over the superior bulbar conjunctiva. The filter paper was than fixed with 96% ethanol and stained with Periodic Acid Schiff, Hematoxylin and Eosin. GCD was determined by optical microscopy. Relation between GCD and Schirmer score, tear break-up time (TBUT), bulbar redness, limbal redness and corneal staining was determined. Results: The mean GCD was 151±122 cells/mm2. GCD was found higher in eyes with higher Schirmer score but it was not significant (p = 0.75). There was a significant relationship ofGCDwith TBUT (p = 0.042). GCD was not correlated with bulbar redness (p = 0.126), and limbal redness (p = 0.054) as well as corneal staining (p = 0.079). No relationship of GCD with age and gender of the subjects (p > 0.05) was observed. Conclusion: GCD was found correlated with TBUT but no significant correlation was found with the aqueous portion of the tear, limbal as well as bulbar redness and corneal staining.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Informática Médica)
Resumo:
PURPOSE:To determine the indication for and incidence and evolution of temporary and permanent pacemaker implantation in cardiac transplant recipients. METHODS: A retrospective review of 114 patients who underwent orthotopic heart transplantation InCor (Heart Institute USP BR) between March 1985 and May 1993. We studied the incidence of and indication for temporary pacing, the relationship between pacing and rejection, the need for pemanent pacing and the clinical follow-up. RESULTS: Fourteen of 114 (12%)heart transplant recipients required temporary pacing and 4 of 114 (3.5%) patients required permanent pacing. The indication for temporary pacing was sinus node dysfunction in 11 patients (78.5%) and atrioventricular (AV) block in 3 patients (21.4%). The indication for permanent pacemaker implantation was sinus node dysfunction in 3 patients (75%) and atrioventricular (AV) block in 1 patient (25%). We observed rejection in 3 patients (21.4%) who required temporary pacing and in 2 patients (50%) who required permanent pacing. The previous use of amiodarone was observed in 10 patients (71.4%) with temporary pacing. Seven of the 14 patients (50%) died during follow-up. CONCLUSION: Sinus node dysfunction was the principal indication for temporary and permanent pacemaker implantation in cardiac transplant recipients. The need for pacing was related to worse prognosis after cardiac transplantation.