926 resultados para transplantation hépatique
Resumo:
First year follow-up after heart transplantation requires invasive tests. Although patients used to be hospitalized for this purpose, ambulatory invasive procedures now offer the possibility of outpatient follow-up. The feasibility and security of this strategy is unknown. From 2007 we transitioned to outpatient follow-up. We have retrospectively reviewed the clinical course of the outpatient group (2007 to 2014) and an inpatient group (2000–2006). Basal characteristics, hospital stay, infections, rejection episodes and vascular complications were evaluated. 87 patients had Inpatient Follow-up (IF) and 98 Outpatient Follow-up (OF). Basal characteristics were similar, with significant differences in immunosuppression (tacrolimus IF 44.8% vs. OF 90.8%, and mycophenolate IF 86.2% vs OF 100%, both p values < 0.001) and age (IF 52 ± 11.5 years vs. OF 56.1 ± 11 years, p = 0.016). In the OF group more clinical visits were performed (IF 10 vs. OF 13, p < 0.001) while hospital stay was lower (IF 23 days vs. OF 3 days, p < 0.001). The rate of infection, rejection, and vascular complications was similar. No difference was found in 1-year mortality (IF 2.3% vs. 1.0%, p = 0.60). First year post-cardiac transplantation outpatient follow-up seems to be feasible and safe in terms of infection, rejection, vascular complications and mortality.
Resumo:
First year follow-up after heart transplantation requires invasive tests. Although patients used to be hospitalized for this purpose, ambulatory invasive procedures now offer the possibility of outpatient follow-up. The feasibility and security of this strategy is unknown. From 2007 we transitioned to outpatient follow-up. We have retrospectively reviewed the clinical course of the outpatient group (2007 to 2014) and an inpatient group (2000–2006). Basal characteristics, hospital stay, infections, rejection episodes and vascular complications were evaluated. 87 patients had Inpatient Follow-up (IF) and 98 Outpatient Follow-up (OF). Basal characteristics were similar, with significant differences in immunosuppression (tacrolimus IF 44.8% vs. OF 90.8%, and mycophenolate IF 86.2% vs OF 100%, both p values < 0.001) and age (IF 52 ± 11.5 years vs. OF 56.1 ± 11 years, p = 0.016). In the OF group more clinical visits were performed (IF 10 vs. OF 13, p < 0.001) while hospital stay was lower (IF 23 days vs. OF 3 days, p < 0.001). The rate of infection, rejection, and vascular complications was similar. No difference was found in 1-year mortality (IF 2.3% vs. 1.0%, p = 0.60). First year post-cardiac transplantation outpatient follow-up seems to be feasible and safe in terms of infection, rejection, vascular complications and mortality.
Resumo:
The shortage of donor hearts for patients with end stage heart failure has accelerated the development of ventricular assist devices (VAD) that act as a replacement heart. Mechanical devices involving pulsatile, axial and centrifugal devices have been proposed. Recent clinical developments indicate that centrifugal devices are not only beneficial for bridge to transplantation applications, but may also aid myocardial recovery. The results of a recent study have shown that patients who received a VAD have extended lives and improved quality of life compared to recipients of drug therapy. Unfortunately 25% of these patients develop right heart failure syndrome, sepsis and multi-organ failure. It was reported that 17% of patients initially receiving an LVAD later required a right ventricular assist device (RVAD). Hence, current research focus is in the development of a bi-ventricular assist device (BVAD). Current BVAD technology is either too bulky or necessitates having to implant two pumps working independently. The latter requires two different controllers for each pump leading to the potential complication of uneven flow dynamics and the requirements for a large amount of body space. This paper illustrates the combination of the LVAD and RVAD as one complete device to augment the function of both the left and right cardiac chambers with double impellers. The proposed device has two impellers rotating in counter directions, hence eliminating the necessity of the body muscles and tubing/heart connection to restrain the pump. The device will also have two separate chambers with independent rotating impeller for the left and right chambers. A problem with centrifugal impellers is the fluid stagnation underneath the impeller. This leads to thrombosis and blood clots.This paper presents the design, construction and location of washout hole to prevent thrombus for a Bi-VAD centrifugal pump. Results using CFD will be used to illustrate the superiority of our design concept in terms of preventing thrombus formation and hemolysis.
Resumo:
The ethics of creating ‘saviour siblings’ for the benefit of another has received much attention, but little consideration has been given to the legal position of those saviours born who may be asked to provide tissue for transplantation to another during childhood. This article examines the ethical issues surrounding minor donation as well as the existing legal framework in the UK and Australia that regulates minors providing tissue for the benefit of another. Against this background the position of minor saviours, who are called upon to donate bone marrow or peripheral blood stem cells, is examined. This analysis suggests that the law does not provide sufficient protection for minor saviours who are called upon to donate to another. It is argued that specific ethical obligations are owed to saviours—that ought to be reflected in the law—in order to protect them from exploitation while they remain minors.
Resumo:
Currently, well-established clinical therapeutic approaches for bone reconstruction are restricted to the transplantation of autografts and allografts, and the implantation of metal devices or ceramic-based implants to assist bone regeneration. Bone grafts possess osteoconductive and osteoinductive properties, however they are limited in access and availability and associated with donor site morbidity, haemorrhage, risk of infection, insufficient transplant integration, graft devitalisation, and subsequent resorption resulting in decreased mechanical stability. As a result, recent research focuses on the development of alternative therapeutic concepts. The field of tissue engineering has emerged as an important approach to bone regeneration. However, bench to bedside translations are still infrequent as the process towards approval by regulatory bodies is protracted and costly, requiring both comprehensive in vitro and in vivo studies. The subsequent gap between research and clinical translation, hence commercialization, is referred to as the ‘Valley of Death’ and describes a large number of projects and/or ventures that are ceased due to a lack of funding during the transition from product/technology development to regulatory approval and subsequently commercialization. One of the greatest difficulties in bridging the Valley of Death is to develop good manufacturing processes (GMP) and scalable designs and to apply these in pre-clinical studies. In this article, we describe part of the rationale and road map of how our multidisciplinary research team has approached the first steps to translate orthopaedic bone engineering from bench to bedside byestablishing a pre-clinical ovine critical-sized tibial segmental bone defect model and discuss our preliminary data relating to this decisive step.
Resumo:
Heart disease is attributed as the highest cause of death in the world. Although this could be alleviated by heart transplantation, there is a chronic shortage of donor hearts and so mechanical solutions are being considered. Currently, many Ventricular Assist Devices (VADs) are being developed worldwide in an effort to increase life expectancy and quality of life for end stage heart failure patients. Current pre-clinical testing methods for VADs involve laboratory testing using Mock Circulation Loops (MCLs), and in vivo testing in animal models. The research and development of highly accurate MCLs is vital to the continuous improvement of VAD performance. The first objective of this study was to develop and validate a mathematical model of a MCL. This model could then be used in the design and construction of a variable compliance chamber to improve the performance of an existing MCL as well as form the basis for a new miniaturised MCL. An extensive review of literature was carried out on MCLs and mathematical modelling of their function. A mathematical model of a MCL was then created in the MATLAB/SIMULINK environment. This model included variable features such as resistance, fluid inertia and volumes (resulting from the pipe lengths and diameters); compliance of Windkessel chambers, atria and ventricles; density of both fluid and compressed air applied to the system; gravitational effects on vertical columns of fluid; and accurately modelled actuators controlling the ventricle contraction. This model was then validated using the physical properties and pressure and flow traces produced from a previously developed MCL. A variable compliance chamber was designed to reproduce parameters determined by the mathematical model. The function of the variability was achieved by controlling the transmural pressure across a diaphragm to alter the compliance of the system. An initial prototype was tested in a previously developed MCL, and a variable level of arterial compliance was successfully produced; however, the complete range of compliance values required for accurate physiological representation was not able to be produced with this initial design. The mathematical model was then used to design a smaller physical mock circulation loop, with the tubing sizes adjusted to produce accurate pressure and flow traces whilst having an appropriate frequency response characteristic. The development of the mathematical model greatly assisted the general design of an in vitro cardiovascular device test rig, while the variable compliance chamber allowed simple and real-time manipulation of MCL compliance to allow accurate transition between a variety of physiological conditions. The newly developed MCL produced an accurate design of a mechanical representation of the human circulatory system for in vitro cardiovascular device testing and education purposes. The continued improvement of VAD test rigs is essential if VAD design is to improve, and hence improve quality of life and life expectancy for heart failure patients.
Resumo:
Background Interdialytic weight gain (IDWG) can be reduced by lowering the dialysate sodium concentration ([Na]) in haemodialysis patients. It has been assumed that this is because thirst is reduced, although this has been difficult to prove. We compared thirst patterns in stable haemodialysis patients with high and low IDWG using a novel technique and compared the effect of low sodium dialysis (LSD) with normal sodium dialysis (NSD). Methods Eight patients with initial high IDWG and seven with low IDWG completed hourly visual analogue ratings of thirst using a modified palmtop computer during the dialysis day and the interdialytic day. The dialysate [Na] was progressively reduced by up to 5 mmol/l over five treatments. Dialysis continued at the lowest attained [Na] for 2 weeks and the measurements were repeated. The dialysate [Na] then returned to baseline and the process was repeated. Results Baseline interdialytic day mean thirst was higher than the dialysis day mean for the high IDWG group (49.9±14.0 vs 36.2±16.6) and higher than the low weight gain group (49.9±14.0 vs 34.1±14.6). This trend persisted on LSD, but there was a pronounced increase in post-dialysis thirst scores for both groups (high IDWG: 46±13 vs 30±21; low IDWG: 48±24 vs 33±18). The high IDWG group demonstrated lower IDWG during LSD than NSD (2.23±0.98 vs 2.86±0.38 kg; P<0.05). Conclusions Our results indicate that patients with high IDWG experience more intense feelings of thirst on the interdialytic day. LSD reduces their IDWG, but paradoxically increases thirst in the immediate post-dialysis period.
Resumo:
Background Some dialysis patients fail to comply with their fluid restriction causing problems due to volume overload. These patients sometimes blame excessive thirst. There has been little work in this area and no work documenting polydipsia among peritoneal dialysis (PD) patients. Methods We measured motivation to drink and fluid consumption in 46 haemodialysis patients (HD), 39 PD patients and 42 healthy controls (HC) using a modified palmtop computer to collect visual analogue scores at hourly intervals. Results Mean thirst scores were markedly depressed on the dialysis day (day 1) for HD (P<0.0001). The profile for day 2 was similar to that of HC. PD generated consistently higher scores than HD day 1 and HC (P = 0.01 vs. HC and P<0.0001 vs HD day 1). Reported mean daily water consumption was similar for HD and PD with both significantly less than HC (P<0.001 for both). However, measured fluid losses were similar for PD and HC whilst HD were lower (P<0.001 for both) suggesting that the PD group may have underestimated their fluid intake. Conclusion Our results indicate that HD causes a protracted period of reduced thirst but that the population's thirst perception is similar to HC on the interdialytic day despite a reduced fluid intake. In contrast, the PD group recorded high thirst scores throughout the day and were apparently less compliant with their fluid restriction. This is potentially important because the volume status of PD patients influences their survival.
Resumo:
Background We have used serial visual analogue scores to demonstrate disturbances of the appetite profile in dialysis patients. This is potentially important as dialysis patients are prone to malnutrition yet have a lower nutrient intake than controls. Appetite disturbance may be influenced by accumulation of appetite inhibitors such as leptin and cholecystokinin (CCK) in dialysis patients. Methods Fasting blood samples were drawn from 43 controls, 50 haemodialysis (HD) and 39 peritoneal dialysis (PD) patients to measure leptin and CCK. Hunger and fullness scores were derived from profiles compiled using hourly visual analogue scores. Nutrient intake was derived from 3 day dietary records. Results Fasting CCK was elevated for PD (6.73 ± 4.42 ng/l vs control 4.99 ± 2.23 ng/l, P < 0.05; vs HD 4.43 ± 2.15 ng/l, P < 0.01). Fasting CCK correlated with the variability of the hunger (r = 0.426, P = 0.01) and fullness (r = 0.52, P = 0.002) scores for PD. There was a notable relationship with the increase in fullness after lunch for PD (r = 0.455, P = 0.006). When well nourished PD patients were compared with their malnourished counterparts, CCK was higher in the malnourished group (P = 0.004). Leptin levels were higher for the dialysis patients than controls (HD and PD, P < 0.001) with pronounced hyperleptinaemia evident in some PD patients. Control leptin levels demonstrated correlation with fullness scores (e.g. peak fullness, r = 0.45, P = 0.007) but the dialysis patients did not. PD nutrient intake (energy and protein intake, r = -0.56, P < 0.0001) demonstrated significant negative correlation with leptin. Conclusion Increased CCK levels appear to influence fullness and hunger perception in PD patients and thus may contribute to malnutrition. Leptin does not appear to affect perceived appetite in dialysis patients but it may influence nutrient intake in PD patients via central feeding centres.
Resumo:
Background Malnutrition is common among dialysis patients and is associated with an adverse outcome. One cause of this is a persistent reduction in nutrient intake, suggesting an abnormality of appetite regulation. Methods We used a novel technique to describe the appetite profile in 46 haemodialysis (HD) patients and 40 healthy controls. The Electronic Appetite Rating System (EARS) employs a palmtop computer to collect hourly ratings of motivation to eat and mood. We collected data on hunger, desire to eat, fullness, and tiredness. HD subjects were monitored on the dialysis day and the interdialytic day. Controls were monitored for 1 or 2 days. Results Temporal profiles of motivation to eat for the controls were similar on both days. Temporal profiles of motivation to eat for the HD group were lower on the dialysis day. Mean HD scores were not significantly different from controls. Dietary records indicated that dialysis patients consumed less food than controls. Conclusions Our data indicate that the EARS can be used to monitor subjective appetite states continuously in a group of HD patients. A HD session reduces hunger and desire to eat. Patients feel more tired after dialysis. This does not correlate with their hunger score, but does correlate with their fullness rating. Nutrient intake is reduced, suggesting a resetting of appetite control for the HD group. The EARS may be useful for intervention studies.
Resumo:
The aim of this study was to evaluate the healing of class III furcation defects following transplantation of autogenous periosteal cells combined with b-tricalcium phosphate (b-TCP). Periosteal cells obtained from Beagle dogs’ periosteum explant cultures, were inoculated onto the surface of b-TCP. Class III furcation defects were created in the mandibular premolars. Three experimental groups were used to test the defects’ healing: group A, b-TCP seeded with periosteal cells were transplanted into the defects; group B, b-TCP alone was used for defect filling; and group C, the defect was without filling materials. Twelve weeks post surgery, the tissue samples were collected for histology, immunohistology and X-ray examination. It was found that both the length of newly formed periodontal ligament and the area of newly formed alveolar bone in group A, were significantly increased compared with both group B and C. Furthermore, both the proportion of newly formed periodontal ligament and newly formed alveolar bone in group A were much higher than those of group B and C. The quantity of cementum and its percentage in the defects (group A) were also significantly higher than those of group C. These results indicate that autogenous periosteal cells combined with b-TCP application can improve periodontal tissue regeneration in class III furcation defects.
Resumo:
The pore architecture of scaffolds is known to play a critical role in tissue engineering as it provides the vital framework for seeded cells to organize into a functioning tissue. In this report we have investigated the effects of different concentrations of silk fibroin protein on three-dimensional (3D) scaffold pore microstructure. Four pore size ranges of silk fibroin scaffolds were made by the freeze drying technique, with the pore sizes ranging from 50 to 300 lm. The pore sizes of the scaffolds decreased as the concentration of fibroin protein increased. Human bone marrow mesenchymal stromal cells (BMSC) transfected with the BMP7 gene were cultured in these scaffolds. A cell viability colorimetric assay, alkaline phosphatase assay and reverse transcription-polymerase chain reaction were performed to analyze the effect of pore size on cell growth, the secretion of extracellular matrix (ECM) and osteogenic differentiation. Cell migration in 3D scaffolds was confirmed by confocal microscopy. Calvarial defects in SCID mice were used to determine the bone forming ability of the silk fibroin scaffolds incorporating BMSC expressing BMP7. The results showed that BMSC expressing BMP7 preferred a pore size between 100 and 300 lm in silk fibroin protein fabricated scaffolds, with better cell proliferation and ECM production. Furthermore, in vivo transplantation of the silk fibroin scaffolds combined with BMSC expressing BMP7 induced new bone formation. This study has shown that an optimized pore architecture of silk fibroin scaffolds can modulate the bioactivity of BMP7-transfected BMSC in bone formation.
Resumo:
In order to effect permanent closure in burns patients suffering from full thickness wounds, replacing their skin via split thickness autografting, is essential. Dermal substitutes in conjunction with widely meshed split thickness autografts (+/- cultured keratinocytes) reduce scarring at the donor and recipient sites of burns patients by reducing demand for autologous skin (both surface area and thickness), without compromising dermal delivery at the wound face. Tissue engineered products such as Integra consist of a dermal template which is rapidly remodelled to form a neodermis, at which time the temporary silicone outer layer is removed and replaced with autologous split thickness skin. Whilst provision of a thick tissue engineered dermis at full thickness burn sites reduces scarring, it is hampered by delays in vascularisation which results in clinical failure. The ultimate success of any skin graft product is dependent upon a number of basic factors including adherence, haemostasis and in the case of viable tissue grafts, success is ultimately dependent upon restoration of a normal blood supply, and hence this study. Ultimately, the goal of this research is to improve the therapeutic properties of tissue replacements, through impregnation with growth factors aimed at stimulating migration and proliferation of microvascular endothelial cells into the donor tissue post grafting. For the purpose of my masters, the aim was to evaluate the responsiveness of a dermal microvascular endothelial cell line to growth factors and haemostatic factors, in the presence of the glycoprotein vitronectin. Vitronectin formed the backbone for my hypothesis and research due to its association with both epithelial and, more specifically, endothelial migration and proliferation. Early work using a platform technology referred to as VitroGro (Tissue Therapies Ltd), which is comprised of vitronectin bound BP5/IGF-1, aided keratinocyte proliferation. I hypothesised that this result would translate to another epithelium - endothelium. VitroGro had no effect on endothelial proliferation or migration. Vitronectin increases the presence of Fibroblast Growth Factor (FGF) and Vascular Endothelial Growth Factor (VEGF) receptors, enhancing cell responsiveness to their respective ligands. So, although Human Microvascular Endothelial Cell line 1 (HMEC-1) VEGF receptor expression is generally low, it was hypothesised that exposure to vitronectin would up-regulate this receptor. HMEC-1 migration, but not proliferation, was enhanced by vitronectin bound VEGF, as well as vitronectin bound Epidermal Growth Factor (EGF), both of which could be used to stimulate microvascular endothelial cell migration for the purpose of transplantation. In addition to vitronectin's synergy with various growth factors, it has also been shown to play a role in haemostasis. Vitronectin binds thrombin-antithrombin III (TAT) to form a trimeric complex that takes on many of the attributes of vitronectin, such as heparin affinity, which results in its adherence to endothelium via heparan sulfate proteoglycans (HSP), followed by unaltered transcytosis through the endothelium, and ultimately its removal from the circulation. This has been documented as a mechanism designed to remove thrombin from the circulation. Equally, it could be argued that it is a mechanism for delivering vitronectin to the matrix. My results show that matrix-bound vitronectin dramatically alters the effect that conformationally altered antithrombin three (cATIII) has on proliferation of microvascular endothelial cells. cATIII stimulates HMEC-1 proliferation in the presence of matrix-bound vitronectin, as opposed to inhibiting proliferation in its absence. Binding vitronectin to tissues and organs prior to transplant, in the presence of cATIII, will have a profound effect on microvascular infiltration of the graft, by preventing occlusion of existing vessels whilst stimulating migration and proliferation of endothelium within the tissue.