979 resultados para scoring rubrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides guidance on how to address the 49 questions of the Australian Weed Risk Assessment (WRA) system. The WRA was developed in Australia in 1999, and has since been widely adapted for different regions. As interest in implementation and results comparison has increased, the issue of consistency in answering and scoring the questions has become important. As a result, this guidance was developed during the 2007 International WRA Workshop. Suggestions on search methods, data sources and examples are also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DArTseq technology is potentially the most appropriate system to discover hundreds of polymorphic genomic loci, scoring thousands of unique genomic-wide DNA fragments in one single experiment, without requiring existing DNA sequence information. The DArT complexity reduction approach in combination with Illumina short read sequencing (Hiseq2000) was applied. To test the application of DArTseq technology in pineapple, a reference population of 13 Ananas genotypes from primitive wild accessions to modern cultivars was used. In a comparison of 3 systems, the combination of restriction enzymes PstI and MseI performed the best producing 18,900 DArT markers and close to 20,000 SNPs. Based on these markers genetic relationships between the samples were identified and a dendrogram was generated. The topography of the tree corresponds with our understanding of the genetic relationships between the genotypes. Importantly, the replicated samples of all genotypes have a dissimilarity of close to 0.0 and occupy the same positions on the tree, confirming high reproducibility of the markers detected. Eventually it is planned that molecular markers will be identified that are associated with resistance to Phytophthora cinnamomi (Pc), the most economically important pathogen of pineapple in Australia, as genetic resistance is known to exist within the Ananas. Marker assisted selection can then be utilized in a pineapple breeding program to develop cultivars resistant to Pc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variation in the reaction of cereal cultivars to crown rot caused by Fusarium spp., in particular Fusarium pseudograminearum, was identified over 50 yrs ago, however the parameters and pathways of infection by F. pseudograminearum remain poorly understood. Seedlings of wheat, barley and oat genotypes that differ in susceptibility to crown rot were inoculated with a mixture of F. pseudograminearum isolates. Seedlings were harvested from 7 to 42 days after inoculation and expanded plant parts were rated for severity of visible disease symptoms. Individual leaf sheaths were placed onto nutrient media and fungal colonies emerging from the leaf sheathes were counted to estimate the degree of fungal spread within the host tissue. Significant differences in both the timing and the severity of disease symptoms were observed in the leaf sheath tissues of different host genotypes. Across all genotypes and plant parts examined, the development of visible symptoms closely correlated with the spread of the fungus into that tissue. The degree of infection of the coleoptile and sub-crown internode varied between genotypes, but was unrelated to the putative resistance of the host. In contrast leaf sheath tissues of the susceptible barley cv. Tallon and bread wheat cv. Puseas scored higher disease ratings and consistently showed faster, earlier spread of the fungus into younger tissues than infections of the oat cv. Cleanleaf or the wheat lines 2-49 and CPI 133814. While initial infections usually spread upwards from near the base of the first leaf sheath, the pathogen did not appear to invade younger leaf sheaths only from the base, but rather spread laterally across from older leaf sheaths into younger, subtended leaf sheaths, particularly as disease progressed. Early in the infection of each leaf sheath, disease symptoms in the partially resistant genotypes were less severe than in susceptible genotypes, however as infected leaf sheaths aged, differences between genotypes lessened as disease symptoms approached maximum values. Hence, while visual scoring of disease symptoms on leaf sheaths is a reliable comparative measure of the degree of fungal infection, differences between genotypes in the development of disease symptoms are more reliably assessed using the most recently expanded leaf sheaths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a highly sensitive genome wide search method for recessive mutations. The method is suitable for distantly related samples that are divided into phenotype positives and negatives. High throughput genotype arrays are used to identify and compare homozygous regions between the cohorts. The method is demonstrated by comparing colorectal cancer patients against unaffected references. The objective is to find homozygous regions and alleles that are more common in cancer patients. We have designed and implemented software tools to automate the data analysis from genotypes to lists of candidate genes and to their properties. The programs have been designed in respect to a pipeline architecture that allows their integration to other programs such as biological databases and copy number analysis tools. The integration of the tools is crucial as the genome wide analysis of the cohort differences produces many candidate regions not related to the studied phenotype. CohortComparator is a genotype comparison tool that detects homozygous regions and compares their loci and allele constitutions between two sets of samples. The data is visualised in chromosome specific graphs illustrating the homozygous regions and alleles of each sample. The genomic regions that may harbour recessive mutations are emphasised with different colours and a scoring scheme is given for these regions. The detection of homozygous regions, cohort comparisons and result annotations are all subjected to presumptions many of which have been parameterized in our programs. The effect of these parameters and the suitable scope of the methods have been evaluated. Samples with different resolutions can be balanced with the genotype estimates of their haplotypes and they can be used within the same study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on a study to evaluate technology-based processes for assessment moderation. The aim was to evaluate standard features found in an institutional Learning Management System, and their compatibility with the values and practices of a large teaching team. The process used an online discussion board forum for tutors, the paring of more experienced tutors with those new to the process, and further meetings conducted in both face-to-face and web conferencing environments. Online rubrics were used for assessing student work and the provision of feedback. A focus group conducted after marking was concluded and the analysis of the discussion board forum demonstrated a strong community of practice with a shared understanding of assessment requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Ankylosing spondylitis (AS) is an immune-mediated arthritis particularly targeting the spine and pelvis and is characterised by inflammation, osteoproliferation and frequently ankylosis. Current treatments that predominately target inflammatory pathways have disappointing efficacy in slowing disease progression. Thus, a better understanding of the causal association and pathological progression from inflammation to bone formation, particularly whether inflammation directly initiates osteoproliferation, is required. Methods The proteoglycan-induced spondylitis (PGISp) mouse model of AS was used to histopathologically map the progressive axial disease events, assess molecular changes during disease progression and define disease progression using unbiased clustering of semi-quantitative histology. PGISp mice were followed over a 24-week time course. Spinal disease was assessed using a novel semi-quantitative histological scoring system that independently evaluated the breadth of pathological features associated with PGISp axial disease, including inflammation, joint destruction and excessive tissue formation (osteoproliferation). Matrix components were identified using immunohistochemistry. Results Disease initiated with inflammation at the periphery of the intervertebral disc (IVD) adjacent to the longitudinal ligament, reminiscent of enthesitis, and was associated with upregulated tumor necrosis factor and metalloproteinases. After a lag phase, established inflammation was temporospatially associated with destruction of IVDs, cartilage and bone. At later time points, advanced disease was characterised by substantially reduced inflammation, excessive tissue formation and ectopic chondrocyte expansion. These distinct features differentiated affected mice into early, intermediate and advanced disease stages. Excessive tissue formation was observed in vertebral joints only if the IVD was destroyed as a consequence of the early inflammation. Ectopic excessive tissue was predominantly chondroidal with chondrocyte-like cells embedded within collagen type II- and X-rich matrix. This corresponded with upregulation of mRNA for cartilage markers Col2a1, sox9 and Comp. Osteophytes, though infrequent, were more prevalent in later disease. Conclusions The inflammation-driven IVD destruction was shown to be a prerequisite for axial disease progression to osteoproliferation in the PGISp mouse. Osteoproliferation led to vertebral body deformity and fusion but was never seen concurrent with persistent inflammation, suggesting a sequential process. The findings support that early intervention with anti-inflammatory therapies will be needed to limit destructive processes and consequently prevent progression of AS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The fecal neutrophil-derived proteins calprotectin and lactoferrin have proven useful surrogate markers of intestinal inflammation. The aim of this study was to compare fecal calprotectin and lactoferrin concentrations to clinically, endoscopically, and histologically assessed Crohn’s disease (CD) activity, and to explore the suitability of these proteins as surrogate markers of mucosal healing during anti-TNFα therapy. Furthermore, we studied changes in the number and expression of effector and regulatory T cells in bowel biopsy specimens during anti-TNFα therapy. Patients and methods: Adult CD patients referred for ileocolonoscopy (n=106 for 77 patients) for various reasons were recruited (Study I). Clinical disease activity was assessed with the Crohn’s disease activity index (CDAI) and endoscopic activity with both the Crohn’s disease index of severity (CDEIS) and the simple endoscopic score for Crohn’s disease (SES-CD). Stool samples for measurements of calprotectin and lactoferrin, and blood samples for CRP were collected. For Study II, biopsy specimens were obtained from the ileum and the colon for histologic activity scoring. In prospective Study III, after baseline ileocolonoscopy, 15 patients received induction with anti-TNFα blocking agents and endoscopic, histologic, and fecal-marker responses to therapy were evaluated at 12 weeks. For detecting changes in the number and expression of effector and regulatory T cells, biopsy specimens were taken from the most severely diseased lesions in the ileum and the colon (Study IV). Results: Endoscopic scores correlated significantly with fecal calprotectin and lactoferrin (p<0.001). Both fecal markers were significantly lower in patients with endoscopically inactive than with active disease (p<0.001). In detecting endoscopically active disease, the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) for calprotectin ≥200 μg/g were 70%, 92%, 94%, and 61%; for lactoferrin ≥10 μg/g they were 66%, 92%, 94%, and 59%. Accordingly, the sensitivity, specificity, PPV, and NPV for CRP >5 mg/l were 48%, 91%, 91%, and 48%. Fecal markers were significantly higher in active colonic (both p<0.001) or ileocolonic (calprotectin p=0.028, lactoferrin p=0.004) than in ileal disease. In ileocolonic or colonic disease, colon histology score correlated significantly with fecal calprotectin (r=0.563) and lactoferrin (r=0.543). In patients receiving anti-TNFα therapy, median fecal calprotectin decreased from 1173 μg/g (range 88-15326) to 130 μg/g (13-1419) and lactoferrin from 105.0 μg/g (4.2-1258.9) to 2.7 μg/g (0.0-228.5), both p=0.001. The relation of ileal IL-17+ cells to CD4+ cells decreased significantly during anti-TNF treatment (p=0.047). The relation of IL-17+ cells to Foxp3+ cells was higher in the patients’ baseline specimens than in their post-treatment specimens (p=0.038). Conclusions: For evaluation of CD activity, based on endoscopic findings, more sensitive surrogate markers than CDAI and CRP were fecal calprotectin and lactoferrin. Fecal calprotectin and lactoferrin were significantly higher in endoscopically active disease than in endoscopic remission. In both ileocolonic and colonic disease, fecal markers correlated closely with histologic disease activity. In CD, these neutrophil-derived proteins thus seem to be useful surrogate markers of endoscopic activity. During anti-TNFα therapy, fecal calprotectin and lactoferrin decreased significantly. The anti-TNFα treatment was also reflected in a decreased IL-17/Foxp3 cell ratio, which may indicate improved balance between effector and regulatory T cells with treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the present study was to investigate the effects of low-intensity ultrasound on bioabsorbable self-reinforced poly-L-lactide (SR-PLLA) screws and on fracture healing after SR-PLLA device fixation in experimental and clinical cancellous bone fracture. In the first experimental study, the assessment of the mechanical strengths of the SR-PLLA screws was performed after 12 weeks of daily 20-minute ultrasound exposure in vitro. In the second experimental study, 32 male Wistar rats with an experimental distal femur osteotomy fixed with an SR-PLLA rod were exposed for daily low-intensity ultrasound treatment for 21 days. The effects on the healing bone were assessed. The clinical studies consist of three prospective, randomized, and placebo-controlled series of dislocated lateral malleolar fractures fixed with one SR-PLLA screw. The total number of the patients in these series was 52. Half of the patients were provided randomly with a sham ultrasound device. The patients underwent ultrasound therapy 20 minutes daily for six weeks. Radiological bone healing was assessed both by radiographs at two, six, nine, and 12 weeks and by multidetector computed tomography (MDCT) scans at two weeks, nine weeks, and 18 months. Bone mineral density was assessed by dual-energy X-ray absorptiometry (DXA). The clinical outcome was assessed by both Olerud-Molander scoring and clinical examination of the ankle. Low-intensity ultrasound had no effects on the mechanical properties and degradation behaviour of the SR-PLLA screws in vitro. There were no obvious signs of low-intensity ultrasound-induced enhancement in the bone healing in SR-PLLA-rod-fixed metaphyseal distal femur osteotomy in rats. The biocompatibility of low-intensity ultrasound treatment and SR-PLLA was found to be good. In the clinical series low-intensity ultrasound was observed to have no obvious effects on the bone mineral density of the fractured lateral malleolus. There were no obvious differences in the radiological bone healing times of the SR-PLLA-screw-fixed lateral malleolar fractures after low-intensity ultrasound treatment. Low-intensity ultrasound did not have any effects on radiological bone morphology, bone mineral density or clinical outcome 18 months after the injury. There were no obvious findings in the present study to support the hypothesis that low-intensity pulsed ultrasound enhances bone healing in SR-PLLA-rod-fixed experimental metaphyseal distal femur osteotomy in rats or in clinical SR-PLLA-screw-fixed lateral malleolar fractures. It is important to limit the conclusions of the present set of studies only to lateral malleolar fractures fixed with an SR-PLLA screw.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to evaluate long-term results of operative treatment for Hirschsprung's disease(HD) and internal anal sphincter achalasia. Fecal continence and quality of life were evaluated by a questionnaire in 100 adult patients who had undergone surgery for HD, during 1950-75. Fecal continence was evaluated using a numerical scoring described by Holschneider. Fifty-four of the 100 patients underwent clinical examination, rigid sigmoidoscopy and manometric evaluation. In anorectal manometry basal resting pressure(BRP)and maximal squeeze pressure(MSP) were measured and voluntary sphincter force(VSF) was calculated by subtracting the BRP from MSP. The results of operative treatment for adult HD were compared with the results of the patients operated in childhood. In adult HD the symptoms are such mild that the patients attain adolescence or even adulthood. The patients with HD and cartilage-hair-hypoplasia were specifically evaluated. The outcome of the patients with internal anal sphincter achalasia operated on by myectomy was evaluated by a questionnaire and continence was evaluated using a numerical scoring described by Holschneider. Of the 100 patients operated on for HD 38 patients had completely normal bowel habits. A normal or good continence score was found in 91 our of 100 patients. Nine patients had fair continence. One of the patients with fair continence had Down's syndrome and two were mentally retarded for other reasons. Only one patient suffered from constipation. In anorectal manometry the difference in BRP between patients with normal and good continence was statistically significant, whereas the difference between good and fair continence groups was not statistically significant. The differences on MSP and VSF between patient groups with different continence outcome were not statistically significant. The differences between patient groups and normal controls were statistically significant in BRP and MSP. In VSF there was not statistically significant difference between the patients and the normal controls. The VSF reflects the working power of the muscles including external sphincter, levator ani and gluteal muscles. The patients operated at adult age had as good continence as patients operated in childhood. The patients with HD and cartilage-hair-hypoplasia had much more morbidity and mortality than non-cartilage-hair-hypoplasia HD patients. The mortality was as high as 38%. In patients with internal anal sphincter achalasia the constipation was cured or alleviated by myectomy whereas a significant number suffered from soiling-related social problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.