986 resultados para scoring systems


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Malignant mesothelioma (MM) is a fatal tumour of increasing incidence which is related to asbestos exposure. This work evaluated expression in MM of Epidermal Growth Factor Receptor (EGFR) by immunohistochemistry in 168 tumour sections and its correlations with clinicopathological and biological factors. The microvessel density (MVD) was derived from CD34 immunostained sections. Hematoxylin and eosin stained sections were examined for intratumoural necrosis. COX-2 protein expression was evaluated with semi-quantitative Western blotting of homogenised tumour supernatants (n = 45). EGFR expression was correlated with survival by Kaplan-Meier and log rank analysis. Univariate and multivariate Cox proportional hazards models were used to compare the effects of EGFR with clinicopathological and biological prognostic factors and prognostic scoring systems. EGFR expression was identified in 74 cases (44%) and correlated with epithelioid cell type (p < 0.0001), good performance status (p < 0.0001), the absence of chest pain (p < 0.0001) and the presence of TN (p = 0.004), but not MVD or COX-2. EGFR expression was a good prognostic factor in univariate analysis (p = 0.01). Independent indicators of poor prognosis in multivariate analysis were non-epithelioid cell type (p = 0.0001), weight loss, performance status and WBC > 8.3 × 10 9 L -1. EGFR status was not an independent prognostic factor. EGFR expression in MM correlates with epithelioid histology and TN. EGFR may be a target for selective therapies in MM. © 2006 Elsevier Ireland Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the UK mortality from malignant mesothelioma (MM) is likely to more than double over the next 20 years and despite advances in surgery, chemotherapy and radiation treatment the overall prognosis for patients remains poor. A number of scoring systems based on assessment of clinicopathological features of patients with the disease have been developed but the search continues for further prognostic indicators. Angiogenesis, tumour necrosis (TN), epidermal growth factor receptor (EGFR) expression, cyclooxygenase-2 (COX-2) and matrix metalloproteinases (MMPs) have been linked with poor prognosis in some types of solid tumour and their relevance as prognostic factors in malignant mesothelioma is examined in this paper. © 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective To determine the relative effects of genetic and environmental factors in susceptibility to ankylosing spondylitis (AS). Methods Twins with AS were identified from the Royal National Hospital for Rheumatic Diseases database. Clinical and radiographic examinations were performed to establish diagnoses, and disease severity was assessed using a combination of validated scoring systems. HLA typing for HLA-B27, HLA-B60, and HLA-DR1 was performed by polymerase chain reaction with sequence- specific primers, and zygosity was assessed using microsatellite markers. Genetic and environmental variance components were assessed with the program Mx, using data from this and previous studies of twins with AS. Results Six of 8 monozygotic (MZ) twin pairs were disease concordant, compared with 4 of 15 B27-positive dizygotic (DZ) twin pairs (27%) and 4 of 32 DZ twin pairs overall (12.5%). Nonsignificant increases in similarity with regard to age at disease onset and all of the disease severity scores assessed were noted in disease-concordant MZ twins compared with concordant DZ twins. HLA-B27 and B60 were associated with the disease in probands, and the rate of disease concordance was significantly increased among DZ twin pairs in which the co- twin was positive for both B27 and DR1. Additive genetic effects were estimated to contribute 97% of the population variance. Conclusion Susceptibility to AS is largely genetically determined, and the environmental trigger for the disease is probably ubiquitous. HLA-B27 accounts for a minority of the overall genetic susceptibility to AS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The prioritisation of potential agents on the basis of likely efficacy is an important step in biological control because it can increase the probability of a successful biocontrol program, and reduce risks and costs. In this introductory paper we define success in biological control, review how agent selection has been approached historically, and outline the approach to agent selection that underpins the structure of this special issue on agent selection. Developing criteria by which to judge the success of a biocontrol agent (or program) provides the basis for agent selection decisions. Criteria will depend on the weed, on the ecological and management context in which that weed occurs, and on the negative impacts that biocontrol is seeking to redress. Predicting which potential agents are most likely to be successful poses enormous scientific challenges. 'Rules of thumb', 'scoring systems' and various conceptual and quantitative modelling approaches have been proposed to aid agent selection. However, most attempts have met with limited success due to the diversity and complexity of the systems in question. This special issue presents a series of papers that deconstruct the question of agent choice with the aim of progressively improving the success rate of biological control. Specifically they ask: (i) what potential agents are available and what should we know about them? (ii) what type, timing and degree of damage is required to achieve success? and (iii) which potential agent will reach the necessary density, at the right time, to exert the required damage in the target environment?

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to identify preoperative predictors of length of stay after primary total hip arthroplasty in a patient population reflecting current trends toward shorter hospitalization and using readily obtainable factors that do not require scoring systems. A retrospective review of 112 consecutive patients was performed. High preoperative pain level and patient expectation of discharge to extended care facilities (ECFs) were the only significant multivariable predictors of hospitalization extending beyond 2 days (P=0.001 and P<0.001 respectively). Patient expectation remained significant after adjusting for Medicare's 3-day requirement for discharge to ECFs (P<0.001). The study was adequately powered to analyze the variables in the multivariable logistic regression model, which had a concordance index of 0.857.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to identify the preoperative predictors of hospital length of stay after primary total knee arthroplasty in a patient population reflecting current trends toward shorter hospitalization and using readily obtainable factors that do not require scoring systems. A single-center, multi-surgeon retrospective chart review of two hundred and sixty consecutive patients who underwent primary total knee arthroplasty was performed. The mean length of stay was 3.0 days. Among the different variables studied, increasing comorbidities, lack of adequate assistance at home, and bilateral surgery were the only multivariable significant predictors of longer length of stay. The study was adequately powered for statistical analyses and the concordance index of the multivariable logistic regression model was 0.815.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The diagnosis of myelodysplastic syndrome (MDS) currently relies primarily on the morphologic assessment of the patient's bone marrow and peripheral blood cells. Moreover, prognostic scoring systems rely on observer-dependent assessments of blast percentage and dysplasia. Gene expression profiling could enhance current diagnostic and prognostic systems by providing a set of standardized, objective gene signatures. Within the Microarray Innovations in LEukemia study, a diagnostic classification model was investigated to distinguish the distinct subclasses of pediatric and adult leukemia, as well as MDS. Overall, the accuracy of the diagnostic classification model for subtyping leukemia was approximately 93%, but this was not reflected for the MDS samples giving only approximately 50% accuracy. Discordant samples of MDS were classified either into acute myeloid leukemia (AML) or

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The diagnosis of patients with myelodysplastic syndromes (MDS) is largely dependent on morphologic examination of bone marrow aspirates. Several criteria that form the basis of the classifications and scoring systems most commonly used in clinical practice are affected by operator-dependent variation. To identify standardized molecular markers that would allow prediction of prognosis, we have used gene expression profiling (GEP) data on CD34+ cells from patients with MDS to determine the relationship between gene expression levels and prognosis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To report any differences in the visual acuity (VA) recording method used in peer-reviewed ophthalmology clinical studies over the past decade. Methods: We reviewed the method of assessing and reporting VA in 160 clinical studies from 2 UK and 2 US peer-reviewed journals, published in 1994 and 2004. Results: The method used to assess VA was specified in 62.5% of UK-published and 60% of US-published papers. In the results sections of the UK publications the VA measurements presented were Snellen acuity (n = 58), logMAR acuity (n = 20) and symbol acuity (n = 1). Similarly in the US publications the VA was recorded in the results section using Snellen acuity (n = 60) and logMAR acuity (n = 14). Overall 10% of the authors appeared to convert Snellen acuity measurements to logMAR format. Five studies (3%) chose to express Snellen-type acuities in decimal form, a method which can easily lead to confusion given the increased use of logMAR scoring systems. Conclusion: The authors recommend that to ensure comparable visual results between studies and different study populations it would be useful if clinical scientists worked to standardized VA testing protocols and reported results in a manner consistent with the way in which they are measured. Copyright © 2008 S. Karger AG.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A indução da doença do transplante contra o hospedeiro (GVHD) depende da activação das células dadoras T pelas células do hospedeiro que apresentamantigenio (APCs). A teoria prevalecente descreve que estas interacções ocorrem nos órgãos linfáticos secundários (SLO), tais como os nóduloslinfáticos (LN), as placas de Peyer’s (PP) e o baço (SP). Esta hipótese foi testada usando ratinhos homozigóticos aly/aly (alinfoplasia) que não têm LN nem PP, usando como controlo os ratinhos heterozigóticos (aly/+) da mesma ninhada. Os dois grupos foram irradiados com dose letal após a remoção do baço aos ratinhos aly/aly (LN/PP/SP-/-), enquanto nos ratinhos aly/+ o baço foi deslocado e recolocado. Ambos receberam transplante de medula óssea (BMT) de ratinhos dadores singénicos (aly/aly, H-2b) ou de ratinhos alogénicos, com diferente complexo principal de histocompatibilidade (MHC) (BALB/c, H-2dou B10.BR, H-2k). A severidade de GVHD foi medida pela sobrevivência,e pelo sistema de pontuação, bem estabelecido, quer de doença clínica quer de doença dos órgãos alvo. Surpreendentemente, todos os ratinhos LN/PP/SP-/-sobreviveram, desenvolvendo GVHD clinicamente significativo, comparável,em severidade, com o observado nos ratinhos LN/PP/SP+/+. Além disso, asanálises histopatológicas demonstraram que os ratinhos LN/PP/SP-/-receptores de BMTdesenvolveram significativamente mais GVHD no fígado,no intestino, e na pele quando comparados com os animais singénicos decontrolo. Os ratinhos LN/PP/SP-/-desenvolveram também GVHD hepático mais severo quando comparados com os ratinhos de controlo LN/PP/SP+/+. Diferenças semelhantes foram ainda observadas, logo ao 7º dia, para o GVHDhepático entre os grupos alogénicos. Para identificar quais os órgãos extra-linfáticos do receptor que poderão servir como sítios iniciais de exposição a antigenios alogénicos, na ausência de SLO, foi examinada a expansão das células T (CD3+), a sua activação (CD69+), e a sua proliferação (CFSE) na medula óssea, 3 dias depois do BMT. Em cada caso, os ratinhos LN/PP/SP-/-transplantados com medula de dadores alogénicos apresentaram númerosabsolutos significativamente maiores quer de células, quer de divisõescelulares, se comparados com os LN/PP/SP+/+. Para garantir que as diferenças experimentais observadas nos animais aly/aly, no sistema díspar do MHC, não são apenas um fenómeno dependente da estirpe de ratinho, foramtransplantados ratinhos sem baço FucT dko (LN/PP/SP-/-), previamente tratados com o anticorpo monoclonal (mAb) anti-MadCAM-1. Após o BMT estes ratinhos apresentaram elevada pontuação clínica de GVHD, mostrando que os SLO não são necessários para a indução de GVHD. Em estudos de transplante-versus-leucemia usando hospedeiros homozigóticos (LN/PP/SP-/-) estes ratinhos morreram devido a expansão tumoral e não devido a GVHD.Estudos in vitro mostraram que a capacidade das APCs, quer das célulasdendríticas (DCs) esplénicas, quer das DCs derivadas da medula óssea, dosratinhos aly/aly e aly/+ eramcomparável. Colectivamente, estes resultados são consistentes com a noção de que os SLO não são necessários para a activação alogénica das celulas T, sugerindo que a medula óssea pode ser umlocal alternativo, embora menos eficiente, para o reconhecimento alogénico deantígenos e consequente activação das células dadoras T. Estas observações desafiam o paradigma de que os tecidos linfáticos secundários sãonecessários para a indução de GVHD.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Despite progress in multidisciplinary treatment of esophageal cancer, oncologic esophagectomy is still the cornerstone of therapeutic strategies. Several scoring systems are used to predict postoperative morbidity, but in most cases they identify nonmodifiable parameters. The aim of this study was to identify potentially modifiable risk factors associated with complications after oncologic esophagectomy. METHODS: All consecutive patients with complete data sets undergoing oncologic esophagectomy in our department during 2001-2011 were included in this study. As potentially modifiable risk factors we assessed nutritional status depicted by body mass index (BMI) and preoperative serum albumin levels, excessive alcohol consumption, and active smoking. Postoperative complications were graded according to a validated 5-grade system. Univariate and multivariate analyses were used to identify preoperative risk factors associated with the occurrence and severity of complications. RESULTS: Our series included 93 patients. Overall morbidity rate was 81 % (n = 75), with 56 % (n = 52) minor complications and 18 % (n = 17) major complications. Active smoking and excessive alcohol consumption were associated with the occurrence of severe complications, whereas BMI and low preoperative albumin levels were not. The simultaneous presence of two or more of these risk factors significantly increased the risk of postoperative complications. CONCLUSIONS: A combination of malnutrition, active smoking and alcohol consumption were found to have a negative impact on postoperative morbidity rates. Therefore, preoperative smoking and alcohol cessation counseling and monitoring and improving the nutritional status are strongly recommended.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Several scores are commonly used to evaluate patients' postoperative satisfaction after lateral ankle ligament repair, including: AOFAS, FAAM, CAIT and CAIS. Comparing published studies in the literature is difficult, as the same patient can have markedly different results depending on which scoring system is used. The current study aims to address this gap in the literature by developing a system to compare these tests, to allow better analysis and comparison of published studies. Patients and methods: This is a retrospective cohort study of 47 patients following lateral ankle ligament repair using a modified Broström-Gould technique. All patients were operated between 2005 and 2010 by a single surgeon and followed the same post operative rehabilitation protocol. Six patients were excluded from the study because of concomitant surgery. Patients were assessed by an independent observer. We used the Pearson correlation coefficient to analyse the concordance of the scores, as well as scatter plots to assess the linear relationship between them. Results: A linear distribution between the scores was found when the results were analysed using scatter plots. We were thus able to use the Pearson correlation coefficient to evaluate the relationship between each of the different postoperative scores. The correlation was found to be above 0.5 in all cases except for the comparison between the CAIT and the FAAM for the activities of daily living (0.39). We were, therefore, able to compare the results obtained and assess the relative concordance of the scoring systems. The results showed that the more specific the scale is, the worst the score is and inversely. So the CAIT and the CAIS appeared to be more severe than the AOFAS and the FAAM measuring the activities of daily living. The sports subscale of the FAAM demonstrated intermediate results. Conclusion: This study outlines a system to compare different postoperative scores commonly used to evaluate outcome after ankle stabilization surgery. The impact of this study is that it makes comparison of published studies easier, even though they use a variety of different clinical scores, thus facilitating better outcome analysis of operative techniques.