931 resultados para predictor endogeneity
Resumo:
Background Studies investigating the relationship between malnutrition and post-discharge mortality following acute hip fracture yield conflicting results. This study aimed to determine whether malnutrition independently predicted 12-month post-fracture mortality after adjusting for clinically relevant covariates. Methods An ethics approved, prospective, consecutive audit was undertaken for all surgically treated hip fracture inpatients admitted to a dedicated orthogeriatric unit (November 2010–October 2011). The 12-month mortality data were obtained by a dual search of the mortality registry and Queensland Health database. Malnutrition was evaluated using the Subjective Global Assessment. Demographic (age, gender, admission residence) and clinical covariates included fracture type, time to surgery, anaesthesia type, type of surgery, post-surgery time to mobilize and post-operative complications (delirium, pulmonary and deep vein thrombosis, cardiac complications, infections). The Charlson Comorbidity Index was retrospectively applied. All diagnoses were confirmed by the treating orthogeriatrician. Results A total of 322 of 346 patients were available for audit. Increased age (P = 0.004), admission from residential care (P < 0.001), Charlson Comorbidity Index (P = 0.007), malnutrition (P < 0.001), time to mobilize >48 h (P < 0.001), delirium (P = 0.003), pulmonary embolism (P = 0.029) and cardiovascular complication (P = 0.04) were associated with 12-month mortality. Logistic regression analysis demonstrated that malnutrition (odds ratio (OR) 2.4 (95% confidence interval (CI) 1.3–4.7, P = 0.007)), in addition to admission from residential care (OR 2.6 (95% CI 1.3–5.3, P = 0.005)) and pulmonary embolism (OR 11.0 (95% CI 1.5–78.7, P = 0.017)), independently predicted 12-month mortality. Conclusions Findings substantiate malnutrition as an independent predictor of 12-month mortality in a representative sample of hip fracture inpatients. Effective strategies to identify and treat malnutrition in hip fracture should be prioritized.
Resumo:
Background: Increased hospital readmission and longer stays in the hospital for patients with type 2 diabetes and cardiac disease can result in higher healthcare costs and heavier individual burden. Thus, knowledge of the characteristics and predictive factors for Vietnamese patients with type 2 diabetes and cardiac disease, at high risk of hospital readmission and longer stays in the hospital, could provide a better understanding on how to develop an effective care plan aimed at improving patient outcomes. However, information about factors influencing hospital readmission and length of stay of patients with type 2 diabetes and cardiac disease in Vietnam is limited. Aim: This study examined factors influencing hospital readmission and length of stay of Vietnamese patients with both type 2 diabetes and cardiac disease. Methods: An exploratory prospective study design was conducted on 209 patients with type 2 diabetes and cardiac disease in Vietnam. Data were collected from patient charts and patients' responses to self-administered questionnaires. Descriptive statistics, bivariate correlation, logistic and multiple regression were used to analyse the data. Results: The hospital readmission rate was 12.0% among patients with both type 2 diabetes and cardiac disease. The average length of stay in the hospital was 9.37 days. Older age (OR= 1.11, p< .05), increased duration of type 2 diabetes (OR= 1.22, p< .05), less engagement in stretching/strengthening exercise behaviours (OR= .93, p< .001) and in communication with physician (OR= .21, p< .001) were significant predictors of 30-dayhospital readmission. Increased number of additional co-morbidities (β= .33, p< .001) was a significant predictor of longer stays in the hospital. High levels of cognitive symptom management (β= .40, p< .001) significantly predicted longer stays in the hospital, indicating that the more patients practiced cognitive symptom management, the longer the stay in hospital. Conclusions: This study provides some evidence of factors influencing hospital readmission and length of stay and argues that this information may have significant implications for clinical practice in order to improve patients' health outcomes. However, the findings of this study related to the targeted hospital only. Additionally, the investigation of environmental factors is recommended for future research as these factors are important components contributing to the research model.
Resumo:
Background Children with developmental coordination disorder (DCD) face evident motor difficulties in daily functioning. Little is known, however, about their difficulties in specific activities of daily living (ADL). Objective The purposes of this study were: (1) to investigate differences between children with DCD and their peers with typical development for ADL performance, learning, and participation, and (2) to explore the predictive values of these aspects. Design. This was a cross-sectional study. Methods In both a clinical sample of children diagnosed with DCD (n=25 [21 male, 4 female], age range=5-8 years) and a group of peers with typical development (25 matched controls), the children’s parents completed the DCDDaily-Q. Differences in scores between the groups were investigated using t tests for performance and participation and Pearson chi-square analysis for learning. Multiple regression analyses were performed to explore the predictive values of performance, learning, and participation. Results Compared with their peers, children with DCD showed poor performance of ADL and less frequent participation in some ADL. Children with DCD demonstrated heterogeneous patterns of performance (poor in 10%-80% of the items) and learning (delayed in 0%-100% of the items). In the DCD group, delays in learning of ADL were a predictor for poor performance of ADL, and poor performance of ADL was a predictor for less frequent participation in ADL compared with the control group. Limitations A limited number of children with DCD were addressed in this study. Conclusions This study highlights the impact of DCD on children’s daily lives and the need for tailored intervention.
Resumo:
Further improvement in performance, to achieve near transparent quality LSF quantization, is shown to be possible by using a higher order two dimensional (2-D) prediction in the coefficient domain. The prediction is performed in a closed-loop manner so that the LSF reconstruction error is the same as the quantization error of the prediction residual. We show that an optimum 2-D predictor, exploiting both inter-frame and intra-frame correlations, performs better than existing predictive methods. Computationally efficient split vector quantization technique is used to implement the proposed 2-D prediction based method. We show further improvement in performance by using weighted Euclidean distance.
Resumo:
INTRODUCTION There is a paucity of research investigating the scar outcome of children with partial thickness burns. The aim of this study was to assess the scar outcome of children with partial thickness burns who received a silver dressing acutely. METHOD Children aged 0-15 years with an acute partial thickness burn, ≤10% TBSA were included. Children were originally recruited for an RCT investigating three dressings for partial thickness burns. Children were assessed at 3 and 6 months after re-epithelialization. 3D photographs were taken of the burn site, POSAS was completed and skin thickness was measured using ultrasound imaging. RESULTS Forty-three children returned for 3 and 6 month follow-ups or returned a photo. Days to re-epithelialization was a significant predictor of skin/scar quality at 3 and 6 months (p<0.01). Patient-rated color and observer-rated vascularity and pigmentation POSAS scores were comparable at 3 months (color vs. vascularity 0.88, p<0.001; color vs. pigmentation 0.64, p<0.001), but patients scored higher than the observer at 6 months (color vs. vascularity 0.57, p<0.05; color vs. pigmentation 0.15, p=0.60). Burn depth was significantly correlated with skin thickness (r=0.51, p<0.01). Hypopigmentation of the burn site was present in 25.8% of children who re-epithelialized in ≤2 weeks. CONCLUSION This study has provided information on outcomes for children with partial thickness burns and highlighted a need for further education of this population.
Resumo:
Defence against pathogens is a vital need of all living organisms that has led to the evolution of complex immune mechanisms. However, although immunocompetence the ability to resist pathogens and control infection has in recent decades become a focus for research in evolutionary ecology, the variation in immune function observed in natural populations is relatively little understood. This thesis examines sources of this variation (environmental, genetic and maternal effects) during the nestling stage and its fitness consequences in wild populations of passerines: the blue tit (Cyanistes caeruleus) and the collared flycatcher (Ficedula albicollis). A developing organism may face a dilemma as to whether to allocate limited resources to growth or to immune defences. The optimal level of investment in immunity is shaped inherently by specific requirements of the environment. If the probability of contracting infection is low, maintaining high growth rates even at the expense of immune function may be advantageous for nestlings, as body mass is usually a good predictor of post-fledging survival. In experiments with blue tits and haematophagous hen fleas (Ceratophyllus gallinae) using two methods, methionine supplementation (to manipulate nestlings resource allocation to cellular immune function) and food supplementation (to increase resource availability), I confirmed that there is a trade-off between growth and immunity and that the abundance of ectoparasites is an environmental factor affecting allocation of resources to immune function. A cross-fostering experiment also revealed that environmental heterogeneity in terms of abundance of ectoparasites may contribute to maintaining additive genetic variation in immunity and other traits. Animal model analysis of extensive data collected from the population of collared flycatchers on Gotland (Sweden) allowed examination of the narrow-sense heritability of PHA-response the most commonly used index of cellular immunocompetence in avian studies. PHA-response is not heritable in this population, but is subject to a non-heritable origin (presumably maternal) effect. However, experimental manipulation of yolk androgen levels indicates that the mechanism of the maternal effect in PHA-response is not in ovo deposition of androgens. The relationship between PHA-response and recruitment was studied for over 1300 collared flycatcher nestlings. Multivariate selection analysis shows that it is body mass, not PHA-response, that is under direct selection. PHA-response appears to be related to recruitment because of its positive relationship with body mass. These results imply that either PHA-response fails to capture the immune mechanisms that are relevant for defence against pathogens encountered by fledglings or that the selection pressure from parasites is not as strong as commonly assumed.
Resumo:
Habitat requirements of fish are most strict during the early life stages, and the quality and quantity of reproduction habitats lays the basis for fish production. A considerable number of fish species in the northern Baltic Sea reproduce in the shallow coastal areas, which are also the most heavily exploited parts of the brackish marine area. However, the coastal fish reproduction habitats in the northern Baltic Sea are poorly known. The studies presented in this thesis focused on the influence of environmental conditions on the distribution of coastal reproduction habitats of freshwater fish. They were conducted in vegetated littoral zone along an exposure and salinity gradient extending from the innermost bays to the outer archipelago on the south-western and southern coasts of Finland, in the northern Baltic Sea. Special emphasis was placed on reed-covered Phragmites australis shores, which form a dominant vegetation type in several coastal archipelago areas. The main aims of this research were to (1) develop and test new survey and mapping methods, (2) investigate the environmental requirements that govern the reproduction of freshwater fish in the coastal area and (3) survey, map and model the distribution of the reproduction habitats of pike (Esox lucius) and roach (Rutilus rutilus). The white plate and scoop method with a standardized sampling time and effort was demonstrated to be a functional method for sampling the early life stages of fish in dense vegetation and shallow water. Reed-covered shores were shown to form especially important reproduction habitats for several freshwater fish species, such as pike, roach, other cyprinids and burbot, in the northern Baltic Sea. The reproduction habitats of pike were limited to sheltered reed- and moss-covered shores of the inner and middle archipelago, where suitable zooplankton prey were available and the influence of the open sea was low. The reproduction habitats of roach were even more limited and roach reproduction was successful only in the very sheltered reed-covered shores of the innermost bay areas, where salinity remained low (< 4‰) during the spawning season due to freshwater inflow. After identifying the critical factors restricting the reproduction of pike and roach, the spatial distribution of their reproduction habitats was successfully mapped and modelled along the environmental gradients using only a few environmental predictor variables. Reproduction habitat maps are a valuable tool promoting the sustainable use and management of exploited coastal areas and helping to maintain the sustainability of fish populations. However, the large environmental gradients and the extensiveness of the archipelago zone in the northern Baltic Sea demand an especially high spatial resolution of the coastal predictor variables. Therefore, the current lack of accurate large-scale, high-resolution spatial data gathered at exactly the right time is a considerable limitation for predictive modelling of shallow coastal waters.
Resumo:
The terrestrial export of dissolved organic matter (DOM) is associated with climate, vegetation and land use, and thus is under the influence of climatic variability and human interference with terrestrial ecosystems, their soils and hydrological cycles. The present study provides an assessment of spatial variation of DOM concentrations and export, and interactions between DOM, catchment characteristics, land use and climatic factors in boreal catchments. The influence of catchment characteristics, land use and climatic drivers on the concentrations and export of total organic carbon (TOC), total organic nitrogen (TON) and dissolved organic phosphorus (DOP) was estimated using stream water quality, forest inventory and climatic data from 42 Finnish pristine forested headwater catchments, and water quality monitoring, GIS land use, forest inventory and climatic data from the 36 main Finnish rivers (and their sub-catchments) flowing to the Baltic Sea. Moreover, the export of DOM in relation to land use along a European climatic gradient was studied using river water quality and land use data from four European areas. Additionally, the role of organic and minerogenic acidity in controlling pH levels in Finnish rivers and pristine streams was studied by measuring organic anion, sulphate (SO4) and base cation (Ca, Mg, K and Na) concentrations. In all study catchments, TOC was a major fraction of DOM, with much lower proportions of TON and DOP. Moreover, most of TOC and TON was in a dissolved form. The correlation between TOC and TON concentrations was strong and TOC concentrations explained 78% of the variation in TON concentrations in pristine headwater streams. In a subgroup of 20 headwater catchments with similar climatic conditions and low N deposition in eastern Finland, the proportion of peatlands in the catchment and the proportion of Norway spruce (Picea abies Karsten) of the tree stand had the strongest correlation with the TOC and TON concentrations and export. In Finnish river basins, TOC export increased with the increasing proportion of peatland in the catchment, whereas TON export increased with increasing extent of agricultural land. The highest DOP concentrations and export were recorded in river basins with a high extent of agricultural land and urban areas, reflecting the influence of human impact on DOP loads. However, the most important predictor for TOC, TON and DOP export in Finnish rivers was the proportion of upstream lakes in the catchment. The higher the upstream lake percentage, the lower the export indicating organic matter retention in lakes. Molar TOC:TON ratio decreased from headwater catchments covered by forests and peatlands to the large river basins with mixed land use, emphasising the effect of the land use gradient on the stoichiometry of rivers. This study also demonstrated that the land use of the catchments is related to both organic and minerogenic acidity in rivers and pristine headwater streams. Organic anion dominated in rivers and streams situated in northern Finland, reflecting the higher extent of peatlands in these areas, whereas SO4 dominated in southern Finland and on western coastal areas, where the extent of fertile areas, agricultural land, urban areas, acid sulphate soils, and sulphate deposition is highest. High TOC concentrations decreased pH values in the stream and river water, whereas no correlation between SO4 concentrations and pH was observed. This underlines the importance of organic acids in controlling pH levels in Finnish pristine headwater streams and main rivers. High SO4 concentrations were associated with high base cation concentrations and fertile areas, which buffered the effects of SO4 on pH.
Resumo:
The extent to which low-frequency (minor allele frequency (MAF) between 1-5%) and rare (MAF = 1%) variants contribute to complex traits and disease in the general population is mainly unknown. Bone mineral density (BMD) is highly heritable, a major predictor of osteoporotic fractures, and has been previously associated with common genetic variants, as well as rare, population-specific, coding variants. Here we identify novel non-coding genetic variants with large effects on BMD (ntotal = 53,236) and fracture (ntotal = 508,253) in individuals of European ancestry from the general population. Associations for BMD were derived from whole-genome sequencing (n = 2,882 from UK10K (ref. 10); a population-based genome sequencing consortium), whole-exome sequencing (n = 3,549), deep imputation of genotyped samples using a combined UK10K/1000 Genomes reference panel (n = 26,534), and de novo replication genotyping (n = 20,271). We identified a low-frequency non-coding variant near a novel locus, EN1, with an effect size fourfold larger than the mean of previously reported common variants for lumbar spine BMD (rs11692564(T), MAF = 1.6%, replication effect size = +0.20 s.d., Pmeta = 2 x 10(-14)), which was also associated with a decreased risk of fracture (odds ratio = 0.85; P = 2 x 10(-11); ncases = 98,742 and ncontrols = 409,511). Using an En1(cre/flox) mouse model, we observed that conditional loss of En1 results in low bone mass, probably as a consequence of high bone turnover. We also identified a novel low-frequency non-coding variant with large effects on BMD near WNT16 (rs148771817(T), MAF = 1.2%, replication effect size = +0.41 s.d., Pmeta = 1 x 10(-11)). In general, there was an excess of association signals arising from deleterious coding and conserved non-coding variants. These findings provide evidence that low-frequency non-coding variants have large effects on BMD and fracture, thereby providing rationale for whole-genome sequencing and improved imputation reference panels to study the genetic architecture of complex traits and disease in the general population.
Resumo:
While many measures of viewpoint goodness have been proposed in computer graphics, none have been evaluated for ribbon representations of protein secondary structure. To fill this gap, we conducted a user study on Amazon’s Mechanical Turk platform, collecting human viewpoint preferences from 65 participants for 4 representative su- perfamilies of protein domains. In particular, we evaluated viewpoint entropy, which was previously shown to be a good predictor for human viewpoint preference of other, mostly non-abstract objects. In a second study, we asked 7 molecular biology experts to find the best viewpoint of the same protein domains and compared their choices with viewpoint entropy. Our results show that viewpoint entropy overall is a significant predictor of human viewpoint preference for ribbon representations of protein secondary structure. However, the accuracy is highly dependent on the complexity of the structure: while most participants agree on good viewpoints for small, non-globular structures with few secondary structure elements, viewpoint preference varies considerably for complex structures. Finally, experts tend to choose viewpoints of both low and high viewpoint entropy to emphasize different aspects of the respective structure.
Resumo:
Mediastinitis as a complication after cardiac surgery is rare but disastrous increasing the hospital stay, hospital costs, morbidity and mortality. It occurs in 1-3 % of patients after median sternotomy. The purpose of this study was to find out the risk factors and also to investigate new ways to prevent mediastinitis. First, we assessed operating room air contamination monitoring by comparing the bacteriological technique with continuous particle counting in low level contamination achieved by ultra clean garment options in 66 coronary artery bypass grafting operations. Second, we examined surgical glove perforations and the changes in bacterial flora of surgeons' fingertips in 116 open-heart operations. Third, the effect of gentamicin-collagen sponge on preventing surgical site infections (SSI) was studied in randomized controlled study with 557 participants. Finally, incidence, outcome, and risk factors of mediastinitis were studied in over 10,000 patients. With the alternative garment and textile system (cotton group and clean air suit group), the air counts fell from 25 to 7 colony-forming units/m3 (P<0.01). The contamination of the sternal wound was reduced by 46% and that of the leg wound by >90%. In only 17% operations both gloves were found unpunctured. Frequency of glove perforations and bacteria counts of hands were found to increase with operation time. With local gentamicin prophylaxis slightly less SSIs (4.0 vs. 5.9%) and mediastinitis (1.1 vs. 1.9%) occurred. We identified 120/10713 cases of postoperative mediastinitis (1.1%). During the study period, the patient population grew significantly older, the proportion of women and patients with ASA score >3 increased significantly. In multivariate logistic regression analysis, the only significant predictor for mediastinitis was obesity. Continuous particle monitoring is a good intraoperative method to control the air contamination related to the theatre staff behavior during individual operation. When a glove puncture is detected, both gloves are to be changed. Before donning a new pair of gloves, the renewed disinfection of hands will help to keep their bacterial counts lower even towards the end of long operation. Gentamicin-collagen sponge may have beneficial effects on the prevention of SSI, but further research is needed. Mediastinitis is not diminishing. Larger populations at risk, for example proportions of overweight patients, reinforce the importance of surveillance and pose a challenge in focusing preventive measures.
Resumo:
Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.
Resumo:
Ruptured abdominal aortic aneurysm (RAAA) is a life-threatening event, and without operative treatment the patient will die. The overall mortality can be as high as 80-90%; thus repair of RAAA should be attempted whenever feasible. The quality of life (QoL) has become an increasingly important outcome measure in vascular surgery. Aim of the study was to evaluate outcomes of RAAA and to find out predictors of mortality. In Helsinki and Uusimaa district 626 patients were identified to have RAAA in 1996-2004. Altogether 352 of them were admitted to Helsinki University Central Hospital (HUCH). Based on Finnvasc Registry, 836 RAAA patients underwent repair of RAAA in 1991-1999. The 30-day operative mortality, hospital and population-based mortality were assessed, and the effect of regional centralisation and improving in-hospital quality on the outcome of RAAA. QoL was evaluated by a RAND-36 questionnaire of survivors of RAAA. Quality-adjusted life years (QALYs), which measure length and QoL, were calculated using the EQ-5D index and estimation of life expectancy. The predictors of outcome after RAAA were assessed at admission and 48 hours after repair of RAAA. The 30-day operative mortality rate was 38% in HUCH and 44% nationwide, whereas the hospital mortality was 45% in HUCH. Population-based mortality was 69% in 1996-2004 and 56% in 2003-2004. After organisational changes were undertaken, the mortality decreased significantly at all levels. Among the survivors, the QoL was almost equal when compared with norms of age- and sex-matched controls; only physical functioning was slightly impaired. Successful repair of RAAA gave a mean of 4.1 (0-30.9) QALYs for all RAAA patients, although non-survivors were included. The preoperative Glasgow Aneurysm Score was an independent predictor of 30-day operative mortality after RAAA, and it also predicted the outcome at 48- hours for initial survivors of repair of RAAA. A high Glasgow Aneurysm Score and high age were associated with low numbers of QALYs to be achieved. Organ dysfunction measured by the Sequential Organ Failure Assessment (SOFA) score at 48 hours after repair of RAAA was the strongest predictor of death. In conclusion surgery of RAAA is a life-saving and cost-effective procedure. The centralisation of vascular emergencies improved the outcome of RAAA patients. The survivors had a good QoL after RAAA. Predictive models can be used on individual level only to provide supplementary information for clinical decision-making due to their moderate discriminatory value. These results support an active operation policy, as there is no reliable measure to predict the outcome after RAAA.
Resumo:
Background Risk-stratification of diffuse large B-cell lymphoma (DLBCL) requires identification of patients with disease that is not cured despite initial R-CHOP. Although the prognostic importance of the tumour microenvironment (TME) is established, the optimal strategy to quantify it is unknown. Methods The relationship between immune-effector and inhibitory (checkpoint) genes was assessed by NanoString™ in 252 paraffin-embedded DLBCL tissues. A model to quantify net anti-tumoural immunity as an outcome predictor was tested in 158 R-CHOP treated patients, and validated in tissue/blood from two independent R-CHOP treated cohorts of 233 and 140 patients respectively. Findings T and NK-cell immune-effector molecule expression correlated with tumour associated macrophage and PD-1/PD-L1 axis markers consistent with malignant B-cells triggering a dynamic checkpoint response to adapt to and evade immune-surveillance. A tree-based survival model was performed to test if immune-effector to checkpoint ratios were prognostic. The CD4*CD8:(CD163/CD68)*PD-L1 ratio was better able to stratify overall survival than any single or combination of immune markers, distinguishing groups with disparate 4-year survivals (92% versus 47%). The immune ratio was independent of and added to the revised international prognostic index (R-IPI) and cell-of-origin (COO). Tissue findings were validated in 233 DLBCL R-CHOP treated patients. Furthermore, within the blood of 140 R-CHOP treated patients immune-effector:checkpoint ratios were associated with differential interim-PET/CT+ve/-ve expression.
Resumo:
We studied the mating behaviour of the primi-tively eusocial wasp Ropalidia marginata and the factors that may influence sperm transfer. By introducing a male and a female R. marginata into ventilated transparent plastic boxes, we were able to observe mating behaviour, and it involved mounting and short or long conjugation of the wasps. Dissection of female wasps after the observation indicated that long conjugation is a good behavioural predictor of sperm transfer. This finding makes it possible to obtain mated females without dissecting them every time. We tested the effect of age, season, relatedness, body size and female's ovarian status on mating. Under laboratory conditions, mating success declined rapidly below and above the ages 5-20 days. Within this age range mating success was significantly low in December compared to other months tested. There was no nestmate discrimination, and there was no influence of male and female body size or of the ovarian state of the female on the probability of sperm transfer.