914 resultados para predictive coding
Resumo:
Experience is lacking with mineral scaling and corrosion in enhanced geothermal systems (EGS) in which surface water is circulated through hydraulically stimulated crystalline rocks. As an aid in designing EGS projects we have conducted multicomponent reactive-transport simulations to predict the likely characteristics of scales and corrosion that may form when exploiting heat from granitoid reservoir rocks at ∼200 °C and 5 km depth. The specifications of an EGS project at Basel, Switzerland, are used to constrain the model. The main water–rock reactions in the reservoir during hydraulic stimulation and the subsequent doublet operation were identified in a separate paper (Alt-Epping et al., 2013b). Here we use the computed composition of the reservoir fluid to (1) predict mineral scaling in the injection and production wells, (2) evaluate methods of chemical geothermometry and (3) identify geochemical indicators of incipient corrosion. The envisaged heat extraction scheme ensures that even if the reservoir fluid is in equilibrium with quartz, cooling of the fluid will not induce saturation with respect to amorphous silica, thus eliminating the risk of silica scaling. However, the ascending fluid attains saturation with respect to crystalline aluminosilicates such as albite, microcline and chlorite, and possibly with respect to amorphous aluminosilicates. If no silica-bearing minerals precipitate upon ascent, reservoir temperatures can be predicted by classical formulations of silica geothermometry. In contrast, Na/K concentration ratios in the production fluid reflect steady-state conditions in the reservoir rather than albite–microcline equilibrium. Thus, even though igneous orthoclase is abundant in the reservoir and albite precipitates as a secondary phase, Na/K geothermometers fail to yield accurate temperatures. Anhydrite, which is present in fractures in the Basel reservoir, is predicted to dissolve during operation. This may lead to precipitation of pyrite and, at high exposure of anhydrite to the circulating fluid, of hematite scaling in the geothermal installation. In general, incipient corrosion of the casing can be detected at the production wellhead through an increase in H2(aq) and the enhanced precipitation of Fe-bearing aluminosilicates. The appearance of magnetite in scales indicates high corrosion rates.
Resumo:
Anthracyclines are used in over 50% of childhood cancer treatment protocols, but their clinical usefulness is limited by anthracycline-induced cardiotoxicity (ACT) manifesting as asymptomatic cardiac dysfunction and congestive heart failure in up to 57% and 16% of patients, respectively. Candidate gene studies have reported genetic associations with ACT, but these studies have in general lacked robust patient numbers, independent replication or functional validation. Thus, the individual variability in ACT susceptibility remains largely unexplained. We performed a genome-wide association study in 280 patients of European ancestry treated for childhood cancer, with independent replication in similarly treated cohorts of 96 European and 80 non-European patients. We identified a nonsynonymous variant (rs2229774, p.Ser427Leu) in RARG highly associated with ACT (P = 5.9 × 10(-8), odds ratio (95% confidence interval) = 4.7 (2.7-8.3)). This variant alters RARG function, leading to derepression of the key ACT genetic determinant Top2b, and provides new insight into the pathophysiology of this severe adverse drug reaction.
Resumo:
Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.
Resumo:
In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance
Resumo:
PURPOSE To quantify the coinciding improvement in the clinical diagnosis of sepsis, its documentation in the electronic health records, and subsequent medical coding of sepsis for billing purposes in recent years. METHODS We examined 98,267 hospitalizations in 66,208 patients who met systemic inflammatory response syndrome criteria at a tertiary care center from 2008 to 2012. We used g-computation to estimate the causal effect of the year of hospitalization on receiving an International Classification of Diseases, Ninth Revision, Clinical Modification discharge diagnosis code for sepsis by estimating changes in the probability of getting diagnosed and coded for sepsis during the study period. RESULTS When adjusted for demographics, Charlson-Deyo comorbidity index, blood culture frequency per hospitalization, and intensive care unit admission, the causal risk difference for receiving a discharge code for sepsis per 100 hospitalizations with systemic inflammatory response syndrome, had the hospitalization occurred in 2012, was estimated to be 3.9% (95% confidence interval [CI], 3.8%-4.0%), 3.4% (95% CI, 3.3%-3.5%), 2.2% (95% CI, 2.1%-2.3%), and 0.9% (95% CI, 0.8%-1.1%) from 2008 to 2011, respectively. CONCLUSIONS Patients with similar characteristics and risk factors had a higher of probability of getting diagnosed, documented, and coded for sepsis in 2012 than in previous years, which contributed to an apparent increase in sepsis incidence.
Resumo:
Objective The validity of current ultra-high risk (UHR) criteria is under-examined in help-seeking minors, particularly, in children below the age of 12 years. Thus, the present study investigated predictors of one-year outcome in children and adolescents (CAD) with UHR status. Method Thirty-five children and adolescents (age 9–17 years) meeting UHR criteria according to the Structured Interview for Psychosis-Risk Syndromes were followed-up for 12 months. Regression analyses were employed to detect baseline predictors of conversion to psychosis and of outcome of non-converters (remission and persistence of UHR versus conversion). Results At one-year follow-up, 20% of patients had developed schizophrenia, 25.7% had remitted from their UHR status that, consequently, had persisted in 54.3%. No patient had fully remitted from mental disorders, even if UHR status was not maintained. Conversion was best predicted by any transient psychotic symptom and a disorganized communication score. No prediction model for outcome beyond conversion was identified. Conclusions Our findings provide the first evidence for the predictive utility of UHR criteria in CAD in terms of brief intermittent psychotic symptoms (BIPS) when accompanied by signs of cognitive impairment, i.e. disorganized communication. However, because attenuated psychotic symptoms (APS) related to thought content and perception were indicative of non-conversion at 1-year follow-up, their use in early detection of psychosis in CAD needs further study. Overall, the need for more in-depth studies into developmental peculiarities in the early detection and treatment of psychoses with an onset of illness in childhood and early adolescence was further highlighted.
Resumo:
BACKGROUND Recent reports using administrative claims data suggest the incidence of community- and hospital-onset sepsis is increasing. Whether this reflects changing epidemiology, more effective diagnostic methods, or changes in physician documentation and medical coding practices is unclear. METHODS We performed a temporal-trend study from 2008 to 2012 using administrative claims data and patient-level clinical data of adult patients admitted to Barnes-Jewish Hospital in St. Louis, Missouri. Temporal-trend and annual percent change were estimated using regression models with autoregressive integrated moving average errors. RESULTS We analyzed 62,261 inpatient admissions during the 5-year study period. 'Any SIRS' (i.e., SIRS on a single calendar day during the hospitalization) and 'multi-day SIRS' (i.e., SIRS on 3 or more calendar days), which both use patient-level data, and medical coding for sepsis (i.e., ICD-9-CM discharge diagnosis codes 995.91, 995.92, or 785.52) were present in 35.3 %, 17.3 %, and 3.3 % of admissions, respectively. The incidence of admissions coded for sepsis increased 9.7 % (95 % CI: 6.1, 13.4) per year, while the patient data-defined events of 'any SIRS' decreased by 1.8 % (95 % CI: -3.2, -0.5) and 'multi-day SIRS' did not change significantly over the study period. Clinically-defined sepsis (defined as SIRS plus bacteremia) and severe sepsis (defined as SIRS plus hypotension and bacteremia) decreased at statistically significant rates of 5.7 % (95 % CI: -9.0, -2.4) and 8.6 % (95 % CI: -4.4, -12.6) annually. All-cause mortality, SIRS mortality, and SIRS and clinically-defined sepsis case fatality did not change significantly during the study period. Sepsis mortality, based on ICD-9-CM codes, however, increased by 8.8 % (95 % CI: 1.9, 16.2) annually. CONCLUSIONS The incidence of sepsis, defined by ICD-9-CM codes, and sepsis mortality increased steadily without a concomitant increase in SIRS or clinically-defined sepsis. Our results highlight the need to develop strategies to integrate clinical patient-level data with administrative data to draw more accurate conclusions about the epidemiology of sepsis.
Resumo:
Content-Centric Networking (CCN) naturally supports multi-path communication, as it allows the simultaneous use of multiple interfaces (e.g. LTE and WiFi). When multiple sources and multiple clients are considered, the optimal set of distribution trees should be determined in order to optimally use all the available interfaces. This is not a trivial task, as it is a computationally intense procedure that should be done centrally. The need for central coordination can be removed by employing network coding, which also offers improved resiliency to errors and large throughput gains. In this paper, we propose NetCodCCN, a protocol for integrating network coding in CCN. In comparison to previous works proposing to enable network coding in CCN, NetCodCCN permit Interest aggregation and Interest pipelining, which reduce the data retrieval times. The experimental evaluation shows that the proposed protocol leads to significant improvements in terms of content retrieval delay compared to the original CCN. Our results demonstrate that the use of network coding adds robustness to losses and permits to exploit more efficiently the available network resources. The performance gains are verified for content retrieval in various network scenarios.
Resumo:
As translation is the final step in gene expression it is particularly important to understand the processes involved in translation regulation. It was shown in the last years that a class of RNA, the non-protein-coding RNAs (ncRNAs), is involved in regulation of gene expression via various mechanisms (e.g. gene silencing by microRNAs). Almost all of these ncRNA discovered so far target the mRNA in order to modulate protein biosynthesis, this is rather unexpected considering the crucial role of the ribosome during gene expression. However, recent data from our laboratory showed that there is a new class of ncRNAs, which target the ribosome itself [Gebetsberger et al., 2012/ Pircher et al, 2014]. These so called ribosome-associated ncRNAs (rancRNAs) have an impact on translation regulation, mainly by interfering / modulating the rate of protein biosynthesis. The main goal of this project is to identify and describe novel potential regulatory rancRNAs in H. volcanii with the focus on intergenic candidates. Northern blot analyses already revealed interactions with the ribosome and showed differential expression of rancRNAs during different growth phases or under specific stress conditions. To investigate the biological relevance of these rancRNAs, knock-outs were generated in H. volcanii which were used for phenotypic characterization studies. The rancRNA s194 showed association with the 50S ribosomal subunit in vitro and in vivo and was capable of inhibiting peptide bond formation and seems to inhibit translation in vitro. These preliminary data for the rancRNA s194 make it an interesting candidate for further functional studies to identify the molecular mechanisms by which rancRNAs can modulate protein biosynthesis. Characterization of further rancRNA candidates are also underway.
Resumo:
Although a trimodality regimen for patients with stage IIIA/pN2 non-small-cell lung cancer (NSCLC) has been variably used owing to limited evidence for its benefits, it remains unknown whether any patient subgroup actually receives benefit from such an approach. To explore this question, the published data were reviewed from 1990 to 2015 to identify the possible predictors and prognosticators in this setting. Overall survival was the endpoint of our study. Of 27 identified studies, none had studied the predictors of improved outcomes with trimodality treatment. Of the potential patient- and tumor-related prognosticators, age, gender, and histologic type were the most frequently formally explored. However, none of the 3 was found to influence overall survival. The most prominent finding of the present review was the substantial lack of data supporting a trimodality treatment approach in any patient subgroup. As demonstrated in completed prospective randomized studies, the use of surgery for stage IIIA NSCLC should be limited to well-defined clinical trials.
Resumo:
PURPOSE Our main objective was to prospectively determine the prognostic value of [(18)F]fluorodeoxyglucose positron emission tomography/computed tomography (PET/CT) after two cycles of rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone given every 14 days (R-CHOP-14) under standardized treatment and PET evaluation criteria. PATIENTS AND METHODS Patients with any stage of diffuse large B-cell lymphoma were treated with six cycles of R-CHOP-14 followed by two cycles of rituximab. PET/CT examinations were performed at baseline, after two cycles (and after four cycles if the patient was PET-positive after two cycles), and at the end of treatment. PET/CT examinations were evaluated locally and by central review. The primary end point was event-free survival at 2 years (2-year EFS). RESULTS Median age of the 138 evaluable patients was 58.5 years with a WHO performance status of 0, 1, or 2 in 56%, 36%, or 8% of the patients, respectively. By local assessment, 83 PET/CT scans (60%) were reported as positive and 55 (40%) as negative after two cycles of R-CHOP-14. Two-year EFS was significantly shorter for PET-positive compared with PET-negative patients (48% v 74%; P = .004). Overall survival at 2 years was not significantly different, with 88% for PET-positive versus 91% for PET-negative patients (P = .46). By using central review and the Deauville criteria, 2-year EFS was 41% versus 76% (P < .001) for patients who had interim PET/CT scans after two cycles of R-CHOP-14 and 24% versus 72% (P < .001) for patients who had PET/CT scans at the end of treatment. CONCLUSION Our results confirmed that an interim PET/CT scan has limited prognostic value in patients with diffuse large B-cell lymphoma homogeneously treated with six cycles of R-CHOP-14 in a large prospective trial. At this point, interim PET/CT scanning is not ready for clinical use to guide treatment decisions in individual patients.
Resumo:
BACKGROUND: Lack of adaptive and enhanced maladaptive coping with stress and negative emotions are implicated in many psychopathological disorders. We describe the development of a new scale to investigate the relative contribution of different coping styles to psychopathology in a large population sample. We hypothesized that the magnitude of the supposed positive correlation between maladaptive coping and psychopathology would be stronger than the supposed negative correlation between adaptive coping and psychopathology. We also examined whether distinct coping style patterns emerge for different psychopathological syndromes. METHODS: A total of 2200 individuals from the general population participated in an online survey. The Patient Health Questionnaire-9 (PHQ-9), the Obsessive-Compulsive Inventory revised (OCI-R) and the Paranoia Checklist were administered along with a novel instrument called Maladaptive and Adaptive Coping Styles (MAX) questionnaire. Participants were reassessed six months later. RESULTS: MAX consists of three dimensions representing adaptive coping, maladaptive coping and avoidance. Across all psychopathological syndromes, similar response patterns emerged. Maladaptive coping was more strongly related to psychopathology than adaptive coping both cross-sectionally and longitudinally. The overall number of coping styles adopted by an individual predicted greater psychopathology. Mediation analysis suggests that a mild positive relationship between adaptive and certain maladaptive styles (emotional suppression) partially accounts for the attenuated relationship between adaptive coping and depressive symptoms. LIMITATIONS: Results should be replicated in a clinical population. CONCLUSIONS: Results suggest that maladaptive and adaptive coping styles are not reciprocal. Reducing maladaptive coping seems to be more important for outcome than enhancing adaptive coping. The study supports transdiagnostic approaches advocating that maladaptive coping is a common factor across different psychopathologies.