884 resultados para Minimal-complexity classifier
Resumo:
Since the immunochemical identification of the bullous pemphigoid antigen 230 (BP230) as one of the major target autoantigens of bullous pemphigoid (BP) in 1981, our understanding of this protein has significantly increased. Cloning of its gene, development and characterization of animal models with engineered gene mutations or spontaneous mouse mutations have revealed an unexpected complexity of the gene encoding BP230. The latter, now called dystonin (DST), is composed of at least 100 exons and gives rise to three major isoforms, an epithelial, a neuronal and a muscular isoform, named BPAG1e (corresponding to the original BP230), BPAG1a and BPAG1b, respectively. The various BPAG1 isoforms play a key role in fundamental processes, such as cell adhesion, cytoskeleton organization, and cell migration. Genetic defects of BPAG1 isoforms are the culprits of epidermolysis bullosa and complex, devastating neurological diseases. In this review, we summarize recent advances of our knowledge about several BPAG1 isoforms, their role in various biological processes and in human diseases.
Resumo:
Minimal invasive extracorporeal circulation (MiECC) systems have initiated important efforts within science and technology to further improve the biocompatibility of cardiopulmonary bypass components to minimize the adverse effects and improve end-organ protection. The Minimal invasive Extra-Corporeal Technologies international Society was founded to create an international forum for the exchange of ideas on clinical application and research of minimal invasive extracorporeal circulation technology. The present work is a consensus document developed to standardize the terminology and the definition of minimal invasive extracorporeal circulation technology as well as to provide recommendations for the clinical practice. The goal of this manuscript is to promote the use of MiECC systems into clinical practice as a multidisciplinary strategy involving cardiac surgeons, anaesthesiologists and perfusionists.
Resumo:
We read with great interest the large-scale network meta-analysis by Kowalewski et al. comparing clinical outcomes of patients undergoing coronary artery bypass grafting (CABG) operated on using minimal invasive extracorporeal circulation (MiECC) or off-pump (OPCAB) with those undergoing surgery on conventional cardiopulmonary bypass (CPB) [1]. The authors actually integrated into single study two recently published meta-analysis comparing MiECC and OPCAB with conventional CPB, respectively [2, 3] into a single study. According to the results of this study, MiECC and OPCAB are both strongly associated with improved perioperative outcomes following CABG when compared with CABG performed on conventional CPB. The authors conclude that MiECC may represent an attractive compromise between OPCAB and conventional CPB. After carefully reading the whole manuscript, it becomes evident that the role of MiECC is clearly undervalued. Detailed statistical analysis using the surface under the cumulative ranking probabilities indicated that MiECC represented the safer and more effective intervention regarding all-cause mortality and protection from myocardial infarction, cerebral stroke, postoperative atrial fibrillation and renal dysfunction when compared with OPCAB. Even though no significant statistical differences were demonstrated between MiECC and OPCAB, the superiority of MiECC is obvious by the hierarchy of treatments in the probability analysis, which ranked MiECC as the first treatment followed by OPCAB and conventional CPB. Thus, MiECC does not represent a compromise between OPCAB and conventional CPB, but an attractive dominant technique in CABG surgery. These results are consistent with the largest published meta-analysis by Anastasiadis et al. comparing MiECC versus conventional CPB including a total of 2770 patients. A significant decrease in mortality was observed when MiECC was used, which was also associated with reduced risk of postoperative myocardial infarction and neurological events [4]. Similarly, another recent meta-analysis by Benedetto et al. compared MiECC versus OPCAB and resulted in comparable outcomes between these two surgical techniques [5]. As stated in the text, superiority of MiECC observed in the current network meta-analysis, when compared with OPCAB, could be attributed to the fact that MiECC offers the potential for complete revascularization, whereas OPCAB poses a challenge for unexperienced surgeons; especially when distal marginal branches on the lateral and/or posterior wall of the heart need revascularization. This is reflected by a significantly lower number of distal anastomoses performed in OPCAB when compared with conventional CPB. Therefore, taking into consideration the literature published up to date, including the results of the current article, we advocate that MiECC should be integrated in the clinical practice guidelines as a state-of-the-art technique and become a standard practice for perfusion in coronary revascularization surgery.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
The logic PJ is a probabilistic logic defined by adding (noniterated) probability operators to the basic justification logic J. In this paper we establish upper and lower bounds for the complexity of the derivability problem in the logic PJ. The main result of the paper is that the complexity of the derivability problem in PJ remains the same as the complexity of the derivability problem in the underlying logic J, which is π[p/2] -complete. This implies that the probability operators do not increase the complexity of the logic, although they arguably enrich the expressiveness of the language.
Resumo:
mRNA 3′ polyadenylation is central to mRNA biogenesis in prokaryotes and eukaryotes, and is implicated in numerous aspects of mRNA metabolism, including efficiency of mRNA export from the nucleus, message stability, and initiation of translation. However, due to the great complexity of the eukaryotic polyadenylation apparatus, the mechanisms of RNA 3 ′ end processing have remained elusive. Although the RNA processing reactions leading to polyadenylated messenger RNA have been studied in many systems, and much progress has been made, a complete understanding of the biochemistry of the poly(A) polymerase enzyme is still lacking. My research uses Vaccinia virus as a model system to gain a better understanding of this complicated polyadenylation process, which consist of RNA binding, catalysis and polymerase translocation. ^ Vaccinia virus replicates in the cytoplasm of its host cell, so it must employ its own poly(A) polymerase (PAP), a heterodimer of two virus encoded proteins, VP55 and VP39. VP55 is the catalytic subunit, adding 30 adenylates to a non-polyadenylated RNA in a rapid processive manner before abruptly changing to a slow, non-processive mode of adenylate addition and dissociating from the RNA. VP39 is the stimulatory subunit. It has no polyadenylation catalytic activity by itself, but when associated with VP55 it facilitates the semi-processive synthesis of tails several hundred adenylates in length. ^ Oligonucleotide selection and competition studies have shown that the heterodimer binds a minimal motif of (rU)2 (N)25 U, the “heterodimer binding motif”, within an oligonucleotide, and its primer selection for polyadenylation is base-type specific. ^ Crosslinking studies using photosensitive uridylate analogs show that within a VP55-VP39-primer ternary complex, VP55 comes into contact with all three required uridylates, while VP39 only contacts the downstream uridylate. Further studies, using a backbone-anchored photosensitive crosslinker show that both PAP subunits are in close proximity to the downstream −10 to −21 region of 50mer model primers containing the heterodimer binding motif. This equal crosslinking to both subunits suggests that the dimerization of VP55 and VP39 creates either a cleft or a channel between the two subunits through which this region of RNA passes. ^ Peptide mapping studies of VP39 covalently crosslinked to the oligonucleotide have identified residue R107 as the amino acid in close proximity to the −10 uridylate. This helps us project a conceptual model onto the known physical surface of this subunit. In the absence of any tertiary structural data for VP55, we have used a series of oligonucleotide selection assays, as well as crosslinking, nucleotide transfer assays, and gel shift assays to gain insight into the requirements for binding, polyadenylation and translocation. Collectively, these data allow us to put together a comprehensive model of the structure and function of the polyadenylation ternary complex consisting of VP39, VP55 and RNA. ^
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
Objectives. Minimal Important Differences (MIDs) establish benchmarks for interpreting mean differences in clinical trials involving quality of life outcomes and inform discussions of clinically meaningful change in patient status. As such, the purpose of this study was to assess MIDs for the Functional Assessment of Cancer Therapy–Melanoma (FACT-M). ^ Methods. A prospective validation study of the FACT-M was performed with 273 patients with stage I to IV melanoma. FACT-M, Karnofsky Performance Status (KPS), and Eastern Cooperative Oncology Group Performance Status (ECOG-PS) scores were obtained at baseline and 3 months following enrollment. Anchor- and distribution-based methods were used to assess MIDs, and the correspondence between MID ranges derived from each method was evaluated. ^ Results. This study indicates that an approximate range for MIDs of the FACT-M subscales is between 5 to 8 points for the Trial Outcome Index, 4 to 5 points for the Melanoma Combined Subscale, 2 to 4 points for the Melanoma Subscale, and 1 to 2 points for the Melanoma Surgery Subscale. Each method produced similar but not identical ranges of MIDs. ^ Conclusions. The properties of the anchor instrument employed to derive MIDs directly affect resulting MID ranges and point values. When MIDs are offered as supportive evidence of a clinically meaningful change, the anchor instrument used to derive thresholds should be clearly stated along with evidence supporting the choice of anchor instrument as the most appropriate for the domain of interest. In this analysis, the KPS was a more appropriate measure than the ECOG-PS for assessing MIDs. ^