957 resultados para SQL Query generation from examples
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
BACKGROUND In percutaneous coronary intervention (PCI) patients new-generation drug-eluting stent (DES) has reduced adverse events in comparison to early-generation DES. The aim of the current study was to investigate the long-term clinical efficacy and safety of new-generation DES versus early-generation DES for PCI of unprotected left main coronary artery (uLMCA) disease. METHODS The patient-level data from the ISAR-LEFT MAIN and ISAR-LEFT MAIN 2 randomized trials were pooled. The clinical outcomes of PCI patients assigned to new-generation DES (everolimus- or zotarolimus-eluting stent) versus early-generation DES (paclitaxel- or sirolimus-eluting stent) were studied. The primary endpoint was the composite of death, myocardial infarction (MI), target lesion revascularization and stroke (MACCE, major adverse cardiac and cerebrovascular event). RESULTS In total, 1257 patients were available. At 3 years, the risk of MACCE was comparable between patients assigned to new-generation DES or early-generation DES (28.2 versus 27.5 %, hazard ratio-HR 1.03, 95 % confidence intervals-CI 0.83-1.26; P = 0.86). Definite/probable stent thrombosis was low and comparable between new-generation DES and early-generation DES (0.8 versus 1.6 %, HR 0.52, 95 % CI 0.18-1.57; P = 0.25); in patients treated with new-generation DES no cases occurred beyond 30 days. Diabetes increased the risk of MACCE in patients treated with new-generation DES but not with early-generation DES (P interaction = 0.004). CONCLUSIONS At 3-year follow-up, a PCI with new-generation DES for uLMCA disease shows comparable efficacy to early-generation DES. Rates of stent thrombosis were low in both groups. Diabetes significantly impacts the risk of MACCE at 3 years in patients treated with new-generation DES for uLMCA disease. ClinicalTrials.gov Identifiers: NCT00133237; NCT00598637.
Resumo:
Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^
Resumo:
En América Latina la injerencia de los organismos internacionales en la política nacional constituye un fenómeno que no se puede soslayar. Se trata de una cuestión candente en el contexto actual y remite a la presencia creciente en los países de la región de una serie de papers, documentos, boletines que, generados en el seno de dichas entidades, señalan desafíos presentes y futuros que deberá atender América Latina. En el campo educativo, marcan las orientaciones de política para la región y representan discursos sobre la educación que es menester analizar críticamente. El objetivo de este trabajo es analizar una serie de ideas, recomendaciones y retóricas político-pedagógicas de los Organismos Internacionales respecto de la denominada calidad de la educación en América Latina con la intención de poner de manifiesto su productividad discursiva en tanto su incidencia está ligada a significantes que operan ordenando las disputas político-discursivas del campo educativo
Resumo:
En América Latina la injerencia de los organismos internacionales en la política nacional constituye un fenómeno que no se puede soslayar. Se trata de una cuestión candente en el contexto actual y remite a la presencia creciente en los países de la región de una serie de papers, documentos, boletines que, generados en el seno de dichas entidades, señalan desafíos presentes y futuros que deberá atender América Latina. En el campo educativo, marcan las orientaciones de política para la región y representan discursos sobre la educación que es menester analizar críticamente. El objetivo de este trabajo es analizar una serie de ideas, recomendaciones y retóricas político-pedagógicas de los Organismos Internacionales respecto de la denominada calidad de la educación en América Latina con la intención de poner de manifiesto su productividad discursiva en tanto su incidencia está ligada a significantes que operan ordenando las disputas político-discursivas del campo educativo
Resumo:
En América Latina la injerencia de los organismos internacionales en la política nacional constituye un fenómeno que no se puede soslayar. Se trata de una cuestión candente en el contexto actual y remite a la presencia creciente en los países de la región de una serie de papers, documentos, boletines que, generados en el seno de dichas entidades, señalan desafíos presentes y futuros que deberá atender América Latina. En el campo educativo, marcan las orientaciones de política para la región y representan discursos sobre la educación que es menester analizar críticamente. El objetivo de este trabajo es analizar una serie de ideas, recomendaciones y retóricas político-pedagógicas de los Organismos Internacionales respecto de la denominada calidad de la educación en América Latina con la intención de poner de manifiesto su productividad discursiva en tanto su incidencia está ligada a significantes que operan ordenando las disputas político-discursivas del campo educativo
Resumo:
RDB2RDF systems generate RDF from relational databases, operating in two dierent manners: materializing the database content into RDF or acting as virtual RDF datastores that transform SPARQL queries into SQL. In the former, inferences on the RDF data (taking into account the ontologies that they are related to) are normally done by the RDF triple store where the RDF data is materialised and hence the results of the query answering process depend on the store. In the latter, existing RDB2RDF systems do not normally perform such inferences at query time. This paper shows how the algorithm used in the REQUIEM system, focused on handling run-time inferences for query answering, can be adapted to handle such inferences for query answering in combination with RDB2RDF systems.
Resumo:
RDB2RDF systems generate RDF from relational databases, operating in two di�erent manners: materializing the database content into RDF or acting as virtual RDF datastores that transform SPARQL queries into SQL. In the former, inferences on the RDF data (taking into account the ontologies that they are related to) are normally done by the RDF triple store where the RDF data is materialised and hence the results of the query answering process depend on the store. In the latter, existing RDB2RDF systems do not normally perform such inferences at query time. This paper shows how the algorithm used in the REQUIEM system, focused on handling run-time inferences for query answering, can be adapted to handle such inferences for query answering in combination with RDB2RDF systems.
Resumo:
panish Young Generation in Nuclear (Jóvenes Nucleares) is a commission of the Spanish Nuclear Society (SNE), whose main goals are to spread knowledge about nuclear energy among the society. Following this motivation, two Seminars have been carried out with the collaboration of the Technical University of Madrid: The Seminar of Nuclear Safety in Advanced Reactors (SRA) and the Seminar of Nuclear Fusion (SFN). The first one, which has been celebrated every year since 2010, aims to show clearly the advances that have been obtained in the section of safety with the new reactors, from a technical but simple point of view and without needing great previous nuclear engineering knowledge. The second one, which first edition was held in 2011, aims to give a general overview of the past, present and future situation of nuclear fusion technology, and was born as a result of the increasing interest of our Spanish Young Generation members in this technology.
Resumo:
El paradigma de procesamiento de eventos CEP plantea la solución al reto del análisis de grandes cantidades de datos en tiempo real, como por ejemplo, monitorización de los valores de bolsa o el estado del tráfico de carreteras. En este paradigma los eventos recibidos deben procesarse sin almacenarse debido a que el volumen de datos es demasiado elevado y a las necesidades de baja latencia. Para ello se utilizan sistemas distribuidos con una alta escalabilidad, elevado throughput y baja latencia. Este tipo de sistemas son usualmente complejos y el tiempo de aprendizaje requerido para su uso es elevado. Sin embargo, muchos de estos sistemas carecen de un lenguaje declarativo de consultas en el que expresar la computación que se desea realizar sobre los eventos recibidos. En este trabajo se ha desarrollado un lenguaje declarativo de consultas similar a SQL y un compilador que realiza la traducción de este lenguaje al lenguaje nativo del sistema de procesamiento masivo de eventos. El lenguaje desarrollado en este trabajo es similar a SQL, con el que se encuentran familiarizados un gran número de desarrolladores y por tanto aprender este lenguaje no supondría un gran esfuerzo. Así el uso de este lenguaje logra reducir los errores en ejecución de la consulta desplegada sobre el sistema distribuido al tiempo que se abstrae al programador de los detalles de este sistema.---ABSTRACT---The complex event processing paradigm CEP has become the solution for high volume data analytics which demand scalability, high throughput, and low latency. Examples of applications which use this paradigm are financial processing or traffic monitoring. A distributed system is used to achieve the performance requisites. These same requisites force the distributed system not to store the events but to process them on the fly as they are received. These distributed systems are complex systems which require a considerably long time to learn and use. The majority of such distributed systems lack a declarative language in which to express the computation to perform over incoming events. In this work, a new SQL-like declarative language and a compiler have been developed. This compiler translates this new language to the distributed system native language. Due to its similarity with SQL a vast amount of developers who are already familiar with SQL will need little time to learn this language. Thus, this language reduces the execution failures at the time the programmer no longer needs to know every single detail of the underlying distributed system to submit a query.
Resumo:
Funding: This work was supported by a grant from the Medical Research Council MR/J015277/1. The Scottish National Islet Transplant Programme is funded by the National Services Division of NHS Scotland. KRM was funded by a Fellowship from the Wellcome Trust / Scottish Translational Medicine and Therapeutics Initiative 85664. Acknowledgments This work was supported by a grant from the Medical Research Council MR/J015277/1. The Scottish National Islet Transplant Programme is funded by the National Services Division of NHS Scotland. KRM was funded by a Fellowship from the Wellcome Trust/ Scottish Translational Medicine and Therapeutics Initiative 85664. We thank Joanna Sweetman for assistance in optimisation of the immunogold staining.
Resumo:
We have developed a technique called the generation of longer cDNA fragments from serial analysis of gene expression (SAGE) tags for gene identification (GLGI), to convert SAGE tags of 10 bases into their corresponding 3′ cDNA fragments covering hundred bases. A primer containing the 10-base SAGE tag is used as the sense primer, and a single base anchored oligo(dT) primer is used as an antisense primer in PCR, together with Pfu DNA polymerase. By using this approach, a cDNA fragment extending from the SAGE tag toward the 3′ end of the corresponding sequence can be generated. Application of the GLGI technique can solve two critical issues in applying the SAGE technique: one is that a longer fragment corresponding to a SAGE tag, which has no match in databases, can be generated for further studies; the other is that the specific fragment corresponding to a SAGE tag can be identified from multiple sequences that match the same SAGE tag. The development of the GLGI method provides several potential applications. First, it provides a strategy for even wider application of the SAGE technique for quantitative analysis of global gene expression. Second, a combined application of SAGE/GLGI can be used to complete the catalogue of the expressed genes in human and in other eukaryotic species. Third, it can be used to identify the 3′ cDNA sequence from any exon within a gene. It can also be used to confirm the reality of exons predicted by bioinformatic tools in genomic sequences. Fourth, a combined application of SAGE/GLGI can be applied to define the 3′ boundary of expressed genes in the genomic sequences in human and in other eukaryotic genomes.
Resumo:
Hematopoietic stem cells (HSC) are unique in that they give rise both to new stem cells (self-renewal) and to all blood cell types. The cellular and molecular events responsible for the formation of HSC remain unknown mainly because no system exists to study it. Embryonic stem (ES) cells were induced to differentiate by coculture with the stromal cell line RP010 and the combination of interleukin (IL) 3, IL-6, and F (cell-free supernatants from cultures of the FLS4.1 fetal liver stromal cell line). Cell cytometry analysis of the mononuclear cells produced in the cultures was consistent with the presence of PgP-1+ Lin- early hematopoietic (B-220- Mac-1- JORO 75- TER 119-) cells and of fewer B-220+ IgM- B-cell progenitors and JORO 75+ T-lymphocyte progenitors. The cell-sorter-purified PgP-1+ Lin- cells produced by induced ES cells could repopulate the lymphoid, myeloid, and erythroid lineages of irradiated mice. The ES-derived PgP-1+ Lin- cells must possess extensive self-renewal potential, as they were able to produce hematopoietic repopulation of secondary mice recipients. Indeed, marrow cells from irradiated mice reconstituted (15-18 weeks before) with PgP-1+ Lin- cell-sorter-purified cells generated by induced ES cells repopulated the lymphoid, myeloid, and erythroid lineages of secondary mouse recipients assessed 16-20 weeks after their transfer into irradiated secondary mice. The results show that the culture conditions described here support differentiation of ES cells into hematopoietic cells with functional properties of HSC. It should now be possible to unravel the molecular events leading to the formation of HSC.