823 resultados para Pipeline
Resumo:
21 cm cosmology opens an observational window to previously unexplored cosmological epochs such as the Epoch of Reionization (EoR), the Cosmic Dawn and the Dark Ages using powerful radio interferometers such as the planned Square Kilometer Array (SKA). Among all the other applications which can potentially improve the understanding of standard cosmology, we study the promising opportunity given by measuring the weak gravitational lensing sourced by 21 cm radiation. We performed this study in two different cosmological epochs, at a typical EoR redshift and successively at a post-EoR redshift. We will show how the lensing signal can be reconstructed using a three dimensional optimal quadratic lensing estimator in Fourier space, using single frequency band or combining multiple frequency band measurements. To this purpose, we implemented a simulation pipeline capable of dealing with issues that can not be treated analytically. Considering the current SKA plans, we studied the performance of the quadratic estimator at typical EoR redshifts, for different survey strategies and comparing two thermal noise models for the SKA-Low array. The simulation we performed takes into account the beam of the telescope and the discreteness of visibility measurements. We found that an SKA-Low interferometer should obtain high-fidelity images of the underlying mass distribution in its phase 1 only if several bands are stacked together, covering a redshift range that goes from z=7 to z=11.5. The SKA-Low phase 2, modeled in order to improve the sensitivity of the instrument by almost an order of magnitude, should be capable of providing images with good quality even when the signal is detected within a single frequency band. Considering also the serious effect that foregrounds could have on this detections, we discussed the limits of these results and also the possibility provided by these models of measuring an accurate lensing power spectrum.
Resumo:
Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen Lösungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen Lösungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermöglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.
Resumo:
Questo elaborato vuole illustrare l’importanza del gas naturale liquefatto, attraverso l’analisi della Strategia Energetica Nazionale e del Bilancio Energetico Nazionale, sostenendo la necessità di costruire ulteriori rigassificatori sul suolo italiano per permettere al nostro Paese, in previsione di una futura ripresa economica, di svincolarsi dalla quasi totale dipendenza di importazioni attraverso “pipeline” che vincolano in modo rigido e monopolistico il Paese importatore a quello esportatore e ai Paesi che vengono attraversati dalla condotta stessa
Resumo:
Argomento del lavoro è stato lo studio di problemi legati alla Flow-Asurance. In particolare, si focalizza su due aspetti: i) una valutazione comparativa delle diverse equazioni di stato implementate nel simulatore multifase OLGA, per valutare quella che porta a risultati più conservativi; ii) l’analisi della formazione di idrati, all’interno di sistemi caratterizzati dalla presenza di gas ed acqua. Il primo argomento di studio nasce dal fatto che per garantire continuità del flusso è necessario conoscere il comportamento volumetrico del fluido all’interno delle pipelines. Per effettuare tali studi, la Flow-Assurance si basa sulle Equazioni di Stato cubiche. In particolare, sono state confrontate: -L’equazione di Soave-Redlich-Kwong; -L’equazione di Peng-Robinson; -L’equazione di Peng-Robinson modificata da Peneloux. Sono stati analizzati 4 fluidi idrocarburici (2 multifase, un olio e un gas) con diverse composizioni e diverse condizioni di fase. Le variabili considerate sono state pressione, temperatura, densità e viscosità; sono state poi valutate le perdite di carico, parametro fondamentale nello studio del trasporto di un fluido, valutando che l'equazione di Peng-Robinson è quella più adatta per caratterizzare termodinamicamente il fluido durante una fase di design, in quanto fornisce l'andamento più conservativo. Dopo aver accertato la presenza di idrati nei fluidi multifase, l’obiettivo del lavoro è stato analizzare come il sistema rispondesse all’aggiunta di inibitori chimici per uscire dalla regione termodinamica di stabilità dell’idrato. Gli inibitori utilizzati sono stati metanolo e mono-etilen-glicole in soluzione acquosa. L’analisi è stata effettuata confrontando due metodi: -Metodo analitico di Hammerschmidt; -Metodo iterativo con PVTSim. I risultati ottenuti hanno dimostrato che entrambi gli inibitori utilizzati risolvono il problema della formazione di idrato spostando la curva di stabilità al di fuori delle pressioni e temperature che si incontrano nella pipeline. Valutando le quantità da iniettare, il metodo di Hammerschmidt risulta quello più conservativo, indicando portate maggiori rispetto al PVTsim, soprattutto aggiungendo metanolo.
Resumo:
La tesi inserita in un periodo di forte transizione dai sistemi Onpremises a sistemi Cloud ha avuto l'esigenza di risolvere alcune problematiche legate alla definizione delle infrastrutture. Come poter scalare le risorse all'evenienza ricreando gli stessi ambienti, monitorandoli e mettendo in sicurezza i dati critici delle applicazioni? La tesi ha risposto proprio a questa domanda definendo un nuovo paradigma nel concepire le infrastrutture chiamato Infrastructure as Code. La tesi ha approfondito le pratiche e le metodologie che si sono legate maggiormente all'Infrastructure as Code tra le quali Version Control, Configuration Management, Continuous Integration e Continuous Delivery. La tesi inoltre ha previsto la realizzazione di un prototipo finale nato dallo studio del flusso di sviluppo software aziendale, definendo gli ambienti in accordo ai sistemi di Version Control e Configuration Management, applicando pratiche di integrazione continua per giungere ad una deployment pipeline funzionale.
Resumo:
Retinal degenerative diseases that target photoreceptors or the adjacent retinal pigment epithelium (RPE) affect millions of people worldwide. Retinal degeneration (RD) is found in many different forms of retinal diseases including retinitis pigmentosa (RP), age-related macular degeneration (AMD), diabetic retinopathy, cataracts, and glaucoma. Effective treatment for retinal degeneration has been widely investigated. Gene-replacement therapy has been shown to improve visual function in inherited retinal disease. However, this treatment was less effective with advanced disease. Stem cell-based therapy is being pursued as a potential alternative approach in the treatment of retinal degenerative diseases. In this review, we will focus on stem cell-based therapies in the pipeline and summarize progress in treatment of retinal degenerative disease.
Resumo:
Domain-specific languages (DSLs) are increasingly used as embedded languages within general-purpose host languages. DSLs provide a compact, dedicated syntax for specifying parts of an application related to specialized domains. Unfortunately, such language extensions typically do not integrate well with the development tools of the host language. Editors, compilers and debuggers are either unaware of the extensions, or must be adapted at a non-trivial cost. We present a novel approach to embed DSLs into an existing host language by leveraging the underlying representation of the host language used by these tools. Helvetia is an extensible system that intercepts the compilation pipeline of the Smalltalk host language to seamlessly integrate language extensions. We validate our approach by case studies that demonstrate three fundamentally different ways to extend or adapt the host language syntax and semantics.
Resumo:
The effect of the swirl component of air injection on the performance of an airlift pump was examined experimentally. An airlift pump is a device that pumps a liquid or slurry using only gas injection. In this study, the liquid used was water and the injected gas was air. The effect of the air swirl was determined by measuring the water discharge from an airlift pump with an air injection nozzle in which the air flow had both axial and tangential components and then repeating the tests with a nozzle with only axial injection. The induced water flow was measured using an orifice meter in the supply pipeline. Tests were run for air pressures ranging from 10 to 30 pounds per square inch, gauge (psig), at flow rates from 5 standard cubic feet per minute (scfm) up the maximum values attainable at the given pressure (usually in the range from 20 to 35 scfm). The nozzle with only axial injection produced a water flow rate that wasequivalent to or better than that induced by the nozzle with swirl. The swirl component of air injection was found to be detrimental to pump performance for all but the smallest air injection flow rate. Optimum efficiency was found for air injection pressures of 10 psig to 15 psig. In addition, the effect of using auxiliary tangential injection of water to create a swirl component in the riser before air injection on the overall capacity (i.e., flow rate) and efficiencyof the pump was examined. Auxiliary tangential water injection was found to have no beneficial effect on the pump capacity or performance in the present system.
Resumo:
To enhance understanding of the metabolic indicators of type 2 diabetes mellitus (T2DM) disease pathogenesis and progression, the urinary metabolomes of well characterized rhesus macaques (normal or spontaneously and naturally diabetic) were examined. High-resolution ultra-performance liquid chromatography coupled with the accurate mass determination of time-of-flight mass spectrometry was used to analyze spot urine samples from normal (n = 10) and T2DM (n = 11) male monkeys. The machine-learning algorithm random forests classified urine samples as either from normal or T2DM monkeys. The metabolites important for developing the classifier were further examined for their biological significance. Random forests models had a misclassification error of less than 5%. Metabolites were identified based on accurate masses (<10 ppm) and confirmed by tandem mass spectrometry of authentic compounds. Urinary compounds significantly increased (p < 0.05) in the T2DM when compared with the normal group included glycine betaine (9-fold), citric acid (2.8-fold), kynurenic acid (1.8-fold), glucose (68-fold), and pipecolic acid (6.5-fold). When compared with the conventional definition of T2DM, the metabolites were also useful in defining the T2DM condition, and the urinary elevations in glycine betaine and pipecolic acid (as well as proline) indicated defective re-absorption in the kidney proximal tubules by SLC6A20, a Na(+)-dependent transporter. The mRNA levels of SLC6A20 were significantly reduced in the kidneys of monkeys with T2DM. These observations were validated in the db/db mouse model of T2DM. This study provides convincing evidence of the power of metabolomics for identifying functional changes at many levels in the omics pipeline.
Resumo:
Current methods to characterize mesenchymal stem cells (MSCs) are limited to CD marker expression, plastic adherence and their ability to differentiate into adipogenic, osteogenic and chondrogenic precursors. It seems evident that stem cells undergoing differentiation should differ in many aspects, such as morphology and possibly also behaviour; however, such a correlation has not yet been exploited for fate prediction of MSCs. Primary human MSCs from bone marrow were expanded and pelleted to form high-density cultures and were then randomly divided into four groups to differentiate into adipogenic, osteogenic chondrogenic and myogenic progenitor cells. The cells were expanded as heterogeneous and tracked with time-lapse microscopy to record cell shape, using phase-contrast microscopy. The cells were segmented using a custom-made image-processing pipeline. Seven morphological features were extracted for each of the segmented cells. Statistical analysis was performed on the seven-dimensional feature vectors, using a tree-like classification method. Differentiation of cells was monitored with key marker genes and histology. Cells in differentiation media were expressing the key genes for each of the three pathways after 21 days, i.e. adipogenic, osteogenic and chondrogenic, which was also confirmed by histological staining. Time-lapse microscopy data were obtained and contained new evidence that two cell shape features, eccentricity and filopodia (= 'fingers') are highly informative to classify myogenic differentiation from all others. However, no robust classifiers could be identified for the other cell differentiation paths. The results suggest that non-invasive automated time-lapse microscopy could potentially be used to predict the stem cell fate of hMSCs for clinical application, based on morphology for earlier time-points. The classification is challenged by cell density, proliferation and possible unknown donor-specific factors, which affect the performance of morphology-based approaches. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
The "gold standard" for treatment of intervertebral disc herniations and degenerated discs is still spinal fusion, corresponding to the saying "no disc - no pain". Mechanical prostheses, which are currently implanted, do only have medium outcome success and have relatively high re-operation rates. Here, we discuss some of the biological intervertebral disc replacement approaches, which can be subdivided into at least two classes in accordance to the two different tissue types, the nucleus pulposus (NP) and the annulus fibrosus (AF). On the side of NP replacement hydrogels have been extensively tested in vitro and in vivo. However, these gels are usually a trade-off between cell biocompatibility and load-bearing capacity, hydrogels which fulfill both are still lacking. On the side of AF repair much less is known and the question of the anchoring of implants is still to be addressed. New hope for cell therapy comes from developmental biology investigations on the existence of intervertebral disc progenitor cells, which would be an ideal cell source for cell therapy. Also notochordal cells (remnants of the embryonic notochord) have been recently pushed back into focus since these cells have regenerative potential and can activate disc cells. Growth factor treatment and molecular therapies could be less problematic. The biological solutions for NP and AF replacement are still more fiction than fact. However, tissue engineering just scratched the tip of the iceberg, more satisfying solutions are yet to be added to the biomedical pipeline.
Resumo:
With the advent of high through-put sequencing (HTS), the emerging science of metagenomics is transforming our understanding of the relationships of microbial communities with their environments. While metagenomics aims to catalogue the genes present in a sample through assessing which genes are actively expressed, metatranscriptomics can provide a mechanistic understanding of community inter-relationships. To achieve these goals, several challenges need to be addressed from sample preparation to sequence processing, statistical analysis and functional annotation. Here we use an inbred non-obese diabetic (NOD) mouse model in which germ-free animals were colonized with a defined mixture of eight commensal bacteria, to explore methods of RNA extraction and to develop a pipeline for the generation and analysis of metatranscriptomic data. Applying the Illumina HTS platform, we sequenced 12 NOD cecal samples prepared using multiple RNA-extraction protocols. The absence of a complete set of reference genomes necessitated a peptide-based search strategy. Up to 16% of sequence reads could be matched to a known bacterial gene. Phylogenetic analysis of the mapped ORFs revealed a distribution consistent with ribosomal RNA, the majority from Bacteroides or Clostridium species. To place these HTS data within a systems context, we mapped the relative abundance of corresponding Escherichia coli homologs onto metabolic and protein-protein interaction networks. These maps identified bacterial processes with components that were well-represented in the datasets. In summary this study highlights the potential of exploiting the economy of HTS platforms for metatranscriptomics.
Resumo:
BACKGROUND: Pneumococcal meningitis is associated with high mortality (approximately 30%) and morbidity. Up to 50% of survivors are affected by neurological sequelae due to a wide spectrum of brain injury mainly affecting the cortex and hippocampus. Despite this significant disease burden, the genetic program that regulates the host response leading to brain damage as a consequence of bacterial meningitis is largely unknown.We used an infant rat model of pneumococcal meningitis to assess gene expression profiles in cortex and hippocampus at 22 and 44 hours after infection and in controls at 22 h after mock-infection with saline. To analyze the biological significance of the data generated by Affymetrix DNA microarrays, a bioinformatics pipeline was used combining (i) a literature-profiling algorithm to cluster genes based on the vocabulary of abstracts indexed in MEDLINE (NCBI) and (ii) the self-organizing map (SOM), a clustering technique based on covariance in gene expression kinetics. RESULTS: Among 598 genes differentially regulated (change factor > or = 1.5; p < or = 0.05), 77% were automatically assigned to one of 11 functional groups with 94% accuracy. SOM disclosed six patterns of expression kinetics. Genes associated with growth control/neuroplasticity, signal transduction, cell death/survival, cytoskeleton, and immunity were generally upregulated. In contrast, genes related to neurotransmission and lipid metabolism were transiently downregulated on the whole. The majority of the genes associated with ionic homeostasis, neurotransmission, signal transduction and lipid metabolism were differentially regulated specifically in the hippocampus. Of the cell death/survival genes found to be continuously upregulated only in hippocampus, the majority are pro-apoptotic, while those continuously upregulated only in cortex are anti-apoptotic. CONCLUSION: Temporal and spatial analysis of gene expression in experimental pneumococcal meningitis identified potential targets for therapy.
Resumo:
The purpose of this study is to provide a procedure to include emissions to the atmosphere resulting from the combustion of diesel fuel during dredging operations into the decision-making process of dredging equipment selection. The proposed procedure is demonstrated for typical dredging methods and data from the Illinois Waterway as performed by the U.S. Army Corps of Engineers, Rock Island District. The equipment included in this study is a 16-inch cutterhead pipeline dredge and a mechanical bucket dredge used during the 2005 dredging season on the Illinois Waterway. Considerable effort has been put forth to identify and reduce environmental impacts from dredging operations. Though environmental impacts of dredging have been studied no efforts have been applied to the evaluation of air emissions from comparable types of dredging equipment, as in this study. By identifying the type of dredging equipment with the lowest air emissions, when cost, site conditions, and equipment availability are comparable, adverse environmental impacts can be minimized without compromising the dredging project. A total of 48 scenarios were developed by varying the dredged material quantity, transport distance, and production rates. This produced an “envelope” of results applicable to a broad range of site conditions. Total diesel fuel consumed was calculated using standard cost estimating practices as defined in the U.S. Army Corps of Engineers Construction Equipment Ownership and Operating Expense Schedule (USACE, 2005). The diesel fuel usage was estimated for all equipment used to mobilize and/or operate each dredging crew for every scenario. A Limited Life Cycle Assessment (LCA) was used to estimate the air emissions from two comparable dredging operations utilizing SimaPro LCA software. An Environmental Impact Single Score (EISS) was the SimaPro output selected for comparison with the cost per CY of dredging, potential production rates, and transport distances to identify possible decision points. The total dredging time was estimated for each dredging crew and scenario. An average hourly cost for both dredging crews was calculated based on Rock Island District 2005 dredging season records (Graham 2007/08). The results from this study confirm commonly used rules of thumb in the dredging industry by indicating that mechanical bucket dredges are better suited for long transport distances and have lower air emissions and cost per CY for smaller quantities of dredged material. In addition, the results show that a cutterhead pipeline dredge would be preferable for moderate and large volumes of dredged material when no additional booster pumps are required. Finally, the results indicate that production rates can be a significant factor when evaluating the air emissions from comparable dredging equipment.
Resumo:
Nitrogen and water are essential for plant growth and development. In this study, we designed experiments to produce gene expression data of poplar roots under nitrogen starvation and water deprivation conditions. We found low concentration of nitrogen led first to increased root elongation followed by lateral root proliferation and eventually increased root biomass. To identify genes regulating root growth and development under nitrogen starvation and water deprivation, we designed a series of data analysis procedures, through which, we have successfully identified biologically important genes. Differentially Expressed Genes (DEGs) analysis identified the genes that are differentially expressed under nitrogen starvation or drought. Protein domain enrichment analysis identified enriched themes (in same domains) that are highly interactive during the treatment. Gene Ontology (GO) enrichment analysis allowed us to identify biological process changed during nitrogen starvation. Based on the above analyses, we examined the local Gene Regulatory Network (GRN) and identified a number of transcription factors. After testing, one of them is a high hierarchically ranked transcription factor that affects root growth under nitrogen starvation. It is very tedious and time-consuming to analyze gene expression data. To avoid doing analysis manually, we attempt to automate a computational pipeline that now can be used for identification of DEGs and protein domain analysis in a single run. It is implemented in scripts of Perl and R.