880 resultados para Large-scale Structures
Resumo:
This study presents a computational parametric analysis of DME steam reforming in a large scale Circulating Fluidized Bed (CFB) reactor. The Computational Fluid Dynamic (CFD) model used, which is based on Eulerian-Eulerian dispersed flow, has been developed and validated in Part I of this study [1]. The effect of the reactor inlet configuration, gas residence time, inlet temperature and steam to DME ratio on the overall reactor performance and products have all been investigated. The results have shown that the use of double sided solid feeding system remarkable improvement in the flow uniformity, but with limited effect on the reactions and products. The temperature has been found to play a dominant role in increasing the DME conversion and the hydrogen yield. According to the parametric analysis, it is recommended to run the CFB reactor at around 300 °C inlet temperature, 5.5 steam to DME molar ratio, 4 s gas residence time and 37,104 ml gcat -1 h-1 space velocity. At these conditions, the DME conversion and hydrogen molar concentration in the product gas were both found to be around 80%.
Resumo:
Some color centers in diamond can serve as quantum bits which can be manipulated with microwave pulses and read out with laser, even at room temperature. However, the photon collection efficiency of bulk diamond is greatly reduced by refraction at the diamond/air interface. To address this issue, we fabricated arrays of diamond nanostructures, differing in both diameter and top end shape, with HSQ and Cr as the etching mask materials, aiming toward large scale fabrication of single-photon sources with enhanced collection efficiency made of nitrogen vacancy (NV) embedded diamond. With a mixture of O2 and CHF3 gas plasma, diamond pillars with diameters down to 45 nm were obtained. The top end shape evolution has been represented with a simple model. The tests of size dependent single-photon properties confirmed an improved single-photon collection efficiency enhancement, larger than tenfold, and a mild decrease of decoherence time with decreasing pillar diameter was observed as expected. These results provide useful information for future applications of nanostructured diamond as a single-photon source.
Resumo:
Spread of antibiotic resistance among bacteria responsible for nosocomial and community-acquired infections urges for novel therapeutic or prophylactic targets and for innovative pathogen-specific antibacterial compounds. Major challenges are posed by opportunistic pathogens belonging to the low GC% gram-positive bacteria. Among those, Enterococcus faecalis is a leading cause of hospital-acquired infections associated with life-threatening issues and increased hospital costs. To better understand the molecular properties of enterococci that may be required for virulence, and that may explain the emergence of these bacteria in nosocomial infections, we performed the first large-scale functional analysis of E. faecalis V583, the first vancomycin-resistant isolate from a human bloodstream infection. E. faecalis V583 is within the high-risk clonal complex 2 group, which comprises mostly isolates derived from hospital infections worldwide. We conducted broad-range screenings of candidate genes likely involved in host adaptation (e.g., colonization and/or virulence). For this purpose, a library was constructed of targeted insertion mutations in 177 genes encoding putative surface or stress-response factors. Individual mutants were subsequently tested for their i) resistance to oxidative stress, ii) antibiotic resistance, iii) resistance to opsonophagocytosis, iv) adherence to the human colon carcinoma Caco-2 epithelial cells and v) virulence in a surrogate insect model. Our results identified a number of factors that are involved in the interaction between enterococci and their host environments. Their predicted functions highlight the importance of cell envelope glycopolymers in E. faecalis host adaptation. This study provides a valuable genetic database for understanding the steps leading E. faecalis to opportunistic virulence.
Resumo:
Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.
Resumo:
Aim Positive regional correlations between biodiversity and human population have been detected for several taxonomic groups and geographical regions. Such correlations could have important conservation implications and have been mainly attributed to ecological factors, with little testing for an artefactual explanation: more populated regions may show higher biodiversity because they are more thoroughly surveyed. We tested the hypothesis that the correlation between people and herptile diversity in Europe is influenced by survey effort
Resumo:
The increasing integration of renewable energies in the electricity grid contributes considerably to achieve the European Union goals on energy and Greenhouse Gases (GHG) emissions reduction. However, it also brings problems to grid management. Large scale energy storage can provide the means for a better integration of the renewable energy sources, for balancing supply and demand, to increase energy security, to enhance a better management of the grid and also to converge towards a low carbon economy. Geological formations have the potential to store large volumes of fluids with minimal impact to environment and society. One of the ways to ensure a large scale energy storage is to use the storage capacity in geological reservoir. In fact, there are several viable technologies for underground energy storage, as well as several types of underground reservoirs that can be considered. The geological energy storage technologies considered in this research were: Underground Gas Storage (UGS), Hydrogen Storage (HS), Compressed Air Energy Storage (CAES), Underground Pumped Hydro Storage (UPHS) and Thermal Energy Storage (TES). For these different types of underground energy storage technologies there are several types of geological reservoirs that can be suitable, namely: depleted hydrocarbon reservoirs, aquifers, salt formations and caverns, engineered rock caverns and abandoned mines. Specific site screening criteria are applicable to each of these reservoir types and technologies, which determines the viability of the reservoir itself, and of the technology for any particular site. This paper presents a review of the criteria applied in the scope of the Portuguese contribution to the EU funded project ESTMAP – Energy Storage Mapping and Planning.
Resumo:
Several decision and control tasks in cyber-physical networks can be formulated as large- scale optimization problems with coupling constraints. In these "constraint-coupled" problems, each agent is associated to a local decision variable, subject to individual constraints. This thesis explores the use of primal decomposition techniques to develop tailored distributed algorithms for this challenging set-up over graphs. We first develop a distributed scheme for convex problems over random time-varying graphs with non-uniform edge probabilities. The approach is then extended to unknown cost functions estimated online. Subsequently, we consider Mixed-Integer Linear Programs (MILPs), which are of great interest in smart grid control and cooperative robotics. We propose a distributed methodological framework to compute a feasible solution to the original MILP, with guaranteed suboptimality bounds, and extend it to general nonconvex problems. Monte Carlo simulations highlight that the approach represents a substantial breakthrough with respect to the state of the art, thus representing a valuable solution for new toolboxes addressing large-scale MILPs. We then propose a distributed Benders decomposition algorithm for asynchronous unreliable networks. The framework has been then used as starting point to develop distributed methodologies for a microgrid optimal control scenario. We develop an ad-hoc distributed strategy for a stochastic set-up with renewable energy sources, and show a case study with samples generated using Generative Adversarial Networks (GANs). We then introduce a software toolbox named ChoiRbot, based on the novel Robot Operating System 2, and show how it facilitates simulations and experiments in distributed multi-robot scenarios. Finally, we consider a Pickup-and-Delivery Vehicle Routing Problem for which we design a distributed method inspired to the approach of general MILPs, and show the efficacy through simulations and experiments in ChoiRbot with ground and aerial robots.
Resumo:
In this thesis we present the development and the current status of the IFrameNet project, aimed at the construction of a large-scale lexical semantic resource for the Italian language based on Frame Semantics theories. We will begin by contextualizing our work in the wider context of Frame Semantics and of the FrameNet project, which, since 1997, has attempted to apply these theories to lexicography. We will then analyse and discuss the applicability of the structure of the American resource to Italian and more specifically we will focus on the domain of fear, worry, and anxiety. We will finally propose some modifications aimed at improving this domain of the resource in relation to its coherence, its ability to accurately represent the linguistic reality and in particular in order to make it possible to apply it to Italian.
Resumo:
This paper deals with the problem of spatial data mapping. A new method based on wavelet interpolation and geostatistical prediction (kriging) is proposed. The method - wavelet analysis residual kriging (WARK) - is developed in order to assess the problems rising for highly variable data in presence of spatial trends. In these cases stationary prediction models have very limited application. Wavelet analysis is used to model large-scale structures and kriging of the remaining residuals focuses on small-scale peculiarities. WARK is able to model spatial pattern which features multiscale structure. In the present work WARK is applied to the rainfall data and the results of validation are compared with the ones obtained from neural network residual kriging (NNRK). NNRK is also a residual-based method, which uses artificial neural network to model large-scale non-linear trends. The comparison of the results demonstrates the high quality performance of WARK in predicting hot spots, reproducing global statistical characteristics of the distribution and spatial correlation structure.
Resumo:
Understanding the way in which large-scale structures, like galaxies, form remains one of the most challenging problems in cosmology today. The standard theory for the origin of these structures is that they grew by gravitational instability from small, perhaps quantum generated, °uctuations in the density of dark matter, baryons and photons over an uniform primordial Universe. After the recombination, the baryons began to fall into the pre-existing gravitational potential wells of the dark matter. In this dissertation a study is initially made of the primordial recombination era, the epoch of the formation of the neutral hydrogen atoms. Besides, we analyzed the evolution of the density contrast (of baryonic and dark matter), in clouds of dark matter with masses among 104M¯ ¡ 1010M¯. In particular, we take into account the several physical mechanisms that act in the baryonic component, during and after the recombination era. The analysis of the formation of these primordial objects was made in the context of three models of dark energy as background: Quintessence, ¤CDM(Cosmological Constant plus Cold Dark Matter) and Phantom. We show that the dark matter is the fundamental agent for the formation of the structures observed today. The dark energy has great importance at that epoch of its formation
Resumo:
Devido às suas características únicas, redes de sensores ópticos têm encontrado aplicação em muitos campos, como em Engenharia Civil, Engenharia Geotécnica, Aeronáutica, Energia e Indústrias de Petróleo & Gás. Soluções de monitoramento baseadas nessa tecnologia têm se mostrado particularmente rentáveis e podem ser aplicadas às estruturas de grande porte, onde centenas de sensores devem ser implantados para medições a longo prazo de diferentes parâmetros mecânicos e físicos. Sensores baseados em Grades de Bragg em fibra (FBGs) são a solução mais comumente utilizada no Monitoramento de Saúde Estrutural (SHM) e as medições são realizadas por instrumentos especiais conhecidos como interrogadores ópticos. Taxas de aquisição cada vez mais elevadas têm sido possíveis utilizando interrogadores ópticos mais recentes, o que dá origem a um grande volume de dados cuja manipulação, armazenamento, gerenciamento e visualização podem demandar aplicações de software especiais. Este trabalho apresenta duas aplicações de software de tempo real desenvolvidas para esses fins: Interrogator Abstraction (InterAB) e Web-based System (WbS). As inovações neste trabalho incluem a integração, sincronização, independência, segurança, processamento e visualização em tempo real, e persistência de dados ou armazenamento proporcionados pelo trabalho conjunto das aplicações desenvolvidas. Os resultados obtidos durante testes em laboratório e ambiente real demonstraram a eficiência, robustez e flexibilidade desses softwares para diferentes tipos de sensores e interrogadores ópticos, garantindo atomicidade, consistência, isolamento e durabilidade dos dados persistidos pelo InterAB e apresentados pelo WbS.
Resumo:
VISTA Variables in the Via Lactea (VVV) is an ESO variability survey that is performing observations in near-infrared bands (ZY JHK(s)) toward the Galactic bulge and part of the disk with the completeness limits at least 3 mag deeper than Two Micron All Sky Survey. In the present work, we searched in the VVV survey data for background galaxies near the Galactic plane using ZY JHK(s) photometry that covers 1.636 deg(2). We identified 204 new galaxy candidates by analyzing colors, sizes, and visual inspection of multi-band (ZY JHK(s)) images. The galaxy candidate colors were also compared with the predicted ones by star count models considering a more realistic extinction model at the same completeness limits observed by VVV. A comparison of the galaxy candidates with the expected one by Millennium simulations is also presented. Our results increase the number density of known galaxies behind the Milky Way by more than one order of magnitude. A catalog with galaxy properties including ellipticity, Petrosian radii, and ZY JHK(s) magnitudes is provided, as well as comparisons of the results with other surveys of galaxies toward the Galactic plane.
Resumo:
Large-scale structures can be considered an interesting and useful "laboratory" to better investigate the Universe; in particular the filaments connecting clusters and superclusters of galaxies can be a powerful tool for this intent, since they are not virialised systems yet. The large structures in the Universe have been studied in different bands, in particular the present work takes into consideration the emission in the radio band. In the last years both compact and diffuse radio emission have been detected, revealing to be associated to single objects and clusters of galaxies respectively. The detection of these sources is important, because the radiation process is the synchrotron emission, which in turn is linked to the presence of a magnetic field: therefore studying these radio sources can help in investigating the magnetic field which permeates different portions of space. Furthermore, radio emission in optical filaments have been detected recently, opening new chances to further improve the understanding of structure formation. Filaments can be seen as the net which links clusters and superclusters. This work was made with the aim of investigating non-thermal properties in low-density regions, looking for possible filaments associated to the diffuse emission. The analysed sources are 0917+75, which is located at a redshift z = 0.125, and the double cluster system A399-A401, positioned at z = 0.071806 and z = 0.073664 respectively. Data were taken from VLA/JVLA observations, and reduced and calibrated with the package AIPS, following the standard procedure. Isocountour and polarisation maps were yielded, allowing to derive the main physical properties. Unfortunately, because of a low quality data for A399-A401, it was not possible to see any radio halo or bridge.
Resumo:
As lightweight and slender structural elements are more frequently used in the design, large scale structures become more flexible and susceptible to excessive vibrations. To ensure the functionality of the structure, dynamic properties of the occupied structure need to be estimated during the design phase. Traditional analysis method models occupants simply as an additional mass; however, research has shown that human occupants could be better modeled as an additional degree-of- freedom. In the United Kingdom, active and passive crowd models are proposed by the Joint Working Group as a result of a series of analytical and experimental research. It is expected that the crowd models would yield a more accurate estimation to the dynamic response of the occupied structure. However, experimental testing recently conducted through a graduate student project at Bucknell University indicated that the proposed passive crowd model might be inaccurate in representing the impact on the structure from the occupants. The objective of this study is to provide an assessment of the validity of the crowd models proposed by JWG through comparing the dynamic properties obtained from experimental testing data and analytical modeling results. The experimental data used in this study was collected by Firman in 2010. The analytical results were obtained by performing a time-history analysis on a finite element model of the occupied structure. The crowd models were created based on the recommendations from the JWG combined with the physical properties of the occupants during the experimental study. During this study, SAP2000 was used to create the finite element models and to implement the analysis; Matlab and ME¿scope were used to obtain the dynamic properties of the structure through processing the time-history analysis results from SAP2000. The result of this study indicates that the active crowd model could quite accurately represent the impact on the structure from occupants standing with bent knees while the passive crowd model could not properly simulate the dynamic response of the structure when occupants were standing straight or sitting on the structure. Future work related to this study involves improving the passive crowd model and evaluating the crowd models with full-scale structure models and operating data.
Resumo:
The realisation of molecular assemblies featuring specific macroscopic properties is a prime example for the versatility of supramolecular organisation. Microporous materials such as zeolite L are well suited for the preparation of host-guest composites containing dyes, complexes, or clusters. This short tutorial focuses on the possibilities offered by zeolite L to study and influence Förster resonance energy transfer inside of its nanochannels. The highly organised host-guest materials can in turn be structured on a larger scale to form macroscopic patterns, making it possible to create large-scale structures from small, highly organised building blocks for novel optical applications.