907 resultados para Large-scale system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim Positive regional correlations between biodiversity and human population have been detected for several taxonomic groups and geographical regions. Such correlations could have important conservation implications and have been mainly attributed to ecological factors, with little testing for an artefactual explanation: more populated regions may show higher biodiversity because they are more thoroughly surveyed. We tested the hypothesis that the correlation between people and herptile diversity in Europe is influenced by survey effort

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing integration of renewable energies in the electricity grid contributes considerably to achieve the European Union goals on energy and Greenhouse Gases (GHG) emissions reduction. However, it also brings problems to grid management. Large scale energy storage can provide the means for a better integration of the renewable energy sources, for balancing supply and demand, to increase energy security, to enhance a better management of the grid and also to converge towards a low carbon economy. Geological formations have the potential to store large volumes of fluids with minimal impact to environment and society. One of the ways to ensure a large scale energy storage is to use the storage capacity in geological reservoir. In fact, there are several viable technologies for underground energy storage, as well as several types of underground reservoirs that can be considered. The geological energy storage technologies considered in this research were: Underground Gas Storage (UGS), Hydrogen Storage (HS), Compressed Air Energy Storage (CAES), Underground Pumped Hydro Storage (UPHS) and Thermal Energy Storage (TES). For these different types of underground energy storage technologies there are several types of geological reservoirs that can be suitable, namely: depleted hydrocarbon reservoirs, aquifers, salt formations and caverns, engineered rock caverns and abandoned mines. Specific site screening criteria are applicable to each of these reservoir types and technologies, which determines the viability of the reservoir itself, and of the technology for any particular site. This paper presents a review of the criteria applied in the scope of the Portuguese contribution to the EU funded project ESTMAP – Energy Storage Mapping and Planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The correlations between the evolution of the Super Massive Black Holes (SMBHs) and their host galaxies suggests that the SMBH accretion on sub-pc scales (active galactice nuclei, AGN) is linked to the building of the galaxy over kpc scales, through the so called AGN feedback. Most of the galaxy assembly occurs in overdense large scale structures (LSSs). AGN residing in powerful sources in LSSs, such as the proto-brightest cluster galaxies (BCGs), can affect the evolution of the surrounding intra-cluster medium (ICM) and nearby galaxies. Among distant AGN, high-redshift radio-galaxies (HzRGs) are found to be excellent BCG progenitor candidates. In this Thesis we analyze novel interferometric observations of the so-called "J1030" field centered around the z = 6.3 SDSS Quasar J1030+0524, carried out with the Atacama large (sub-)millimetre array (ALMA) and the Jansky very large array (JVLA). This field host a LSS assembling around a powerful HzRG at z = 1.7 that shows evidence of positive AGN feedback in heating the surrounding ICM and promoting star-formation in multiple galaxies at hundreds kpc distances. We report the detection of gas-rich members of the LSS, including the HzRG. We showed that the LSS is going to evolve into a local massive cluster and the HzRG is the proto-BCG. we unveiled signatures of the proto-BCG's interaction with the surrounding ICM, strengthening the positive AGN feedback scenario. From the JVLA observations of the "J1030" we extracted one of the deepest extra-galactic radio surveys to date (~12.5 uJy at 5 sigma). Exploiting the synergy with the X-ray deep survey (~500 ks) we investigated the relation of the X-ray/radio emission of a X-ray-selected sample, unveiling that the radio emission is powered by different processes (star-formation and AGN), and that AGN-driven sample is mostly composed by radio-quiet objects that display a significant X-ray/radio correlation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we present the development and the current status of the IFrameNet project, aimed at the construction of a large-scale lexical semantic resource for the Italian language based on Frame Semantics theories. We will begin by contextualizing our work in the wider context of Frame Semantics and of the FrameNet project, which, since 1997, has attempted to apply these theories to lexicography. We will then analyse and discuss the applicability of the structure of the American resource to Italian and more specifically we will focus on the domain of fear, worry, and anxiety. We will finally propose some modifications aimed at improving this domain of the resource in relation to its coherence, its ability to accurately represent the linguistic reality and in particular in order to make it possible to apply it to Italian.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to explore how transgender individuals were supported to navigate the healthcare system to achieve positive healthcare experiences. A single case study was conducted in Southern Ontario, which included ten individual interviews. Data was analyzed through thematic analysis, allowing for seven themes to emerge within macro (large-scale system), meso (local/interpersonal), and micro (individual/internal) levels of healthcare system support. Themes that emerged within the levels of system support included: 1) existing deficits with hope for change; 2) significant external supports; 3) importance of informal networking; 4) support from local area family physicians and walk-in clinics; 5) navigating the healthcare system alone; 6) personality traits for successful healthcare experiences; and 7) the development of strategies to achieve positive healthcare experiences. This study outlined factors that contributed to positive healthcare experiences for transgender individuals, showing that meso and micro level support are compensating for large-scale healthcare system deficits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for large-scale systems. Nonetheless, a critical obstacle, which needs to be overcome in MPC, is the large computational burden when a large-scale system is considered or a long prediction horizon is involved. In order to solve this problem, we use an adaptive prediction accuracy (APA) approach that can reduce the computational burden almost by half. The proposed MPC scheme with this scheme is tested on the northern Dutch water system, which comprises Lake IJssel, Lake Marker, the River IJssel and the North Sea Canal. The simulation results show that by using the MPC-APA scheme, the computational time can be reduced to a large extent and a flood protection problem over longer prediction horizons can be well solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apresentamos dois métodos de interpretação de dados de campos potenciais, aplicados à prospecção de hidrocarbonetos. O primeiro emprega dados aeromagnéticos para estimar o limite, no plano horizontal, entre a crosta continental e a crosta oceânica. Este método baseia-se na existência de feições geológicas magnéticas exclusivas da crosta continental, de modo que as estimativas das extremidades destas feições são usadas como estimativas dos limites da crosta continental. Para tanto, o sinal da anomalia aeromagnética na região da plataforma, do talude e da elevação continental é amplificado através do operador de continuação analítica para baixo usando duas implementações: o princípio da camada equivalente e a condição de fronteira de Dirichlet. A maior carga computacional no cálculo do campo continuado para baixo reside na resolução de um sistema de equações lineares de grande porte. Este esforço computacional é minimizado através do processamento por janelas e do emprego do método do gradiente conjugado na resolução do sistema de equações. Como a operação de continuação para baixo é instável, estabilizamos a solução através do funcional estabilizador de primeira ordem de Tikhonov. Testes em dados aeromagnéticos sintéticos contaminados com ruído pseudo-aleatório Gaussiano mostraram a eficiência de ambas as implementações para realçar os finais das feições magnéticas exclusivas da crosta continental, permitindo o delineamento do limite desta com a crosta oceânica. Aplicamos a metodologia em suas duas implementações a dados aeromagnéticos reais de duas regiões da costa brasileira: Foz do Amazonas e Bacia do Jequitinhonha. O segundo método delineia, simultaneamente, a topografia do embasamento de uma bacia sedimentar e a geometria de estruturas salinas contidas no pacote sedimentar. Os modelos interpretativos consistem de um conjunto de prismas bidimensionais verticais justapostos, para o pacote sedimentar e de prismas bidimensionais com seções verticais poligonais para as estruturas salinas. Estabilizamos a solução, incorporando características geométricas do relevo do embasamento e das estruturas salinas compatíveis com o ambiente geológico através dos estabilizadores da suavidade global, suavidade ponderada e da concentração de massa ao longo de direções preferenciais, além de vínculos de desigualdade nos parâmetros. Aplicamos o método a dados gravimétricos sintéticos produzidos por fontes 2D simulando bacias sedimentares intracratônicas e marginais apresentando densidade do pacote sedimentar variando com a profundidade segundo uma lei hiperbólica e abrigando domos e almofadas salinas. Os resultados mostraram que o método apresenta potencial para delinear, simultaneamente, as geometrias tanto de almofadas e domos salinos, como de relevos descontínuos do embasamento. Aplicamos o método, também, a dados reais ao longo de dois perfis gravimétricos sobre as Bacias de Campos e do Jequitinhonha e obtivemos interpretações compatíveis com a geologia da área.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to examine the long-term development of offshore macrozoobenthic soft-bottom communities of the German Bight, four representative permanent stations (MZB-SSd, -FSd, -Slt, -WB) have been sampled continuously since 1969. Inter-annual variability and possible long-term trends were analysed based on spring-time samples from 1969 until 2000. This is part of the ecological long-term series of the AWI and is supplemented by periodic large-scale mapping of the benthos. The main factors influencing the development of the benthic communities are biological interactions, climate, food supply (eutrophication) and the disturbance regime. The most frequent disturbances are sediment relocations during strong storms or by bottom trawling, while occasional oxygen deficiencies and extremely cold winters are important disturbance events working on a much larger scale. Benthic communities at the sampling stations show a large inter-annual variability combined with a variation on a roughly decadal scale. In accordance with large-scale system shifts reported for the North Sea, benthic community transitions occurred between roughly the 1970ies, 80ies and 90ies. The transitions between periods are not distinctly marked by strong changes but rather reflected in gradual changes of the species composition and dominance structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os mecanismos e técnicas do domínio de Tempo-Real são utilizados quando existe a necessidade de um sistema, seja este um sistema embutido ou de grandes dimensões, possuir determinadas características que assegurem a qualidade de serviço do sistema. Os Sistemas de Tempo-Real definem-se assim como sistemas que possuem restrições temporais rigorosas, que necessitam de apresentar altos níveis de fiabilidade de forma a garantir em todas as instâncias o funcionamento atempado do sistema. Devido à crescente complexidade dos sistemas embutidos, empregam-se frequentemente arquiteturas distribuídas, onde cada módulo é normalmente responsável por uma única função. Nestes casos existe a necessidade de haver um meio de comunicação entre estes, de forma a poderem comunicar entre si e cumprir a funcionalidade desejadas. Devido à sua elevada capacidade e baixo custo a tecnologia Ethernet tem vindo a ser alvo de estudo, com o objetivo de a tornar num meio de comunicação com a qualidade de serviço característica dos sistemas de tempo-real. Como resposta a esta necessidade surgiu na Universidade de Aveiro, o Switch HaRTES, o qual possui a capacidade de gerir os seus recursos dinamicamente, de modo a fornecer à rede onde é aplicado garantias de Tempo-Real. No entanto, para uma arquitetura de rede ser capaz de fornecer aos seus nós garantias de qualidade serviço, é necessário que exista uma especificação do fluxo, um correto encaminhamento de tráfego, reserva de recursos, controlo de admissão e um escalonamento de pacotes. Infelizmente, o Switch HaRTES apesar de possuir todas estas características, não suporta protocolos standards. Neste documento é apresentado então o trabalho que foi desenvolvido para a integração do protocolo SRP no Switch HaRTES.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Samples of cultivated Ulva clathrata were collected from a medium scale system (MSS, 1.5 1.5 m tank), or from a large scale system (LSS, 0.8 ha earthen pond). MSS samples were dried directly while the LSS sample was washed in freshwater and pressed before drying. Crude protein content ranged 20–26%, essential amino acids accounting for 32–36% of crude protein. The main analysed monosaccharides were rhamnose (36–40%), uronic acids (27–29%), xylose (10–13%) and glucose (10–16%). Some notable variations between MSS and LSS samples were observed for total dietary fibre (26% vs 41%), saturated fatty acids (31% vs 51%), PUFAS (33% vs 13%), carotenoids (358 vs 169 mg kg1 dw) and for Ca (9 vs 19 g kg1 ), Fe (0.6 vs 4.2 g kg1 ), Cu (44 vs 14 mg kg1 ), Zn (93 vs 17 mg kg1 ) and As (2 vs 9 mg kg1 ). The chemical composition of U. clathrata indicates that it has a good potential for its use in human and animal food.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, a tube-based Distributed Economic Predictive Control (DEPC) scheme is presented for a group of dynamically coupled linear subsystems. These subsystems are components of a large scale system and control inputs are computed based on optimizing a local economic objective. Each subsystem is interacting with its neighbors by sending its future reference trajectory, at each sampling time. It solves a local optimization problem in parallel, based on the received future reference trajectories of the other subsystems. To ensure recursive feasibility and a performance bound, each subsystem is constrained to not deviate too much from its communicated reference trajectory. This difference between the plan trajectory and the communicated one is interpreted as a disturbance on the local level. Then, to ensure the satisfaction of both state and input constraints, they are tightened by considering explicitly the effect of these local disturbances. The proposed approach averages over all possible disturbances, handles tightened state and input constraints, while satisfies the compatibility constraints to guarantee that the actual trajectory lies within a certain bound in the neighborhood of the reference one. Each subsystem is optimizing a local arbitrary economic objective function in parallel while considering a local terminal constraint to guarantee recursive feasibility. In this framework, economic performance guarantees for a tube-based distributed predictive control (DPC) scheme are developed rigorously. It is presented that the closed-loop nominal subsystem has a robust average performance bound locally which is no worse than that of a local robust steady state. Since a robust algorithm is applying on the states of the real (with disturbances) subsystems, this bound can be interpreted as an average performance result for the real closed-loop system. To this end, we present our outcomes on local and global performance, illustrated by a numerical example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology is presented for the development of a combined seasonal weather and crop productivity forecasting system. The first stage of the methodology is the determination of the spatial scale(s) on which the system could operate; this determination has been made for the case of groundnut production in India. Rainfall is a dominant climatic determinant of groundnut yield in India. The relationship between yield and rainfall has been explored using data from 1966 to 1995. On the all-India scale, seasonal rainfall explains 52% of the variance in yield. On the subdivisional scale, correlations vary between variance r(2) = 0.62 (significance level p < 10(-4)) and a negative correlation with r(2) = 0.1 (p = 0.13). The spatial structure of the relationship between rainfall and groundnut yield has been explored using empirical orthogonal function (EOF) analysis. A coherent, large-scale pattern emerges for both rainfall and yield. On the subdivisional scale (similar to 300 km), the first principal component (PC) of rainfall is correlated well with the first PC of yield (r(2) = 0.53, p < 10(-4)), demonstrating that the large-scale patterns picked out by the EOFs are related. The physical significance of this result is demonstrated. Use of larger averaging areas for the EOF analysis resulted in lower and (over time) less robust correlations. Because of this loss of detail when using larger spatial scales, the subdivisional scale is suggested as an upper limit on the spatial scale for the proposed forecasting system. Further, district-level EOFs of the yield data demonstrate the validity of upscaling these data to the subdivisional scale. Similar patterns have been produced using data on both of these scales, and the first PCs are very highly correlated (r(2) = 0.96). Hence, a working spatial scale has been identified, typical of that used in seasonal weather forecasting, that can form the basis of crop modeling work for the case of groundnut production in India. Last, the change in correlation between yield and seasonal rainfall during the study period has been examined using seasonal totals and monthly EOFs. A further link between yield and subseasonal variability is demonstrated via analysis of dynamical data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.