995 resultados para multi-event


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paleoclimatic record of Jureia Paleolagoon, coastal southeastem Brazil, includes cyclic and gradual changes with different intensities and frequencies through geological time, and it is controlled by astronomical, geophysical, and geological phenomena. These variations are not due to one single cause, but they result from the interaction of several factors, which act at different temporal and spatial scales. Here, we describe paleoenvironmental evidence regarding climatic and sea level changes from the last 9400 cal yr BP at the Jureia Paleolagoon - one of the main groups of protected South Atlantic ecosystems. Geochemical evidences were used to identify anomalies from multi-proxy analyses of a paleolagoon sediment core. The anomalies of centennial scale were correlated to climate and transgression-regression cycles from the Holocene period. Decadal scale anomalous oscillations in the Quaternary paleolagoon sediments occur between 9400 and 7500 cal yr BP, correlated with long- and short-term natural events, which generated high sedimentation rates, mainly between 8385 and 8375 cal yr BP (10 cm/yr). Our results suggest that a modem-day short-duration North Atlantic climatic event, such as the 82 ka event, could affect the environmental equilibrium in South America and intensify the South American Summer Monsoon. (C) 2011 University of Washington. Published by Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural hazard related to the volcanic activity represents a potential risk factor, particularly in the vicinity of human settlements. Besides to the risk related to the explosive and effusive activity, the instability of volcanic edifices may develop into large landslides often catastrophically destructive, as shown by the collapse of the northern flank of Mount St. Helens in 1980. A combined approach was applied to analyse slope failures that occurred at Stromboli volcano. SdF slope stability was evaluated by using high-resolution multi-temporal DTMMs and performing limit equilibrium stability analyses. High-resolution topographical data collected with remote sensing techniques and three-dimensional slope stability analysis play a key role in understanding instability mechanism and the related risks. Analyses carried out on the 2002–2003 and 2007 Stromboli eruptions, starting from high-resolution data acquired through airborne remote sensing surveys, permitted the estimation of the lava volumes emplaced on the SdF slope and contributed to the investigation of the link between magma emission and slope instabilities. Limit Equilibrium analyses were performed on the 2001 and 2007 3D models, in order to simulate the slope behavior before 2002-2003 landslide event and after the 2007 eruption. Stability analyses were conducted to understand the mechanisms that controlled the slope deformations which occurred shortly after the 2007 eruption onset, involving the upper part of slope. Limit equilibrium analyses applied to both cases yielded results which are congruent with observations and monitoring data. The results presented in this work undoubtedly indicate that hazard assessment for the island of Stromboli should take into account the fact that a new magma intrusion could lead to further destabilisation of the slope, which may be more significant than the one recently observed because it will affect an already disarranged deposit and fractured and loosened crater area. The two-pronged approach based on the analysis of 3D multi-temporal mapping datasets and on the application of LE methods contributed to better understanding volcano flank behaviour and to be prepared to undertake actions aimed at risk mitigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EUMETSAT (www.eumetsat.int) e’ l’agenzia europea per operazioni su satelliti per monitorare clima, meteo e ambiente terrestre. Dal centro operativo situato a Darmstadt (Germania), si controllano satelliti meteorologici su orbite geostazionarie e polari che raccolgono dati per l’osservazione dell’atmosfera, degli oceani e della superficie terrestre per un servizio continuo di 24/7. Un sistema di monitoraggio centralizzato per programmi diversi all’interno dell’ambiente operazionale di EUMETSAT, e’ dato da GEMS (Generic Event Monitoring System). Il software garantisce il controllo di diverse piattaforme, cross-monitoring di diverse sezioni operative, ed ha le caratteristiche per potere essere esteso a future missioni. L’attuale versione della GEMS MMI (Multi Media Interface), v. 3.6, utilizza standard Java Server Pages (JSP) e fa uso pesante di codici Java; utilizza inoltre files ASCII per filtri e display dei dati. Conseguenza diretta e’ ad esempio, il fatto che le informazioni non sono automaticamente aggiornate, ma hanno bisogno di ricaricare la pagina. Ulteriori inputs per una nuova versione della GEMS MMI vengono da diversi comportamenti anomali riportati durante l’uso quotidiano del software. La tesi si concentra sulla definizione di nuovi requisiti per una nuova versione della GEMS MMI (v. 4.4) da parte della divisione ingegneristica e di manutenzione di operazioni di EUMETSAT. Per le attivita’ di supporto, i test sono stati condotti presso Solenix. Il nuovo software permettera’ una migliore applicazione web, con tempi di risposta piu’ rapidi, aggiornamento delle informazioni automatico, utilizzo totale del database di GEMS e le capacita’ di filtri, insieme ad applicazioni per telefoni cellulari per il supporto delle attivita’ di reperibilita’. La nuova versione di GEMS avra’ una nuova Graphical User Interface (GUI) che utilizza tecnologie moderne. Per un ambiente di operazioni come e’ quello di EUMETSAT, dove l’affidabilita’ delle tecnologie e la longevita’ dell’approccio scelto sono di vitale importanza, non tutti gli attuali strumenti a disposizione sono adatti e hanno bisogno di essere migliorati. Allo stesso tempo, un’ interfaccia moderna, in termini di visual design, interattivita’ e funzionalita’, e’ importante per la nuova GEMS MMI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quando si parla di architetture di controllo in ambito Web, il Modello ad Eventi è indubbiamente quello più diffuso e adottato. L’asincronicità e l’elevata interazione con l’utente sono caratteristiche tipiche delle Web Applications, ed un architettura ad eventi, grazie all’adozione del suo tipico ciclo di controllo chiamato Event Loop, fornisce un'astrazione semplice ma sufficientemente espressiva per soddisfare tali requisiti. La crescita di Internet e delle tecnologie ad esso associate, assieme alle recenti conquiste in ambito di CPU multi-core, ha fornito terreno fertile per lo sviluppo di Web Applications sempre più complesse. Questo aumento di complessità ha portato però alla luce alcuni limiti del modello ad eventi, ancora oggi non del tutto risolti. Con questo lavoro si intende proporre un differente approccio a questa tipologia di problemi, che superi i limiti riscontrati nel modello ad eventi proponendo un architettura diversa, nata in ambito di IA ma che sta guadagno popolarità anche nel general-purpose: il Modello ad Agenti. Le architetture ad agenti adottano un ciclo di controllo simile all’Event Loop del modello ad eventi, ma con alcune profonde differenze: il Control Loop. Lo scopo di questa tesi sarà dunque approfondire le due tipologie di architetture evidenziandone le differenze, mostrando cosa significa affrontare un progetto e lo sviluppo di una Web Applications avendo tecnologie diverse con differenti cicli di controllo, mettendo in luce pregi e difetti dei due approcci.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Standard Model of particle physics is a very successful theory which describes nearly all known processes of particle physics very precisely. Nevertheless, there are several observations which cannot be explained within the existing theory. In this thesis, two analyses with high energy electrons and positrons using data of the ATLAS detector are presented. One, probing the Standard Model of particle physics and another searching for phenomena beyond the Standard Model.rnThe production of an electron-positron pair via the Drell-Yan process leads to a very clean signature in the detector with low background contributions. This allows for a very precise measurement of the cross-section and can be used as a precision test of perturbative quantum chromodynamics (pQCD) where this process has been calculated at next-to-next-to-leading order (NNLO). The invariant mass spectrum mee is sensitive to parton distribution functions (PFDs), in particular to the poorly known distribution of antiquarks at large momentum fraction (Bjoerken x). The measurementrnof the high-mass Drell-Yan cross-section in proton-proton collisions at a center-of-mass energy of sqrt(s) = 7 TeV is performed on a dataset collected with the ATLAS detector, corresponding to an integrated luminosity of 4.7 fb-1. The differential cross-section of pp -> Z/gamma + X -> e+e- + X is measured as a function of the invariant mass in the range 116 GeV < mee < 1500 GeV. The background is estimated using a data driven method and Monte Carlo simulations. The final cross-section is corrected for detector effects and different levels of final state radiation corrections. A comparison isrnmade to various event generators and to predictions of pQCD calculations at NNLO. A good agreement within the uncertainties between measured cross-sections and Standard Model predictions is observed.rnExamples of observed phenomena which can not be explained by the Standard Model are the amount of dark matter in the universe and neutrino oscillations. To explain these phenomena several extensions of the Standard Model are proposed, some of them leading to new processes with a high multiplicity of electrons and/or positrons in the final state. A model independent search in multi-object final states, with objects defined as electrons and positrons, is performed to search for these phenomenas. Therndataset collected at a center-of-mass energy of sqrt(s) = 8 TeV, corresponding to an integrated luminosity of 20.3 fb-1 is used. The events are separated in different categories using the object multiplicity. The data-driven background method, already used for the cross-section measurement was developed further for up to five objects to get an estimation of the number of events including fake contributions. Within the uncertainties the comparison between data and Standard Model predictions shows no significant deviations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recurrence of cardiovascular events and mortality remain high after acute coronary syndromes. A Swiss multicentric study, "Inflammation and acute coronary syndromes (ACS)--Novel strategies for prevention and clinical managements", is currently underway with the support of the Swiss National Science Foundation. The study includes a clinical research subproject of which the aim is to assess the impact of the ELIPS program (multi-dimEnsionaL prevention Program after acute coronary Syndrome) on the recurrence of cardiovascular events after an ACS. The basic research sub-projects aim to investigate novel cardiovascular risk biomarkers and genetic determinants of recurrence and to study the role of stem cells after an ACS. Another sub-project will evaluate intracoronary imaging techniques and the efficacy of different types of stents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present multi-modal study we aimed to investigate the role of visual exploration in relation to the neuronal activity and performance during visuospatial processing. To this end, event related functional magnetic resonance imaging er-fMRI was combined with simultaneous eye tracking recording and transcranial magnetic stimulation (TMS). Two groups of twenty healthy subjects each performed an angle discrimination task with different levels of difficulty during er-fMRI. The number of fixations as a measure of visual exploration effort was chosen to predict blood oxygen level-dependent (BOLD) signal changes using the general linear model (GLM). Without TMS, a positive linear relationship between the visual exploration effort and the BOLD signal was found in a bilateral fronto-parietal cortical network, indicating that these regions reflect the increased number of fixations and the higher brain activity due to higher task demands. Furthermore, the relationship found between the number of fixations and the performance demonstrates the relevance of visual exploration for visuospatial task solving. In the TMS group, offline theta bursts TMS (TBS) was applied over the right posterior parietal cortex (PPC) before the fMRI experiment started. Compared to controls, TBS led to a reduced correlation between visual exploration and BOLD signal change in regions of the fronto-parietal network of the right hemisphere, indicating a disruption of the network. In contrast, an increased correlation was found in regions of the left hemisphere, suggesting an intent to compensate functionality of the disturbed areas. TBS led to fewer fixations and faster response time while keeping accuracy at the same level, indicating that subjects explored more than actually needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Master production schedule (MPS) plays an important role in an integrated production planning system. It converts the strategic planning defined in a production plan into the tactical operation execution. The MPS is also known as a tool for top management to control over manufacture resources and becomes input of the downstream planning levels such as material requirement planning (MRP) and capacity requirement planning (CRP). Hence, inappropriate decision on the MPS development may lead to infeasible execution, which ultimately causes poor delivery performance. One must ensure that the proposed MPS is valid and realistic for implementation before it is released to real manufacturing system. In practice, where production environment is stochastic in nature, the development of MPS is no longer simple task. The varying processing time, random event such as machine failure is just some of the underlying causes of uncertainty that may be hardly addressed at planning stage so that in the end the valid and realistic MPS is tough to be realized. The MPS creation problem becomes even more sophisticated as decision makers try to consider multi-objectives; minimizing inventory, maximizing customer satisfaction, and maximizing resource utilization. This study attempts to propose a methodology for MPS creation which is able to deal with those obstacles. This approach takes into account uncertainty and makes trade off among conflicting multi-objectives at the same time. It incorporates fuzzy multi-objective linear programming (FMOLP) and discrete event simulation (DES) for MPS development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: To assess observations with multimodality imaging of the Absorb bioresorbable everolimus-eluting vascular scaffold performed in two consecutive cohorts of patients who were serially investigated either at 6 and 24 months or at 12 and 36 months. Methods and results: In the ABSORB multicentre single-arm trial, 45 patients (cohort B1) and 56 patients (cohort B2) underwent serial invasive imaging, specifically quantitative coronary angiography (QCA), intravascular ultrasound (IVUS), radiofrequency backscattering (IVUS-VH) and optical coherence tomography (OCT). Between one and three years, late luminal loss remained unchanged (6 months: 0.19 mm, 1 year: 0.27 mm, 2 years: 0.27 mm, 3 years: 0.29 mm) and the in-segment angiographic restenosis rate for the entire cohort B (n=101) at three years was 6%. On IVUS, mean lumen, scaffold, plaque and vessel area showed enlargement up to two years. Mean lumen and scaffold area remained stable between two and three years whereas significant reduction in plaque behind the struts occurred with a trend toward adaptive restrictive remodelling of EEM. Hyperechogenicity of the vessel wall, a surrogate of the bioresorption process, decreased from 23.1% to 10.4% with a reduction of radiofrequency backscattering for dense calcium and necrotic core. At three years, the count of strut cores detected on OCT increased significantly, probably reflecting the dismantling of the scaffold; 98% of struts were covered. In the entire cohort B (n=101), the three-year major adverse cardiac event rate was 10.0% without any scaffold thrombosis. Conclusions: The current investigation demonstrated the dynamics of vessel wall changes after implantation of a bioresorbable scaffold, resulting at three years in stable luminal dimensions, a low restenosis rate and a low clinical major adverse cardiac events rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cost-efficient operation while satisfying performance and availability guarantees in Service Level Agreements (SLAs) is a challenge for Cloud Computing, as these are potentially conflicting objectives. We present a framework for SLA management based on multi-objective optimization. The framework features a forecasting model for determining the best virtual machine-to-host allocation given the need to minimize SLA violations, energy consumption and resource wasting. A comprehensive SLA management solution is proposed that uses event processing for monitoring and enables dynamic provisioning of virtual machines onto the physical infrastructure. We validated our implementation against serveral standard heuristics and were able to show that our approach is significantly better.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for supersymmetric particles in final states with zero, one, and two leptons, with and without jets identified as originating from b-quarks, in 4.7 fb(-1) of root s = 7 TeV pp collisions produced by the Large Hadron Collider and recorded by the ATLAS detector is presented. The search uses a set of variables carrying information on the event kinematics transverse and parallel to the beam line that are sensitive to several topologies expected in supersymmetry. Mutually exclusive final states are defined, allowing a combination of all channels to increase the search sensitivity. No deviation from the Standard Model expectation is observed. Upper limits at 95 % confidence level on visible cross-sections for the production of new particles are extracted. Results are interpreted in the context of the constrained minimal supersymmetric extension to the Standard Model and in supersymmetry-inspired models with diverse, high-multiplicity final states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distributions of event-by-event harmonic flow coefficients v_n for n=2-4 are measured in sqrt(s_NN)=2.76 TeV Pb+Pb collisions using the ATLAS detector at the LHC. The measurements are performed using charged particles with transverse momentum pT> 0.5 GeV and in the pseudorapidity range |eta|<2.5 in a dataset of approximately 7 ub^-1 recorded in 2010. The shapes of the v_n distributions are described by a two-dimensional Gaussian function for the underlying flow vector in central collisions for v_2 and over most of the measured centrality range for v_3 and v_4. Significant deviations from this function are observed for v_2 in mid-central and peripheral collisions, and a small deviation is observed for v_3 in mid-central collisions. It is shown that the commonly used multi-particle cumulants are insensitive to the deviations for v_2. The v_n distributions are also measured independently for charged particles with 0.51 GeV. When these distributions are rescaled to the same mean values, the adjusted shapes are found to be nearly the same for these two pT ranges. The v_n distributions are compared with the eccentricity distributions from two models for the initial collision geometry: a Glauber model and a model that includes corrections to the initial geometry due to gluon saturation effects. Both models fail to describe the experimental data consistently over most of the measured centrality range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to study further the long-range correlations ("ridge") observed recently in p+Pb collisions at sqrt(s_NN) =5.02 TeV, the second-order azimuthal anisotropy parameter of charged particles, v_2, has been measured with the cumulant method using the ATLAS detector at the LHC. In a data sample corresponding to an integrated luminosity of approximately 1 microb^(-1), the parameter v_2 has been obtained using two- and four-particle cumulants over the pseudorapidity range |eta|<2.5. The results are presented as a function of transverse momentum and the event activity, defined in terms of the transverse energy summed over 3.1

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. In this Letter, results are presented of a search for events containing one or more such particles, which decay at a significant distance from their production point, using a final state containing charged hadrons and an associated muon. This analysis uses a data sample of proton-proton collisions at root s = 7 TeV corresponding to an integrated luminosity of 4.4 fb(-1) collected in 2011 by the ATLAS detector operating at the Large Hadron Collider. Results are interpreted in the context of R-parity violating supersymmetric scenarios. No events in the signal region are observed and limits are set on the production cross section for pair production of supersymmetric particles, multiplied by the square of the branching fraction for a neutralino to decay to charged hadrons and a muon, based on the scenario where both of the produced supersymmetric particles give rise to neutralinos that decay in this way. However, since the search strategy is based on triggering on and reconstructing the decay products of individual long-lived particles, irrespective of the rest of the event, these limits can easily be reinterpreted in scenarios with different numbers of long-lived particles per event. The limits are presented as a function of neutralino lifetime, and for a range of squark and neutralino masses.