818 resultados para Violent event


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impact cratering has been a fundamental geological process in Earth history with major ramifications for the biosphere. The complexity of shocked and melted rocks within impact structures presents difficulties for accurate and precise radiogenic isotope age determination, hampering the assessment of the effects of an individual event in the geological record. We demonstrate the utility of a multi-chronometer approach in our study of samples from the 40 km diameter Araguainha impact structure of central Brazil. Samples of uplifted basement granite display abundant evidence of shock deformation, but U/Pb ages of shocked zircons and the Ar-40/Ar-39 ages of feldspar from the granite largely preserve the igneous crystallization and cooling history. Mixed results are obtained from in situ Ar-40/Ar-39 spot analyses of shocked igneous biotites in the granite, with deformation along kink-bands resulting in highly localized, partial resetting in these grains. Likewise, spot analyses of perlitic glass from pseudotachylitic breccia samples reflect a combination of argon inheritance from wall rock material, the age of the glass itself, and post-impact devitrification. The timing of crater formation is better assessed using samples of impact-generated melt rock where isotopic resetting is associated with textural evidence of melting and in situ crystallization. Granular aggregates of neocrystallized zircon form a cluster of ten U-Pb ages that yield a "Concordia" age of 247.8 +/- 3.8 Ma. The possibility of Pb loss from this population suggests that this is a minimum age for the impact event. The best evidence for the age of the impact comes from the U-Th-Pb dating of neocrystallized monazite and Ar-40/Ar-39 step heating of three separate populations of post-impact, inclusion-rich quartz grains that are derived from the infill of miarolitic cavities. The Pb-206/U-238 age of 254.5 +/- 3.2 Ma (2 sigma error) and Pb-208/Th-232 age of 255.2 +/- 4.8 Ma (2 sigma error) of monazite, together with the inverse, 18 point isochron age of 254 +/- 10 Ma (MSWD = 0.52) for the inclusion-rich quartz grains yield a weighted mean age of 254.7 +/- 2.5 Ma (0.99%, 2 sigma error) for the impact event. The age of the Araguainha crater overlaps with the timing of the Permo-Triassic boundary, within error, but the calculated energy released by the Araguainha impact is insufficient to be a direct cause of the global mass extinction. However, the regional effects of the Araguainha impact event in the Parana-Karoo Basin may have been substantial. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present measurements of Underlying Event observables in pp collisions at root s = 0 : 9 and 7 TeV. The analysis is performed as a function of the highest charged-particle transverse momentum p(T),L-T in the event. Different regions are defined with respect to the azimuthal direction of the leading (highest transverse momentum) track: Toward, Transverse and Away. The Toward and Away regions collect the fragmentation products of the hardest partonic interaction. The Transverse region is expected to be most sensitive to the Underlying Event activity. The study is performed with charged particles above three different p(T) thresholds: 0.15, 0.5 and 1.0 GeV/c. In the Transverse region we observe an increase in the multiplicity of a factor 2-3 between the lower and higher collision energies, depending on the track p(T) threshold considered. Data are compared to PYTHIA 6.4, PYTHIA 8.1 and PHOJET. On average, all models considered underestimate the multiplicity and summed p(T) in the Transverse region by about 10-30%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic kidney diseasemineral bone disorder (CKD-MBD) is defined by abnormalities in mineral and hormone metabolism, bone histomorphometric changes, and/or the presence of soft-tissue calcification. Emerging evidence suggests that features of CKD-MBD may occur early in disease progression and are associated with changes in osteocyte function. To identify early changes in bone, we utilized the jck mouse, a genetic model of polycystic kidney disease that exhibits progressive renal disease. At 6 weeks of age, jck mice have normal renal function and no evidence of bone disease but exhibit continual decline in renal function and death by 20 weeks of age, when approximately 40% to 60% of them have vascular calcification. Temporal changes in serum parameters were identified in jck relative to wild-type mice from 6 through 18 weeks of age and were subsequently shown to largely mirror serum changes commonly associated with clinical CKD-MBD. Bone histomorphometry revealed progressive changes associated with increased osteoclast activity and elevated bone formation relative to wild-type mice. To capture the early molecular and cellular events in the progression of CKD-MBD we examined cell-specific pathways associated with bone remodeling at the protein and/or gene expression level. Importantly, a steady increase in the number of cells expressing phosphor-Ser33/37-beta-catenin was observed both in mouse and human bones. Overall repression of Wnt/beta-catenin signaling within osteocytes occurred in conjunction with increased expression of Wnt antagonists (SOST and sFRP4) and genes associated with osteoclast activity, including receptor activator of NF-?B ligand (RANKL). The resulting increase in the RANKL/osteoprotegerin (OPG) ratio correlated with increased osteoclast activity. In late-stage disease, an apparent repression of genes associated with osteoblast function was observed. These data confirm that jck mice develop progressive biochemical changes in CKD-MBD and suggest that repression of the Wnt/beta-catenin pathway is involved in the pathogenesis of renal osteodystrophy. (C) 2012 American Society for Bone and Mineral Research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of event background fluctuations on charged particle jet reconstruction in Pb-Pb collisions at root s(NN) = 2.76 TeV has been measured with the ALICE experiment. The main sources of non-statistical fluctuations are characterized based purely on experimental data with an unbiased method, as well as by using single high p(t) particles and simulated jets embedded into real Pb-Pb events and reconstructed with the anti-k(t) jet finder. The influence of a low transverse momentum cut-off on particles used in the jet reconstruction is quantified by varying the minimum track p(t) between 0.15 GeV/c and 2 GeV/c. For embedded jets reconstructed from charged particles with p(t) > 0.15 GeV/c, the uncertainty in the reconstructed jet transverse momentum due to the heavy-ion background is measured to be 11.3 GeV/c (standard deviation) for the 10% most central Pb-Pb collisions, slightly larger than the value of 11.0 GeV/c measured using the unbiased method. For a higher particle transverse momentum threshold of 2 GeV/c, which will generate a stronger bias towards hard fragmentation in the jet finding process, the standard deviation of the fluctuations in the reconstructed jet transverse momentum is reduced to 4.8-5.0 GeV/c for the 10% most central events. A non-Gaussian tail of the momentum uncertainty is observed and its impact on the reconstructed jet spectrum is evaluated for varying particle momentum thresholds, by folding the measured fluctuations with steeply falling spectra.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We simulate top-energy Au + Au collisions using ideal hydrodynamics in order to make the first comparison to the complete set of midrapidity flow measurements made by the PHENIX Collaboration. A simultaneous calculation of nu(2), nu(3), nu(4), and the first event-by-event calculation of quadrangular flow defined with respect to the nu(2) event plane (nu(4){Psi(2)}) gives good agreement with measured values, including the dependence on both transverse momentum and centrality. This provides confirmation that the collision system is indeed well described as a quark-gluon plasma with an extremely small viscosity and that correlations are dominantly generated from collective effects. In addition, we present a prediction for nu(5).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paleoclimatic record of Jureia Paleolagoon, coastal southeastem Brazil, includes cyclic and gradual changes with different intensities and frequencies through geological time, and it is controlled by astronomical, geophysical, and geological phenomena. These variations are not due to one single cause, but they result from the interaction of several factors, which act at different temporal and spatial scales. Here, we describe paleoenvironmental evidence regarding climatic and sea level changes from the last 9400 cal yr BP at the Jureia Paleolagoon - one of the main groups of protected South Atlantic ecosystems. Geochemical evidences were used to identify anomalies from multi-proxy analyses of a paleolagoon sediment core. The anomalies of centennial scale were correlated to climate and transgression-regression cycles from the Holocene period. Decadal scale anomalous oscillations in the Quaternary paleolagoon sediments occur between 9400 and 7500 cal yr BP, correlated with long- and short-term natural events, which generated high sedimentation rates, mainly between 8385 and 8375 cal yr BP (10 cm/yr). Our results suggest that a modem-day short-duration North Atlantic climatic event, such as the 82 ka event, could affect the environmental equilibrium in South America and intensify the South American Summer Monsoon. (C) 2011 University of Washington. Published by Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Industrial recurrent event data where an event of interest can be observed more than once in a single sample unit are presented in several areas, such as engineering, manufacturing and industrial reliability. Such type of data provide information about the number of events, time to their occurrence and also their costs. Nelson (1995) presents a methodology to obtain asymptotic confidence intervals for the cost and the number of cumulative recurrent events. Although this is a standard procedure, it can not perform well in some situations, in particular when the sample size available is small. In this context, computer-intensive methods such as bootstrap can be used to construct confidence intervals. In this paper, we propose a technique based on the bootstrap method to have interval estimates for the cost and the number of cumulative events. One of the advantages of the proposed methodology is the possibility for its application in several areas and its easy computational implementation. In addition, it can be a better alternative than asymptotic-based methods to calculate confidence intervals, according to some Monte Carlo simulations. An example from the engineering area illustrates the methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Which event study methods are best in non-U.S. multi-country samples? Nonparametric tests, especially the rank and generalized sign, are better specified and more powerful than common parametric tests, especially in multi-day windows. The generalized sign test is the best statistic but must be applied to buy-and-hold abnormal returns for correct specification. Market-adjusted and market-model methods with local market indexes, without conversion to a common currency, work well. The results are robust to limiting the samples to situations expected to be problematic for test specification or power. Applying the tests that perform best in simulation to merger announcements produces reasonable results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web is constantly evolving, thanks to the 2.0 transition, HTML5 new features and the coming of cloud-computing, the gap between Web and traditional desktop applications is tailing off. Web-apps are more and more widespread and bring several benefits compared to traditional ones. On the other hand reference technologies, JavaScript primarly, are not keeping pace, so a paradim shift is taking place in Web programming, and so many new languages and technologies are coming out. First objective of this thesis is to survey the reference and state-of-art technologies for client-side Web programming focusing in particular on what concerns concurrency and asynchronous programming. Taking into account the problems that affect existing technologies, we finally design simpAL-web, an innovative approach to tackle Web-apps development, based on the Agent-oriented programming abstraction and the simpAL language. == Versione in italiano: Il Web è in continua evoluzione, grazie alla transizione verso il 2.0, alle nuove funzionalità introdotte con HTML5 ed all’avvento del cloud-computing, il divario tra le applicazioni Web e quelle desktop tradizionali va assottigliandosi. Le Web-apps sono sempre più diffuse e presentano diversi vantaggi rispetto a quelle tradizionali. D’altra parte le tecnologie di riferimento, JavaScript in primis, non stanno tenendo il passo, motivo per cui la programmazione Web sta andando incontro ad un cambio di paradigma e nuovi linguaggi e tecnologie stanno spuntando sempre più numerosi. Primo obiettivo di questa tesi è di passare al vaglio le tecnologie di riferimento ed allo stato dell’arte per quel che riguarda la programmmazione Web client-side, porgendo particolare attenzione agli aspetti inerenti la concorrenza e la programmazione asincrona. Considerando i principali problemi di cui soffrono le attuali tecnologie passeremo infine alla progettazione di simpAL-web, un approccio innovativo con cui affrontare lo sviluppo di Web-apps basato sulla programmazione orientata agli Agenti e sul linguaggio simpAL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il progetto Eye-Trauma si colloca all'interno dello sviluppo di un simulatore chirurgico per traumi alla zona oculare, sviluppato in collaborazione con Simulation Group in Boston, Harvard Medical School e Massachusetts General Hospital. Il simulatore presenta un busto in silicone fornito di moduli intercambiabili della zona oculare, per simulare diversi tipi di trauma. L'utilizzatore è chiamato ad eseguire la procedura medica di saturazione tramite degli strumenti chirurgici su cui sono installati dei sensori di forza e di apertura. I dati collezionati vengono utilizzati all'interno del software per il riconoscimento dei gesti e il controllo real-time della performance. L'algoritmo di gesture recognition, da me sviluppato, si basa sul concetto di macchine a stati; la transizione tra gli stati avviene in base agli eventi rilevati dal simulatore.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the main Executive Control theories are exposed. Methods typical of Cognitive and Computational Neuroscience are introduced and the role of behavioural tasks involving conflict resolution in the response elaboration, after the presentation of a stimulus to the subject, are highlighted. In particular, the Eriksen Flanker Task and its variants are discussed. Behavioural data, from scientific literature, are illustrated in terms of response times and error rates. During experimental behavioural tasks, EEG is registered simultaneously. Thanks to this, event related potential, related with the current task, can be studied. Different theories regarding relevant event related potential in this field - such as N2, fERN (feedback Error Related Negativity) and ERN (Error Related Negativity) – are introduced. The aim of this thesis is to understand and simulate processes regarding Executive Control, including performance improvement, error detection mechanisms, post error adjustments and the role of selective attention, with the help of an original neural network model. The network described here has been built with the purpose to simulate behavioural results of a four choice Eriksen Flanker Task. Model results show that the neural network can simulate response times, error rates and event related potentials quite well. Finally, results are compared with behavioural data and discussed in light of the mentioned Executive Control theories. Future perspective for this new model are outlined.