110 resultados para Particle-antiparticle correlation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a theoretical study of the recently observed dynamical regimes of paramagnetic colloidal particles externally driven above a regular lattice of magnetic bubbles [P. Tierno, T. H. Johansen, and T. M. Fischer, Phys. Rev. Lett. 99, 038303 (2007)]. An external precessing magnetic field alters the potential generated by the surface of the film in such a way to either drive the particle circularly around one bubble, ballistically through the array, or in triangular orbits on the interstitial regions between the bubbles. In the ballistic regime, we observe different trajectories performed by the particles phase locked with the external driving. Superdiffusive motion, which was experimentally found bridging the localized and delocalized dynamics, emerge only by introducing a certain degree of randomness into the bubbles size distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of different correlation functionals has been tested for alkali metals, Li to Cs, interacting with cluster models simulating different active sites of the Si(111) surface. In all cases, the ab initio Hartree-Fock density has been obtained and used as a starting point. The electronic correlation energy is then introduced as an a posteriori correction to the Hartree-Fock energy using different correlation functionals. By making use of the ionic nature of the interaction and of different dissociation limits we have been able to prove that all functionals tested introduce the right correlation energy, although to a different extent. Hence, correlation functionals appear as an effective and easy way to introduce electronic correlation in the ab initio Hartree-Fock description of the chemisorption bond in complex systems where conventional configuration interaction techniques cannot be used. However, the calculated energies may differ by some tens of eV. Therefore, these methods can be employed to get a qualitative idea of how important correlation effects are, but they have some limitations if accurate binding energies are to be obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The correlation between the species composition of pasture communities and soil properties in Plana de Vic has been studied using two multivariate methods, Correspondence Analysis (CA) for the vegetation data and Principal Component Analysis (PCA) for the soil data. To analyse the pastures, we took 144 vegetation relevés (comprising 201 species) that have been classified into 10 phytocoenological communities elsewhere. Most of these communities are almost entirely built up by perennials, ranging from xerophilous, clearly Mediterranean, to mesophilous, related to medium-European pastures, but a few occurring in shallow soils are dominated by therophytes. As for the soil properties, we analysed texture, pH, depth, bulk density, organic matter, C/N ratio and the carbonates content of 25 samples, correspondingto representative relevés of the communities studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of existing protein sequences spans a very small fraction of sequence space. Natural proteins have overcome a strong negative selective pressure to avoid the formation of insoluble aggregates. Stably folded globular proteins and intrinsically disordered proteins (IDP) use alternative solutions to the aggregation problem. While in globular proteins folding minimizes the access to aggregation prone regions IDPs on average display large exposed contact areas. Here, we introduce the concept of average meta-structure correlation map to analyze sequence space. Using this novel conceptual view we show that representative ensembles of folded and ID proteins show distinct characteristics and responds differently to sequence randomization. By studying the way evolutionary constraints act on IDPs to disable a negative function (aggregation) we might gain insight into the mechanisms by which function - enabling information is encoded in IDPs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle fluxes (including major components and grain size), and oceanographic parameters (near-bottom water temperature, current speed and suspended sediment concentration) were measured along the Cap de Creus submarine canyon in the Gulf of Lions (GoL; NW Mediterranean Sea) during two consecutive winter-spring periods (2009 2010 and 2010 2011). The comparison of data obtained with the measurements of meteorological and hydrological parameters (wind speed, turbulent heat flux, river discharge) have shown the important role of atmospheric forcings in transporting particulate matter through the submarine canyon and towards the deep sea. Indeed, atmospheric forcing during 2009 2010 and 2010 2011 winter months showed differences in both intensity and persistence that led to distinct oceanographic responses. Persistent dry northern winds caused strong heat losses (14.2 × 103 W m−2) in winter 2009 2010 that triggered a pronounced sea surface cooling compared to winter 2010 2011 (1.6 × 103 W m−2 lower). As a consequence, a large volume of dense shelf water formed in winter 2009 2010, which cascaded at high speed (up to ∼1 m s−1) down Cap de Creus Canyon as measured by a current-meter in the head of the canyon. The lower heat losses recorded in winter 2010 2011, together with an increased river discharge, resulted in lowered density waters over the shelf, thus preventing the formation and downslope transport of dense shelf water. High total mass fluxes (up to 84.9 g m−2 d−1) recorded in winter-spring 2009 2010 indicate that dense shelf water cascading resuspended and transported sediments at least down to the middle canyon. Sediment fluxes were lower (28.9 g m−2 d−1) under the quieter conditions of winter 2010 2011. The dominance of the lithogenic fraction in mass fluxes during the two winter-spring periods points to a resuspension origin for most of the particles transported down canyon. The variability in organic matter and opal contents relates to seasonally controlled inputs associated with the plankton spring bloom during March and April of both years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between pressure induced changes on individual proteins and selected quality parameters in bovine longissimus thoracis et lumborum (LTL) muscle was studied. Pressures ranging from 200 to 600 MPa at 20 °C were used. High pressure processing (HPP) at pressures above 200 MPa induced strong modifications of protein solubility, meat colour and water holding capacity (WHC). The protein profiles of non-treated and pressure treated meat were observed using two dimensional electrophoresis. Proteins showing significant differences in abundance among treatments were identified by mass spectrometry. Pressure levels above 200 MPa strongly modified bovine LTL proteome with main effects being insolubilisation of sarcoplasmic proteins and solubilisation of myofibrillar proteins. Sarcoplasmic proteins were more susceptible to HPP effects than myofibrillar. Individual protein changes were significantly correlated with protein solubility, L*, b* and WHC, providing further insights into the mechanistic processes underlying HPP influence on quality and providing the basis for the future development of protein markers to assess the quality of processed meats.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer simulations of the dynamics of a colloidal particle suspended in a fluid confined by an interface show that the asymptotic decay of the velocity correlation functions is algebraic. The exponents of the long-time tails depend on the direction of motion of the particle relative to the surface, as well as on the specific nature of the boundary conditions. In particular, we find that for the angular velocity correlation function, the decay in the presence of a slip surface is faster than the one corresponding to a stick one. An intuitive picture is introduced to explain the various long-time tails, and the simulations are compared with theoretical expressions where available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-energy charged particles in the van Allen radiation belts and in solar energetic particle events can damage satellites on orbit leading to malfunctions and loss of satellite service. Here we describe some recent results from the SPACECAST project on modelling and forecasting the radiation belts, and modelling solar energetic particle events. We describe the SPACECAST forecasting system that uses physical models that include wave-particle interactions to forecast the electron radiation belts up to 3 h ahead. We show that the forecasts were able to reproduce the >2 MeV electron flux at GOES 13 during the moderate storm of 7-8 October 2012, and the period following a fast solar wind stream on 25-26 October 2012 to within a factor of 5 or so. At lower energies of 10- a few 100 keV we show that the electron flux at geostationary orbit depends sensitively on the high-energy tail of the source distribution near 10 RE on the nightside of the Earth, and that the source is best represented by a kappa distribution. We present a new model of whistler mode chorus determined from multiple satellite measurements which shows that the effects of wave-particle interactions beyond geostationary orbit are likely to be very significant. We also present radial diffusion coefficients calculated from satellite data at geostationary orbit which vary with Kp by over four orders of magnitude. We describe a new automated method to determine the position at the shock that is magnetically connected to the Earth for modelling solar energetic particle events and which takes into account entropy, and predict the form of the mean free path in the foreshock, and particle injection efficiency at the shock from analytical theory which can be tested in simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We reconsider a model of two relativistic particles interacting via a multiplicative potential, as an example of a simple dynamical system with sectors, or branches, with different dynamics and degrees of freedom. The presence or absence of sectors depends on the values of rest masses. Some aspects of the canonical quantization are described. The model could be interpreted as a bigravity model in one dimension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La distribución del número y del volumen de partículas, y la eficiencia de eliminación de las partículas y los sólidos en suspensión de diferentes efluentes y sus filtrados, fueron analizadas para estudiar si los filtros más usuales en los sistemas de riego localizado eliminan las partículas que pueden obturar los goteros. En la mayoría de los efluentes y filtrados fue mínimo el número de partículas con diámetros superiores a 20 μm. Sin embargo, al analizar la distribución del volumen de las partículas, en los filtrados aparecieron partículas de dimensiones superiores a la luz de los filtros de anillas y malla, siendo el filtro de arena el que retuvo las partículas de mayor diámetro. La eficiencia de los filtros para retener partículas se debió más al tipo de efluente que al filtro. Se verificó también que la distribución del número de partículas sigue una relación de tipo potencial. Analizando el exponente β de la ley potencial, se halló que los filtros no modificaron significativamente la distribución del número de partículas de los efluentes.