888 resultados para international student mobility cross-section time series model Source country host country
Resumo:
Influences of the isospin dependence of the in-medium nucleon-nucleon cross section and the momentum-dependant interaction (MDI) on the isotope scaling are investigated by using the isospin-dependent quantum molecular dynamics model (IQMD). The results show that both the isospin dependence of the in-medium nucleon-nucleon cross section and the momentum-dependent interaction affect the isoscaling parameters appreciably and independently. The influence caused by the isospin dependence of two-body collision is relatively larger than that from the MDI in the mean field. Aiming at exploring the implication of isoscaling behaviour, which the statistical equilibrium in the reaction is reached, the statistical properties in the mass distribution and the kinetic energy distribution of the fragments simulated by IQMD are presented.
Resumo:
The influences of the isospin dependent in-medium nucleon-nucleon cross section and the MomentumDependent Interaction(MDI) on the isotope scaling have been investigated within the Isospin dependent Quantum Molecular Dynamics Model(IQMD). The results show that both the isospin dependent in-medium nucleon-nucleon cross section and the momentum interaction reduce the isoscaling parameter a appreciably, which means they decrease the dependence of yield ratios of two systems on the isospin difference between two systems.
Resumo:
Al K-shell X-ray yields are measured with highly charged Arq+ ions (q = 12-16) bombarding against aluminium. The energy range of the Ar ions is from 180 to 380 keV. K-shell ionization cross sections of aluminium are also obtained from the yields data. The experimental data is explained within the framework of 2p pi-2p sigma s rotational coupling. When Ar ions with 2p-shell vacancies are incident on aluminium, the vacancies begin to reduce. Meanwhile, collisions against Al atoms lead to the production of new 2p-shell vacancies of Ar ions. These Ar 2p-shell vacancies will transfer to the 1s orbit of an Al atom via 2p pi-2p sigma s rotational coupling leading to the emission of a K-shell X-ray of aluminiun. A model is constructed based on the base of the above physical scenario. The calculation results of the model are in agreement with the experimental results.
Resumo:
Within the dinuclear system (DNS) conception, instead of solving the Fokker-Planck equation (FPE) analytically, the master equation is solved numerically to calculate the fusion probability of super-heavy nuclei, so that the harmonic oscillator approximation to the potential energy of the DNS is avoided. The relative motion concerning the energy, the angular momentum and the fragment deformation relaxations is explicitly treated to couple with the diffusion process. The nucleon transition probabilities, which are derived microscopically, are related with the energy dissipation of the relative motion. Thus they are time dependent. Comparing with the analytical solution of FPE at the equilibrium, our time-dependent results preserve more dynamical effects. The calculated evaporation residue cross-sections for one-neutron emission channel of Pb-based reactions are basically in agreement with the known experimental data within one order of magnitude.
Resumo:
The medium effect of in-medium nucleon-nucleon cross section sigma(med)(NN) (alpha(m)) on the isoscaling parameter a is investigated for two couples of central nuclear reactions Ca-40 + Ca-48 and Ca-60 + Ca-48; Sn-112 + Sn-112 and Sn-124 + Sn-124 at beam energy region from 40 to 60 MeV/nucleon with isospin dependent quantum molecular dynamics. It is found that there is the obvious medium effect of sigma(med)(NN) (alpha(m)) on the isoscaling parameters alpha. The mechanism for the medium effect of sigma(med)(NN) (alpha(m)) on a is investigated.
Resumo:
Influences of the isospin-dependent in-medium nucleon nucleon cross-section (sigma(iso)(NN) and momentum-dependent interaction (MDI) on the isoscaling parameter a are investigated for two central collisions Ca-40 +Ca-40 and Ca-60+ Ca-60. These collisions are with isospin dependent quantum molecular dynamics in the beam energy region from 40 to 60 MeV/nucleon. The isotope yield ratio R-21 (N, Z) for the above two central collisions depends exponentially on the neutron number N and proton number Z of isotopes, with an isoscaling. In particular, the isospin-dependent (sigma(iso)(NN) and MDI induce an obvious de crease of the isoscaling parameter a. The mechanism of the decreases of a by both sigma(iso)(NN) and MDI are studied respectively.
Resumo:
We report on a measurement of the gamma(1S + 2S + 3S) -> e(+)e(-) cross section at midrapidity in p + p collisions at root s = 200 GeV. We find the cross section to be 114 +/- 38(stat + fit)(-24)(+23)(syst) pb. Perturbative QCD calculations at next-to-leading order in the color evaporation model are in agreement with our measurement, while calculations in the color singlet model underestimate it by 2 sigma. Our result is consistent with the trend seen in world data as a function of the center-of-mass energy of the collision and extends the availability of gamma data to RHIC energies. The dielectron continuum in the invariant-mass range near the gamma is also studied to obtain a combined yield of e(+)e(-) pairs from the sum of the Drell-Yan process and b-(b) over bar production.
Resumo:
The mirror nuclei N-12 and B-12 are separated by the Radioactive Ion Beam Line in Lanzhou (RIBLL) at HIRFL from the breakup of 78.6 MeV/u N-14 on a Be target. The total reaction cross-sections of N-12 at 34.9 MeV/u and B-12 at 54.4 MeV/u on a Si target have been measured by using the transmission method. Assuming N-12 consists of a C-11 core plus one halo proton, the excitation function of N-12 and B-12 on a Si target and a C target were calculated with the Glauber model. It can fit the experimental data very well. The characteristic halo structure for N-12 was found with a large diffusion of the protons density distribution.
Resumo:
The goal of this work is to learn a parsimonious and informative representation for high-dimensional time series. Conceptually, this comprises two distinct yet tightly coupled tasks: learning a low-dimensional manifold and modeling the dynamical process. These two tasks have a complementary relationship as the temporal constraints provide valuable neighborhood information for dimensionality reduction and conversely, the low-dimensional space allows dynamics to be learnt efficiently. Solving these two tasks simultaneously allows important information to be exchanged mutually. If nonlinear models are required to capture the rich complexity of time series, then the learning problem becomes harder as the nonlinearities in both tasks are coupled. The proposed solution approximates the nonlinear manifold and dynamics using piecewise linear models. The interactions among the linear models are captured in a graphical model. By exploiting the model structure, efficient inference and learning algorithms are obtained without oversimplifying the model of the underlying dynamical process. Evaluation of the proposed framework with competing approaches is conducted in three sets of experiments: dimensionality reduction and reconstruction using synthetic time series, video synthesis using a dynamic texture database, and human motion synthesis, classification and tracking on a benchmark data set. In all experiments, the proposed approach provides superior performance.
Resumo:
The goal of this work is to learn a parsimonious and informative representation for high-dimensional time series. Conceptually, this comprises two distinct yet tightly coupled tasks: learning a low-dimensional manifold and modeling the dynamical process. These two tasks have a complementary relationship as the temporal constraints provide valuable neighborhood information for dimensionality reduction and conversely, the low-dimensional space allows dynamics to be learnt efficiently. Solving these two tasks simultaneously allows important information to be exchanged mutually. If nonlinear models are required to capture the rich complexity of time series, then the learning problem becomes harder as the nonlinearities in both tasks are coupled. The proposed solution approximates the nonlinear manifold and dynamics using piecewise linear models. The interactions among the linear models are captured in a graphical model. The model structure setup and parameter learning are done using a variational Bayesian approach, which enables automatic Bayesian model structure selection, hence solving the problem of over-fitting. By exploiting the model structure, efficient inference and learning algorithms are obtained without oversimplifying the model of the underlying dynamical process. Evaluation of the proposed framework with competing approaches is conducted in three sets of experiments: dimensionality reduction and reconstruction using synthetic time series, video synthesis using a dynamic texture database, and human motion synthesis, classification and tracking on a benchmark data set. In all experiments, the proposed approach provides superior performance.
Resumo:
Time-series analysis and prediction play an important role in state-based systems that involve dealing with varying situations in terms of states of the world evolving with time. Generally speaking, the world in the discourse persists in a given state until something occurs to it into another state. This paper introduces a framework for prediction and analysis based on time-series of states. It takes a time theory that addresses both points and intervals as primitive time elements as the temporal basis. A state of the world under consideration is defined as a set of time-varying propositions with Boolean truth-values that are dependent on time, including properties, facts, actions, events and processes, etc. A time-series of states is then formalized as a list of states that are temporally ordered one after another. The framework supports explicit expression of both absolute and relative temporal knowledge. A formal schema for expressing general time-series of states to be incomplete in various ways, while the concept of complete time-series of states is also formally defined. As applications of the formalism in time-series analysis and prediction, we present two illustrating examples.
Resumo:
In 2000 a Review of Current Marine Observations in relation to present and future needs was undertaken by the Inter-Agency Committee for Marine Science and Technology (IACMST). The Marine Environmental Change Network (MECN) was initiated in 2002 as a direct response to the recommendations of the report. A key part of the current phase of the MECN is to ensure that information from the network is provided to policy makers and other end-users to enable them to produce more accurate assessments of ecosystem state and gain a clearer understanding of factors influencing change in marine ecosystems. The MECN holds workshops on an annual basis, bringing together partners maintaining time-series and long-term datasets as well as end-users interested in outputs from the network. It was decided that the first workshop of the MECN continuation phase should consist of an evaluation of the time series and data sets maintained by partners in the MECN with regard to their ‘fit for purpose’ for answering key science questions and informing policy development. This report is based on the outcomes of the workshop. Section one of the report contains a brief introduction to monitoring, time series and long-term datasets. The various terms are defined and the need for MECN type data to complement compliance monitoring programmes is discussed. Outlines are also given of initiatives such as the United Kingdom Marine Monitoring and Assessment Strategy (UKMMAS) and Oceans 2025. Section two contains detailed information for each of the MECN time series / long-term datasets including information on scientific outputs and current objectives. This information is mainly based on the presentations given at the workshop and therefore follows a format whereby the following headings are addressed: Origin of time series including original objectives; current objectives; policy relevance; products (advice, publications, science and society). Section three consists of comments made by the review panel concerning all the time series and the network. Needs or issues highlighted by the panel with regard to the future of long-term datasets and time-series in the UK are shown along with advice and potential solutions where offered. The recommendations are divided into 4 categories; ‘The MECN and end-user requirements’; ‘Procedures & protocols’; ‘Securing data series’ and ‘Future developments’. Ever since marine environmental protection issues really came to the fore in the 1960s, it has been recognised that there is a requirement for a suitable evidence base on environmental change in order to support policy and management for UK waters. Section four gives a brief summary of the development of marine policy in the UK along with comments on the availability and necessity of long-term marine observations for the implementation of this policy. Policy relating to three main areas is discussed; Marine Conservation (protecting biodiversity and marine ecosystems); Marine Pollution and Fisheries. The conclusion of this section is that there has always been a specific requirement for information on long-term change in marine ecosystems around the UK in order to address concerns over pollution, fishing and general conservation. It is now imperative that this need is addressed in order for the UK to be able to fulfil its policy commitments and manage marine ecosystems in the light of climate change and other factors.
Continuous Plankton Records - Persistence In Time-Series Of Annual Means Of Abundance Of Zooplankton
Resumo:
Time-series of annual means of abundance of zooplankton of the north-east Atlantic Ocean and the North Sea, for the period 1948 to 1977, show considerable associations between successive years. The seasonal dynamics of the stocks appear to be consistent with at least a proportion of this being due to inherent persistence from year-to-year. Experiments with a simple model suggest that the observed properties of the time-series cannot be reproduced as a response to simple random forcing. The extent of trends and long wavelength variations can be simulated by introducing fairly extensive persistence into the perturbations, but this underestimates the extent of shorter wavelength variability in the observed time-series. The effect of persistence is to increase the proportion of trend and long wavelength variability in time-series of annual means, but stocks can respond to short wavelength perturbations provided these have a clearly defined frequency.
Resumo:
Historical GIS has the potential to re-invigorate our use of statistics from historical censuses and related sources. In particular, areal interpolation can be used to create long-run time-series of spatially detailed data that will enable us to enhance significantly our understanding of geographical change over periods of a century or more. The difficulty with areal interpolation, however, is that the data that it generates are estimates which will inevitably contain some error. This paper describes a technique that allows the automated identification of possible errors at the level of the individual data values.
Resumo:
Objectives: Methicillin-resistant Staphylococcus aureus (MRSA) is a major nosocomial pathogen worldwide. A wide range of factors have been suggested to influence the spread of MRSA. The objective of this study was to evaluate the effect of antimicrobial drug use and infection control practices on nosocomial MRSA incidence in a 426-bed general teaching hospital in Northern Ireland.
Methods: The present research involved the retrospective collection of monthly data on the usage of antibiotics and on infection control practices within the hospital over a 5 year period (January 2000–December 2004). A multivariate ARIMA (time-series analysis) model was built to relate MRSA incidence with antibiotic use and infection control practices.
Results: Analysis of the 5 year data set showed that temporal variations in MRSA incidence followed temporal variations in the use of fluoroquinolones, third-generation cephalosporins, macrolides and amoxicillin/clavulanic acid (coefficients = 0.005, 0.03, 0.002 and 0.003, respectively, with various time lags). Temporal relationships were also observed between MRSA incidence and infection control practices, i.e. the number of patients actively screened for MRSA (coefficient = -0.007), the use of alcohol-impregnated wipes (coefficient = -0.0003) and the bulk orders of alcohol-based handrub (coefficients = -0.04 and -0.08), with increased infection control activity being associated with decreased MRSA incidence, and between MRSA incidence and the number of new patients admitted with MRSA (coefficient = 0.22). The model explained 78.4% of the variance in the monthly incidence of MRSA.
Conclusions: The results of this study confirm the value of infection control policies as well as suggest the usefulness of restricting the use of certain antimicrobial classes to control MRSA.