899 resultados para particle trajectory computation
Resumo:
Tiger nut (Cyperus esculentus) tuber contains oil that is high in monounsaturated fatty acids, and this oil makes up about 23% of the tuber. The study aimed at evaluating the impact of several factors and enzymatic pre-treatment on the recovery of pressed tiger nut oil. Smaller particles were more favourable for pressing. High pressure pre-treatment did not increase oil recovery but enzymatic treatment did. The highest yield obtained by enzymatic treatment prior to mechanical extraction was 33 % on a dry defatted basis, which represents a recovery of 90 % of the oil. Tiger nut oil consists mainly of oleic acid; its acid and peroxide values reflect the high stability of the oil.
Resumo:
This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.
Resumo:
In this paper we propose an innovative approach for behaviour recognition, from a multicamera environment, based on translating video activity into semantics. First, we fuse tracks from individual cameras through clustering employing soft computing techniques. Then, we introduce a higher-level module able to translate fused tracks into semantic information. With our proposed approach, we address the challenge set in PETS 2014 on recognising behaviours of interest around a parked vehicle, namely the abnormal behaviour of someone walking around the vehicle.
Resumo:
Estimating trajectories and parameters of dynamical systems from observations is a problem frequently encountered in various branches of science; geophysicists for example refer to this problem as data assimilation. Unlike as in estimation problems with exchangeable observations, in data assimilation the observations cannot easily be divided into separate sets for estimation and validation; this creates serious problems, since simply using the same observations for estimation and validation might result in overly optimistic performance assessments. To circumvent this problem, a result is presented which allows us to estimate this optimism, thus allowing for a more realistic performance assessment in data assimilation. The presented approach becomes particularly simple for data assimilation methods employing a linear error feedback (such as synchronization schemes, nudging, incremental 3DVAR and 4DVar, and various Kalman filter approaches). Numerical examples considering a high gain observer confirm the theory.
Resumo:
Nonlinear data assimilation is high on the agenda in all fields of the geosciences as with ever increasing model resolution and inclusion of more physical (biological etc.) processes, and more complex observation operators the data-assimilation problem becomes more and more nonlinear. The suitability of particle filters to solve the nonlinear data assimilation problem in high-dimensional geophysical problems will be discussed. Several existing and new schemes will be presented and it is shown that at least one of them, the Equivalent-Weights Particle Filter, does indeed beat the curse of dimensionality and provides a way forward to solve the problem of nonlinear data assimilation in high-dimensional systems.
Resumo:
Approximate Bayesian computation (ABC) is a popular family of algorithms which perform approximate parameter inference when numerical evaluation of the likelihood function is not possible but data can be simulated from the model. They return a sample of parameter values which produce simulations close to the observed dataset. A standard approach is to reduce the simulated and observed datasets to vectors of summary statistics and accept when the difference between these is below a specified threshold. ABC can also be adapted to perform model choice. In this article, we present a new software package for R, abctools which provides methods for tuning ABC algorithms. This includes recent dimension reduction algorithms to tune the choice of summary statistics, and coverage methods to tune the choice of threshold. We provide several illustrations of these routines on applications taken from the ABC literature.
Resumo:
Filter degeneracy is the main obstacle for the implementation of particle filter in non-linear high-dimensional models. A new scheme, the implicit equal-weights particle filter (IEWPF), is introduced. In this scheme samples are drawn implicitly from proposal densities with a different covariance for each particle, such that all particle weights are equal by construction. We test and explore the properties of the new scheme using a 1,000-dimensional simple linear model, and the 1,000-dimensional non-linear Lorenz96 model, and compare the performance of the scheme to a Local Ensemble Kalman Filter. The experiments show that the new scheme can easily be implemented in high-dimensional systems and is never degenerate, with good convergence properties in both systems.
Resumo:
Trust and reputation are important factors that influence the success of both traditional transactions in physical social networks and modern e-commerce in virtual Internet environments. It is difficult to define the concept of trust and quantify it because trust has both subjective and objective characteristics at the same time. A well-reported issue with reputation management system in business-to-consumer (BtoC) e-commerce is the “all good reputation” problem. In order to deal with the confusion, a new computational model of reputation is proposed in this paper. The ratings of each customer are set as basic trust score events. In addition, the time series of massive ratings are aggregated to formulate the sellers’ local temporal trust scores by Beta distribution. A logical model of trust and reputation is established based on the analysis of the dynamical relationship between trust and reputation. As for single goods with repeat transactions, an iterative mathematical model of trust and reputation is established with a closed-loop feedback mechanism. Numerical experiments on repeated transactions recorded over a period of 24 months are performed. The experimental results show that the proposed method plays guiding roles for both theoretical research into trust and reputation and the practical design of reputation systems in BtoC e-commerce.
Resumo:
Trust is one of the most important factors that influence the successful application of network service environments, such as e-commerce, wireless sensor networks, and online social networks. Computation models associated with trust and reputation have been paid special attention in both computer societies and service science in recent years. In this paper, a dynamical computation model of reputation for B2C e-commerce is proposed. Firstly, conceptions associated with trust and reputation are introduced, and the mathematical formula of trust for B2C e-commerce is given. Then a dynamical computation model of reputation is further proposed based on the conception of trust and the relationship between trust and reputation. In the proposed model, classical varying processes of reputation of B2C e-commerce are discussed. Furthermore, the iterative trust and reputation computation models are formulated via a set of difference equations based on the closed-loop feedback mechanism. Finally, a group of numerical simulation experiments are performed to illustrate the proposed model of trust and reputation. Experimental results show that the proposed model is effective in simulating the dynamical processes of trust and reputation for B2C e-commerce.
Resumo:
Retrograde transport of NF-κB from the synapse to the nucleus in neurons is mediated by the dynein/dynactin motor complex and can be triggered by synaptic activation. The calibre of axons is highly variable ranging down to 100 nm, aggravating the investigation of transport processes in neurites of living neurons using conventional light microscopy. In this study we quantified for the first time the transport of the NF-κB subunit p65 using high-density single-particle tracking in combination with photoactivatable fluorescent proteins in living mouse hippocampal neurons. We detected an increase of the mean diffusion coefficient (Dmean) in neurites from 0.12 ± 0.05 µm2/s to 0.61 ± 0.03 µm2/s after stimulation with glutamate. We further observed that the relative amount of retrogradely transported p65 molecules is increased after stimulation. Glutamate treatment resulted in an increase of the mean retrograde velocity from 10.9 ± 1.9 to 15 ± 4.9 µm/s, whereas a velocity increase from 9 ± 1.3 to 14 ± 3 µm/s was observed for anterogradely transported p65. This study demonstrates for the first time that glutamate stimulation leads to an increased mobility of single NF-κB p65 molecules in neurites of living hippocampal neurons.
Resumo:
Field observations of new particle formation and the subsequent particle growth are typically only possible at a fixed measurement location, and hence do not follow the temporal evolution of an air parcel in a Lagrangian sense. Standard analysis for determining formation and growth rates requires that the time-dependent formation rate and growth rate of the particles are spatially invariant; air parcel advection means that the observed temporal evolution of the particle size distribution at a fixed measurement location may not represent the true evolution if there are spatial variations in the formation and growth rates. Here we present a zero-dimensional aerosol box model coupled with one-dimensional atmospheric flow to describe the impact of advection on the evolution of simulated new particle formation events. Wind speed, particle formation rates and growth rates are input parameters that can vary as a function of time and location, using wind speed to connect location to time. The output simulates measurements at a fixed location; formation and growth rates of the particle mode can then be calculated from the simulated observations at a stationary point for different scenarios and be compared with the ‘true’ input parameters. Hence, we can investigate how spatial variations in the formation and growth rates of new particles would appear in observations of particle number size distributions at a fixed measurement site. We show that the particle size distribution and growth rate at a fixed location is dependent on the formation and growth parameters upwind, even if local conditions do not vary. We also show that different input parameters used may result in very similar simulated measurements. Erroneous interpretation of observations in terms of particle formation and growth rates, and the time span and areal extent of new particle formation, is possible if the spatial effects are not accounted for.
Resumo:
Ozone dynamics depend on meteorological characteristics such as wind, radiation, sunshine, air temperature and precipitation. The aim of this study was to determine ozone trajectories along the northern coast of Portugal during the summer months of 2005, when there was a spate of forest fires in the region, evaluating their impact on respiratory and cardiovascular health in the greater metropolitan area of Porto. We investigated the following diseases, as coded in the ninth revision of the International Classification of Diseases: hypertensive disease (codes 401-405); ischemic heart disease (codes 410-414); other cardiac diseases, including heart failure (codes 426-428); chronic obstructive pulmonary disease and allied conditions, including bronchitis and asthma (codes 490-496); and pneumoconiosis and other lung diseases due to external agents (codes 500-507). We evaluated ozone data from air quality monitoring stations in the study area, together with data collected through HYbrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model analysis of air mass circulation and synoptic-scale zonal wind from National Centers for Environmental Prediction data. High ozone levels in rural areas were attributed to the dispersion of pollutants induced by local circulation, as well as by mesoscale and synoptic scale processes. The fires of 2005 increased the levels of pollutants resulting from the direct emission of gases and particles into the atmosphere, especially when there were incoming frontal systems. For the meteorological case studies analyzed, peaks in ozone concentration were positively associated with higher rates of hospital admissions for cardiovascular diseases, although there were no significant associations between ozone peaks and admissions for respiratory diseases.
MAGNETOHYDRODYNAMIC SIMULATIONS OF RECONNECTION AND PARTICLE ACCELERATION: THREE-DIMENSIONAL EFFECTS
Resumo:
Magnetic fields can change their topology through a process known as magnetic reconnection. This process in not only important for understanding the origin and evolution of the large-scale magnetic field, but is seen as a possibly efficient particle accelerator producing cosmic rays mainly through the first-order Fermi process. In this work we study the properties of particle acceleration inserted in reconnection zones and show that the velocity component parallel to the magnetic field of test particles inserted in magnetohydrodynamic (MHD) domains of reconnection without including kinetic effects, such as pressure anisotropy, the Hall term, or anomalous effects, increases exponentially. Also, the acceleration of the perpendicular component is always possible in such models. We find that within contracting magnetic islands or current sheets the particles accelerate predominantly through the first-order Fermi process, as previously described, while outside the current sheets and islands the particles experience mostly drift acceleration due to magnetic field gradients. Considering two-dimensional MHD models without a guide field, we find that the parallel acceleration stops at some level. This saturation effect is, however, removed in the presence of an out-of-plane guide field or in three-dimensional models. Therefore, we stress the importance of the guide field and fully three-dimensional studies for a complete understanding of the process of particle acceleration in astrophysical reconnection environments.
Resumo:
Our numerical simulations show that the reconnection of magnetic field becomes fast in the presence of weak turbulence in the way consistent with the Lazarian and Vishniac (1999) model of fast reconnection. We trace particles within our numerical simulations and show that the particles can be efficiently accelerated via the first order Fermi acceleration. We discuss the acceleration arising from reconnection as a possible origin of the anomalous cosmic rays measured by Voyagers. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we present multiband optical polarimetric observations of the very-high energy blazar PKS 2155-304 made simultaneously with a HESS/Fermi high-energy campaign in 2008, when the source was found to be in a low state. The intense daily coverage of the data set allowed us to study in detail the temporal evolution of the emission, and we found that the particle acceleration time-scales are decoupled from the changes in the polarimetric properties of the source. We present a model in which the optical polarimetric emission originates at the polarized mm-wave core and propose an explanation for the lack of correlation between the photometric and polarimetric fluxes. The optical emission is consistent with an inhomogeneous synchrotron source in which the large-scale field is locally organized by a shock in which particle acceleration takes place. Finally, we use these optical polarimetric observations of PKS 2155-304 at a low state to propose an origin for the quiescent gamma-ray flux of the object, in an attempt to provide clues for the source of its recently established persistent TeV emission.