992 resultados para Particle Classification
Resumo:
Retrograde transport of NF-κB from the synapse to the nucleus in neurons is mediated by the dynein/dynactin motor complex and can be triggered by synaptic activation. The calibre of axons is highly variable ranging down to 100 nm, aggravating the investigation of transport processes in neurites of living neurons using conventional light microscopy. In this study we quantified for the first time the transport of the NF-κB subunit p65 using high-density single-particle tracking in combination with photoactivatable fluorescent proteins in living mouse hippocampal neurons. We detected an increase of the mean diffusion coefficient (Dmean) in neurites from 0.12 ± 0.05 µm2/s to 0.61 ± 0.03 µm2/s after stimulation with glutamate. We further observed that the relative amount of retrogradely transported p65 molecules is increased after stimulation. Glutamate treatment resulted in an increase of the mean retrograde velocity from 10.9 ± 1.9 to 15 ± 4.9 µm/s, whereas a velocity increase from 9 ± 1.3 to 14 ± 3 µm/s was observed for anterogradely transported p65. This study demonstrates for the first time that glutamate stimulation leads to an increased mobility of single NF-κB p65 molecules in neurites of living hippocampal neurons.
Resumo:
Field observations of new particle formation and the subsequent particle growth are typically only possible at a fixed measurement location, and hence do not follow the temporal evolution of an air parcel in a Lagrangian sense. Standard analysis for determining formation and growth rates requires that the time-dependent formation rate and growth rate of the particles are spatially invariant; air parcel advection means that the observed temporal evolution of the particle size distribution at a fixed measurement location may not represent the true evolution if there are spatial variations in the formation and growth rates. Here we present a zero-dimensional aerosol box model coupled with one-dimensional atmospheric flow to describe the impact of advection on the evolution of simulated new particle formation events. Wind speed, particle formation rates and growth rates are input parameters that can vary as a function of time and location, using wind speed to connect location to time. The output simulates measurements at a fixed location; formation and growth rates of the particle mode can then be calculated from the simulated observations at a stationary point for different scenarios and be compared with the ‘true’ input parameters. Hence, we can investigate how spatial variations in the formation and growth rates of new particles would appear in observations of particle number size distributions at a fixed measurement site. We show that the particle size distribution and growth rate at a fixed location is dependent on the formation and growth parameters upwind, even if local conditions do not vary. We also show that different input parameters used may result in very similar simulated measurements. Erroneous interpretation of observations in terms of particle formation and growth rates, and the time span and areal extent of new particle formation, is possible if the spatial effects are not accounted for.
Resumo:
Ozone dynamics depend on meteorological characteristics such as wind, radiation, sunshine, air temperature and precipitation. The aim of this study was to determine ozone trajectories along the northern coast of Portugal during the summer months of 2005, when there was a spate of forest fires in the region, evaluating their impact on respiratory and cardiovascular health in the greater metropolitan area of Porto. We investigated the following diseases, as coded in the ninth revision of the International Classification of Diseases: hypertensive disease (codes 401-405); ischemic heart disease (codes 410-414); other cardiac diseases, including heart failure (codes 426-428); chronic obstructive pulmonary disease and allied conditions, including bronchitis and asthma (codes 490-496); and pneumoconiosis and other lung diseases due to external agents (codes 500-507). We evaluated ozone data from air quality monitoring stations in the study area, together with data collected through HYbrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model analysis of air mass circulation and synoptic-scale zonal wind from National Centers for Environmental Prediction data. High ozone levels in rural areas were attributed to the dispersion of pollutants induced by local circulation, as well as by mesoscale and synoptic scale processes. The fires of 2005 increased the levels of pollutants resulting from the direct emission of gases and particles into the atmosphere, especially when there were incoming frontal systems. For the meteorological case studies analyzed, peaks in ozone concentration were positively associated with higher rates of hospital admissions for cardiovascular diseases, although there were no significant associations between ozone peaks and admissions for respiratory diseases.
MAGNETOHYDRODYNAMIC SIMULATIONS OF RECONNECTION AND PARTICLE ACCELERATION: THREE-DIMENSIONAL EFFECTS
Resumo:
Magnetic fields can change their topology through a process known as magnetic reconnection. This process in not only important for understanding the origin and evolution of the large-scale magnetic field, but is seen as a possibly efficient particle accelerator producing cosmic rays mainly through the first-order Fermi process. In this work we study the properties of particle acceleration inserted in reconnection zones and show that the velocity component parallel to the magnetic field of test particles inserted in magnetohydrodynamic (MHD) domains of reconnection without including kinetic effects, such as pressure anisotropy, the Hall term, or anomalous effects, increases exponentially. Also, the acceleration of the perpendicular component is always possible in such models. We find that within contracting magnetic islands or current sheets the particles accelerate predominantly through the first-order Fermi process, as previously described, while outside the current sheets and islands the particles experience mostly drift acceleration due to magnetic field gradients. Considering two-dimensional MHD models without a guide field, we find that the parallel acceleration stops at some level. This saturation effect is, however, removed in the presence of an out-of-plane guide field or in three-dimensional models. Therefore, we stress the importance of the guide field and fully three-dimensional studies for a complete understanding of the process of particle acceleration in astrophysical reconnection environments.
Resumo:
Our numerical simulations show that the reconnection of magnetic field becomes fast in the presence of weak turbulence in the way consistent with the Lazarian and Vishniac (1999) model of fast reconnection. We trace particles within our numerical simulations and show that the particles can be efficiently accelerated via the first order Fermi acceleration. We discuss the acceleration arising from reconnection as a possible origin of the anomalous cosmic rays measured by Voyagers. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we present multiband optical polarimetric observations of the very-high energy blazar PKS 2155-304 made simultaneously with a HESS/Fermi high-energy campaign in 2008, when the source was found to be in a low state. The intense daily coverage of the data set allowed us to study in detail the temporal evolution of the emission, and we found that the particle acceleration time-scales are decoupled from the changes in the polarimetric properties of the source. We present a model in which the optical polarimetric emission originates at the polarized mm-wave core and propose an explanation for the lack of correlation between the photometric and polarimetric fluxes. The optical emission is consistent with an inhomogeneous synchrotron source in which the large-scale field is locally organized by a shock in which particle acceleration takes place. Finally, we use these optical polarimetric observations of PKS 2155-304 at a low state to propose an origin for the quiescent gamma-ray flux of the object, in an attempt to provide clues for the source of its recently established persistent TeV emission.
Resumo:
The problem of cosmological particle creation for a spatially flat, homogeneous and isotropic universes is discussed in the context of f (R) theories of gravity. Different from cosmological models based on general relativity theory, it is found that a conformal invariant metric does not forbid the creation of massless particles during the early stages (radiation era) of the universe. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Epidendrum L. is the largest genus of Orchidaceae in the Neotropical region; it has an impressive morphological diversification, which imposes difficulties in delimitation of both infrageneric and interspecific boundaries. In this study, we review infrageneric boundaries within the subgenus Amphiglottium and try to contribute to the understanding of morphological diversification and taxa delimitation within this group. We tested the monophyly of the subgenus Amphiglottium sect. Amphiglottium, expanding previous phylogenetic investigations and reevaluated previous infrageneric classifications proposed. Sequence data from the trnL-trnF region were analyzed with both parsimony and maximum likelihood criteria. AFLP markers were also obtained and analyzed with phylogenetic and principal coordinate analyses. Additionally, we obtained chromosome numbers for representative species within the group. The results strengthen the monophyly of the subgenus Amphiglottium but do not support the current classification system proposed by previous authors. Only section Tuberculata comprises a well-supported monophyletic group, with sections Carinata and Integra not supported. Instead of morphology, biogeographical and ecological patterns are reflected in the phylogenetic signal in this group. This study also confirms the large variability of chromosome numbers for the subgenus Amphiglottium (numbers ranging from 2n = 24 to 2n = 240), suggesting that polyploidy and hybridization are probably important mechanisms of speciation within the group.
Resumo:
The Tiete River and its tributary Pinheiros River receive a highly complex organic and inorganic pollutants load from sanitary sewage and industrial sources, as well as agricultural and agroindustrial activities. The aim of the present study was to evaluate the embryotoxic and teratogenic effects of sediments from selected locations in the Tiete River Basin by means of the sediment contact embryo toxicity assay with Danio rerio, in order to provide a comprehensive and realistic insight into the bioavailable hazard potential of these sediment samples. Lethal and sub-lethal effects were recorded, and high embryo toxicity could be found in the samples not only in the vicinity of the megacity Sao Paulo (Billings reservoir and Pinheiros River samples), but also downstream (in the reservoirs Barra Bonita, Promissao and Tres Irmaos). Results confirm that most toxicity is due to the discharges of the metropolitan area of Sao Paulo. However, they also indicate additional sources of pollutants along the river course, probably from industrial, agricultural and agroindustrial residues, which contribute to the degradation of each area. The sediment contact fish embryo test showed to be powerful tool to detect embryo toxicity in sediments, not only by being a sensitive method, but also for taking into account bioavailability. This test provides an ecological highly realistic and relevant exposure scenario, and should therefore be added in ecotoxicological sediment quality assessments. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
This work proposes and discusses an approach for inducing Bayesian classifiers aimed at balancing the tradeoff between the precise probability estimates produced by time consuming unrestricted Bayesian networks and the computational efficiency of Naive Bayes (NB) classifiers. The proposed approach is based on the fundamental principles of the Heuristic Search Bayesian network learning. The Markov Blanket concept, as well as a proposed ""approximate Markov Blanket"" are used to reduce the number of nodes that form the Bayesian network to be induced from data. Consequently, the usually high computational cost of the heuristic search learning algorithms can be lessened, while Bayesian network structures better than NB can be achieved. The resulting algorithms, called DMBC (Dynamic Markov Blanket Classifier) and A-DMBC (Approximate DMBC), are empirically assessed in twelve domains that illustrate scenarios of particular interest. The obtained results are compared with NB and Tree Augmented Network (TAN) classifiers, and confinn that both proposed algorithms can provide good classification accuracies and better probability estimates than NB and TAN, while being more computationally efficient than the widely used K2 Algorithm.
Resumo:
The substitution of missing values, also called imputation, is an important data preparation task for many domains. Ideally, the substitution of missing values should not insert biases into the dataset. This aspect has been usually assessed by some measures of the prediction capability of imputation methods. Such measures assume the simulation of missing entries for some attributes whose values are actually known. These artificially missing values are imputed and then compared with the original values. Although this evaluation is useful, it does not allow the influence of imputed values in the ultimate modelling task (e.g. in classification) to be inferred. We argue that imputation cannot be properly evaluated apart from the modelling task. Thus, alternative approaches are needed. This article elaborates on the influence of imputed values in classification. In particular, a practical procedure for estimating the inserted bias is described. As an additional contribution, we have used such a procedure to empirically illustrate the performance of three imputation methods (majority, naive Bayes and Bayesian networks) in three datasets. Three classifiers (decision tree, naive Bayes and nearest neighbours) have been used as modelling tools in our experiments. The achieved results illustrate a variety of situations that can take place in the data preparation practice.
Resumo:
Credit scoring modelling comprises one of the leading formal tools for supporting the granting of credit. Its core objective consists of the generation of a score by means of which potential clients can be listed in the order of the probability of default. A critical factor is whether a credit scoring model is accurate enough in order to provide correct classification of the client as a good or bad payer. In this context the concept of bootstraping aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the fitted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper we propose a new bagging-type variant procedure, which we call poly-bagging, consisting of combining predictors over a succession of resamplings. The study is derived by credit scoring modelling. The proposed poly-bagging procedure was applied to some different artificial datasets and to a real granting of credit dataset up to three successions of resamplings. We observed better classification accuracy for the two-bagged and the three-bagged models for all considered setups. These results lead to a strong indication that the poly-bagging approach may promote improvement on the modelling performance measures, while keeping a flexible and straightforward bagging-type structure easy to implement. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
To know how much misalignment is tolerable for a particle accelerator is an important input for the design of these machines. In particle accelerators the beam must be guided and focused using bending magnets and magnetic lenses, respectively. The alignment of the lenses along a transport line aims to ensure that the beam passes through their optical axes and represents a critical point in the assembly of the machine. There are more and more accelerators in the world, many of which are very small machines. Because the existing literature and programs are mostly targeted for large machines. in this work we describe a method suitable for small machines. This method consists in determining statistically the alignment tolerance in a set of lenses. Differently from the methods used in standard simulation codes for particle accelerators, the statistical method we propose makes it possible to evaluate particle losses as a function of the alignment accuracy of the optical elements in a transport line. Results for 100 key electrons, on the 3.5-m long conforming beam stage of the IFUSP Microtron are presented as an example of use. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Particle conservation lattice-gas models with infinitely many absorbing states are studied on a one-dimensional lattice. As one increases the particle density, they exhibit a phase transition from an absorbing to an active phase. The models are solved exactly by the use of the transfer matrix technique from which the critical behavior was obtained. We have found that the exponent related to the order parameter, the density of active sites, is 1 for all studied models except one of them with exponent 2.