899 resultados para Particle Trajectory Computation
Resumo:
Finding the degree-constrained minimum spanning tree (DCMST) of a graph is a widely studied NP-hard problem. One of its most important applications is network design. Here we deal with a new variant of the DCMST problem, which consists of finding not only the degree- but also the role-constrained minimum spanning tree (DRCMST), i.e., we add constraints to restrict the role of the nodes in the tree to root, intermediate or leaf node. Furthermore, we do not limit the number of root nodes to one, thereby, generally, building a forest of DRCMSTs. The modeling of network design problems can benefit from the possibility of generating more than one tree and determining the role of the nodes in the network. We propose a novel permutation-based representation to encode these forests. In this new representation, one permutation simultaneously encodes all the trees to be built. We simulate a wide variety of DRCMST problems which we optimize using eight different evolutionary computation algorithms encoding individuals of the population using the proposed representation. The algorithms we use are: estimation of distribution algorithm, generational genetic algorithm, steady-state genetic algorithm, covariance matrix adaptation evolution strategy, differential evolution, elitist evolution strategy, non-elitist evolution strategy and particle swarm optimization. The best results are for the estimation of distribution algorithms and both types of genetic algorithms, although the genetic algorithms are significantly faster.
Resumo:
A simple evolutionary process can discover sophisticated methods for emergent information processing in decentralized spatially extended systems. The mechanisms underlying the resulting emergent computation are explicated by a technique for analyzing particle-based logic embedded in pattern-forming systems. Understanding how globally coordinated computation can emerge in evolution is relevant both for the scientific understanding of natural information processing and for engineering new forms of parallel computing systems.
Resumo:
We present a new model for the continuous measurement of a coupled quantum dot charge qubit. We model the effects of a realistic measurement, namely adding noise to, and filtering, the current through the detector. This is achieved by embedding the detector in an equivalent circuit for measurement. Our aim is to describe the evolution of the qubit state conditioned on the macroscopic output of the external circuit. We achieve this by generalizing a recently developed quantum trajectory theory for realistic photodetectors [P. Warszawski, H. M. Wiseman, and H. Mabuchi, Phys. Rev. A 65, 023802 (2002)] to treat solid-state detectors. This yields stochastic equations whose (numerical) solutions are the realistic quantum trajectories of the conditioned qubit state. We derive our general theory in the context of a low transparency quantum point contact. Areas of application for our theory and its relation to previous work are discussed.
Resumo:
We investigate the gas-particle dynamics of a device designed for biological pre-clinical experiments. The device uses transonic/supersonic gas flow to accelerate microparticles such that they penetrate the outer skin layers. By using a shock tube coupled to a correctly expanded nozzle, a quasi-one-dimensional, quasi-steady flow (QSF) is produced to uniformly accelerate the microparticles. The system utilises a microparticle cassette (a diaphragm sealed container) that incorporates a jet mixing mechanism to stir the particles prior to diaphragm rupture. Pressure measurements reveal that a QSF exit period - suitable for uniformly accelerating microparticles - exists between 155 and 220 mus after diaphragm rupture. Immediately preceding the QSF period, a starting process secondary shock was shown to form with its (x,t) trajectory comparing well to theoretical estimates. To characterise the microparticle, flow particle image velocimetry experiments were conducted at the nozzle exit, using particle payloads with varying diameter (2.7-48 mu m), density (600-16,800 kg/m(3)) and mass (0.25-10 mg). The resultant microparticle velocities were temporally uniform. The experiments also show that the starting process does not significantly influence the microparticle nozzle exit velocities. The velocity distribution across the nozzle exit was also uniform for the majority of microparticle types tested. For payload masses typically used in pre-clinical drug and vaccine applications (
Resumo:
The pyrolysis of a freely moving cellulosic particle inside a 41.7mgs -1 source continuously fed fluid bed reactor subjected to convective heat transfer is modelled. The Lagrangian approach is adopted for the particle tracking inside the reactor, while the flow of the inert gas is treated with the standard Eulerian method for gases. The model incorporates the thermal degradation of cellulose to char with simultaneous evolution of gases and vapours from discrete cellulosic particles. The reaction kinetics is represented according to the Broido–Shafizadeh scheme. The convective heat transfer to the surface of the particle is solved by two means, namely the Ranz–Marshall correlation and the limit case of infinitely fast external heat transfer rates. The results from both approaches are compared and discussed. The effect of the different heat transfer rates on the discrete phase trajectory is also considered.
Resumo:
The fluid–particle interaction inside a 150 g/h fluidised bed reactor is modelled. The biomass particle is injected into the fluidised bed and the momentum transport from the fluidising gas and fluidised sand is modelled. The Eulerian approach is used to model the bubbling behaviour of the sand, which is treated as a continuum. The particle motion inside the reactor is computed using drag laws, dependent on the local volume fraction of each phase, according to the literature. FLUENT 6.2 has been used as the modelling framework of the simulations with a completely revised drag model, in the form of user defined function (UDF), to calculate the forces exerted on the particle as well as its velocity components. 2-D and 3-D simulations are tested and compared. The study is the first part of a complete pyrolysis model in fluidised bed reactors.
Resumo:
In recent years, there has been an enormous growth of location-aware devices, such as GPS embedded cell phones, mobile sensors and radio-frequency identification tags. The age of combining sensing, processing and communication in one device, gives rise to a vast number of applications leading to endless possibilities and a realization of mobile Wireless Sensor Network (mWSN) applications. As computing, sensing and communication become more ubiquitous, trajectory privacy becomes a critical piece of information and an important factor for commercial success. While on the move, sensor nodes continuously transmit data streams of sensed values and spatiotemporal information, known as ``trajectory information". If adversaries can intercept this information, they can monitor the trajectory path and capture the location of the source node. ^ This research stems from the recognition that the wide applicability of mWSNs will remain elusive unless a trajectory privacy preservation mechanism is developed. The outcome seeks to lay a firm foundation in the field of trajectory privacy preservation in mWSNs against external and internal trajectory privacy attacks. First, to prevent external attacks, we particularly investigated a context-based trajectory privacy-aware routing protocol to prevent the eavesdropping attack. Traditional shortest-path oriented routing algorithms give adversaries the possibility to locate the target node in a certain area. We designed the novel privacy-aware routing phase and utilized the trajectory dissimilarity between mobile nodes to mislead adversaries about the location where the message started its journey. Second, to detect internal attacks, we developed a software-based attestation solution to detect compromised nodes. We created the dynamic attestation node chain among neighboring nodes to examine the memory checksum of suspicious nodes. The computation time for memory traversal had been improved compared to the previous work. Finally, we revisited the trust issue in trajectory privacy preservation mechanism designs. We used Bayesian game theory to model and analyze cooperative, selfish and malicious nodes' behaviors in trajectory privacy preservation activities.^
Resumo:
In recent years, there has been an enormous growth of location-aware devices, such as GPS embedded cell phones, mobile sensors and radio-frequency identification tags. The age of combining sensing, processing and communication in one device, gives rise to a vast number of applications leading to endless possibilities and a realization of mobile Wireless Sensor Network (mWSN) applications. As computing, sensing and communication become more ubiquitous, trajectory privacy becomes a critical piece of information and an important factor for commercial success. While on the move, sensor nodes continuously transmit data streams of sensed values and spatiotemporal information, known as ``trajectory information". If adversaries can intercept this information, they can monitor the trajectory path and capture the location of the source node. This research stems from the recognition that the wide applicability of mWSNs will remain elusive unless a trajectory privacy preservation mechanism is developed. The outcome seeks to lay a firm foundation in the field of trajectory privacy preservation in mWSNs against external and internal trajectory privacy attacks. First, to prevent external attacks, we particularly investigated a context-based trajectory privacy-aware routing protocol to prevent the eavesdropping attack. Traditional shortest-path oriented routing algorithms give adversaries the possibility to locate the target node in a certain area. We designed the novel privacy-aware routing phase and utilized the trajectory dissimilarity between mobile nodes to mislead adversaries about the location where the message started its journey. Second, to detect internal attacks, we developed a software-based attestation solution to detect compromised nodes. We created the dynamic attestation node chain among neighboring nodes to examine the memory checksum of suspicious nodes. The computation time for memory traversal had been improved compared to the previous work. Finally, we revisited the trust issue in trajectory privacy preservation mechanism designs. We used Bayesian game theory to model and analyze cooperative, selfish and malicious nodes' behaviors in trajectory privacy preservation activities.
Resumo:
The scope of this paper is to reflect on the theoretical construction in the constitution of the sociology of health, still called medical sociology in some countries. Two main ideas constitute the basis for this: interdisciplinarity and the degree of articulation in the fields of medicine and sociology. We sought to establish a dialogue with some dimensions - macro/micro, structure/action - that constitute the basis for understanding medicine/health in relation to the social/sociological dimension. The main aspects of these dimensions are initially presented. Straus' two medical sociologies and the theory/application impasses are then addressed, as well as the dilemmas of the sociology of medicine in the 1960s and 1970s. From these analyses the theoretical production before 1970 is placed as a counterpoint. Lastly, the sociology of health is seen in the general context of sociology, which underwent a fragmentation process from 1970 with effects in all subfields of the social sciences. This process involves a rethinking of the theoretical issues in a broadened spectrum of possibilities. The 1980s are highlighted when theoretical issues in the sociology of health are reinvigorated and the issue of interdisciplinarity is once again addressed.
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
A combination of trajectory sensitivity method and master-slave synchronization was proposed to parameter estimation of nonlinear systems. It was shown that master-slave coupling increases the robustness of the trajectory sensitivity algorithm with respect to the initial guess of parameters. Since synchronization is not a guarantee that the estimation process converges to the correct parameters, a conditional test that guarantees that the new combined methodology estimates the true values of parameters was proposed. This conditional test was successfully applied to Lorenz's and Chua's systems, and the proposed parameter estimation algorithm has shown to be very robust with respect to parameter initial guesses and measurement noise for these examples. Copyright (C) 2009 Elmer P. T. Cari et al.