827 resultados para Multiple-scale processing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the dynamics of localized solutions of the relativistic cold-fluid plasma model in the small but finite amplitude limit, for slightly overcritical plasma density. Adopting a multiple scale analysis, we derive a perturbed nonlinear Schrodinger equation that describes the evolution of the envelope of circularly polarized electromagnetic field. Retaining terms up to fifth order in the small perturbation parameter, we derive a self-consistent framework for the description of the plasma response in the presence of localized electromagnetic field. The formalism is applied to standing electromagnetic soliton interactions and the results are validated by simulations of the full cold-fluid model. To lowest order, a cubic nonlinear Schrodinger equation with a focusing nonlinearity is recovered. Classical quasiparticle theory is used to obtain analytical estimates for the collision time and minimum distance of approach between solitons. For larger soliton amplitudes the inclusion of the fifth-order terms is essential for a qualitatively correct description of soliton interactions. The defocusing quintic nonlinearity leads to inelastic soliton collisions, while bound states of solitons do not persist under perturbations in the initial phase or amplitude

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use new neutron scattering instrumentation to follow in a single quantitative time-resolving experiment, the three key scales of structural development which accompany the crystallisation of synthetic polymers. These length scales span 3 orders of magnitude of the scattering vector. The study of polymer crystallisation dates back to the pioneering experiments of Keller and others who discovered the chain-folded nature of the thin lamellae crystals which are normally found in synthetic polymers. The inherent connectivity of polymers makes their crystallisation a multiscale transformation. Much understanding has developed over the intervening fifty years but the process has remained something of a mystery. There are three key length scales. The chain folded lamellar thickness is ~ 10nm, the crystal unit cell is ~ 1nm and the detail of the chain conformation is ~ 0.1nm. In previous work these length scales have been addressed using different instrumention or were coupled using compromised geometries. More recently researchers have attempted to exploit coupled time-resolved small-angle and wide-angle x-ray experiments. These turned out to be challenging experiments much related to the challenge of placing the scattering intensity on an absolute scale. However, they did stimulate the possibility of new phenomena in the very early stages of crystallisation. Although there is now considerable doubt on such experiments, they drew attention to the basic question as to the process of crystallisation in long chain molecules. We have used NIMROD on the second target station at ISIS to follow all three length scales in a time-resolving manner for poly(e-caprolactone). The technique can provide a single set of data from 0.01 to 100Å-1 on the same vertical scale. We present the results using a multiple scale model of the crystallisation process in polymers to analyse the results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By using the multiple scale method with the simultaneous introduction of multiple times, we study the propagation of long surface-waves in a shallow inviscid fluid. As a consequence of the requirements of scale invariance and absence of secular terms in each order of the perturbative expansion, we show that the Korteweg-de Vries hierarchy equations do play a role in the description of such waves. Finally, we show that this procedure of eliminating secularities is closely related to the renormalization technique introduced by Kodama and Taniuti. © 1995 American Institute of Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sour cassava starch is traditionally produced in Latin America for preparation of cheesebreads such as 'pan de yuca' and 'pandebono' in Colombia, and 'pao de queijo' in Brazil. The processing involved is described and improvements suggested. Criteria for quality assessment of sour cassava starch are based on consumers' requirements. Recommendations are made for improving the processing and product quality. Alternatives are given for extending the potential value of this traditional foodstuff.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we analyze the convergence of solutions of the Poisson equation with Neumann boundary conditions in a two-dimensional thin domain with highly oscillatory behavior. We consider the case where the height of the domain, amplitude and period of the oscillations are all of the same order, and given by a small parameter e > 0. Using an appropriate corrector approach, we show strong convergence and give error estimates when we replace the original solutions by the first-order expansion through the Multiple-Scale Method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis deals with heterogeneous architectures in standard workstations. Heterogeneous architectures represent an appealing alternative to traditional supercomputers because they are based on commodity components fabricated in large quantities. Hence their price-performance ratio is unparalleled in the world of high performance computing (HPC). In particular, different aspects related to the performance and consumption of heterogeneous architectures have been explored. The thesis initially focuses on an efficient implementation of a parallel application, where the execution time is dominated by an high number of floating point instructions. Then the thesis touches the central problem of efficient management of power peaks in heterogeneous computing systems. Finally it discusses a memory-bounded problem, where the execution time is dominated by the memory latency. Specifically, the following main contributions have been carried out: A novel framework for the design and analysis of solar field for Central Receiver Systems (CRS) has been developed. The implementation based on desktop workstation equipped with multiple Graphics Processing Units (GPUs) is motivated by the need to have an accurate and fast simulation environment for studying mirror imperfection and non-planar geometries. Secondly, a power-aware scheduling algorithm on heterogeneous CPU-GPU architectures, based on an efficient distribution of the computing workload to the resources, has been realized. The scheduler manages the resources of several computing nodes with a view to reducing the peak power. The two main contributions of this work follow: the approach reduces the supply cost due to high peak power whilst having negligible impact on the parallelism of computational nodes. from another point of view the developed model allows designer to increase the number of cores without increasing the capacity of the power supply unit. Finally, an implementation for efficient graph exploration on reconfigurable architectures is presented. The purpose is to accelerate graph exploration, reducing the number of random memory accesses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Creada en 1962 por Sam Walton, una pequeña empresa de distribución norteamericana se convertirá rápidamente en la primera empresa mundial. Desde Bentonville, Arkansas, hasta el interior de China, Wal-Mart muestra un camino bastante clásico que va desde la sede principal de la empresa hacia una proliferación de sus tiendas en una quincena de países. Con una facturación de 405 mil millones de dólares en 2008, la empresa de distribución número uno en el mundo fue sobrepasada en abril de 2009 por la empresa petrolera Exxon Mobil en la clasificación de las quinientas empresas mundiales más importantes. Pero a diferencia de otras grandes transnacionales como Exxon Mobil o Microsoft, Wal-Mart, la primera empresa mundial de distribución, tiene que adaptarse al medio ambiente local con el fin de atraer a un máximo de consumidores. El artículo propone una aproximación a escala múltiple de las estrategias de desarrollo de esta firma internacional. Así, la aplicación de un análisis espacial es apropiada para esclarecer los vínculos existentes entre los territorios y las estrategias de los actores a diferentes escalas geográficas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Creada en 1962 por Sam Walton, una pequeña empresa de distribución norteamericana se convertirá rápidamente en la primera empresa mundial. Desde Bentonville, Arkansas, hasta el interior de China, Wal-Mart muestra un camino bastante clásico que va desde la sede principal de la empresa hacia una proliferación de sus tiendas en una quincena de países. Con una facturación de 405 mil millones de dólares en 2008, la empresa de distribución número uno en el mundo fue sobrepasada en abril de 2009 por la empresa petrolera Exxon Mobil en la clasificación de las quinientas empresas mundiales más importantes. Pero a diferencia de otras grandes transnacionales como Exxon Mobil o Microsoft, Wal-Mart, la primera empresa mundial de distribución, tiene que adaptarse al medio ambiente local con el fin de atraer a un máximo de consumidores. El artículo propone una aproximación a escala múltiple de las estrategias de desarrollo de esta firma internacional. Así, la aplicación de un análisis espacial es apropiada para esclarecer los vínculos existentes entre los territorios y las estrategias de los actores a diferentes escalas geográficas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Creada en 1962 por Sam Walton, una pequeña empresa de distribución norteamericana se convertirá rápidamente en la primera empresa mundial. Desde Bentonville, Arkansas, hasta el interior de China, Wal-Mart muestra un camino bastante clásico que va desde la sede principal de la empresa hacia una proliferación de sus tiendas en una quincena de países. Con una facturación de 405 mil millones de dólares en 2008, la empresa de distribución número uno en el mundo fue sobrepasada en abril de 2009 por la empresa petrolera Exxon Mobil en la clasificación de las quinientas empresas mundiales más importantes. Pero a diferencia de otras grandes transnacionales como Exxon Mobil o Microsoft, Wal-Mart, la primera empresa mundial de distribución, tiene que adaptarse al medio ambiente local con el fin de atraer a un máximo de consumidores. El artículo propone una aproximación a escala múltiple de las estrategias de desarrollo de esta firma internacional. Así, la aplicación de un análisis espacial es apropiada para esclarecer los vínculos existentes entre los territorios y las estrategias de los actores a diferentes escalas geográficas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: A fully three-dimensional (3D) massively parallelizable list-mode ordered-subsets expectation-maximization (LM-OSEM) reconstruction algorithm has been developed for high-resolution PET cameras. System response probabilities are calculated online from a set of parameters derived from Monte Carlo simulations. The shape of a system response for a given line of response (LOR) has been shown to be asymmetrical around the LOR. This work has been focused on the development of efficient region-search techniques to sample the system response probabilities, which are suitable for asymmetric kernel models, including elliptical Gaussian models that allow for high accuracy and high parallelization efficiency. The novel region-search scheme using variable kernel models is applied in the proposed PET reconstruction algorithm. Methods: A novel region-search technique has been used to sample the probability density function in correspondence with a small dynamic subset of the field of view that constitutes the region of response (ROR). The ROR is identified around the LOR by searching for any voxel within a dynamically calculated contour. The contour condition is currently defined as a fixed threshold over the posterior probability, and arbitrary kernel models can be applied using a numerical approach. The processing of the LORs is distributed in batches among the available computing devices, then, individual LORs are processed within different processing units. In this way, both multicore and multiple many-core processing units can be efficiently exploited. Tests have been conducted with probability models that take into account the noncolinearity, positron range, and crystal penetration effects, that produced tubes of response with varying elliptical sections whose axes were a function of the crystal's thickness and angle of incidence of the given LOR. The algorithm treats the probability model as a 3D scalar field defined within a reference system aligned with the ideal LOR. Results: This new technique provides superior image quality in terms of signal-to-noise ratio as compared with the histogram-mode method based on precomputed system matrices available for a commercial small animal scanner. Reconstruction times can be kept low with the use of multicore, many-core architectures, including multiple graphic processing units. Conclusions: A highly parallelizable LM reconstruction method has been proposed based on Monte Carlo simulations and new parallelization techniques aimed at improving the reconstruction speed and the image signal-to-noise of a given OSEM algorithm. The method has been validated using simulated and real phantoms. A special advantage of the new method is the possibility of defining dynamically the cut-off threshold over the calculated probabilities thus allowing for a direct control on the trade-off between speed and quality during the reconstruction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the dynamics of localized solutions of the relativistic cold-fluid plasma model in the small but finite amplitude limit, for slightly overcritical plasma density. Adopting a multiple scale analysis, we derive a perturbed nonlinear Schrödinger equation that describes the evolution of the envelope of circularly polarized electromagnetic field. Retaining terms up to fifth order in the small perturbation parameter, we derive a self-consistent framework for the description of the plasma response in the presence of localized electromagnetic field. The formalism is applied to standing electromagnetic soliton interactions and the results are validated by simulations of the full cold-fluid model. To lowest order, a cubic nonlinear Schrödinger equation with a focusing nonlinearity is recovered. Classical quasiparticle theory is used to obtain analytical estimates for the collision time and minimum distance of approach between solitons. For larger soliton amplitudes the inclusion of the fifth-order terms is essential for a qualitatively correct description of soliton interactions. The defocusing quintic nonlinearity leads to inelastic soliton collisions, while bound states of solitons do not persist under perturbations in the initial phase or amplitude