895 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parallel computing is now widely used in numerical simulation, particularly for application codes based on finite difference and finite element methods. A popular and successful technique employed to parallelize such codes onto large distributed memory systems is to partition the mesh into sub-domains that are then allocated to processors. The code then executes in parallel, using the SPMD methodology, with message passing for inter-processor interactions. In order to improve the parallel efficiency of an imbalanced structured mesh CFD code, a new dynamic load balancing (DLB) strategy has been developed in which the processor partition range limits of just one of the partitioned dimensions uses non-coincidental limits, as opposed to coincidental limits. The ‘local’ partition limit change allows greater flexibility in obtaining a balanced load distribution, as the workload increase, or decrease, on a processor is no longer restricted by the ‘global’ (coincidental) limit change. The automatic implementation of this generic DLB strategy within an existing parallel code is presented in this chapter, along with some preliminary results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer egress simulation has potential to be used in large scale incidents to provide live advice to incident commanders. While there are many considerations which must be taken into account when applying such models to live incidents, one of the first concerns the computational speed of simulations. No matter how important the insight provided by the simulation, numerical hindsight will not prove useful to an incident commander. Thus for this type of application to be useful, it is essential that the simulation can be run many times faster than real time. Parallel processing is a method of reducing run times for very large computational simulations by distributing the workload amongst a number of CPUs. In this paper we examine the development of a parallel version of the buildingEXODUS software. The parallel strategy implemented is based on a systematic partitioning of the problem domain onto an arbitrary number of sub-domains. Each sub-domain is computed on a separate processor and runs its own copy of the EXODUS code. The software has been designed to work on typical office based networked PCs but will also function on a Windows based cluster. Two evaluation scenarios using the parallel implementation of EXODUS are described; a large open area and a 50 story high-rise building scenario. Speed-ups of up to 3.7 are achieved using up to six computers, with high-rise building evacuation simulation achieving run times of 6.4 times faster than real time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric inputs of mineral dust supply iron and other trace metals to the remote ocean and can influence the marine carbon cycle due to iron's role as a potentially limiting micronutrient. Dust generation, transport, and deposition are highly heterogeneous, and there are very few remote marine locations where dust concentrations and chemistry (e.g., iron solubility) are routinely monitored. Here we use aerosol and rainwater samples collected during 10 large-scale research cruises to estimate the atmospheric input of iron, aluminum, and manganese to four broad regions of the Atlantic Ocean over two 3 month periods for the years 2001–2005. We estimate total inputs of these metals to our study regions to be 4.2, 17, and 0.27 Gmol in April–June and 4.9, 14, and 0.19 Gmol in September–November, respectively. Inputs were highest in regions of high rainfall (the intertropical convergence zone and South Atlantic storm track), and rainfall contributed higher proportions of total input to wetter regions. By combining input estimates for total and soluble metals for these time periods, we calculated overall percentage solubilities for each metal that account for the contributions from both wet and dry depositions and the relative contributions from different aerosol types. Calculated solubilities were in the range 2.4%–9.1% for iron, 6.1%–15% for aluminum, and 54%–73% for manganese. We discuss sources of uncertainty in our estimates and compare our results to some recent estimates of atmospheric iron input to the Atlantic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unprecedented basin-scale ecological changes are occurring in our seas. As temperature and carbon dioxide concentrations increase, the extent of sea ice is decreasing, stratification and nutrient regimes are changing and pH is decreasing. These unparalleled changes present new challenges for managing our seas, as we are only just beginning to understand the ecological manifestations of these climate alterations. The Marine Strategy Framework Directive requires all European Member States to achieve good environmental status (GES) in their seas by 2020; this means management towards GES will take place against a background of climate-driven macroecological change. Each Member State must set environmental targets to achieve GES; however, in order to do so, an understanding of large-scale ecological change in the marine ecosystem is necessary. Much of our knowledge of macroecological change in the North Atlantic is a result of research using data gathered by the Continuous Plankton Recorder (CPR) survey, a near-surface plankton monitoring programme that has been sampling in the North Atlantic since 1931. CPR data indicate that North Atlantic and North Sea plankton dynamics are responding to both climate and human-induced changes, presenting challenges to the development of pelagic targets for achievement of GES in European Seas. Thus, the continuation of long-term ecological time series such as the CPR survey is crucial for informing and supporting the sustainable management of European seas through policy mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Ceramiaceae, one of the largest families of the red algae, there are from 1 to 4000 nuclei in each vegetative cell, but each tribe is homogeneous with respect to the uninucleate/multinucleate character state, except for the Callithamnieae. The goals of this study were to analyze rbcL gene sequences to clarify the evolution of taxa within the tribe Callithamnieae and to evaluate the potential evolutionary significance of the development of multinucleate cells in certain taxa. The genus Aglaothamnion, segregated from Callithamnion because it is uninucleate, was paraphyletic in all analyses. Callithamnion (including Aristothamnion) was monophyletic although not robustly so, apparently due to variations between taxa in rate of sequence evolution. Morphological synapomorphies were identified at different depths in the tree, supporting the molecular phylogenetic analysis. The uninucleate character state is ancestral in this tribe. The evolution of multinucleate cells has occurred once in the Callithamnieae. Multiple nuclei in each cell may combine the benefits of small C values (rapid cell cycle) with large cells (permitting morphological elaboration) while maintaining a constant ratio of nuclear volume: cytoplasmic volume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of springlike coupling between bosons in an open-chain configuration where the counter-rotating terms are explicitly included. We show that fruitful insight can be gained by decomposing the time-evolution operator of this problem into a pattern of linear-optics elements. This allows us to provide a clear picture of the effects of the counter-rotating terms in the important problem of long-haul entanglement distribution. The analytic control over the variance matrix of the state of the bosonic register allows us to track the dynamics of the entanglement. This helps in designing a global addressing scheme, complemented by a proper initialization of the register, which quantitatively improves the entanglement between the extremal oscillators in the chain, thus providing a strategy for feasible long-distance entanglement distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se propone un planteamiento teórico/conceptual para determinar si las relaciones interorganizativas e interpersonales de la netchain de las cooperativas agroalimentarias evolucionan hacia una learning netchain. Las propuestas del trabajo muestran que el mayor grado de asociacionismo y la mayor cooperación/colaboración vertical a lo largo de la cadena están positivamente relacionados con la posición horizontal de la empresa focal más cercana del consumidor final. Esto requiere una planificación y una resolución de problemas de manera conjunta, lo que está positivamente relacionado con el mayor flujo y diversidad de la información/conocimiento obtenido y diseminado a lo largo de la netchain. Al mismo tiempo se necesita desarrollar un contexto social en el que fluya la información/conocimiento y las nuevas ideas de manera informal y esto se logra con redes personales y, principalmente, profesionales y con redes internas y, principalmente, externas. Todo esto permitirá una mayor satisfacción de los socios de la cooperativa agroalimentaria y de sus distribuidores y una mayor intensidad en I+D, convirtiéndose la netchain de la cooperativa agroalimentaria, así, en una learning netchain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of a study aimed at determining the most important experimental parameters for automated, quantitative analysis of solid dosage form pharmaceuticals (seized and model 'ecstasy' tablets) are reported. Data obtained with a macro-Raman spectrometer were complemented by micro-Raman measurements, which gave information on particle size and provided excellent data for developing statistical models of the sampling errors associated with collecting data as a series of grid points on the tablets' surface. Spectra recorded at single points on the surface of seized MDMA-caffeine-lactose tablets with a Raman microscope (lambda(ex) = 785 nm, 3 mum diameter spot) were typically dominated by one or other of the three components, consistent with Raman mapping data which showed the drug and caffeine microcrystals were ca 40 mum in diameter. Spectra collected with a microscope from eight points on a 200 mum grid were combined and in the resultant spectra the average value of the Raman band intensity ratio used to quantify the MDMA: caffeine ratio, mu(r), was 1.19 with an unacceptably high standard deviation, sigma(r), of 1.20. In contrast, with a conventional macro-Raman system (150 mum spot diameter), combined eight grid point data gave mu(r) = 1.47 with sigma(r) = 0.16. A simple statistical model which could be used to predict sigma(r) under the various conditions used was developed. The model showed that the decrease in sigma(r) on moving to a 150 mum spot was too large to be due entirely to the increased spot diameter but was consistent with the increased sampling volume that arose from a combination of the larger spot size and depth of focus in the macroscopic system. With the macro-Raman system, combining 64 grid points (0.5 mm spacing and 1-2 s accumulation per point) to give a single averaged spectrum for a tablet was found to be a practical balance between minimizing sampling errors and keeping overhead times at an acceptable level. The effectiveness of this sampling strategy was also tested by quantitative analysis of a set of model ecstasy tablets prepared from MDEA-sorbitol (0-30% by mass MDEA). A simple univariate calibration model of averaged 64 point data had R-2 = 0.998 and an r.m.s. standard error of prediction of 1.1% whereas data obtained by sampling just four points on the same tablet showed deviations from the calibration of up to 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Balanced Scorecard of Kaplan and Norton is a management tool that supports the successful implementation of corporate strategies. It has been discussed and considered widely in both practice and research. By linking operational and non-financial corporate activities with causal chains to the firm's long-term strategy, the Balanced Scorecard supports the alignment and management of all corporate activities according to their strategic relevance. The Balanced Scorecard makes it possible to take into account non-monetary strategic success factors that significantly impact the economic success of a business. The Balanced Scorecard is thus a promising starting-point to also incorporate environmental and social aspects into the main management system of a firm. Sustainability management with the Balanced Scorecard helps to overcome the shortcomings of conventional approaches to environmental and social management systems by integrating the three pillars of sustainability into a single and overarching strategic management tool. After a brief discussion of the different possible forms of a Sustainability Balanced Scorecard the article takes a closer look at the process and steps of formulating a Sustainability Balanced Scorecard for a business unit. Before doing so, the basic conventional approach of the Balanced Scorecard and its suitability for sustainability management will be outlined in brief.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One thousand two hundred pigs were weaned at 4 weeks of age and mixed to form groups of ten animals that were balanced for gender. The groups consisted of uniform weight groups (i.e. separate groups of small, medium or large pigs), or mixed weight groups (i.e. groups containing small, medium and large pigs). Half of the groups were retained from weaning until slaughter at 21 weeks of age, and half were regrouped at the start of the finishing period at 10 weeks of age. In this regrouping, uniform weight groups were regrouped to form mixed weight groups, and mixed weight groups were regrouped to form uniform weight groups. In addition, some mixed weight groups were regrouped to form mixed weight groups in order to assess the effect of regrouping at 10 weeks of age on performance and aggressive behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microsatellite genotyping is a common DNA characterization technique in population, ecological and evolutionary genetics research. Since different alleles are sized relative to internal size-standards, different laboratories must calibrate and standardize allelic designations when exchanging data. This interchange of microsatellite data can often prove problematic. Here, 16 microsatellite loci were calibrated and standardized for the Atlantic salmon, Salmo salar, across 12 laboratories. Although inconsistencies were observed, particularly due to differences between migration of DNA fragments and actual allelic size ('size shifts'), inter-laboratory calibration was successful. Standardization also allowed an assessment of the degree and partitioning of genotyping error. Notably, the global allelic error rate was reduced from 0.05 ± 0.01 prior to calibration to 0.01 ± 0.002 post-calibration. Most errors were found to occur during analysis (i.e. when size-calling alleles; the mean proportion of all errors that were analytical errors across loci was 0.58 after calibration). No evidence was found of an association between the degree of error and allelic size range of a locus, number of alleles, nor repeat type, nor was there evidence that genotyping errors were more prevalent when a laboratory analyzed samples outside of the usual geographic area they encounter. The microsatellite calibration between laboratories presented here will be especially important for genetic assignment of marine-caught Atlantic salmon, enabling analysis of marine mortality, a major factor in the observed declines of this highly valued species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A scalable large vocabulary, speaker independent speech recognition system is being developed using Hidden Markov Models (HMMs) for acoustic modeling and a Weighted Finite State Transducer (WFST) to compile sentence, word, and phoneme models. The system comprises a software backend search and an FPGA-based Gaussian calculation which are covered here. In this paper, we present an efficient pipelined design implemented both as an embedded peripheral and as a scalable, parallel hardware accelerator. Both architectures have been implemented on an Alpha Data XRC-5T1, reconfigurable computer housing a Virtex 5 SX95T FPGA. The core has been tested and is capable of calculating a full set of Gaussian results from 3825 acoustic models in 9.03 ms which coupled with a backend search of 5000 words has provided an accuracy of over 80%. Parallel implementations have been designed with up to 32 cores and have been successfully implemented with a clock frequency of 133?MHz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the ground-state phase diagram of ultracold dipolar gases in highly anisotropic traps. Starting from a one-dimensional geometry, by ramping down the transverse confinement along one direction, the gas reaches various planar distributions of dipoles. At large linear densities, when the dipolar gas exhibits a crystal-like phase, critical values of the transverse frequency exist below which the configuration exhibits transverse patterns. These critical values are found by means of a classical theory, and are in full agreement with classical Monte Carlo simulations. The study of the quantum system is performed numerically with Monte Carlo techniques and shows that the quantum fluctuations smoothen the transition and make it completely disappear in a gas phase. These predictions could be experimentally tested and would allow one to reveal the effect of zero-point motion on self-organized mesoscopic structures of matter waves, such as the transverse pattern of the zigzag chain.