960 resultados para Graph partitioning


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new denuder-filter sampling technique has been used to investigate the gas/particle partitioning behaviour of the carbonyl products from the photooxidation of isoprene and 1,3,5-trimethylbenzene. A series of experiments was performed in two atmospheric simulation chambers at atmospheric pressure and ambient temperature in the presence of NOx and at a relative humidity of approximately 50%. The denuder and filter were both coated with the derivatizing agent O-(2,3,4,5,6-pentafluorobenzyl)-hydroxylamine (PFBHA) to enable the efficient collection of gas- and particle-phase carbonyls respectively. The tubes and filters were extracted and carbonyls identified as their oxime derivatives by GC-MS. The carbonyl products identified in the experiments accounted for around 5% and 10% of the mass of secondary organic aerosol formed from the photooxidation of isoprene and 1,3,5-trimethylbenzene respectively. Experimental gas/particle partitioning coefficients were determined for a wide range of carbonyl products formed from the photooxidation of isoprene and 1,3,5-trimethylbenzene and compared with the theoretical values based on standard absorptive partitioning theory. Photooxidation products with a single carbonyl moiety were not observed in the particle phase, but dicarbonyls, and in particular, glyoxal and methylglyoxal, exhibited gas/particle partitioning coefficients several orders of magnitude higher than expected theoretically. These findings support the importance of heterogeneous and particle-phase chemical reactions for SOA formation and growth during the atmospheric degradation of anthropogenic and biogenic hydrocarbons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Key life history traits such as breeding time and clutch size are frequently both heritable and under directional selection, yet many studies fail to document micro-evolutionary responses. One general explanation is that selection estimates are biased by the omission of correlated traits that have causal effects on fitness, but few valid tests of this exist. Here we show, using a quantitative genetic framework and six decades of life-history data on two free-living populations of great tits Parus major, that selection estimates for egg-laying date and clutch size are relatively unbiased. Predicted responses to selection based on the Robertson-Price Identity were similar to those based on the multivariate breeder’s equation, indicating that unmeasured covarying traits were not missing from the analysis. Changing patterns of phenotypic selection on these traits (for laying date, linked to climate change) therefore reflect changing selection on breeding values, and genetic constraints appear not to limit their independent evolution. Quantitative genetic analysis of correlational data from pedigreed populations can be a valuable complement to experimental approaches to help identify whether apparent associations between traits and fitness are biased by missing traits, and to parse the roles of direct versus indirect selection across a range of environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When plastic pipe is solidified, it proceeds through a long cooling chamber. Inside this chamber, inside the hollow extrudate, the plastic is molten, and this inner surface solidifies last. Sag, the flow due to the self-weight of the molten plastic, then happens in this cooling chamber, and sometimes, thickened regions (called knuckles) arise in the lower quadrants, especially of large diameter thickwalled pipes. To compensate for sag, engineers normally shift the die centerpiece downward. This thesis focuses on the consequences of this decentering. Specifically, when the molten polymer is viscoelastic, as is normally the case, a downward lateral force is exerted on the mandrel. Die eccentricity also affects the downstream axial force on the mandrel. These forces govern how rigidly the mandrel must be attached (normally, on a spider die). We attack this flow problem in eccentric cylindrical coordinates, using the Oldroyd 8-constant constitutive model framework. Specifically, we revise the method of Jones (1964), called polymer process partitioning. We estimate both axial and lateral forces. We develop a corresponding map to help plastics engineers predict the extrudate shape, including extrudate knuckles. From the mass balance over the postdie region, we then predict the shape of the extrudate entering the cooling chamber. We further include expressions for the stresses in the extruded polymer melt. We include detailed dimensional worked examples to show process engineers how to use our results to design pipe dies, and especially to suppress extrudate knuckling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract not available

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of unstructured mesh codes on parallel machines is one of the most effective ways to solve large computational mechanics problems. Completely general geometries and complex behaviour can be modelled and, in principle, the inherent sparsity of many such problems can be exploited to obtain excellent parallel efficiencies. However, unlike their structured counterparts, the problem of distributing the mesh across the memory of the machine, whilst minimising the amount of interprocessor communication, must be carefully addressed. This process is an overhead that is not incurred by a serial code, but is shown to rapidly computable at turn time and tailored for the machine being used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter describes a parallel optimization technique that incorporates a distributed load-balancing algorithm and provides an extremely fast solution to the problem of load-balancing adaptive unstructured meshes. Moreover, a parallel graph contraction technique can be employed to enhance the partition quality and the resulting strategy outperforms or matches results from existing state-of-the-art static mesh partitioning algorithms. The strategy can also be applied to static partitioning problems. Dynamic procedures have been found to be much faster than static techniques, to provide partitions of similar or higher quality and, in comparison, involve the migration of a fraction of the data. The method employs a new iterative optimization technique that balances the workload and attempts to minimize the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that the algorithm provides partitions of an equivalent or higher quality to static partitioners (which do not reuse the existing partition) and much more quickly. The dynamic evolution of load has three major influences on possible partitioning techniques; cost, reuse, and parallelism. The unstructured mesh may be modified every few time-steps and so the load-balancing must have a low cost relative to that of the solution algorithm in between remeshing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the measure of Aspect Ratio for mesh partitioning and gives hints why, for certain solvers, the Aspect Ratio of partitions plays an important role. We define and rate different kinds of Aspect Ratio, present a new center-based partitioning method which optimizes this measure implicitly and rate several existing partitioning methods and tools under the criterion of Aspect Ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central product of the DRAMA (Dynamic Re-Allocation of Meshes for parallel Finite Element Applications) project is a library comprising a variety of tools for dynamic re-partitioning of unstructured Finite Element (FE) applications. The input to the DRAMA library is the computational mesh, and corresponding costs, partitioned into sub-domains. The core library functions then perform a parallel computation of a mesh re-allocation that will re-balance the costs based on the DRAMA cost model. We discuss the basic features of this cost model, which allows a general approach to load identification, modelling and imbalance minimisation. Results from crash simulations are presented which show the necessity for multi-phase/multi-constraint partitioning components.