928 resultados para Hypergraph Partitioning


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Darwinian Particle Swarm Optimization (DPSO) is an evolutionary algorithm that extends the Particle Swarm Optimization using natural selection to enhance the ability to escape from sub-optimal solutions. An extension of the DPSO to multi-robot applications has been recently proposed and denoted as Robotic Darwinian PSO (RDPSO), benefiting from the dynamical partitioning of the whole population of robots, hence decreasing the amount of required information exchange among robots. This paper further extends the previously proposed algorithm adapting the behavior of robots based on a set of context-based evaluation metrics. Those metrics are then used as inputs of a fuzzy system so as to systematically adjust the RDPSO parameters (i.e., outputs of the fuzzy system), thus improving its convergence rate, susceptibility to obstacles and communication constraints. The adapted RDPSO is evaluated in groups of physical robots, being further explored using larger populations of simulated mobile robots within a larger scenario.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider the problem of scheduling a set of sporadic tasks on a multiprocessor system to meet deadlines using a task-splitting scheduling algorithm. Task-splitting (also called semi-partitioning) scheduling algorithms assign most tasks to just one processor but a few tasks are assigned to two or more processors, and they are dispatched in a way that ensures that a task never executes on two or more processors simultaneously. A particular type of task-splitting algorithms, called slot-based task-splitting dispatching, is of particular interest because of its ability to schedule tasks with high processor utilizations. Unfortunately, no slot-based task-splitting algorithm has been implemented in a real operating system so far. In this paper we discuss and propose some modifications to the slot-based task-splitting algorithm driven by implementation concerns, and we report the first implementation of this family of algorithms in a real operating system running Linux kernel version 2.6.34. We have also conducted an extensive range of experiments on a 4-core multicore desktop PC running task-sets with utilizations of up to 88%. The results show that the behavior of our implementation is in line with the theoretical framework behind it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies static-priority preemptive scheduling on a multiprocessor using partitioned scheduling. We propose a new scheduling algorithm and prove that if the proposed algorithm is used and if less than 50% of the capacity is requested then all deadlines are met. It is known that for every static-priority multiprocessor scheduling algorithm, there is a task set that misses a deadline although the requested capacity is arbitrary close to 50%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heterogeneous multicore platforms are becoming an interesting alternative for embedded computing systems with limited power supply as they can execute specific tasks in an efficient manner. Nonetheless, one of the main challenges of such platforms consists of optimising the energy consumption in the presence of temporal constraints. This paper addresses the problem of task-to-core allocation onto heterogeneous multicore platforms such that the overall energy consumption of the system is minimised. To this end, we propose a two-phase approach that considers both dynamic and leakage energy consumption: (i) the first phase allocates tasks to the cores such that the dynamic energy consumption is reduced; (ii) the second phase refines the allocation performed in the first phase in order to achieve better sleep states by trading off the dynamic energy consumption with the reduction in leakage energy consumption. This hybrid approach considers core frequency set-points, tasks energy consumption and sleep states of the cores to reduce the energy consumption of the system. Major value has been placed on a realistic power model which increases the practical relevance of the proposed approach. Finally, extensive simulations have been carried out to demonstrate the effectiveness of the proposed algorithm. In the best-case, savings up to 18% of energy are reached over the first fit algorithm, which has shown, in previous works, to perform better than other bin-packing heuristics for the target heterogeneous multicore platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sorption is commonly agreed to be the major process underlying the transport and fate of polycyclic aromatic hydrocarbons (PAHs) in soils. However, there is still a scarcity of studies focusing on spatial variability at the field scale in particular. In order to investigate the variation in the field of phenanthrene sorption, bulk topsoil samples were taken in a 15 × 15-m grid from the plough layer in two sandy loam fields with different texture and organic carbon (OC) contents (140 samples in total). Batch experiments were performed using the adsorption method. Values for the partition coefficient K d (L kg−1) and the organic carbon partition coefficient K OC (L kg−1) agreed with the most frequently used models for PAH partitioning, as OC revealed a higher affinity for sorption. More complex models using different OC compartments, such as non-complexed organic carbon (NCOC) and complexed organic carbon (COC) separately, performed better than single K OC models, particularly for a subset including samples with Dexter n < 10 and OC <0.04 kg kg−1. The selected threshold revealed that K OC-based models proved to be applicable for more organic fields, while two-component models proved to be more accurate for the prediction of K d and retardation factor (R) for less organic soils. Moreover, OC did not fully reflect the changes in phenanthrene retardation in the field with lower OC content (Faardrup). Bulk density and available water content influenced the phenanthrene transport mechanism phenomenon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brain metastases occur in 20-50% of NSCLC and 50-80% of SCLC. In this review, we will look at evidence-based medicine data and give some perspectives on the management of BM. We will address the problems of multiple BM, single BM and prophylactic cranial irradiation. Recursive Partitioning Analysis (RPA) is a powerful prognostic tool to facilitate treatment decisions. Dealing with multiple BM, the use of corticosteroids was established more than 40 years ago by a unique randomized trial (RCT). Palliative effect is high (_80%) as well as side-effects. Whole brain radiotherapy (WBRT) was evaluated in many RCTs with a high (60-90%) response rate; several RT regimes are equivalent, but very high dose per fraction should be avoided. In multiple BM from SCLC, the effect of WBRT is comparable to that in NSCLC but chemotherapy (CXT) although advocated is probably less effective than RT. Single BM from NSCLC occurs in 30% of all BM cases; several prognostic classifications including RPA are very useful. Several options are available in single BM: WBRT, surgery (SX), radiosurgery (RS) or any combination of these. All were studied in RCTs and will be reviewed: the addition of WBRT to SX or RS gives a better neurological tumour control, has little or no impact on survival, and may be more toxic. However omitting WBRT after SX alone gives a higher risk of cerebro-spinal fluid dissemination. Prophylactic cranial irradiation (PCI) has a major role in SCLC. In limited disease, meta-analyses have shown a positive impact of PCI in the decrease of brain relapse and in survival improvement, especially for patients in complete remission. Surprisingly, this has been recently confirmed also in extensive disease. Experience with PCI for NSCLC is still limited, but RCT suggest a reduction of BM with no impact on survival. Toxicity of PCI is a matter of debate, as neurological or neuro-cognitive impairment is already present prior to PCI in almost half of patients. However RT toxicity is probably related to total dose and dose per fraction. Perspectives : Future research should concentrate on : 1) combined modalities in multiple BM. 2) Exploration of treatments in oligo-metastases. 3) Further exploration of PCI in NSCLC. 4) Exploration of new, toxicity-sparing radiotherapy techniques (IMRT, Tomotherapy etc).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

All-electron partitioning of wave functions into products ^core^vai of core and valence parts in orbital space results in the loss of core-valence antisymmetry, uncorrelation of motion of core and valence electrons, and core-valence overlap. These effects are studied with the variational Monte Carlo method using appropriately designed wave functions for the first-row atoms and positive ions. It is shown that the loss of antisymmetry with respect to interchange of core and valence electrons is a dominant effect which increases rapidly through the row, while the effect of core-valence uncorrelation is generally smaller. Orthogonality of the core and valence parts partially substitutes the exclusion principle and is absolutely necessary for meaningful calculations with partitioned wave functions. Core-valence overlap may lead to nonsensical values of the total energy. It has been found that even relatively crude core-valence partitioned wave functions generally can estimate ionization potentials with better accuracy than that of the traditional, non-partitioned ones, provided that they achieve maximum separation (independence) of core and valence shells accompanied by high internal flexibility of ^core and Wvai- Our best core-valence partitioned wave function of that kind estimates the IP's with an accuracy comparable to the most accurate theoretical determinations in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thylakoid membrane fractions were prepared from specific regions of thylakoid membranes of spinach (Spinacia oleracea). These fractions, which include grana (83), stroma (T3), grana core (8S), margins (Ma) and purified stroma (Y100) were prepared using a non-detergent method including a mild sonication and aqueous two-phase partitioning. The significance of PSlla and PSII~ centres have been described extensively in the literature. Previous work has characterized two types of PSII centres which are proposed to exist in different regions of the thylakoid membrane. a-centres are suggested to aggregate in stacked regions of grana whereas ~-centres are located in unstacked regions of stroma lamellae. The goal of this study is to characterize photosystem II from the isolated membrane vesicles representing different regions of the higher plant thylakoid membrane. The low temperature absorption spectra have been deconvoluted via Gaussian decomposition to estimate the relative sub-components that contribute to each fractions signature absorption spectrum. The relative sizes of the functional PSII antenna and the fluorescence induction kinetics were measured and used to determine the relative contributions of PSlla and PSII~ to each fraction. Picosecond chlorophyll fluorescence decay kinetics were collected for each fraction to characterize and gain insight into excitation energy transfer and primary electron transport in PSlla and PSII~ centres. The results presented here clearly illustrate the widely held notions of PSII/PS·I and PSlIa/PSII~ spatial separation. This study suggests that chlorophyll fluorescence decay lifetimes of PSII~ centres are shorter than those of PSlIa centres and, at FM, the longer lived of the two PSII components renders a larger yield in PSlIa-rich fractions, but smaller in PSIlr3-rich fractions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The manipulation of large (>10 kb) plasmid systems amplifies problems common to traditional cloning strategies. Unique or rare restriction enzyme recognition sequences are uncommon and very rarely located in opportunistic locations. Making site-specific deletions and insertions in larger plasmids consequently leads to multiple step cloning strategies that are often limited by time-consuming, low efficiency linker insertions or blunt-end cloning strategies. Manipulation ofthe adenovirus genome and the genomes ofother viruses as bacterial plasmids are systems that typify such situations. Recombinational cloning techniques based on homologous recombination in Saccharomyces cerevisiae that circumvent many ofthese common problems have been developed. However, these techniques are rarely realistic options for such large plasmid systems due to the above mentioned difficulties associated with the addition ofrequired yeast DNA replication, partitioning and selectable marker sequences. To determine ifrecombinational cloning techniques could be modified to simplify the manipulation of such a large plasmid system, a recombinational cloning system for the creation of human adenovirus EI-deletion rescue plasmids was developed. Here we report for the first time that the 1,456 bp TRP1/ARS fragment ofYRp7 is alone sufficient to foster successful recombinational cloning without additional partitioning sequences, using only slight modifications of existing protocols. In addition, we describe conditions for efficient recombinational cloning involving simultaneous deletion of large segments ofDNA (>4.2 kb) and insertion of donor fragment DNA using only a single non-unique restriction site. The discovery that recombinational cloning can foster large deletions has been used to develop a novel recombiliational cloillng technique, selectable inarker 'kilockouf" recombinational cloning, that uses deletion of a yeast selectable marker coupled with simultaneous negative and positive selection to reduce background transformants to undetectable levels. The modification of existing protocols as described in this report facilitates the use of recombinational cloning strategies that are otherwise difficult or impractical for use with large plasmid systems. Improvement of general recombinational cloning strategies and strategies specific to the manipulation ofthe adenovirus genome are considered in light of data presented herein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We developed the concept of split-'t to deal with the large molecules (in terms of the number of electrons and nuclear charge Z). This naturally leads to partitioning the local energy into components due to each electron shell. The minimization of the variation of the valence shell local energy is used to optimize a simple two parameter CuH wave function. Molecular properties (spectroscopic constants and the dipole moment) are calculated for the optimized and nearly optimized wave functions using the Variational Quantum Monte Carlo method. Our best results are comparable to those from the single and double configuration interaction (SDCI) method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Part I, theoretical derivations for Variational Monte Carlo calculations are compared with results from a numerical calculation of He; both indicate that minimization of the ratio estimate of Evar , denoted EMC ' provides different optimal variational parameters than does minimization of the variance of E MC • Similar derivations for Diffusion Monte Carlo calculations provide a theoretical justification for empirical observations made by other workers. In Part II, Importance sampling in prolate spheroidal coordinates allows Monte Carlo calculations to be made of E for the vdW molecule var He2' using a simplifying partitioning of the Hamiltonian and both an HF-SCF and an explicitly correlated wavefunction. Improvements are suggested which would permit the extension of the computational precision to the point where an estimate of the interaction energy could be made~

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soybean (Glycine ~ (L.) Merr. cv. Harosoy 63) plants inoculated with Rhizobium japonicum were grown in vermiculite in the presence or absence of nitrate fertilization for up to 6 weeks after planting. Overall growth of nodulated plants was enhanced in the presence of nitrate fertilization, while the extent of nodule development was reduced. Although the number of nodules was not affected by nitrate fertilization when plants were grown at a light intensity limiting for photosynthesis, at light intensities approaching or exceeding the light saturation point for photosynthesis, nitrate fertilization resulted in at least a 30% reduction in nodule numbers. The mature, first trifoliate leaf of 21 day old plants was allowed to photoassimi1ate 14C02. One hour after·· the initial exposure to 14C02, the , plants were harvested and the 14C radioactivity was determined in the 80% ethanol-soluble fraction: in. o:rider to assess· "the extent of photoassimilate export and the pattern of distribution of exported 14C. The magnitude of 14C export was not affected by the presence of nitrate fertilization. However, there was a significant effect on the distribution pattern, particularly with regard to the partitioning of 14C-photosynthate between the nodules and the root tissue. In the presence of nitrate fertilization, less than 6% of the exported 14C photosynthate was recovered from the nodules, with much larger amounts (approximately 37%) being recovered from the root tissue. In the absence of nitrate fertilization, recovery of exported 14C-photosynthate from the nodules (19 to 27%) was approximately equal to that from the root tissue (24 to 33%). By initiating- or terminating the applications of nitrate at 14 days of age, it was determined that the period from day 14 to day 21 after planting was particularly significant for the development of nodules initiated earlier. Addition of nitrate fertilization at this time inhibited further nodule development while stimulating plant growth, whereas removal of nitrate fertilization stimulated nodule development. The results obtained are consistent with the hypothesis that nodule development is inhibited by nitrate fertilization through a reduction in the availability of photosynthate to the nodules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diatoms are renowned for their robust ability to perform NPQ (Non-Photochemical Quenching of chlorophyll fluorescence) as a dissipative response to heightened light stress on photosystem II, plausibly explaining their dominance over other algal groups in turbulent light environs. Their NPQ mechanism has been principally attributed to a xanthophyll cycle involving the lumenal pH regulated reversible de-epoxidation of diadinoxanthin. The principal goal of this dissertation is to reveal the physiological and physical origins and consequences of the NPQ response in diatoms during short-term transitions to excessive irradiation. The investigation involves diatom species from different originating light environs to highlight the diversity of diatom NPQ and to facilitate the detection of core mechanisms common among the diatoms as a group. A chiefly spectroscopic approach was used to investigate NPQ in diatom cells. Prime methodologies include: the real time monitoring of PSII excitation and de-excitation pathways via PAM fluorometry and pigment interconversion via transient absorbance measurements, the collection of cryogenic absorbance spectra to measure pigment energy levels, and the collection of cryogenic fluorescence spectra and room temperature picosecond time resolved fluorescence decay spectra to study excitation energy transfer and dissipation. Chemical inhibitors that target the trans-thylakoid pH gradient, the enzyme responsible for diadinoxanthin de-epoxidation, and photosynthetic electron flow were additionally used to experimentally manipulate the NPQ response. Multifaceted analyses of the NPQ responses from two previously un-photosynthetically characterised species, Nitzschia curvilineata and Navicula sp., were used to identify an excitation pressure relief ‘strategy’ for each species. Three key areas of NPQ were examined: (i) the NPQ activation/deactivation processes, (ii) how NPQ affects the collection, dissipation, and usage of absorbed light energy, and (iii) the interdependence of NPQ and photosynthetic electron flow. It was found that Nitzschia cells regulate excitation pressure via performing a high amplitude, reversible antenna based quenching which is dependent on the de-epoxidation of diadinoxanthin. In Navicula cells excitation pressure could be effectively regulated solely within the PSII reaction centre, whilst antenna based, diadinoxanthin de-epoxidation dependent quenching was implicated to be used as a supplemental, long-lasting source of excitation energy dissipation. These strategies for excitation balance were discussed in the context of resource partitioning under these species’ originating light climates. A more detailed investigation of the NPQ response in Nitzschia was used to develop a comprehensive model describing the mechanism for antenna centred non-photochemical quenching in this species. The experimental evidence was strongly supportive of a mechanism whereby: an acidic lumen triggers the diadinoxanthin de-epoxidation and protonation mediated aggregation of light harvesting complexes leading to the formation of quencher chlorophyll a-chlorophyll a dimers with short-lived excited states; quenching relaxes when a rise in lumen pH triggers the dispersal of light harvesting complex aggregates via deprotonation events and the input of diadinoxanthin. This model may also be applicable for describing antenna based NPQ in other diatom species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ce mémoire présente une implantation de la création paresseuse de tâches desti- née à des systèmes multiprocesseurs à mémoire distribuée. Elle offre un sous-ensemble des fonctionnalités du Message-Passing Interface et permet de paralléliser certains problèmes qui se partitionnent difficilement de manière statique grâce à un système de partitionnement dynamique et de balancement de charge. Pour ce faire, il se base sur le langage Multilisp, un dialecte de Scheme orienté vers le traitement parallèle, et implante sur ce dernier une interface semblable à MPI permettant le calcul distribué multipro- cessus. Ce système offre un langage beaucoup plus riche et expressif que le C et réduit considérablement le travail nécessaire au programmeur pour pouvoir développer des programmes équivalents à ceux en MPI. Enfin, le partitionnement dynamique permet de concevoir des programmes qui seraient très complexes à réaliser sur MPI. Des tests ont été effectués sur un système local à 16 processeurs et une grappe à 16 processeurs et il offre de bonnes accélérations en comparaison à des programmes séquentiels équiva- lents ainsi que des performances acceptables par rapport à MPI. Ce mémoire démontre que l’usage des futures comme technique de partitionnement dynamique est faisable sur des multiprocesseurs à mémoire distribuée.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’annotation en rôles sémantiques est une tâche qui permet d’attribuer des étiquettes de rôles telles que Agent, Patient, Instrument, Lieu, Destination etc. aux différents participants actants ou circonstants (arguments ou adjoints) d’une lexie prédicative. Cette tâche nécessite des ressources lexicales riches ou des corpus importants contenant des phrases annotées manuellement par des linguistes sur lesquels peuvent s’appuyer certaines approches d’automatisation (statistiques ou apprentissage machine). Les travaux antérieurs dans ce domaine ont porté essentiellement sur la langue anglaise qui dispose de ressources riches, telles que PropBank, VerbNet et FrameNet, qui ont servi à alimenter les systèmes d’annotation automatisés. L’annotation dans d’autres langues, pour lesquelles on ne dispose pas d’un corpus annoté manuellement, repose souvent sur le FrameNet anglais. Une ressource telle que FrameNet de l’anglais est plus que nécessaire pour les systèmes d’annotation automatisé et l’annotation manuelle de milliers de phrases par des linguistes est une tâche fastidieuse et exigeante en temps. Nous avons proposé dans cette thèse un système automatique pour aider les linguistes dans cette tâche qui pourraient alors se limiter à la validation des annotations proposées par le système. Dans notre travail, nous ne considérons que les verbes qui sont plus susceptibles que les noms d’être accompagnés par des actants réalisés dans les phrases. Ces verbes concernent les termes de spécialité d’informatique et d’Internet (ex. accéder, configurer, naviguer, télécharger) dont la structure actancielle est enrichie manuellement par des rôles sémantiques. La structure actancielle des lexies verbales est décrite selon les principes de la Lexicologie Explicative et Combinatoire, LEC de Mel’čuk et fait appel partiellement (en ce qui concerne les rôles sémantiques) à la notion de Frame Element tel que décrit dans la théorie Frame Semantics (FS) de Fillmore. Ces deux théories ont ceci de commun qu’elles mènent toutes les deux à la construction de dictionnaires différents de ceux issus des approches traditionnelles. Les lexies verbales d’informatique et d’Internet qui ont été annotées manuellement dans plusieurs contextes constituent notre corpus spécialisé. Notre système qui attribue automatiquement des rôles sémantiques aux actants est basé sur des règles ou classificateurs entraînés sur plus de 2300 contextes. Nous sommes limités à une liste de rôles restreinte car certains rôles dans notre corpus n’ont pas assez d’exemples annotés manuellement. Dans notre système, nous n’avons traité que les rôles Patient, Agent et Destination dont le nombre d’exemple est supérieur à 300. Nous avons crée une classe que nous avons nommé Autre où nous avons rassemblé les autres rôles dont le nombre d’exemples annotés est inférieur à 100. Nous avons subdivisé la tâche d’annotation en sous-tâches : identifier les participants actants et circonstants et attribuer des rôles sémantiques uniquement aux actants qui contribuent au sens de la lexie verbale. Nous avons soumis les phrases de notre corpus à l’analyseur syntaxique Syntex afin d’extraire les informations syntaxiques qui décrivent les différents participants d’une lexie verbale dans une phrase. Ces informations ont servi de traits (features) dans notre modèle d’apprentissage. Nous avons proposé deux techniques pour l’identification des participants : une technique à base de règles où nous avons extrait une trentaine de règles et une autre technique basée sur l’apprentissage machine. Ces mêmes techniques ont été utilisées pour la tâche de distinguer les actants des circonstants. Nous avons proposé pour la tâche d’attribuer des rôles sémantiques aux actants, une méthode de partitionnement (clustering) semi supervisé des instances que nous avons comparée à la méthode de classification de rôles sémantiques. Nous avons utilisé CHAMÉLÉON, un algorithme hiérarchique ascendant.