908 resultados para Task-to-core mapping
Resumo:
Adaptation and reproductive isolation, the engines of biological diversity, are still elusive when discussing the genetic bases of speciation. Namely, the number of genes and magnitude of selection acting positively or negatively on genomic traits implicated in speciation is contentious. Here, we describe the first steps of an ongoing research program aimed at understanding the genetic bases of population divergence and reproductive isolation in the lake whitefish (Coregonus clupeaformis). A preliminary linkage map originating from a hybrid cross between dwarf and normal ecotypes is presented, whereby some of the segregating AFLP markers were found to be conserved among natural populations. Maximum-likelihood was used to estimate hybrid indices from non-diagnostic markers at 998 AFLP loci. This allowed identification of the most likely candidate loci that have been under the influence of selection during the natural hybridisation of whitefish originating from different glacial races. As some of these loci could be identified on the linkage map, the possibility that selection of traits in natural populations may eventually be correlated to specific chromosomal regions was demonstrated. The future prospects and potential of these approaches to elucidate the genetic bases of adaptation and reproductive isolation among sympatric ecotypes of lake whitefish is discussed.
Resumo:
Accurate habitat mapping is critical to landscape ecological studies such as required for developing and testing Montreal Process indicator 1.1e, fragmentation of forest types. This task poses a major challenge to remote sensing, especially in mixedspecies, variable-age forests such as dry eucalypt forests of subtropical eastern Australia. In this paper, we apply an innovative approach that uses a small section of one-metre resolution airborne data to calibrate a moderate spatial resolution model (30 m resolution; scale 1:50 000) based on Landsat Thematic Mapper data to estimate canopy structural properties in St Marys State Forest, near Maryborough, south-eastern Queensland. The approach applies an image-processing model that assumes each image pixel is significantly larger than individual tree crowns and gaps to estimate crown-cover percentage, stem density and mean crown diameter. These parameters were classified into three discrete habitat classes to match the ecology of four exudivorous arboreal species (yellowbellied glider Petaurus australis, sugar glider P. breviceps, squirrel glider P. norfolcensis , and feathertail glider Acrobates pygmaeus), and one folivorous arboreal marsupial, the greater glider Petauroides volans. These species were targeted due to the known ecological preference for old trees with hollows, and differences in their home range requirements. The overall mapping accuracy, visually assessed against transects (n = 93) interpreted from a digital orthophoto and validated in the field, was 79% (KHAT statistic = 0.72). The KHAT statistic serves as an indicator of the extent that the percentage correct values of the error matrix are due to ‘true’ agreement verses ‘chance’ agreement. This means that we are able to reliably report on the effect of habitat loss on target species, especially those with a large home range size (e.g. yellow-bellied glider). However, the classified habitat map failed to accurately capture the spatial patterning (e.g. patch size and shape) of stands with a trace or sub-dominance of senescent trees. This outcome makes the reporting of the effects of habitat fragmentation more problematic, especially for species with a small home range size (e.g. feathertail glider). With further model refinement and validation, however, this moderateresolution approach offers an important, cost eff e c t i v e advancement in mapping the age of dry eucalypt forests in the region.
What's law got to do with it? Mapping Modern mediation movements in civil & common law jurisdictions
Resumo:
This study investigated the influence of a concurrent cognitive task on the compensatory stepping response in balance-impaired elders and the attentional demand of the stepping response. Kinetic, kinematic and neuromuscular measures of a forward recovery step were investigated in 15 young adults, 15 healthy elders and 13 balance-impaired elders in a single task (postural recovery only) and dual task (postural recovery and vocal reaction time task) situation. Results revealed that reaction times were longer in all subjects when performed concurrently with a compensatory step, they were longer for a step than an in-place response and longer for balance-impaired older adults compared with young adults. An interesting finding was that the latter group difference may be related to prioritization between the two tasks rather than attentional demand, as the older adults completed the step before the reaction time, whereas the young adults could perform both concurrently. Few differences in step characteristics were found between tasks, with the most notable being a delayed latency and reduced magnitude of the early automatic postural response in healthy and balance-impaired elders with a concurrent task. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
A conserved helical peptide vaccine candidate from the M protein of group A streptococci, p145, has been described. Minimal epitopes within p145 have been defined and an epitope recognized by protective antibodies, but not by autoreactive T cells, has been identified. When administered to mice, p145 has low immunogenicity. Many boosts of peptide are required to achieve a high antibody titre (> 12 800). To attempt to overcome this low immunogenicity, lipid-core peptide technology was employed. Lipid-core peptides (LCP) consist of an oligomeric polylysine core, with multiple copies of the peptide of choice, conjugated to a series of lipoamino acids, which acts as an anchor for the antigen. Seven different LCP constructs based on the p145 peptide sequence were synthesized (LCP1-->LCP7) and the immunogenicity of the compounds examined. The most immunogenic constructs contained the longest alkyl side-chains. The number of lipoamino acids in the constructs affected the immunogenicity and spacing between the alkyl side-chains increased immunogenicity. An increase in immunogenicity (enzyme-linked immunosorbent assay (ELISA) titres) of up to 100-fold was demonstrated using this technology and some constructs without adjuvant were more immunogenic than p145 administered with complete Freund's adjuvant (CFA). The fine specificity of the induced antibody response differed for the different constructs but one construct, LCP4, induced antibodies of identical fine specificity to those found in endemic human serum. Opsonic activity of LCP4 antisera was more than double that of p145 antisera. These data show the potential for LCP technology to both enhance immunogenicity of complex peptides and to focus the immune response towards or away from critical epitopes.
Resumo:
The use of thermodilution and other methods of monitoring in dogs during surgery and critical care was evaluated. Six Greyhounds were anaesthetised and then instrumented by placing a thermodilution catheter into the pulmonary artery via the jugular vein. A catheter in the dorsal pedal artery also permitted direct measurement of arterial pressures. Core body temperature (degreesC) and central venous pressure (mmHg) were measured, while cardiac output (mL/min/kg) and mean arterial pressure (mmHg) were calculated. A mid-line surgical incision was performed and the physiological parameters were monitored for a total of two hours. All physiological parameters generally declined, although significant increases (P<0.05) were noted for cardiac output following surgical incision. Central venous pressure was maintained at approximately 0mmHg by controlling an infusion of sterile saline. Core body temperature decreased from 37.1+/-0.6degreesC (once instrumented) to 36.6+/-0.60degreesC (at the end of the study), despite warming using heating pads. Physiological parameters indicative of patient viability will generally decline during surgery without intervention. This study describes an approach that can be undertaken in veterinary hospitals to accurately monitor vital signs in surgical and critical care patients.
Resumo:
The ability to generate enormous random libraries of DNA probes via split-and-mix synthesis on solid supports is an important biotechnological application of colloids that has not been fully utilized to date. To discriminate between colloid-based DNA probes each colloidal particle must be 'encoded' so it is distinguishable from all other particles. To this end, we have used novel particle synthesis strategies to produce large numbers of optically encoded particle suitable for DNA library synthesis. Multifluorescent particles with unique and reproducible optical signatures (i.e., fluorescence and light-scattering attributes) suitable for high-throughput flow cytometry have been produced. In the spectroscopic study presented here, we investigated the optical characteristics of multi-fluorescent particles that were synthesized by coating silica 'core' particles with up to six different fluorescent dye shells alternated with non-fluorescent silica 'spacer' shells. It was observed that the diameter of the particles increased by up to 20% as a result of the addition of twelve concentric shells and that there was a significant reduction in fluorescence emission intensities from inner shells as an increasing number of shells were deposited.
Resumo:
When the Arab Spring broke out, the United States was in a quandary over how to handle the crisis in its attempt to balance its moral obligations and ideals without undercutting its strategic interests and those of its close allies. Flaws in US diplomatic approach have contributed to one of the most serious foreign policy crisis for a US administration to date with consequential upheaval and erosion of the US-built balance of power. The reactions and policy responses of the Obama administration highlight the difficulties in grasping with the new reality in the Middle East and in enunciating a policy platform that could combine American interests and values.
Resumo:
Since collaborative networked organisations are usually formed by independent and heterogeneous entities, it is natural that each member holds his own set of values, and that conflicts among partners might emerge because of some misalignment of values. In contrast, it is often stated in literature that the alignment between the value systems of members involved in collaborative processes is a prerequisite for successful co-working. As a result, the issue of core value alignment in collaborative networks started to attract attention. However, methods to analyse such alignment are lacking mainly because the concept of 'alignment' in this context is still ill defined and shows a multifaceted nature. As a contribution to the area, this article introduces an approach based on causal models and graph theory for the analysis of core value alignment in collaborative networks. The potential application of the approach is then discussed in the virtual organisations' breeding environment context.
Resumo:
Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.
Resumo:
Dust is a complex mixture of particles of organic and inorganic origin and different gases absorbed in aerosol droplets. In a poultry unit include dried faecal matter and urine, skin flakes, ammonia, carbon dioxide, pollens, feed and litter particles, feathers, grain mites, fungi spores, bacteria, viruses and their constituents. Dust particles vary in size and differentiation between particle size fractions is important in health studies in order to quantify penetration within the respiratory system. A descriptive study was developed in order to assess exposure to particles in a poultry unit during different operations, namely routine examination and floor turn over. Direct-reading equipment was used (Lighthouse, model 3016 IAQ). Particle measurement was performed in 5 different sizes (PM0.5; PM1.0; PM2.5; PM5.0; PM10). The chemical composition of poultry litter was also determined by neutron activation analysis. Normally, the litter of poultry pavilions is turned over weekly and it was during this operation that the higher exposure of particles was observed. In all the tasks considered PM5.0 and PM10.0 were the sizes with higher concentrations values. PM10 is what turns out to have higher values and PM0.5 the lowest values. The chemical element with the highest concentration was Mg (5.7E6 mg.kg-1), followed by K (1.5E4 mg.kg-1), Ca (4.8E3 mg.kg-1), Na (1.7E3 mg.kg-1), Fe (2.1E2 mg.kg-1) and Zn (4.2E1 mg.kg-1). This high presence of particles in the respirable range (<5–7μm) means that poultry dust particles can penetrate into the gas exchange region of the lung. Larger particles (PM10) present a range of concentrations from 5.3E5 and 3.0E6 mg/m3.
Resumo:
This paper is a contribution for the assessment and comparison of magnet properties based on magnetic field characteristics particularly concerning the magnetic induction uniformity in the air gaps. For this aim, a solver was developed and implemented to determine the magnetic field of a magnetic core to be used in Fast Field Cycling (FFC) Nuclear Magnetic Resonance (NMR) relaxometry. The electromagnetic field computation is based on a 2D finite-element method (FEM) using both the scalar and the vector potential formulation. Results for the magnetic field lines and the magnetic induction vector in the air gap are presented. The target magnetic induction is 0.2 T, which is a typical requirement of the FFC NMR technique, which can be achieved with a magnetic core based on permanent magnets or coils. In addition, this application requires high magnetic induction uniformity. To achieve this goal, a solution including superconducting pieces is analyzed. Results are compared with a different FEM program.
Resumo:
OBJECTIVE: To analyze the core group for sexually transmitted infections (STI) among college students. METHODS: Cross-sectional study carried out in a convenience sample comprising 711 college students of the public university of Morelos, Mexico, between 2001 and 2003. Sociodemographic and sexual behavior information were collected using self-applied questionnaires. Herpes simplex 2 (HSV-2) infection was tested in the blood. The number of sexual partners in the last year and cocaine consumption were used as indicators to construct the dependent variable "level of STI risk" in three categories: low, medium and high risk (core group). A multinomial analysis was conducted to evaluate whether different sex behaviors were associated with the variable "level of STI risk". RESULTS: There was significant association between HSV-2 seroprevalence and the variable "level of STI risk": 13%, 5.6% and 3.8% were found in high (core group), medium and low categories, respectively. There were gender differences regarding the core group. Men started having sexual intercourse earlier, had more sex partners, higher alcohol and drug consumption, higher frequency of sex intercourse with sex workers, exchanging sex for money, occasional and concurrent partners compared to women. CONCLUSIONS: The study findings suggest existing contextual characteristics in the study population that affect their sex behavior. In Mexico, the cultural conception of sexuality is determined mainly by gender differences where men engage in higher risky sexual behavior than women.
Resumo:
Os sistemas de tempo real modernos geram, cada vez mais, cargas computacionais pesadas e dinâmicas, começando-se a tornar pouco expectável que sejam implementados em sistemas uniprocessador. Na verdade, a mudança de sistemas com um único processador para sistemas multi- processador pode ser vista, tanto no domínio geral, como no de sistemas embebidos, como uma forma eficiente, em termos energéticos, de melhorar a performance das aplicações. Simultaneamente, a proliferação das plataformas multi-processador transformaram a programação paralela num tópico de elevado interesse, levando o paralelismo dinâmico a ganhar rapidamente popularidade como um modelo de programação. A ideia, por detrás deste modelo, é encorajar os programadores a exporem todas as oportunidades de paralelismo através da simples indicação de potenciais regiões paralelas dentro das aplicações. Todas estas anotações são encaradas pelo sistema unicamente como sugestões, podendo estas serem ignoradas e substituídas, por construtores sequenciais equivalentes, pela própria linguagem. Assim, o modo como a computação é na realidade subdividida, e mapeada nos vários processadores, é da responsabilidade do compilador e do sistema computacional subjacente. Ao retirar este fardo do programador, a complexidade da programação é consideravelmente reduzida, o que normalmente se traduz num aumento de produtividade. Todavia, se o mecanismo de escalonamento subjacente não for simples e rápido, de modo a manter o overhead geral em níveis reduzidos, os benefícios da geração de um paralelismo com uma granularidade tão fina serão meramente hipotéticos. Nesta perspetiva de escalonamento, os algoritmos que empregam uma política de workstealing são cada vez mais populares, com uma eficiência comprovada em termos de tempo, espaço e necessidades de comunicação. Contudo, estes algoritmos não contemplam restrições temporais, nem outra qualquer forma de atribuição de prioridades às tarefas, o que impossibilita que sejam diretamente aplicados a sistemas de tempo real. Além disso, são tradicionalmente implementados no runtime da linguagem, criando assim um sistema de escalonamento com dois níveis, onde a previsibilidade, essencial a um sistema de tempo real, não pode ser assegurada. Nesta tese, é descrita a forma como a abordagem de work-stealing pode ser resenhada para cumprir os requisitos de tempo real, mantendo, ao mesmo tempo, os seus princípios fundamentais que tão bons resultados têm demonstrado. Muito resumidamente, a única fila de gestão de processos convencional (deque) é substituída por uma fila de deques, ordenada de forma crescente por prioridade das tarefas. De seguida, aplicamos por cima o conhecido algoritmo de escalonamento dinâmico G-EDF, misturamos as regras de ambos, e assim nasce a nossa proposta: o algoritmo de escalonamento RTWS. Tirando partido da modularidade oferecida pelo escalonador do Linux, o RTWS é adicionado como uma nova classe de escalonamento, de forma a avaliar na prática se o algoritmo proposto é viável, ou seja, se garante a eficiência e escalonabilidade desejadas. Modificar o núcleo do Linux é uma tarefa complicada, devido à complexidade das suas funções internas e às fortes interdependências entre os vários subsistemas. Não obstante, um dos objetivos desta tese era ter a certeza que o RTWS é mais do que um conceito interessante. Assim, uma parte significativa deste documento é dedicada à discussão sobre a implementação do RTWS e à exposição de situações problemáticas, muitas delas não consideradas em teoria, como é o caso do desfasamento entre vários mecanismo de sincronização. Os resultados experimentais mostram que o RTWS, em comparação com outro trabalho prático de escalonamento dinâmico de tarefas com restrições temporais, reduz significativamente o overhead de escalonamento através de um controlo de migrações, e mudanças de contexto, eficiente e escalável (pelo menos até 8 CPUs), ao mesmo tempo que alcança um bom balanceamento dinâmico da carga do sistema, até mesmo de uma forma não custosa. Contudo, durante a avaliação realizada foi detetada uma falha na implementação do RTWS, pela forma como facilmente desiste de roubar trabalho, o que origina períodos de inatividade, no CPU em questão, quando a utilização geral do sistema é baixa. Embora o trabalho realizado se tenha focado em manter o custo de escalonamento baixo e em alcançar boa localidade dos dados, a escalonabilidade do sistema nunca foi negligenciada. Na verdade, o algoritmo de escalonamento proposto provou ser bastante robusto, não falhando qualquer meta temporal nas experiências realizadas. Portanto, podemos afirmar que alguma inversão de prioridades, causada pela sub-política de roubo BAS, não compromete os objetivos de escalonabilidade, e até ajuda a reduzir a contenção nas estruturas de dados. Mesmo assim, o RTWS também suporta uma sub-política de roubo determinística: PAS. A avaliação experimental, porém, não ajudou a ter uma noção clara do impacto de uma e de outra. No entanto, de uma maneira geral, podemos concluir que o RTWS é uma solução promissora para um escalonamento eficiente de tarefas paralelas com restrições temporais.