972 resultados para Distributed environments
Resumo:
Multi-environment trials (METs) used to evaluate breeding lines vary in the number of years that they sample. We used a cropping systems model to simulate the target population of environments (TPE) for 6 locations over 108 years for 54 'near-isolines' of sorghum in north-eastern Australia. For a single reference genotype, each of 547 trials was clustered into 1 of 3 'drought environment types' (DETs) based on a seasonal water stress index. Within sequential METs of 2 years duration, the frequencies of these drought patterns often differed substantially from those derived for the entire TPE. This was reflected in variation in the mean yield of the reference genotype. For the TPE and for 2-year METs, restricted maximum likelihood methods were used to estimate components of genotypic and genotype by environment variance. These also varied substantially, although not in direct correlation with frequency of occurrence of different DETs over a 2-year period. Combined analysis over different numbers of seasons demonstrated the expected improvement in the correlation between MET estimates of genotype performance and the overall genotype averages as the number of seasons in the MET was increased.
Resumo:
It has been suggested that twinning may influence handedness through the effects of birth order, intra-uterine crowding and mirror imaging. The influence of these effects on handedness (for writing and throwing) was examined in 3657 Monozygotic (MZ) and 3762 Dizygotic (DZ) twin pairs (born 1893-1992). Maximum likelihood analyses revealed no effects of birth order on the incidence of left-handedness. Twins were no more likely to be left-handed than their singleton siblings (n = 1757), and there were no differences between the DZ co-twin and sibling-twin covariances, suggesting that neither intra-uterine crowding nor the experience of being a twin affects handedness. There was no evidence of mirror imaging; the co-twin correlations of monochorionic and dichorionic MZ twins did not differ. Univariate genetic analyses revealed common environmental factors to be the most parsimonious explanation of familial aggregation for the writing-hand measure, while additive genetic influences provided a better interpretation of the throwing hand data.
Resumo:
Neste trabalho nós estudamos a dieta da tartaruga verde, Chelonia mydas, e os fatores envolvidos na variação de sua ecologia alimentar. Avaliamos também o impacto da ingestão de lixo, e os fatores que podem explicar a elevada ingestão destes resíduos entre os animais marinhos. No estudo da ecologia alimentar, nós avaliamos mais de 400 indivíduos, entre dados originais e da literatura, distribuídos ao longo de um gradiente latitudinal e diversos ambientes. As tartarugas se alimentaram majoritariamente de macroalgas, porém apresentaram uma grande plasticidade alimentar, tanto em relação à estratégia de forrageamento quanto à dieta. Nas regiões mais frias e com menor disponibilidade de algas, as tartarugas mudaram de uma dieta herbívora, para uma dieta baseada em matéria animal. Esta mudança de dieta acarretou também em uma mudança de estratégia de forrageamento, saindo da alimentação bentônica para uma alimentação pelágica. Estratégia esta que também foi encontrada nas áreas estuarinas. A plasticidade alimentar se deve à interação de fatores intrínsecos (restrições fisiológicas) e extrínsecos (regionais e locais). As diferenças nas estratégias de forragenamento acarretam também em diferenças na exposição a ameaças. Um exemplo disso é a ingestão de lixo, que apesar de ter sido registrada em mais de 70% das tartarugas (N = 265), representou uma ameaça maior aos animais com estratégia de forrageamento pelágica. O plástico foi o material mais ingerido, tendo como principal fonte itens relacionados à alimentação e sacolas plásticas. O estudo também mostrou que uma quantidade pequena de lixo (0,5 g) é suficiente para causar a morte. Este resultado revelou que o potencial de letalidade por ingestão de lixo é muito maior que a mortalidade observada. A verdadeira ameaça da ingestão de lixo está sendo mascarada pela elevada mortalidade relacionada às atividades pesqueiras. A ingestão de lixo é normalmente atribuída à confusão de um item alimentar específico com o resíduo, como águas-vivas e sacolas plásticas. Porém, nós mostramos que se trata de uma questão mais ampla, e usamos a tartaruga verde, aves marinhas e peixes para ressaltar a importância de outros fatores como: abundância do lixo no ambiente, estratégia de forrageamento, capacidade de detecção do resíduo e amplitude da dieta. Nós acreditamos que a ingestão de lixo ocorre devido a uma armadilha evolutiva muito mais ampla do que a previamente sugerida, e que deve afetar muito mais espécies que as que foram até hoje reportadas. Desarmar esta armadilha será particularmente difícil devido ao contínuo e crescente despejo de plástico no ambiente marinho e sua alta persistência no ambiente.
Resumo:
According to the opinion of clinicians, emerging medical conditions can be timely detected by observing changes in the activities of daily living and/or in the physiological signals of a person. To accomplish such purpose, it is necessary to properly monitor both the person’s physiological signals as well as the home environment with sensing technology. Wireless sensor networks (WSNs) are a promising technology for this support. After receiving the data from the sensor nodes, a computer processes the data and extracts information to detect any abnormality. The computer runs algorithms that should have been previously developed and tested in real homes or in living-labs. However, these installations (and volunteers) may not be easily available. In order to get around that difficulty, this paper suggests the making of a physical model to emulate basic actions of a user at home, thus giving autonomy to researchers wanting to test the performance of their algorithms. This paper also studies some data communication issues in mobile WSNs namely how the orientation of the sensor nodes in the body affects the received signal strength, as well as retransmission aspects of a TDMA-based MAC protocol in the data recovery process.
Resumo:
This paper aims to describe the processes of teaching illustration and animation, together, in the context of a masters degree program. In Portugal, until very recently, illustration and animation higher education courses, were very scarce and only provided by a few private universities, which offered separated programs - either illustration or animation. The MA in Illustration and Animation (MIA) based in the Instituto Politécnico do Cávado e Ave in Portugal, dared to join these two creative areas in a common learning model and is already starting it’s third edition with encouraging results and will be supported by the first international conference on illustration and animation (CONFIA). This masters program integrates several approaches and techniques (in illustration and animation) and integrates and encourages creative writing and critique writing. This paper describes the iterative process of construction, and implementation of the program as well as the results obtained on the initial years of existence in terms of pedagogic and learning conclusions. In summary, we aim to compare pedagogic models of animation or illustration teaching in higher education opposed to a more contemporary and multidisciplinary model approach that integrates the two - on an earlier stage - and allows them to be developed separately – on the second part of the program. This is based on the differences and specificities of animation (from classic techniques to 3D) and illustration (drawing the illustration) and the intersection area of these two subjects within the program structure focused on the students learning and competencies acquired to use in professional or authorial projects.
Resumo:
Carpooling initiated in America in the 1970s due to the oil crisis. However, over the past years, carpooling has increased significantly across the world. Some countries have created a High Occupancy Vehicle (HOV) lane to encourage commuters not to travel alone. In additional, carpool websites has been developed to facilitate the connection between the commuters, making it possible to create a compatible match in a faster and efficient manner. This project focuses on carpooling, especially in an academic environment since younger people are more likely to choose carpool. Initially, an intense research was made to examine carpool studies that occurred all over the world, following with a research of higher education institutes that use carpooling as a transportation mode. Most websites created carpools by targeting people from a specific country. These commuters have different origins and destinations making it more complicated to create compatible matches. The objective of this project is to develop a system helping teachers and students from an academic environment to create carpool matches. This objective makes it easier to create carpools because these students and teachers have the same destination. During the research, it was essential to explore, as many as possible, existing carpool websites that are available across the world. After this analysis, several sketches were made to develop the layout and structure of the web application that’s being implemented throughout the project. Once the layout was established, the development of the web application was initiated. This project had its ups and downs but it accomplished all the necessary requirements. This project can be accessed on the link: http://ipcacarpool.somee.com. Once the website was up and running, a web-based survey was developed to study the reasons that motivate people to consider carpooling as an alternative to driving alone. To develop this survey was used a tool called Survey Planet. This survey contained 408 respondents, which 391 are students and 17 are teachers. This study concludes that a majority of the respondents don’t carpool, however they will consider carpooling if there was a dedicated parking space. A majority of the respondents that carpool initiated less than a year ago, indicating that this mean of transportation is recent.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
A biomonitoring study, using transplanted lichens Flavoparmelia caperata, was conducted to assess the indoor air quality in primary schools in urban (Lisbon) and rural (Ponte de Sor) Portuguese sites. The lichens exposure period occurred between April and June 2010 and two types of environments of the primary schools were studied: classrooms and outdoor/courtyard. Afterwards, the lichen samples were processed and analyzed by instrumental neutron activation analysis (INAA) to assess a total of 20 chemical elements. Accumulated elements in the exposed lichens were assessed and enrichment factors (EF) were determined. Indoor and outdoor biomonitoring results were compared to evaluate how biomonitors (as lichens) react at indoor environments and to assess the type of pollutants that are prevalent in those environments.
Resumo:
Object-oriented programming languages presently are the dominant paradigm of application development (e. g., Java,. NET). Lately, increasingly more Java applications have long (or very long) execution times and manipulate large amounts of data/information, gaining relevance in fields related with e-Science (with Grid and Cloud computing). Significant examples include Chemistry, Computational Biology and Bio-informatics, with many available Java-based APIs (e. g., Neobio). Often, when the execution of such an application is terminated abruptly because of a failure (regardless of the cause being a hardware of software fault, lack of available resources, etc.), all of its work already performed is simply lost, and when the application is later re-initiated, it has to restart all its work from scratch, wasting resources and time, while also being prone to another failure and may delay its completion with no deadline guarantees. Our proposed solution to address these issues is through incorporating mechanisms for checkpointing and migration in a JVM. These make applications more robust and flexible by being able to move to other nodes, without any intervention from the programmer. This article provides a solution to Java applications with long execution times, by extending a JVM (Jikes research virtual machine) with such mechanisms. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
The aim of this study was to contribute to the assessment of exposure levels of ultrafine particles in the urban environment of Lisbon, Portugal, due to automobile traffic, by monitoring lung deposited alveolar surface area (resulting from exposure to ultrafine particles) in a major avenue leading to the town center during late spring, as well as in indoor buildings facing it. Data revealed differentiated patterns for week days and weekends, consistent with PM2.5 and PM10 patterns currently monitored by air quality stations in Lisbon. The observed ultrafine particulate levels may be directly correlated with fluxes in automobile traffic. During a typical week, amounts of ultrafine particles per alveolar deposited surface area varied between 35 and 89.2 μm2/cm3, which are comparable with levels reported for other towns in Germany and the United States. The measured values allowed for determination of the number of ultrafine particles per cubic centimeter, which are comparable to levels reported for Madrid and Brisbane. In what concerns outdoor/indoor levels, we observed higher levels (32 to 63%) outdoors, which is somewhat lower than levels observed in houses in Ontario.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
Power systems have been suffering huge changes mainly due to the substantial increase of distributed generation and to the operation in competitive environments. Virtual power players can aggregate a diversity of players, namely generators and consumers, and a diversity of energy resources, including electricity generation based on several technologies, storage and demand response. Resource management gains an increasing relevance in this competitive context, while demand side active role provides managers with increased demand elasticity. This makes demand response use more interesting and flexible, giving rise to a wide range of new opportunities.This paper proposes a methodology for managing demand response programs in the scope of virtual power players. The proposed method is based on the calculation of locational marginal prices (LMP). The evaluation of the impact of using demand response specific programs on the LMP value supports the manager decision concerning demand response use. The proposed method has been computationally implemented and its application is illustrated in this paper using a 32 bus network with intensive use of distributed generation.