936 resultados para Heterogeneous Nucleation
Resumo:
PROFIBUS is an international standard (IEC 61158) for factory-floor communications, with some hundreds of thousands of world-wide installations. However, it does not include any wireless capabilities. In this paper we propose a hybrid wired/wireless PROFIBUS solution where most of the design options are made in order to guarantee the proper real-time behaviour of the overall network. We address the timing unpredictability problems placed by the co-existence of heterogeneous transmission media in the same network. Moreover, we propose a novel solution to provide inter-cell mobility to PROFIBUS wireless nodes.
Resumo:
Aim - To use Monte Carlo (MC) together with voxel phantoms to analyze the tissue heterogeneity effect in the dose distributions and equivalent uniform dose (EUD) for (125)I prostate implants. Background - Dose distribution calculations in low dose-rate brachytherapy are based on the dose deposition around a single source in a water phantom. This formalism does not take into account tissue heterogeneities, interseed attenuation, or finite patient dimensions effects. Tissue composition is especially important due to the photoelectric effect. Materials and Methods - The computed tomographies (CT) of two patients with prostate cancer were used to create voxel phantoms for the MC simulations. An elemental composition and density were assigned to each structure. Densities of the prostate, vesicles, rectum and bladder were determined through the CT electronic densities of 100 patients. The same simulations were performed considering the same phantom as pure water. Results were compared via dose-volume histograms and EUD for the prostate and rectum. Results - The mean absorbed doses presented deviations of 3.3-4.0% for the prostate and of 2.3-4.9% for the rectum, when comparing calculations in water with calculations in the heterogeneous phantom. In the calculations in water, the prostate D 90 was overestimated by 2.8-3.9% and the rectum D 0.1cc resulted in dose differences of 6-8%. The EUD resulted in an overestimation of 3.5-3.7% for the prostate and of 7.7-8.3% for the rectum. Conclusions - The deposited dose was consistently overestimated for the simulation in water. In order to increase the accuracy in the determination of dose distributions, especially around the rectum, the introduction of the model-based algorithms is recommended.
Resumo:
OBJECTIVE: To perform a systematic review of the prevalence of the HCV/ S. mansoni co-infection and associated factors in Schistosoma mansoni -infected populations. METHODS: The bibliographic search was carried out using the Medline, Lilacs, SciELO, Cochrane Library and Ibecs databases. The criteria for the studies' selection and the extraction data were based on systematic review methods. Forty five studies were found, with nine being excluded in a first screening. Thirteen articles were used for data extraction. RESULTS: The HCV infection rates in schistosomiasis populations range from 1% in Ethiopia to 50% in Egypt. Several studies had poorly defined methodologies, even in areas characterized by an association between hepatitis C and schistosomiasis, such as Brazil and Egypt, which meant conclusions were inconsistent. HCV infection rates in schistosomotic populations were heterogeneous and risk factors for acquiring the virus varied widely. CONCLUSIONS: Despite the limitations, this review may help to identify regions with higher rates of hepatitis C and schistosomiasis association. However, more studies are necessary for the development of public health policies on prevention and control of both diseases.
Resumo:
I - Este relatório pretende descrever o estágio especializado em ensino de música realizado no âmbito do mestrado em Ensino de Música na Escola Superior de Música de Lisboa. Este estágio decorreu no Instituto Gregoriano de Lisboa e no Conservatório de Música, de Dança e de Arte Dramática de Lisboa, duas escolas de ensino oficial especializado. Sendo a primeira pública e a segunda privada, estas escolas apresentam realidades muito diferentes do ponto de vista organizacional e de gestão, que resultam em situações heterogéneas e dependentes de vários factores que serão mencionados ao longo deste relatório. A análise SWOT efectuada para cada uma destas organizações descreve mais objectivamente os factores e variáveis que permitiram construir este relatório. Foram caracterizados três alunos, um de cada curso: preparatório, básico e secundário. A Maria M. é a aluna do 2º ano do curso preparatório, o Pedro R. é aluno de 3º grau do curso básico de instrumento e o Diego M. é aluno de 7º grau. Foram aprofundadas as práticas pedagógicas desenvolvidas com cada um dos alunos e os avanços e metas atingidas por cada um destes alunos. Este estágio resulta numa reflexão sobre a prática pedagógica aplicada e as suas motivações.
Resumo:
In the present work, we studied a common outbreaking Lepidoptera species in Portuguese pine stands – Thaumetopoea pityocampa (Den. & Schiff.) - and one of its potential predators – Parus major (L.). The population dynamics of the immature stages of the Lepidoptera was studied in several types of Pinus pinaster (Aiton) plantations in three different areas: Setúbal Peninsula, Abrantes and National Pine Forest of Leiria. Location and plantation structure was the most important factors determining population density of T. pityocampa. Setubal and Abrantes was highly susceptible to attacks by the Lepidoptera, whereas Leiria had lower densities. Young and homogeneous pine stands was more susceptible to attacks than older and more heterogeneous pines stands. However, a desynchronized population of T. pityocampa, in which the larvae develops during summer instead of during winter, reached high densities also in Leiria. The impact of several mortality factors and climatic conditions on the immature stages of the insect (eggs and larvae), in normal and desynchronized populations are discussed, as well as possible evolutionary implications of the sudden appearance of the new version of T. pityocampa. The break of the pupa diapause and adult emergence times the annual life cycle of this insect. Adults from the desynchronized population emerged earlier than adults from the normal population, which in turn determined the change in the larvae development period. Different factors, potentially affecting the timing of adult emergence in both normal and abnormal populations are also discussed. To study P. Major, nest-boxes were placed in the areas of Setúbal and Leiria and they were monitored during three seasons. The nest-boxes increased the density of breeding and wintering birds in the studied pine plantations, indicating that a lack of natural holes are in fact a limiting factor for this populations. The earliest breeding start for this species was recorded in my study area, indicating that Portuguese coastal pines provide good breeding conditions earlier than in other areas of Europe and North Africa. This leads to an overlap between the end of the larvae stage of T. pityocampa and the beginning of the breeding season of P. major. Key-words: Thaumetopoea pityocampa, Parus major, Pinus pinaster, population dynamics, Portugal.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
Academic evaluation has been an essential component of modern science since its inception, as science has moved away from personalized patronage toward its contemporary role as an essential enterprise of contemporary, democratic societies. In recent years, Brazil has experienced sustained growth in its scientific output, which is nowadays fully compatible with its status as a high middle-income country striving to become a fully developed, more equitable country in the years to come. Growth usually takes place amidst challenges and dilemmas and, in Brazil as elsewhere, academic evaluation is not exempt from such difficulties. In a large, profoundly heterogeneous country with a national evaluation system and nationwide on-line platforms disseminating information on the most disparate fields of knowledge, the main challenges refer to how to pay attention to detail without losing sight of comprehensiveness and how to handle social and regional diversity while preserving academic excellence as the fundamental benchmark.
Resumo:
The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.
Resumo:
Journal of Electroanalytical Chemistry 541 (2003) 153-162
Resumo:
PROFIBUS is an international standard (IEC 61158, EN 50170) for factory-floor communications, with several thousands of installations worldwide. Taking into account the increasing need for mobile devices in industrial environments, one obvious solution is to extend traditional wired PROFIBUS networks with wireless capabilities. In this paper, we outline the major aspects of a hybrid wired/wireless PROFIBUS-based architecture, where most of the design options were made in order to guarantee the real-time behaviour of the overall network. We also introduce the timing unpredictability problems resulting from the co-existence of heterogeneous physical media in the same network. However, the major focus of this paper is on how to guarantee real-time communications in such a hybrid network, where nodes (and whole segments) can move between different radio cells (inter-cell mobility). Assuming a simple mobility management mechanism based on mobile nodes performing periodic radio channel assessment and switching, we propose a methodology to compute values for specific parameters that enable an optimal (minimum) and bounded duration of the handoff procedure.
Resumo:
This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.
Resumo:
In heterogeneous environments, diversity of resources among the devices may affect their ability to perform services with specific QoS constraints, and drive peers to group themselves in a coalition for cooperative service execution. The dynamic selection of peers should be influenced by user’s QoS requirements as well as local computation availability, tailoring provided service to user’s specific needs. However, complex dynamic real-time scenarios may prevent the possibility of computing optimal service configurations before execution. An iterative refinement approach with the ability to trade off deliberation time for the quality of the solution is proposed. We state the importance of quickly finding a good initial solution and propose heuristic evaluation functions that optimise the rate at which the quality of the current solution improves as the algorithms have more time to run.
Resumo:
OBJECTIVE To analyze vaccination coverage and factors associated with a complete immunization scheme in children < 5 years old. METHODS This cross-sectional household census survey evaluated 1,209 children < 5 years old living in Bom Jesus, Angola, in 2010. Data were obtained from interviews, questionnaires, child immunization histories, and maternal health histories. The statistical analysis used generalized linear models, in which the dependent variable followed a binary distribution (vaccinated, unvaccinated) and the association function was logarithmic and had the children’s individual, familial, and socioeconomic factors as independent variables. RESULTS Vaccination coverage was 37.0%, higher in children < 1 year (55.0%) and heterogeneous across neighborhoods; 52.0% of children of both sexes had no immunization records. The prevalence rate of vaccination significantly varied according to child age, mother’s level of education, family size, ownership of household appliances, and destination of domestic waste. CONCLUSIONS Vulnerable groups with vaccination coverage below recommended levels continue to be present. Some factors indicate inequalities that represent barriers to full immunization, indicating the need to implement more equitable policies. The knowledge of these factors contributes to planning immunization promotion measures that focus on the most vulnerable groups.
Resumo:
With advancement in computer science and information technology, computing systems are becoming increasingly more complex with an increasing number of heterogeneous components. They are thus becoming more difficult to monitor, manage, and maintain. This process has been well known as labor intensive and error prone. In addition, traditional approaches for system management are difficult to keep up with the rapidly changing environments. There is a need for automatic and efficient approaches to monitor and manage complex computing systems. In this paper, we propose an innovative framework for scheduling system management by combining Autonomic Computing (AC) paradigm, Multi-Agent Systems (MAS) and Nature Inspired Optimization Techniques (NIT). Additionally, we consider the resolution of realistic problems. The scheduling of a Cutting and Treatment Stainless Steel Sheet Line will be evaluated. Results show that proposed approach has advantages when compared with other scheduling systems
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Conservação e Restauro, especialidade Teoria, História e Técnicas, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia