907 resultados para Participatory methodologies


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

TOD (Transit Oriented Development) is typically defined as a high density mixed area (residential and commercial) within easy walking distance of a high capacity public transport station (typically within an 800m buffer area). TOO is viewed as a set of strategies to increase the use of public transport, increasing walking activity, containing urban sprawl, and creating more liveable places. It is believed that this type of combined strategies will improve sustainable growth. This work is an exploratory work for evidence of TOD characteristics in train station areas in Azambuja train line, setting further methodologies to evaluate the success of TOD areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Ensino Precoce do Inglês

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Engenharia Civil - Ramo de Gestão da Construção

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To analyze the effect of air pollution and temperature on mortality due to cardiovascular and respiratory diseases. METHODS We evaluated the isolated and synergistic effects of temperature and particulate matter with aerodynamic diameter < 10 µm (PM10) on the mortality of individuals > 40 years old due to cardiovascular disease and that of individuals > 60 years old due to respiratory diseases in Sao Paulo, SP, Southeastern Brazil, between 1998 and 2008. Three methodologies were used to evaluate the isolated association: time-series analysis using Poisson regression model, bidirectional case-crossover analysis matched by period, and case-crossover analysis matched by the confounding factor, i.e., average temperature or pollutant concentration. The graphical representation of the response surface, generated by the interaction term between these factors added to the Poisson regression model, was interpreted to evaluate the synergistic effect of the risk factors. RESULTS No differences were observed between the results of the case-crossover and time-series analyses. The percentage change in the relative risk of cardiovascular and respiratory mortality was 0.85% (0.45;1.25) and 1.60% (0.74;2.46), respectively, due to an increase of 10 μg/m3 in the PM10 concentration. The pattern of correlation of the temperature with cardiovascular mortality was U-shaped and that with respiratory mortality was J-shaped, indicating an increased relative risk at high temperatures. The values for the interaction term indicated a higher relative risk for cardiovascular and respiratory mortalities at low temperatures and high temperatures, respectively, when the pollution levels reached approximately 60 μg/m3. CONCLUSIONS The positive association standardized in the Poisson regression model for pollutant concentration is not confounded by temperature, and the effect of temperature is not confounded by the pollutant levels in the time-series analysis. The simultaneous exposure to different levels of environmental factors can create synergistic effects that are as disturbing as those caused by extreme concentrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an increasingly competitive and globalized world, companies need effective training methodologies and tools for their employees. However, selecting the most suitable ones is not an easy task. It depends on the requirements of the target group (namely time restrictions), on the specificities of the contents, etc. This is typically the case for training in Lean, the waste elimination manufacturing philosophy. This paper presents and compares two different approaches to lean training methodologies and tools: a simulation game based on a single realistic manufacturing platform, involving production and assembly operations that allows learning by playing; and a digital game that helps understand lean tools. This paper shows that both tools have advantages in terms of trainee motivation and knowledge acquisition. Furthermore, they can be used in a complementary way, reinforcing the acquired knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last years the electricity industry has faced a restructuring process. Among the aims of this process was the increase in competition, especially in the generation activity where firms would have an incentive to become more efficient. However, the competitive behavior of generating firms might jeopardize the expected benefits of the electricity industry liberalization. The present paper proposes a conjectural variations model to study the competitive behavior of generating firms acting in liberalized electricity markets. The model computes a parameter that represents the degree of competition of each generating firm in each trading period. In this regard, the proposed model provides a powerful methodology for regulatory and competition authorities to monitor the competitive behavior of generating firms. As an application of the model, a study of the day-ahead Iberian electricity market (MIBEL) was conducted to analyze the impact of the integration of the Portuguese and Spanish electricity markets on the behavior of generating firms taking into account the hourly results of the months of June and July of 2007. The advantages of the proposed methodology over other methodologies used to address market power, namely Residual Supply index and Lerner index are highlighted. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electrooxidative behavior of pravastatin (PRV) in aqueous media was studied by square-wave voltammetry at a glassycarbon electrode (GCE) and at a screen-printed carbon electrode (SPCE). Maximum peak current intensities in a pH 5.0 buffer were obtained at +1.3 V vs. AgCl/Ag and +1.0 V vs. Ag for the GCE and SPCE surface respectively. Validation of the developed methodologies revealed good performance characteristics and confirmed their applicability to the quantification of PRV in pharmaceutical products, without significant sample pretreatment. A comparative analysis between the two electrode types showed that SPCEs are preferred as an electrode surface because of their higher sensitivity and the elimination of the need to clean the electrode’s surface for its renewal, which frequently is, if not always, the rate-limiting step in voltammetric analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis submitted to the Universidade Nova de Lisboa,Faculdade de Ciências e Tecnologia for the degree of Doctor of Philosophy in Environmental Engineering

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the prevalence of self-medication in Brazil’s adult population.METHODS Systematic review of cross-sectional population-based studies. The following databases were used: Medline, Embase, Scopus, ISI, CINAHL, Cochrane Library, CRD, Lilacs, SciELO, the Banco de teses brasileiras(Brazilian theses database) (Capes) and files from the Portal Domínio Público (Brazilian Public Domain). In addition, the reference lists from relevant studies were examined to identify potentially eligible articles. There were no applied restrictions in terms of the publication date, language or publication status. Data related to publication, population, methods and prevalence of self-medication were extracted by three independent researchers. Methodological quality was assessed following eight criteria related to sampling, measurement and presentation of results. The prevalences were measured from participants who used at least one medication during the recall period of the studies.RESULTS The literature screening identified 2,778 records, from which 12 were included for analysis. Most studies were conducted in the Southeastern region of Brazil, after 2000 and with a 15-day recall period. Only five studies achieved high methodological quality, of which one study had a 7-day recall period, in which the prevalence of self-medication was 22.9% (95%CI 14.6;33.9). The prevalence of self-medication in three studies of high methodological quality with a 15-day recall period was 35.0% (95%CI 29.0;40.0, I2 = 83.9%) in the adult Brazilian population.CONCLUSIONS Despite differences in the methodologies of the included studies, the results of this systematic review indicate that a significant proportion of the adult Brazilian population self-medicates. It is suggested that future research projects that assess self-medication in Brazil standardize their methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dependability is a critical factor in computer systems, requiring high quality validation & verification procedures in the development stage. At the same time, digital devices are getting smaller and access to their internal signals and registers is increasingly complex, requiring innovative debugging methodologies. To address this issue, most recent microprocessors include an on-chip debug (OCD) infrastructure to facilitate common debugging operations. This paper proposes an enhanced OCD infrastructure with the objective of supporting the verification of fault-tolerant mechanisms through fault injection campaigns. This upgraded on-chip debug and fault injection (OCD-FI) infrastructure provides an efficient fault injection mechanism with improved capabilities and dynamic behavior. Preliminary results show that this solution provides flexibility in terms of fault triggering and allows high speed real-time fault injection in memory elements