983 resultados para Tool materials
Resumo:
The purpose of this paper is to analyse if Multiple-Choice Tests may be considered an interesting alternative for assessing knowledge, particularly in the Mathematics area, as opposed to the traditional methods, such as open questions exams. In this sense we illustrate some opinions of the researchers in this area. Often the perception of the people about the construction of this kind of exams is that they are easy to create. But it is not true! Construct well written tests it’s a hard work and needs writing ability from the teachers. Our proposal is analyse the construction difficulties of multiple - choice tests as well some advantages and limitations of this type of tests. We also show the frequent critics and worries, since the beginning of this objective format usage. Finally in this context some examples of Multiple-Choice Items in the Mathematics area are given, and we illustrate as how we can take advantage and improve this kind of tests.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
The present work concerns a new synthesis approach to prepare niobium based SAPO materials with AEL structure and the characterization ofNb species incorporated within the inorganic matrixes. The SAPO-11 materials were synthesized with or without the help of a small amine, methylamine (MA) as co-template, while Nb was added directly during the preparation of the initial gel. Structural, textural and acidic properties of the different supports were evaluated by XRD, TPR, UV-Vis spectroscopy, pyridine adsorption followed by IR spectroscopy and thermal analyses. Pure and well crystalline Nb based SAPO-11 materials were obtained, either with or without MA, using in the initial gel a low Si content of about 0.6. Increasing the Si content of the gel up to 0.9 led to an important decrease of the samples crystallinity. Niobium was found to incorporate the AEL pores support as small Nb2O5 oxide particles and also as extra framework cationic species (Nb5+), compensating the negative charges from the matrix and generating new Lewis acid sites. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
The design and development of simulation models and tools for Demand Response (DR) programs are becoming more and more important for adequately taking the maximum advantages of DR programs use. Moreover, a more active consumers’ participation in DR programs can help improving the system reliability and decrease or defer the required investments. DemSi, a DR simulator, designed and implemented by the authors of this paper, allows studying DR actions and schemes in distribution networks. It undertakes the technical validation of the solution using realistic network simulation based on PSCAD. DemSi considers the players involved in DR actions, and the results can be analyzed from each specific player point of view.
Resumo:
The study of electricity markets operation has been gaining an increasing importance in last years, as result of the new challenges that the electricity markets restructuring produced. This restructuring increased the competitiveness of the market, but with it its complexity. The growing complexity and unpredictability of the market’s evolution consequently increases the decision making difficulty. Therefore, the intervenient entities are forced to rethink their behaviour and market strategies. Currently, lots of information concerning electricity markets is available. These data, concerning innumerous regards of electricity markets operation, is accessible free of charge, and it is essential for understanding and suitably modelling electricity markets. This paper proposes a tool which is able to handle, store and dynamically update data. The development of the proposed tool is expected to be of great importance to improve the comprehension of electricity markets and the interactions among the involved entities.
Resumo:
This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operator´s planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi) tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engine. The scenarios created with EVeSSi can be used by external tools (e.g., power flow) for specific analysis, for instance grid impacts. Two scenarios are briefly presented for illustration of the simulator capabilities.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
Neonatal anthropometry is an inexpensive, noninvasive and convenient tool for bedside evaluation, especially in sick and fragile neonates. Anthropometry can be used in neonates as a tool for several purposes: diagnosis of foetal malnutrition and prediction of early postnatal complications; postnatal assessment of growth, body composition and nutritional status; prediction of long-term complications including metabolic syndrome; assessment of dysmorphology; and estimation of body surface. However, in this age group anthropometry has been notorious for its inaccuracy and the main concern is to make validated indices available. Direct measurements, such as body weight, length and body circumferences are the most commonly used measurements for nutritional assessment in clinical practice and in field studies. Body weight is the most reliable anthropometric measurement and therefore is often used alone in the assessment of the nutritional status, despite not reflecting body composition. Derived indices from direct measurements have been proposed to improve the accuracy of anthropometry. Equations based on body weight and length, mid-arm circumference/head circumference ratio, and upper-arm cross-sectional areas are among the most used derived indices to assess nutritional status and body proportionality, even though these indices require further validation for the estimation of body composition in neonates.
Resumo:
This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.
Resumo:
This paper introduces the PCMAT platform project and, in particular, one of its components, the PCMAT Metadata Authoring Tool. This is an educational web application that allows the project metadata creators to write the metadata associated to each learning object without any concern for the metadata schema semantics. Furthermore it permits the project managers to add or delete elements to the schema, without having to rewrite or compile any code.
Resumo:
This paper presents the SmartClean tool. The purpose of this tool is to detect and correct the data quality problems (DQPs). Compared with existing tools, SmartClean has the following main advantage: the user does not need to specify the execution sequence of the data cleaning operations. For that, an execution sequence was developed. The problems are manipulated (i.e., detected and corrected) following that sequence. The sequence also supports the incremental execution of the operations. In this paper, the underlying architecture of the tool is presented and its components are described in detail. The tool's validity and, consequently, of the architecture is demonstrated through the presentation of a case study. Although SmartClean has cleaning capabilities in all other levels, in this paper are only described those related with the attribute value level.
Resumo:
Today, business group decision making is an extremely important activity. A considerable number of applications and research have been made in the past years in order to increase the effectiveness of decision making process. In order to support the idea generation process, IGTAI (Idea Generation Tool for Ambient Intelligence) prototype was created. IGTAI is a Group Decision Support System designed to support any kind of meetings namely distributed, asynchronous or face to face. It aims at helping geographically distributed (or not) people and organizations in the idea generation task, by making use of pervasive hardware in a meeting room, expanding the meeting beyond the room walls by allowing a ubiquitous access through different kinds of equipment. This paper focus on the research made to build IGTAI prototype, its architecture and its main functionalities, namely the support given in the different phases of the idea generation meeting.
Resumo:
Sendo os desperdícios “Waste” associados à atividade industrial em Portugal e nos mercados globais e os seus custos inerentes, uma das maiores preocupações a todos os níveis de gestão empresarial, a filosofia “Lean” nasce como ajuda e encaminhamento na solução desta problemática. O conceito “Lean”, no que se refere à indústria, desde sempre e até aos dias de hoje, tem uma enorme ênfase, com a adoção deste conceito.Verificam-se bons resultados ao nível da redução de custos, melhoria da qualidade geral dos artigos produzidos, no controlo da produção em geral e é uma poderosa ferramenta no estreitamento da relação entre os diferentes intervenientes da cadeia de valor de determinado produto, sobretudo com fornecedores e com clientes. Com “Lean Management” e “Glass Wall Management”, em ambientes onde as empresas mais avançadas estão a procurar melhorar a sua competitividade através de uma gestão transparente (“Glass Wall Management”), a partir da qual, “toda informação relevante é compartilhada de maneira a que todos entendam a situação”(Suzaki, K, 1993), ganha cada vez mais importância a existência de uma estrutura organizacional que permita esta transparência e a consequente maturidade das empresas. Neste trabalho foram descritos alguns processos de gestão transparente desenvolvidos nos últimos dois anos numa PME portuguesa, aprofundando o processo de gestão transparente vigente e as ferramentas que ajudam a empresa e que na sua globalidade poderão ser extrapoladas a outras PME Portuguesas de modo que a informação importante e relevante seja partilhada por todos os intervenientes na estrutura empresarial, sendo entendida e desenvolvida por todos através de Edições e Revisões aos documentos mais importantes da empresa. Neste estudo foram contactadas vinte e uma PME’S portuguesas de tipologia de produção MTO (Make to Order) do sector dos estofos/mobiliário, e solicitado o preenchimento de um Questionário, tendo como fim em vista, a verificação do uso desta metodologia “Glass Wall Management” à escala empresarial portuguesa e a interpretação do Conceito Geral “Lean” como filosofia de redução de materiais, tempos e custos.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.