992 resultados para Numerical tool
Resumo:
Neonatal anthropometry is an inexpensive, noninvasive and convenient tool for bedside evaluation, especially in sick and fragile neonates. Anthropometry can be used in neonates as a tool for several purposes: diagnosis of foetal malnutrition and prediction of early postnatal complications; postnatal assessment of growth, body composition and nutritional status; prediction of long-term complications including metabolic syndrome; assessment of dysmorphology; and estimation of body surface. However, in this age group anthropometry has been notorious for its inaccuracy and the main concern is to make validated indices available. Direct measurements, such as body weight, length and body circumferences are the most commonly used measurements for nutritional assessment in clinical practice and in field studies. Body weight is the most reliable anthropometric measurement and therefore is often used alone in the assessment of the nutritional status, despite not reflecting body composition. Derived indices from direct measurements have been proposed to improve the accuracy of anthropometry. Equations based on body weight and length, mid-arm circumference/head circumference ratio, and upper-arm cross-sectional areas are among the most used derived indices to assess nutritional status and body proportionality, even though these indices require further validation for the estimation of body composition in neonates.
Resumo:
This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.
Resumo:
Neste trabalho aborda-se o desenvolvimento da carroçaria do Veículo Eléctrico Ecológico – VEECO recorrendo a tecnologias assistidas por computador. Devido à impossibilidade de abranger toda a temática das tecnologias assistidas por computador, associadas ao desenvolvimento de uma carroçaria automóvel, o foco deste trabalho assenta no processo de obtenção de um modelo digital válido e no estudo do desempenho aerodinâmico da carroçaria. A existência de um modelo digital válido é a base de qualquer processo de desenvolvimento associado a tecnologias assistidas por computador. Neste sentido, numa primeira etapa, foram aplicadas e desenvolvidas técnicas e metodologias que permitem o desenvolvimento de uma carroçaria desde a sua fase de “design” até à obtenção de um modelo digital CAD. Estas abrangem a conversão e importação de dados, a realização de engenharia inversa, a construção/reconstrução CAD em CATIA V5 e a preparação/correcção de modelos CAD para a análise numérica. Numa segunda etapa realizou-se o estudo da aerodinâmica exterior da carroçaria, recorrendo à ferramenta de análise computacional de fluidos (CFD) Flow Simulation da CosmosFloworks integrado no programa SolidWorks 2010. Associado à temática do estudo aerodinâmico e devido à elevada importância da validação dos resultados numéricos por meio de dados experimentais, foi realizado o estudo de análise dimensional que permite a realização de ensaios experimentais à escala, bem como a análise dos resultados experimentais obtidos.
Resumo:
This paper introduces the PCMAT platform project and, in particular, one of its components, the PCMAT Metadata Authoring Tool. This is an educational web application that allows the project metadata creators to write the metadata associated to each learning object without any concern for the metadata schema semantics. Furthermore it permits the project managers to add or delete elements to the schema, without having to rewrite or compile any code.
Resumo:
This paper presents the SmartClean tool. The purpose of this tool is to detect and correct the data quality problems (DQPs). Compared with existing tools, SmartClean has the following main advantage: the user does not need to specify the execution sequence of the data cleaning operations. For that, an execution sequence was developed. The problems are manipulated (i.e., detected and corrected) following that sequence. The sequence also supports the incremental execution of the operations. In this paper, the underlying architecture of the tool is presented and its components are described in detail. The tool's validity and, consequently, of the architecture is demonstrated through the presentation of a case study. Although SmartClean has cleaning capabilities in all other levels, in this paper are only described those related with the attribute value level.
Resumo:
Today, business group decision making is an extremely important activity. A considerable number of applications and research have been made in the past years in order to increase the effectiveness of decision making process. In order to support the idea generation process, IGTAI (Idea Generation Tool for Ambient Intelligence) prototype was created. IGTAI is a Group Decision Support System designed to support any kind of meetings namely distributed, asynchronous or face to face. It aims at helping geographically distributed (or not) people and organizations in the idea generation task, by making use of pervasive hardware in a meeting room, expanding the meeting beyond the room walls by allowing a ubiquitous access through different kinds of equipment. This paper focus on the research made to build IGTAI prototype, its architecture and its main functionalities, namely the support given in the different phases of the idea generation meeting.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.
Resumo:
In this work we solve Mathematical Programs with Complementarity Constraints using the hyperbolic smoothing strategy. Under this approach, the complementarity condition is relaxed through the use of the hyperbolic smoothing function, involving a positive parameter that can be decreased to zero. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.
Resumo:
On this paper we present a modified regularization scheme for Mathematical Programs with Complementarity Constraints. In the regularized formulations the complementarity condition is replaced by a constraint involving a positive parameter that can be decreased to zero. In our approach both the complementarity condition and the nonnegativity constraints are relaxed. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.
Resumo:
Copyright © 2014 António F. Rodrigues, Nuno O. Martins. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In accordance of the Creative Commons Attribution License all Copyrights © 2014 are reserved for SCIRP and the owner of the intellectual property António F. Rodrigues, Nuno O. Martins. All Copyright © 2014 are guarded by law and by SCIRP as a guardian.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
Deep Ocean Species. The little that is known mostly comes from collected specimens. L.A. Rocha et al. Letter "Specimen collection: An essential tool" (23 May, 344: 814) brilliantly discuss the importance of specimen collection and present the evolution of collecting since the mid-19th century until our present strict codes and conducts. However, it is also important to emphasize the fact that the vast majority of deep ocean macro-organisms are only known to us because of collection and this is a strong argument that should be present in our actions as scientists. If the deep is considered the least known of Earth’s habitats (1% or so according to recent estimates) then what awesome collection of yet to discover species are still there to be properly described? As the authors point citing (1), something around 86% of species remain unknown. Voucher specimens are fundamental for the reasons pointed out and perhaps the vast depths of the World’s oceans are the best example of that importance. The resumed report of 2010 Census of Marine Life (2) showed that among the millions of specimens collected in both familiar and seldom-explored waters, the Census found more than 6,000 potentially new species and completed formal descriptions of more than 1,200 of them. It also found that a number of rare species are in fact common. Voucher specimens are essential and, again agreeing with L.A. Rocha et al. Letter (see above), the modern approach for collecting will not be a cause for extinctions but instead a valuable tool for knowledge, description and even, as seen above, a way to find out that supposed rare species may not be that rare and even prove to reach abundant populations.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.
Resumo:
Introduction: In the XXI Century ’s Society the scientific investigation process has been growing steadily , and the field of the pharmaceutical research is one of the most enthusiastic and relevant . Here, it is very important to correlate observed functional alterations with possibly modified drug bio distribution patterns . Cancer, inflammation and inf ection are processes that induce many molecular intermediates like cytokines, chemokines and other chemical complexes that can alter the pharmacokinetics of many drugs. One cause of such changes is thought to be the modulator action of these complexes in t he P - Glyco p rotein activity, because they can act like inducers/inhibitors of MDR - 1 expression. This protein results from the expression of MDR - 1 gene, and acts as an ATP energy - dependent efflux pump, with their substrates including many drugs , like antiretrovirals, anticancers, anti - infectives, immunosuppressants, steroids or opioids . Objectives: Because of the lack of methods to provide helpful information in the investigation of in vivo molecular changes in Pgp activity during infection/infl ammation processes, and its value in the explanation of the altered drug pharmacokinetic, this paper want to evaluate the potential utility of 99m Tc - Sestamibi scintigraphy during this kind of health sciences investigation. Although the a im is indeed to create a technique to the in vivo study of Pgp activity, this preliminary Project only reaches the in vitro study phase, assumed as the first step in a n evaluation period for a new tool development. Materials and Methods: For that reason , we are performing in vitro studies of influx and efflux of 99m Tc - Sestamibi ( that is a substrate of Pgp) in hepatocytes cell line (HepG2). We are interested in clarify the cellular behavior of this radiopharmaceutical in Lipopolysaccharide(LPS) stimulated cells ( well known in vitro model of inflammation) to possibly approve this methodology. To validate the results, the Pgp expression will be finally evaluated using Western Blot technique. Results: Up to this moment , we still don’t have the final results, but we have already enough data to let us believe that LPS stimulation induce a downregulation of MDR - 1, and consequently Pgp, which could conduce to a prolonged retention of 99m Tc - Sestamibi in the inflamed cells . Conclusions: If and when this methodology demonstrate the promising results we expect, one will be able to con clude that Nuclear Medicine is an important tool to help evidence based research also on this specific field .