27 resultados para Tool path computing
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
A necessidade de poder computacional é crescente nas diversas áreas de actuação humana, tanto na indústria, como em ambientes académicos. Grid Computing permite a ligação de recursos computacionais dispersos de maneira a permitir a sua utilização mais eficaz, fornecendo aos utilizadores um acesso simplificado ao poder computacional de diversos sistemas. Os primeiros projectos de Grid Computing implicavam a ligação de máquinas paralelas ou aglomerados de alto desempenho e alto custo, disponíveis apenas em algumas instituições. Contrastando com o elevado custo dos super-computadores, os computadores pessoais e a Internet sofreram uma evolução significativa nos últimos anos. O uso de computadores dispersos em uma WAN pode representar um ambiente muito interessante para processamento de alto desempenho. Os sistemas em Grid fornecem a possibilidade de se utilizar um conjunto de computadores pessoais de modo a fornecer uma computação que utiliza recursos que de outra maneira estariam omissos. Este trabalho consiste no estudo de Grid Computing a nível de conceito e de arquitectura e numa análise ao seu estado actual hoje em dia. Como complemento foi desenvolvido um componente que permite o desenvolvimento de serviços para Grids (Grid Services) mais eficaz do que o modelo de suporte a serviços actualmente utilizado. Este componente é disponibilizado sob a forma um plug-in para a plataforma Eclipse IDE.
Resumo:
Neonatal anthropometry is an inexpensive, noninvasive and convenient tool for bedside evaluation, especially in sick and fragile neonates. Anthropometry can be used in neonates as a tool for several purposes: diagnosis of foetal malnutrition and prediction of early postnatal complications; postnatal assessment of growth, body composition and nutritional status; prediction of long-term complications including metabolic syndrome; assessment of dysmorphology; and estimation of body surface. However, in this age group anthropometry has been notorious for its inaccuracy and the main concern is to make validated indices available. Direct measurements, such as body weight, length and body circumferences are the most commonly used measurements for nutritional assessment in clinical practice and in field studies. Body weight is the most reliable anthropometric measurement and therefore is often used alone in the assessment of the nutritional status, despite not reflecting body composition. Derived indices from direct measurements have been proposed to improve the accuracy of anthropometry. Equations based on body weight and length, mid-arm circumference/head circumference ratio, and upper-arm cross-sectional areas are among the most used derived indices to assess nutritional status and body proportionality, even though these indices require further validation for the estimation of body composition in neonates.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.
Resumo:
As it is well known, competitive electricity markets require new computing tools for power companies that operate in retail markets in order to enhance the management of its energy resources. During the last years there has been an increase of the renewable penetration into the micro-generation which begins to co-exist with the other existing power generation, giving rise to a new type of consumers. This paper develops a methodology to be applied to the management of the all the aggregators. The aggregator establishes bilateral contracts with its clients where the energy purchased and selling conditions are negotiated not only in terms of prices but also for other conditions that allow more flexibility in the way generation and consumption is addressed. The aggregator agent needs a tool to support the decision making in order to compose and select its customers' portfolio in an optimal way, for a given level of profitability and risk.
Resumo:
We investigate the phase behaviour of 2D mixtures of bi-functional and three-functional patchy particles and 3D mixtures of bi-functional and tetra-functional patchy particles by means of Monte Carlo simulations and Wertheim theory. We start by computing the critical points of the pure systems and then we investigate how the critical parameters change upon lowering the temperature. We extend the successive umbrella sampling method to mixtures to make it possible to extract information about the phase behaviour of the system at a fixed temperature for the whole range of densities and compositions of interest. (C) 2013 AIP Publishing LLC.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.
Resumo:
Thesis submitted in the fulfilment of the requirements for the Degree of Master in Electronic and Telecomunications Engineering
Resumo:
Mestrado em Contabilidade e Gestão das Instituições Financeiras
Resumo:
O processo de autorregulação não se desenvolve nos alunos de forma espontânea. Neste sentido, é necessário preparar o trabalho com os alunos, a fim de se conseguir uma autorregulação eficaz e uma apropriação do significado dos objetivos de aprendizagem. No âmbito do Mestrado em Educação Pré-Escolar e Ensino do 1º Ciclo do Ensino Básico realizei um estudo com o objetivo de compreender o contributo do Portefólio, enquanto instrumento (auto) regulador da aprendizagem. Assim as nossas hipóteses de trabalho consistiram em analisar como se negociou com os alunos o processo de construção e como foi dinamizado e utilizado o portefólio, em sala de aula. Com esta análise procurámos perceber também como é que os alunos evoluíram neste percurso em termos da sua apropriação deste instrumento para o desenvolvimento das suas aprendizagens. A metodologia adotada inscreve-se numa abordagem qualitativa com um design próximo da investigação-ação. Os dados foram recolhidos através da observação, inquérito através de entrevistas e questionário e ainda através de análise documental dos portefólios dos alunos. Para a análise de dados utilizou-se a análise de conteúdo, em que as categorias se foram construindo no decurso do trabalho.Os resultados mostram que a apropriação de um trabalho novo é gradual e que a utilização do portefólio, enquanto instrumento de autorregulação, contribui para o desenvolvimento de um conjunto de aprendizagens que se relacionam com as áreas curriculares, mas também com a autonomia. O estudo também mostra que o portefólio assume-se como um instrumento por excelência para a atribuição de feedback, que também possibilita que os alunos melhorem o seu desempenho.
Resumo:
International Conference with Peer Review 2012 IEEE International Conference in Geoscience and Remote Sensing Symposium (IGARSS), 22-27 July 2012, Munich, Germany
Resumo:
Clustering analysis is a useful tool to detect and monitor disease patterns and, consequently, to contribute for an effective population disease management. Portugal has the highest incidence of tuberculosis in the European Union (in 2012, 21.6 cases per 100.000 inhabitants), although it has been decreasing consistently. Two critical PTB (Pulmonary Tuberculosis) areas, metropolitan Oporto and metropolitan Lisbon regions, were previously identified through spatial and space-time clustering for PTB incidence rate and risk factors. Identifying clusters of temporal trends can further elucidate policy makers about municipalities showing a faster or a slower TB control improvement.
Resumo:
Relatório Final de Estágio apresentado à Escola Superior de Dança, com vista à obtenção do grau de Mestre em Ensino de Dança.
Resumo:
Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. The thesis addresses the challenges and contributions posed to the application of a mobile collaborative computing environment concept to wireless networks. The goal is to define a reference architecture for high performance mobile applications. Current work is focused on efficient data dissemination on a highly transitive environment, suitable to many mobile applications and also to the reputation and incentive system available on this mobile collaborative computing environment. For this we are improving our already published reputation/incentive algorithm with knowledge from the usage pattern from the eduroam wireless network in the Lisbon area.
Resumo:
Physical computing has spun a true global revolution in the way in which the digital interfaces with the real world. From bicycle jackets with turn signal lights to twitter-controlled christmas trees, the Do-it-Yourself (DiY) hardware movement has been driving endless innovations and stimulating an age of creative engineering. This ongoing (r)evolution has been led by popular electronics platforms such as the Arduino, the Lilypad, or the Raspberry Pi, however, these are not designed taking into account the specific requirements of biosignal acquisition. To date, the physiological computing community has been severely lacking a parallel to that found in the DiY electronics realm, especially in what concerns suitable hardware frameworks. In this paper, we build on previous work developed within our group, focusing on an all-in-one, low-cost, and modular biosignal acquisition hardware platform, that makes it quicker and easier to build biomedical devices. We describe the main design considerations, experimental evaluation and circuit characterization results, together with the results from a usability study performed with volunteers from multiple target user groups, namely health sciences and electrical, biomedical, and computer engineering. Copyright © 2014 SCITEPRESS - Science and Technology Publications. All rights reserved.
Resumo:
Floating-point computing with more than one TFLOP of peak performance is already a reality in recent Field-Programmable Gate Arrays (FPGA). General-Purpose Graphics Processing Units (GPGPU) and recent many-core CPUs have also taken advantage of the recent technological innovations in integrated circuit (IC) design and had also dramatically improved their peak performances. In this paper, we compare the trends of these computing architectures for high-performance computing and survey these platforms in the execution of algorithms belonging to different scientific application domains. Trends in peak performance, power consumption and sustained performances, for particular applications, show that FPGAs are increasing the gap to GPUs and many-core CPUs moving them away from high-performance computing with intensive floating-point calculations. FPGAs become competitive for custom floating-point or fixed-point representations, for smaller input sizes of certain algorithms, for combinational logic problems and parallel map-reduce problems. © 2014 Technical University of Munich (TUM).