985 resultados para Machine tool


Relevância:

20.00% 20.00%

Publicador:

Resumo:

CoDeSys "Controller Development Systems" is a development environment for programming in the area of automation controllers. It is an open source solution completely in line with the international industrial standard IEC 61131-3. All five programming languages for application programming as defined in IEC 61131-3 are available in the development environment. These features give professionals greater flexibility with regard to programming and allow control engineers have the ability to program for many different applications in the languages in which they feel most comfortable. Over 200 manufacturers of devices from different industrial sectors offer intelligent automation devices with a CoDeSys programming interface. In 2006, version 3 was released with new updates and tools. One of the great innovations of the new version of CoDeSys is object oriented programming. Object oriented programming (OOP) offers great advantages to the user for example when wanting to reuse existing parts of the application or when working on one application with several developers. For this reuse can be prepared a source code with several well known parts and this is automatically generated where necessary in a project, users can improve then the time/cost/quality management. Until now in version 2 it was necessary to have hardware interface called “Eni-Server” to have access to the generated XML code. Another of the novelties of the new version is a tool called Export PLCopenXML. This tool makes it possible to export the open XML code without the need of specific hardware. This type of code has own requisites to be able to comply with the standard described above. With XML code and with the knowledge how it works it is possible to do component-oriented development of machines with modular programming in an easy way. Eplan Engineering Center (EEC) is a software tool developed by Mind8 GmbH & Co. KG that allows configuring and generating automation projects. Therefore it uses modules of PLC code. The EEC already has a library to generate code for CoDeSys version 2. For version 3 and the constant innovation of drivers by manufacturers, it is necessary to implement a new library in this software. Therefore it is important to study the XML export to be then able to design any type of machine. The purpose of this master thesis is to study the new version of the CoDeSys XML taking into account all aspects and impact on the existing CoDeSys V2 models and libraries in the company Harro Höfliger Verpackungsmaschinen GmbH. For achieve this goal a small sample named “Traffic light” in CoDeSys version 2 will be done and then, using the tools of the new version it there will be a project with version 3 and also the EEC implementation for the automatically generated code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neonatal anthropometry is an inexpensive, noninvasive and convenient tool for bedside evaluation, especially in sick and fragile neonates. Anthropometry can be used in neonates as a tool for several purposes: diagnosis of foetal malnutrition and prediction of early postnatal complications; postnatal assessment of growth, body composition and nutritional status; prediction of long-term complications including metabolic syndrome; assessment of dysmorphology; and estimation of body surface. However, in this age group anthropometry has been notorious for its inaccuracy and the main concern is to make validated indices available. Direct measurements, such as body weight, length and body circumferences are the most commonly used measurements for nutritional assessment in clinical practice and in field studies. Body weight is the most reliable anthropometric measurement and therefore is often used alone in the assessment of the nutritional status, despite not reflecting body composition. Derived indices from direct measurements have been proposed to improve the accuracy of anthropometry. Equations based on body weight and length, mid-arm circumference/head circumference ratio, and upper-arm cross-sectional areas are among the most used derived indices to assess nutritional status and body proportionality, even though these indices require further validation for the estimation of body composition in neonates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces the PCMAT platform project and, in particular, one of its components, the PCMAT Metadata Authoring Tool. This is an educational web application that allows the project metadata creators to write the metadata associated to each learning object without any concern for the metadata schema semantics. Furthermore it permits the project managers to add or delete elements to the schema, without having to rewrite or compile any code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the SmartClean tool. The purpose of this tool is to detect and correct the data quality problems (DQPs). Compared with existing tools, SmartClean has the following main advantage: the user does not need to specify the execution sequence of the data cleaning operations. For that, an execution sequence was developed. The problems are manipulated (i.e., detected and corrected) following that sequence. The sequence also supports the incremental execution of the operations. In this paper, the underlying architecture of the tool is presented and its components are described in detail. The tool's validity and, consequently, of the architecture is demonstrated through the presentation of a case study. Although SmartClean has cleaning capabilities in all other levels, in this paper are only described those related with the attribute value level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, business group decision making is an extremely important activity. A considerable number of applications and research have been made in the past years in order to increase the effectiveness of decision making process. In order to support the idea generation process, IGTAI (Idea Generation Tool for Ambient Intelligence) prototype was created. IGTAI is a Group Decision Support System designed to support any kind of meetings namely distributed, asynchronous or face to face. It aims at helping geographically distributed (or not) people and organizations in the idea generation task, by making use of pervasive hardware in a meeting room, expanding the meeting beyond the room walls by allowing a ubiquitous access through different kinds of equipment. This paper focus on the research made to build IGTAI prototype, its architecture and its main functionalities, namely the support given in the different phases of the idea generation meeting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep Ocean Species. The little that is known mostly comes from collected specimens. L.A. Rocha et al. Letter "Specimen collection: An essential tool" (23 May, 344: 814) brilliantly discuss the importance of specimen collection and present the evolution of collecting since the mid-19th century until our present strict codes and conducts. However, it is also important to emphasize the fact that the vast majority of deep ocean macro-organisms are only known to us because of collection and this is a strong argument that should be present in our actions as scientists. If the deep is considered the least known of Earth’s habitats (1% or so according to recent estimates) then what awesome collection of yet to discover species are still there to be properly described? As the authors point citing (1), something around 86% of species remain unknown. Voucher specimens are fundamental for the reasons pointed out and perhaps the vast depths of the World’s oceans are the best example of that importance. The resumed report of 2010 Census of Marine Life (2) showed that among the millions of specimens collected in both familiar and seldom-explored waters, the Census found more than 6,000 potentially new species and completed formal descriptions of more than 1,200 of them. It also found that a number of rare species are in fact common. Voucher specimens are essential and, again agreeing with L.A. Rocha et al. Letter (see above), the modern approach for collecting will not be a cause for extinctions but instead a valuable tool for knowledge, description and even, as seen above, a way to find out that supposed rare species may not be that rare and even prove to reach abundant populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de Projeto apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Tradução e Interpretação Especializadas, sob orientação do Mestre Alberto Couto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: In the XXI Century ’s Society the scientific investigation process has been growing steadily , and the field of the pharmaceutical research is one of the most enthusiastic and relevant . Here, it is very important to correlate observed functional alterations with possibly modified drug bio distribution patterns . Cancer, inflammation and inf ection are processes that induce many molecular intermediates like cytokines, chemokines and other chemical complexes that can alter the pharmacokinetics of many drugs. One cause of such changes is thought to be the modulator action of these complexes in t he P - Glyco p rotein activity, because they can act like inducers/inhibitors of MDR - 1 expression. This protein results from the expression of MDR - 1 gene, and acts as an ATP energy - dependent efflux pump, with their substrates including many drugs , like antiretrovirals, anticancers, anti - infectives, immunosuppressants, steroids or opioids . Objectives: Because of the lack of methods to provide helpful information in the investigation of in vivo molecular changes in Pgp activity during infection/infl ammation processes, and its value in the explanation of the altered drug pharmacokinetic, this paper want to evaluate the potential utility of 99m Tc - Sestamibi scintigraphy during this kind of health sciences investigation. Although the a im is indeed to create a technique to the in vivo study of Pgp activity, this preliminary Project only reaches the in vitro study phase, assumed as the first step in a n evaluation period for a new tool development. Materials and Methods: For that reason , we are performing in vitro studies of influx and efflux of 99m Tc - Sestamibi ( that is a substrate of Pgp) in hepatocytes cell line (HepG2). We are interested in clarify the cellular behavior of this radiopharmaceutical in Lipopolysaccharide(LPS) stimulated cells ( well known in vitro model of inflammation) to possibly approve this methodology. To validate the results, the Pgp expression will be finally evaluated using Western Blot technique. Results: Up to this moment , we still don’t have the final results, but we have already enough data to let us believe that LPS stimulation induce a downregulation of MDR - 1, and consequently Pgp, which could conduce to a prolonged retention of 99m Tc - Sestamibi in the inflamed cells . Conclusions: If and when this methodology demonstrate the promising results we expect, one will be able to con clude that Nuclear Medicine is an important tool to help evidence based research also on this specific field .