937 resultados para wind power, simulation, simulation tool, user interface


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report for the scientific sojourn at the James Cook University, Australia, between June to December 2007. Free convection in enclosed spaces is found widely in natural and industrial systems. It is a topic of primary interest because in many systems it provides the largest resistance to the heat transfer in comparison with other heat transfer modes. In such systems the convection is driven by a density gradient within the fluid, which, usually, is produced by a temperature difference between the fluid and surrounding walls. In the oil industry, the oil, which has High Prandtl, usually is stored and transported in large tanks at temperatures high enough to keep its viscosity and, thus the pumping requirements, to a reasonable level. A temperature difference between the fluid and the walls of the container may give rise to the unsteady buoyancy force and hence the unsteady natural convection. In the initial period of cooling the natural convection regime dominates over the conduction contribution. As the oil cools down it typically becomes more viscous and this increase of viscosity inhibits the convection. At this point the oil viscosity becomes very large and unloading of the tank becomes very difficult. For this reason it is of primary interest to be able to predict the cooling rate of the oil. The general objective of this work is to develop and validate a simulation tool able to predict the cooling rates of high Prandtl fluid considering the variable viscosity effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work covers two aspects. First, it generally compares and summarizes the similarities and differences of state of the art feature detector and descriptor and second it presents a novel approach of detecting intestinal content (in particular bubbles) in capsule endoscopy images. Feature detectors and descriptors providing invariance to change of perspective, scale, signal-noise-ratio and lighting conditions are important and interesting topics in current research and the number of possible applications seems to be numberless. After analysing a selection of in the literature presented approaches, this work investigates in their suitability for applications information extraction in capsule endoscopy images. Eventually, a very good performing detector of intestinal content in capsule endoscopy images is presented. A accurate detection of intestinal content is crucial for all kinds of machine learning approaches and other analysis on capsule endoscopy studies because they occlude the field of view of the capsule camera and therefore those frames need to be excluded from analysis. As a so called “byproduct” of this investigation a graphical user interface supported Feature Analysis Tool is presented to execute and compare the discussed feature detectors and descriptor on arbitrary images, with configurable parameters and visualized their output. As well the presented bubble classifier is part of this tool and if a ground truth is available (or can also be generated using this tool) a detailed visualization of the validation result will be performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El proyecto que se expone a continuación está dedicado al control de instrumentos mediante el bus de instrumentación GPIB programado con el software Matlab. Está dividido en dos partes. La primera, será llevada a cabo en el laboratorio de docencia y el objetivo será controlar el osciloscopio y el generador de funciones. Como ejemplo del control realizado se desarrollará una aplicación que permitirá obtener el diagrama de Bode de módulo de cualquier sistema electrónico. La segunda parte será llevada a cabo en el laboratorio de investigación y el objetivo será controlar el analizador de semiconductores. En este caso, la aplicación desarrollada permitirá la realización de medidas para la caracterización de transistores. Las aplicaciones de ambas partes estarán realizadas mediante una interfaz gráfica de usuario diseñada con la herramienta GUIDE de Matlab.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En aquest projecte es presenta un escenari de xarxa DTN. Aquest escenari té una serie de problemes que dificulten l’encaminament. Al projecte s’ha proposat un protocol d’encaminament que permet superar aquestes dificultats. Per realitzar això ha estat necessari estudiar i adquirir un coneixement profund sobre el funcionament dels protocols de xarxes DTN. Per demostrar que aquesta proposta solucionava els problemes es presenta l’anàlisi dels resultats d’aplicar el protocol a l’escenari, aquests resultats s’han obtingut amb l’eina de simulació de xarxes NS-2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este proyecto está orientado a ayudar a un grupo de investigadores del Departamento de Ciencia Animal y de los Alimentos (UAB), que se dedican a recopilar datos genómicos obtenidos en experimentos. La aplicación constará de dos partes. La primera es la parte de los usuarios, donde se podrá crear proyectos, insertar, modificar o eliminar datos y consultar la información ya existente en la aplicación. La segunda parte es la del administrador, que como tal podrá dar de alta a nuevos usuarios, restaurar versiones anteriores de la base de datos, eliminar usuarios y consultar las acciones realizadas por los usuarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. While ground UV irradiance is monitored via different techniques, it is difficult to translate such observations into human UV exposure or dose because of confounding factors. A multi-disciplinary collaboration developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a simulation tool that estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. Dosimetric measurements obtained in field conditions were used to assess the model performance. The model predicted exposure to solar UV adequately with a symmetric mean absolute percentage error of 13% and half of the predictions within 17% range of the measurements. Using this tool, solar UV exposure patterns were investigated with respect to the relative contribution of the direct, diffuse and reflected radiation. Exposure doses for various body parts and exposure scenarios of a standing individual were assessed using erythemally-weighted UV ground irradiance data measured in 2009 at Payerne, Switzerland as input. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 Standard Erythemal Dose, SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e. g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a webservice architecture for Statistical Machine Translation aimed at non-technical users. A workfloweditor allows a user to combine different webservices using a graphical user interface. In the current state of this project,the webservices have been implemented for a range of sentential and sub-sententialaligners. The advantage of a common interface and a common data format allows the user to build workflows exchanging different aligners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La Universitat de Vic disposa, entre altres equips, d’una cèl·lula flexible de fabricació, del fabricant Festo, que simula un procés de formació de palets amb els productes que es disposen en un magatzem intermedi. Aquesta cèl·lula està composta de quatre estacions de muntatge diferenciades (càrrega de palets, càrrega de plaques, magatzem intermedi i transport). Cada una disposa d'un PLC SIEMENS S7-300 per la seva automatització, i tots aquests es troben interconnectats amb una xarxa industrial Profibus. L'objectiu d'aquest projecte és implantar el sistema SCADA Vijeo Citect pel control i supervisió de l'estació magatzem d'aquesta cèl·lula flexible de fabricació, establint també un intercanvi de dades entre l'SCADA i el Microsoft Access, per poder ser utilitzat per la docència. Aquest projecte s'ha desenvolupat en cinc fases diferents: 1. La primera fase s'ha dedicat a l'automatització pròpiament de l'estació magatzem a partir de l'autòmat programable Siemens S7-300 i complint amb les necessitats plantejades. 2. En la segona fase s'ha programat i establert la comunicació per l'intercanvi de dades (lectura i escriptura) entre el sistema SCADA Vijeo Citect i la base de dades de Microsoft Access. 3. En la tercera fase s'ha elaborat i programat l'entorn gràfic de supervisió i control del procés a partir del sistema SCADA Vijeo Citect. 4. En la quarta fase s'ha instal·lat un OPC Server en el PC i s'ha establert la comunicació entre el PLC i el sistema SCADA. 5. Finalment s'ha anat revisant i depurant les diferents programacions i comunicacions per tal de que el sistema funcioni com a un conjunt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The federal government is aggressively promoting biofuels as an answer to global climate change and dependence on imported sources of energy. Iowa has quickly become a leader in the bioeconomy and wind energy production, but meeting the United States Department of Energy’s goal having 20% of U.S. transportation fuels come from biologically based sources by 2030 will require a dramatic increase in ethanol and biodiesel production and distribution. At the same time, much of Iowa’s rural transportation infrastructure is near or beyond its original design life. As Iowa’s rural roadway structures, pavements, and unpaved roadways become structurally deficient or functionally obsolete, public sector maintenance and rehabilitation costs rapidly increase. More importantly, costs to move all farm products will rapidly increase if infrastructure components are allowed to fail; longer hauls, slower turnaround times, and smaller loads result. When these results occur on a large scale, Iowa will start to lose its economic competitive edge in the rapidly developing bioeconomy. The primary objective of this study was to document the current physical and fiscal impacts of Iowa’s existing biofuels and wind power industries. A four-county cluster in north-central Iowa and a two-county cluster in southeast Iowa were identified through a local agency survey as having a large number of diverse facilities and were selected for the traffic and physical impact analysis. The research team investigated the large truck traffic patterns on Iowa’s secondary and local roads from 2002 to 2008 and associated those with the pavement condition and county maintenance expenditures. The impacts were quantified to the extent possible and visualized using geographic information system (GIS) tools. In addition, a traffic and fiscal assessment tool was developed to understand the impact of the development of the biofuels on Iowa’s secondary road system. Recommended changes in public policies relating to the local government and to the administration of those policies included standardizing the reporting and format of all county expenditures, conducting regular pavement evaluations on a county’s system, cooperating and communicating with cities (adjacent to a plant site), considering utilization of tax increment financing (TIF) districts as a short-term tool to produce revenues, and considering alternative ways to tax the industry.