947 resultados para System software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparison between two competing models of an all mechanical power transmission system is studied by using Dymola –software as the simulation tool. This tool is compared with Matlab/ Simulink –software by using functionality, user-friendliness and price as comparison criteria. In this research we assume that the torque is balanceable and transmission ratios are calculated. Using kinematic connection sketches of the two transmission models, simulation models are built into the Dymola simulation environment. Models of transmission systems are modified according to simulation results to achieve a continuous variable transmission ratio. Simulation results are compared between the two transmission systems. The main features of Dymola and MATLAB/ Simulink are compared. Advantages and disadvantages of the two softwares are analyzed and compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT). Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds) when the activity occurred. As a result, the computer builds up several files (one per detector/sensor) containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT) or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.). Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Master’s thesis Biomass Utilization in PFC Co-firing System with the Slagging and Fouling Analysis is the study of the modern technologies of different coal-firing systems: PFC system, FB system and GF system. The biomass co-fired with coal is represented by the research of the company Alstom Power Plant. Based on the back ground of the air pollution, greenhouse effect problems and the national fuel security today, the bioenergy utilization is more and more popular. However, the biomass is promoted to burn to decrease the emission amount of carbon dioxide and other air pollutions, new problems form like slagging and fouling, hot corrosion in the firing systems. Thesis represent the brief overview of different coal-firing systems utilized in the world, and focus on the biomass-coal co-firing in the PFC system. The biomass supply and how the PFC system is running are represented in the thesis. Additionally, the new problems of hot corrosion, slagging and fouling are mentioned. The slagging and fouling problem is simulated by using the software HSC Chemistry 6.1, and the emissions comparison between coal-firing and co-firing are simulated as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hur arbetar en framgångsrik programmerare? Uppgifterna att programmera datorspel och att programmera industriella, säkerhetskritiska system verkar tämligen olika. Genom en noggrann empirisk undersökning jämför och kontrasterar avhandlingen dessa två former av programmering och visar att programmering innefattar mer än teknisk förmåga. Med utgångspunkt i hermeneutisk och retorisk teori och med hjälp av både kulturvetenskap och datavetenskap visar avhandlingen att programmerarnas tradition och värderingar är grundläggande för deras arbete, och att båda sorter av programmering kan uppfattas och analyseras genom klassisk texttolkningstradition. Dessutom kan datorprogram betraktas och analyseras med hjälp av klassiska teorier om talproduktion i praktiken - program ses då i detta sammanhang som ett slags yttranden. Allt som allt förespråkar avhandlingen en återkomst till vetenskapens grunder, vilka innebär en ständig och oupphörlig cyklisk rörelse mellan att erfara och att förstå. Detta står i kontrast till en reduktionistisk syn på vetenskapen, som skiljer skarpt mellan subjektivt och objektivt, och på så sätt utgår från möjligheten att uppnå fullständigt vetande. Ofullständigt vetande är tolkandets och hermeneutikens domän. Syftet med avhandlingen är att med hjälp av exempel demonstrera programmeringens kulturella, hermeneutiska och retoriska natur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target’s three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a demonstrable association between exposure to air pollutants and deaths due to cardiovascular diseases. The objective of this study was to estimate the effects of exposure to sulfur dioxide on mortality due to circulatory diseases in individuals 50 years of age or older residing in São José dos Campos, SP. This was a time-series ecological study for the years 2003 to 2007 using information on deaths due to circulatory disease obtained from Datasus reports. Data on daily levels of pollutants, particulate matter, sulfur dioxide (SO2), ozone, temperature, and humidity were obtained from the São Paulo State Environmental Agency. Moving average models for 2 to 7 days were calculated by Poisson regression using the R software. Exposure to SO2 was analyzed using a unipollutant, bipollutant or multipollutant model adjusted for mean temperature and humidity. The relative risks with 95%CI were obtained and the percent decrease in risk was calculated. There were 1928 deaths with a daily mean (± SD) of 1.05 ± 1.03 (range: 0-6). Exposure to SO2 was significantly associated with mortality due to circulatory disease: RR = 1.04 (95%CI = 1.01 to 1.06) in the 7-day moving average, after adjusting for ozone. There was an 8.5% decrease in risk in the multipollutant model, proportional to a decrease of SO2 concentrations. The results of this study suggest that residents of medium-sized Brazilian cities with characteristics similar to those of São José dos Campos probably have health problems due to exposure to air pollutants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As increasing efficiency of a wind turbine gearbox, more power can be transferred from rotor blades to generator and less power is used to cause wear and heating in the gearbox. By using a simulation model, behavior of the gearbox can be studied before creating expensive prototypes. The objective of the thesis is to model a wind turbine gearbox and its lubrication system to study power losses and heat transfer inside the gearbox and to study the simulation methods of the used software. Software used to create the simulation model is Siemens LMS Imagine.Lab AMESim, which can be used to create one-dimensional mechatronic system simulation models from different fields of engineering. When combining components from different libraries it is possible to create a simulation model, which includes mechanical, thermal and hydraulic models of the gearbox. Results for mechanical, thermal, and hydraulic simulations are presented in the thesis. Due to the large scale of the wind turbine gearbox and the amount of power transmitted, power loss calculations from AMESim software are inaccurate and power losses are modelled as constant efficiency for each gear mesh. Starting values for simulation in thermal and hydraulic simulations were chosen from test measurements and from empirical study as compact and complex design of gearbox prevents accurate test measurements. In further studies to increase the accuracy of the simulation model, components used for power loss calculations needs to be modified and values for unknown variables are needed to be solved through accurate test measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of VSC-HVDC technology throughout the world has turned out to be an efficient solution regarding a large share of wind power in different power systems. This technology enhances the overall reliability of the grid by utilization of the active and reactive power control schemes which allows to maintain frequency and voltage on busbars of the end-consumers at the required level stated by the network operator. This master’s thesis is focused on the existing and planned wind farms as well as electric power system of the Åland Islands. The goal is to analyze the wind conditions of the islands and appropriately predict a possible production of the existing and planned wind farms with a help of WAsP software program. Further, to investigate the influence of increased wind power it is necessary to develop a simulation model of the electric grid and VSC-HVDC system in PSCAD and examine grid response to different wind power production cases with respect to the grid code requirements and ensure the stability of the power system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines how content marketing is used in B2B customer acquisition and how content marketing performance measurement system is built and utilized in this context. Literature related to performance measurement, branding and buyer behavior is examined in the theoretical part in order to identify the elements influence on content marketing performance measurement design and usage. Qualitative case study is chosen in order to gain deep understanding of the phenomenon studied. The case company is a Finnish software vendor, which operates in B2B markets and has practiced content marketing for approximately two years. The in-depth interviews were conducted with three employees from marketing department. According to findings content marketing performance measurement system’s infrastructure is based on target market’s decision making processes, company’s own customer acquisition process, marketing automation tool and analytics solutions. The main roles of content marketing performance measurement system are measuring performance, strategy management and learning and improvement. Content marketing objectives in the context of customer acquisition are enhancing brand awareness, influencing brand attitude and lead generation. Both non-financial and financial outcomes are assessed by single phase specific metrics, phase specific overall KPIs and ratings related to lead’s involvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.