997 resultados para Graphical modeling (Statistics)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to apply mathematical models to the growth of Nile tilapia (Oreochromis niloticus) reared in net cages in the lower São Francisco basin and choose the model(s) that best represents the conditions of rearing for the region. Nonlinear models of Brody, Bertalanffy, Logistic, Gompertz, and Richards were tested. The models were adjusted to the series of weight for age according to the methods of Gauss, Newton, Gradiente and Marquardt. It was used the procedure "NLIN" of the System SAS® (2003) to obtain estimates of the parameters from the available data. The best adjustment of the data were performed by the Bertalanffy, Gompertz and Logistic models which are equivalent to explain the growth of the animals up to 270 days of rearing. From the commercial point of view, it is recommended that commercialization of tilapia from at least 600 g, which is estimated in the Bertalanffy, Gompertz and Logistic models for creating over 183, 181 and 184 days, and up to 1 Kg of mass , it is suggested the suspension of the rearing up to 244, 244 and 243 days, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT Given the need to obtain systems to better control broiler production environment, we performed an experiment with broilers from 1 to 21 days, which were submitted to different intensities and air temperature durations in conditioned wind tunnels and the results were used for validation of afuzzy model. The model was developed using as input variables: duration of heat stress (days), dry bulb air temperature (°C) and as output variable: feed intake (g) weight gain (g) and feed conversion (g.g-1). The inference method used was Mamdani, 20 rules have been prepared and the defuzzification technique used was the Center of Gravity. A satisfactory efficiency in determining productive responses is evidenced in the results obtained in the model simulation, when compared with the experimental data, where R2 values ​​calculated for feed intake, weight gain and feed conversion were 0.998, 0.981 and 0.980, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transportation and warehousing are large and growing sectors in the society, and their efficiency is of high importance. Transportation also has a large share of global carbondioxide emissions, which are one the leading causes of anthropogenic climate warming. Various countries have agreed to decrease their carbon emissions according to the Kyoto protocol. Transportation is the only sector where emissions have steadily increased since the 1990s, which highlights the importance of transportation efficiency. The efficiency of transportation and warehousing can be improved with the help of simulations, but models alone are not sufficient. This research concentrates on the use of simulations in decision support systems. Three main simulation approaches are used in logistics: discrete-event simulation, systems dynamics, and agent-based modeling. However, individual simulation approaches have weaknesses of their own. Hybridization (combining two or more approaches) can improve the quality of the models, as it allows using a different method to overcome the weakness of one method. It is important to choose the correct approach (or a combination of approaches) when modeling transportation and warehousing issues. If an inappropriate method is chosen (this can occur if the modeler is proficient in only one approach or the model specification is not conducted thoroughly), the simulation model will have an inaccurate structure, which in turn will lead to misleading results. This issue can further escalate, as the decision-maker may assume that the presented simulation model gives the most useful results available, even though the whole model can be based on a poorly chosen structure. In this research it is argued that simulation- based decision support systems need to take various issues into account to make a functioning decision support system. The actual simulation model can be constructed using any (or multiple) approach, it can be combined with different optimization modules, and there needs to be a proper interface between the model and the user. These issues are presented in a framework, which simulation modelers can use when creating decision support systems. In order for decision-makers to fully benefit from the simulations, the user interface needs to clearly separate the model and the user, but at the same time, the user needs to be able to run the appropriate runs in order to analyze the problems correctly. This study recommends that simulation modelers should start to transfer their tacit knowledge to explicit knowledge. This would greatly benefit the whole simulation community and improve the quality of simulation-based decision support systems as well. More studies should also be conducted by using hybrid models and integrating simulations with Graphical Information Systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Usage of batteries as energy storage is emerging in automotive and mobile working machine applications in future. When battery systems become larger, battery management becomes an essential part of the application concerning fault situations of the battery and safety of the user. A properly designed battery management system extends one charge cycle of battery pack and the whole life time of the battery pack. In this thesis main objectives and principles of BMS are studied and first order Thevenin’s model of the lithium-titanate battery cell is built based on laboratory measurements. The battery cell model is then verified by comparing the battery cell model and the actual battery cell and its suitability for use in BMS is studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this doctoral thesis, methods to estimate the expected power cycling life of power semiconductor modules based on chip temperature modeling are developed. Frequency converters operate under dynamic loads in most electric drives. The varying loads cause thermal expansion and contraction, which stresses the internal boundaries between the material layers in the power module. Eventually, the stress wears out the semiconductor modules. The wear-out cannot be detected by traditional temperature or current measurements inside the frequency converter. Therefore, it is important to develop a method to predict the end of the converter lifetime. The thesis concentrates on power-cycling-related failures of insulated gate bipolar transistors. Two types of power modules are discussed: a direct bonded copper (DBC) sandwich structure with and without a baseplate. Most common failure mechanisms are reviewed, and methods to improve the power cycling lifetime of the power modules are presented. Power cycling curves are determined for a module with a lead-free solder by accelerated power cycling tests. A lifetime model is selected and the parameters are updated based on the power cycling test results. According to the measurements, the factor of improvement in the power cycling lifetime of modern IGBT power modules is greater than 10 during the last decade. Also, it is noticed that a 10 C increase in the chip temperature cycle amplitude decreases the lifetime by 40%. A thermal model for the chip temperature estimation is developed. The model is based on power loss estimation of the chip from the output current of the frequency converter. The model is verified with a purpose-built test equipment, which allows simultaneous measurement and simulation of the chip temperature with an arbitrary load waveform. The measurement system is shown to be convenient for studying the thermal behavior of the chip. It is found that the thermal model has a 5 C accuracy in the temperature estimation. The temperature cycles that the power semiconductor chip has experienced are counted by the rainflow algorithm. The counted cycles are compared with the experimentally verified power cycling curves to estimate the life consumption based on the mission profile of the drive. The methods are validated by the lifetime estimation of a power module in a direct-driven wind turbine. The estimated lifetime of the IGBT power module in a direct-driven wind turbine is 15 000 years, if the turbine is located in south-eastern Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To obtain the desirable accuracy of a robot, there are two techniques available. The first option would be to make the robot match the nominal mathematic model. In other words, the manufacturing and assembling tolerances of every part would be extremely tight so that all of the various parameters would match the “design” or “nominal” values as closely as possible. This method can satisfy most of the accuracy requirements, but the cost would increase dramatically as the accuracy requirement increases. Alternatively, a more cost-effective solution is to build a manipulator with relaxed manufacturing and assembling tolerances. By modifying the mathematical model in the controller, the actual errors of the robot can be compensated. This is the essence of robot calibration. Simply put, robot calibration is the process of defining an appropriate error model and then identifying the various parameter errors that make the error model match the robot as closely as possible. This work focuses on kinematic calibration of a 10 degree-of-freedom (DOF) redundant serial-parallel hybrid robot. The robot consists of a 4-DOF serial mechanism and a 6-DOF hexapod parallel manipulator. The redundant 4-DOF serial structure is used to enlarge workspace and the 6-DOF hexapod manipulator is used to provide high load capabilities and stiffness for the whole structure. The main objective of the study is to develop a suitable calibration method to improve the accuracy of the redundant serial-parallel hybrid robot. To this end, a Denavit–Hartenberg (DH) hybrid error model and a Product-of-Exponential (POE) error model are developed for error modeling of the proposed robot. Furthermore, two kinds of global optimization methods, i.e. the differential-evolution (DE) algorithm and the Markov Chain Monte Carlo (MCMC) algorithm, are employed to identify the parameter errors of the derived error model. A measurement method based on a 3-2-1 wire-based pose estimation system is proposed and implemented in a Solidworks environment to simulate the real experimental validations. Numerical simulations and Solidworks prototype-model validations are carried out on the hybrid robot to verify the effectiveness, accuracy and robustness of the calibration algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this doctoral thesis, a power conversion unit for a 10 kWsolid oxide fuel cell is modeled, and a suitable control system is designed. The need for research was identified based on an observation that there was no information available about the characteristics of the solid oxide fuel cell from the perspective of power electronics and the control system, and suitable control methods had not previously been studied in the literature. In addition, because of the digital implementation of the control system, the inherent characteristics of the digital system had to be taken into account in the characteristics of the solid oxide fuel cell (SOFC). The characteristics of the solid oxide fuel cell as well the methods for the modeling and control of the DC/DC converter and the grid converter are studied by a literature survey. Based on the survey, the characteristics of the SOFC as an electrical power source are identified, and a solution to the interfacing of the SOFC in distributed generation is proposed. A mathematical model of the power conversion unit is provided, and the control design for the DC/DC converter and the grid converter is made based on the proposed interfacing solution. The limit cycling phenomenon is identified as a source of low-frequency current ripple, which is found to be insignificant when connected to a grid-tied converter. A method to mitigate a second harmonic originating from the grid interface is proposed, and practical considerations of the operation with the solid oxide fuel cell plant are presented. At the theoretical level, the thesis discusses and summarizes the methods to successfully derive a model for a DC/DC converter, a grid converter, and a power conversion unit. The results of this doctoral thesis can also be used in other applications, and the models and methods can be adopted to similar applications such as photovoltaic systems. When comparing the results with the objectives of the doctoral thesis, we may conclude that the objectives set for the work are met. In this doctoral thesis, theoretical and practical guidelines are presented for the successful control design to connect a SOFC-based distributed generation plant to the utility grid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diplomityön tavoitteena on soveltaa perinteisesti teollisesta tuotannosta lähtöisin olevia prosessijohtamisen menetelmiä sosiaali- ja terveydenhuollon vastaanottotoiminnan kehittämiseen. Tarkoituksena on asiantuntijapalveluprosessien mallintamisen avulla kartoittaa prosessien nykytilaa tunnistamalla ongelma- ja kehittämiskohteita, joiden ratkaisemiseksi annettavien kehittämistoimenpide-ehdotusten muodostamisessa sovelletaan prosessijohtamisen menetelmien periaatteita. Tutkimuksen teoreettisen viitekehyksen mukaisesti tarkastellaan mm. prosessien mallintamista, eli kuvaamista ja analysointia, sekä sosiaali- ja terveydenhuoltoa palvelutuotantona. Tutkimuksessa hyödynnetään sekä kvalitatiivisia että kvantitatiivisia menetelmiä, ja tutkimusaineistoa kerätään haastattelujen, havainnoinnin ja tilastojen avulla. Asiantuntijoiden palveluprosesseista laaditaan aineiston perusteella kuvaukset, ja tunnistetaan prosesseihin tai laajemminkin sosiaali- ja terveydenhuolto-organisaatioon liittyviä ongelma- ja kehittämiskohteita. Kehittämistoimenpiteissä korostuvat asiakaslähtöisyys ja tuottavuus, ja niissä huomioidaan niin asiakkaan, henkilöstön, prosessien kuin asiakasohjauksen näkökulma. Työn keskeisimmät tulokset ovat prosessikuvaukset, tunnistetut ongelma- ja kehittämiskohteet sekä prosessien uudelleenmäärittämiseksi muodostetut kehittämistoimenpide-ehdotukset, joiden hyötyjä ja vaikutuksia mm. suorituskykyyn arvioidaan. Tulosten perusteella voidaan päätellä, että prosessijohtamisen menetelmät soveltuvat sosiaali- ja terveydenhuollon toiminnan kehittämiseen, kunhan huomioidaan toimintaympäristön erityispiirteet ja haasteet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Welding has a growing role in modern world manufacturing. Welding joints are extensively used from pipes to aerospace industries. Prediction of welding residual stresses and distortions is necessary for accurate evaluation of fillet welds in relation to design and safety conditions. Residual stresses may be beneficial or detrimental, depending whether they are tensile or compressive and the loading. They directly affect the fatigue life of the weld by impacting crack growth rate. Beside theoretical background of residual stresses this study calculates residual stresses and deformations due to localized heating by welding process and subsequent rapid cooling in fillet welds. Validated methods are required for this purpose due to complexity of process, localized heating, temperature dependence of material properties and heat source. In this research both empirical and simulation methods were used for the analysis of welded joints. Finite element simulation has become a popular tool of prediction of welding residual stresses and distortion. Three different cases with and without preload have been modeled during this study. Thermal heat load set is used by calculating heat flux from the given heat input energy. First the linear and then nonlinear material behavior model is modeled for calculation of residual stresses. Experimental work is done to calculate the stresses empirically. The results from both the methods are compared to check their reliability. Residual stresses can have a significant effect on fatigue performance of the welded joints made of high strength steel. Both initial residual stress state and subsequent residual stress relaxation need to be considered for accurate description of fatigue behavior. Tensile residual stresses are detrimental and will reduce the fatigue life and compressive residual stresses will increase it. The residual stresses follow the yield strength of base or filler material and the components made of high strength steel are typically thin, where the role of distortion is emphasizing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Model-View-Controller (MVC) is an architectural pattern used in software development for graphical user interfaces. It was one of the first proposed solutions in the late 1970s to the Smart UI anti-pattern, which refers to the act of writing all domain logic into a user interface. The original MVC pattern has since evolved in multiple directions, with various names and may confuse many. The goal of this thesis is to present the origin of the MVC pattern and how it has changed over time. Software architecture in general and the MVC’s evolution within web applications are not the primary focus. Fundamen- tal designs are abstracted, and then used to examine the more recent versions. Prob- lems with the subject and its terminology are also presented.