991 resultados para Task Modeling
Resumo:
ABSTRACT This study aimed to verify the differences in radiation intensity as a function of distinct relief exposure surfaces and to quantify these effects on the leaf area index (LAI) and other variables expressing eucalyptus forest productivity for simulations in a process-based growth model. The study was carried out at two contrasting edaphoclimatic locations in the Rio Doce basin in Minas Gerais, Brazil. Two stands with 32-year-old plantations were used, allocating fixed plots in locations with northern and southern exposure surfaces. The meteorological data were obtained from two automated weather stations located near the study sites. Solar radiation was corrected for terrain inclination and exposure surfaces, as it is measured based on the plane, perpendicularly to the vertical location. The LAI values collected in the field were used. For the comparative simulations in productivity variation, the mechanistic 3PG model was used, considering the relief exposure surfaces. It was verified that during most of the year, the southern surfaces showed lower availability of incident solar radiation, resulting in up to 66% losses, compared to the same surface considered plane, probably related to its geographical location and higher declivity. Higher values were obtained for the plantings located on the northern surface for the variables LAI, volume and mean annual wood increase, with this tendency being repeated in the 3PG model simulations.
Resumo:
Traditionally limestone has been used for the flue gas desulfurization in fluidized bed combustion. Recently, several studies have been carried out to examine the use of limestone in applications which enable the removal of carbon dioxide from the combustion gases, such as calcium looping technology and oxy-fuel combustion. In these processes interlinked limestone reactions occur but the reaction mechanisms and kinetics are not yet fully understood. To examine these phenomena, analytical and numerical models have been created. In this work, the limestone reactions were studied with aid of one-dimensional numerical particle model. The model describes a single limestone particle in the process as a function of time, the progress of the reactions and the mass and energy transfer in the particle. The model-based results were compared with experimental laboratory scale BFB results. It was observed that by increasing the temperature from 850 °C to 950 °C the calcination was enhanced but the sulfate conversion was no more improved. A higher sulfur dioxide concentration accelerated the sulfation reaction and based on the modeling, the sulfation is first order with respect to SO2. The reaction order of O2 seems to become zero at high oxygen concentrations.
Resumo:
Linear programming models are effective tools to support initial or periodic planning of agricultural enterprises, requiring, however, technical coefficients that can be determined using computer simulation models. This paper, presented in two parts, deals with the development, application and tests of a methodology and of a computational modeling tool to support planning of irrigated agriculture activities. Part I aimed at the development and application, including sensitivity analysis, of a multiyear linear programming model to optimize the financial return and water use, at farm level for Jaíba irrigation scheme, Minas Gerais State, Brazil, using data on crop irrigation requirement and yield, obtained from previous simulation with MCID model. The linear programming model outputted a crop pattern to which a maximum total net present value of R$ 372,723.00 for the four years period, was obtained. Constraints on monthly water availability, labor, land and production were critical in the optimal solution. In relation to the water use optimization, it was verified that an expressive reductions on the irrigation requirements may be achieved by small reductions on the maximum total net present value.
Resumo:
Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.
Resumo:
In the forced-air cooling process of fruits occurs, besides the convective heat transfer, the mass transfer by evaporation. The energy need in the evaporation is taken from fruit that has its temperature lowered. In this study it has been proposed the use of empirical correlations for calculating the convective heat transfer coefficient as a function of surface temperature of the strawberry during the cooling process. The aim of this variation of the convective coefficient is to compensate the effect of evaporation in the heat transfer process. Linear and exponential correlations are tested, both with two adjustable parameters. The simulations are performed using experimental conditions reported in the literature for the cooling of strawberries. The results confirm the suitability of the proposed methodology.
Resumo:
The interaction between the soil and tillage tool can be examined using different parameters for the soil and the tool. Among the soil parameters are the shear stress, cohesion, internal friction angle of the soil and the pre-compression stress. The tool parameters are mainly the tool geometry and depth of operation. Regarding to the soils of Rio Grande do Sul there are hardly any studies and evaluations of the parameters that have importance in the use of mathematical models to predict tensile loads. The objective was to obtain parameters related to the soils of Rio Grande do Sul, which are used in soil-tool analysis, more specifically on mathematical models that allow the calculation of tractive effort for symmetric and narrow tools. Two of the main soils of Rio Grande do Sul, an Albaqualf and a Paleudult were studied. Equations that relate the cohesion, internal friction angle of the soil, adhesion, soil-tool friction angle and pre-compression stress as a function of water content in the soil were obtained, leading to important information for use of mathematical models for tractive effort calculation.
Resumo:
This study aimed to apply mathematical models to the growth of Nile tilapia (Oreochromis niloticus) reared in net cages in the lower São Francisco basin and choose the model(s) that best represents the conditions of rearing for the region. Nonlinear models of Brody, Bertalanffy, Logistic, Gompertz, and Richards were tested. The models were adjusted to the series of weight for age according to the methods of Gauss, Newton, Gradiente and Marquardt. It was used the procedure "NLIN" of the System SAS® (2003) to obtain estimates of the parameters from the available data. The best adjustment of the data were performed by the Bertalanffy, Gompertz and Logistic models which are equivalent to explain the growth of the animals up to 270 days of rearing. From the commercial point of view, it is recommended that commercialization of tilapia from at least 600 g, which is estimated in the Bertalanffy, Gompertz and Logistic models for creating over 183, 181 and 184 days, and up to 1 Kg of mass , it is suggested the suspension of the rearing up to 244, 244 and 243 days, respectively.
Resumo:
ABSTRACT Given the need to obtain systems to better control broiler production environment, we performed an experiment with broilers from 1 to 21 days, which were submitted to different intensities and air temperature durations in conditioned wind tunnels and the results were used for validation of afuzzy model. The model was developed using as input variables: duration of heat stress (days), dry bulb air temperature (°C) and as output variable: feed intake (g) weight gain (g) and feed conversion (g.g-1). The inference method used was Mamdani, 20 rules have been prepared and the defuzzification technique used was the Center of Gravity. A satisfactory efficiency in determining productive responses is evidenced in the results obtained in the model simulation, when compared with the experimental data, where R2 values calculated for feed intake, weight gain and feed conversion were 0.998, 0.981 and 0.980, respectively.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Elinkaariarviointi on menetelmä, missä tuotejärjestelmän aikaiset syötteet ja tuotteet koostetaan yhteen ja tuloksena saadaan sen ympäristökuormitus. Elinkaariarviointi on päätöksentekoa tukeva työkalu. Jätelain kokonaisuudistuksen myötä elinkaariarvioinnin käyttö tullee lisääntymään kuntavastuullisessa jätehuollossa. Helsingin seudun ympäristöpalvelut -kuntayhtymä HSY:n jätehuollon tavoitteena on rakentaa elinkaarimalli, jonka avulla voidaan selvittää koko toiminnan aiheuttama ympäristökuormitus ja taloudelliset vaikutukset. HSY:n jätehuolto on päättänyt toteuttaa elinkaarimallin rakentamisen konsulttityönä. Työn tavoitteena on ollut laatia toimintaohjeisto HSY:n jätehuollon elinkaarimallinnuspalveluiden hankkimiseksi. Elinkaarimalli voidaan tehdä kaupallista ohjelmistoa käyttämällä. Tähän selvitykseen on valittu arvioitavaksi kolme elinkaariarvioinnin työkalua: EASEWASTE, WRATE ja GaBi 4.4. Ohjelmistojen ominaisuuksia on arvioitu kirjallisuuden ja haastattelun perusteella. Työssä on laadittu kriteeristö näiden ohjelmistojen arviointiin. Kirjallisuuden perusteella on selvitetty elinkaariarvioinnin soveltamiskohteet kuntavastuullisessa jätehuollossa. HSY:n jätehuollon elinkaariarvioinnin soveltamiskohteet ja mallinnustarpeet on tunnistettu haastattelemalla HSY:n jätehuollon asiantuntijoita. HSY:n jätehuollolle rakennettavan mallin päivittämistä, käyttöä ja kehittämistä tulisi hallita HSY:n jätehuollon toimesta. Kaikki työssä arvioidut ohjelmistot soveltuvat HSY:n jätehuollon tunnistamien mallinnustarpeiden laskentaan. Elinkaarimallinnuspalveluiden toimintaohjeistolla pyritään varmistamaan HSY:n jätehuollon tarpeisiin soveltuvan mallin hankinta ja jatkotoimenpiteiden suunnittelu.
Resumo:
Usage of batteries as energy storage is emerging in automotive and mobile working machine applications in future. When battery systems become larger, battery management becomes an essential part of the application concerning fault situations of the battery and safety of the user. A properly designed battery management system extends one charge cycle of battery pack and the whole life time of the battery pack. In this thesis main objectives and principles of BMS are studied and first order Thevenin’s model of the lithium-titanate battery cell is built based on laboratory measurements. The battery cell model is then verified by comparing the battery cell model and the actual battery cell and its suitability for use in BMS is studied.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
Social tagging evolved in response to a need to tag heterogeneous objects, the automated tagging of which is usually not feasible by current technological means. Social tagging can be used for more flexible competence management within organizations. The profiles of employees can be built in the form of groups of tags, as employees tag each other, based on their familiarity of each other’s expertise. This can serve as a replacement for the more traditional competence management approaches, which usually become outdated due to social and organizational hurdles, and obsolete data. These limitations can be overcome by people tagging, as the information revealed by such tags is usually based on most recent employee interaction and knowledge. Task management as part of personal information management aims at the support of users’ individual task handling. This can include collaborating with other individuals, sharing one’s knowledge, both functional and process-related, and distributing documents and web resources. In this context, Task patterns can be used as templates that collect information and experience around tasks associated to it during run time, facilitating agility. The effective collaboration among contributors necessitates the means to find the appropriate individuals to work with on the task, and this can be made possible by using social tagging to describe individual competencies. The goal of this study is to support finding and tagging people within task management, through the effective exploitation of the work/task context. This involves the utilization of knowledge of the workers’ expertise, nature of the task/task pattern and information available from the documents and web resources attached to the task. Vice versa, task management provides an excellent environment for social tagging due to the task context that already provides suitable tags. The study also aims at assisting users of the task management solution with the collaborative construction of light-weight ontology by inferring semantic relations between tags. The thesis project aims at an implementation of people finding & tagging within the java application for task management that consumes web services, which provide the required ontology for the organization.
Resumo:
Modern machine structures are often fabricated by welding. From a fatigue point of view, the structural details and especially, the welded details are the most prone to fatigue damage and failure. Design against fatigue requires information on the fatigue resistance of a structure’s critical details and the stress loads that act on each detail. Even though, dynamic simulation of flexible bodies is already current method for analyzing structures, obtaining the stress history of a structural detail during dynamic simulation is a challenging task; especially when the detail has a complex geometry. In particular, analyzing the stress history of every structural detail within a single finite element model can be overwhelming since the amount of nodal degrees of freedom needed in the model may require an impractical amount of computational effort. The purpose of computer simulation is to reduce amount of prototypes and speed up the product development process. Also, to take operator influence into account, real time models, i.e. simplified and computationally efficient models are required. This in turn, requires stress computation to be efficient if it will be performed during dynamic simulation. The research looks back at the theoretical background of multibody dynamic simulation and finite element method to find suitable parts to form a new approach for efficient stress calculation. This study proposes that, the problem of stress calculation during dynamic simulation can be greatly simplified by using a combination of floating frame of reference formulation with modal superposition and a sub-modeling approach. In practice, the proposed approach can be used to efficiently generate the relevant fatigue assessment stress history for a structural detail during or after dynamic simulation. In this work numerical examples are presented to demonstrate the proposed approach in practice. The results show that approach is applicable and can be used as proposed.