878 resultados para physically based modeling
Resumo:
Virtual environments and real-time simulators (VERS) are becoming more and more important tools in research and development (R&D) process of non-road mobile machinery (NRMM). The virtual prototyping techniques enable faster and more cost-efficient development of machines compared to use of real life prototypes. High energy efficiency has become an important topic in the world of NRMM because of environmental and economic demands. The objective of this thesis is to develop VERS based methods for research and development of NRMM. A process using VERS for assessing effects of human operators on the life-cycle efficiency of NRMM was developed. Human in the loop simulations are ran using an underground mining loader to study the developed process. The simulations were ran in the virtual environment of the Laboratory of Intelligent Machines of Lappeenranta University of Technology. A physically adequate real-time simulation model of NRMM was shown to be reliable and cost effective in testing of hardware components by the means of hardware-in-the-loop (HIL) simulations. A control interface connecting integrated electro-hydraulic energy converter (IEHEC) with virtual simulation model of log crane was developed. IEHEC consists of a hydraulic pump-motor and an integrated electrical permanent magnet synchronous motorgenerator. The results show that state of the art real-time NRMM simulators are capable to solve factors related to energy consumption and productivity of the NRMM. A significant variation between the test drivers is found. The results show that VERS can be used for assessing human effects on the life-cycle efficiency of NRMM. HIL simulation responses compared to that achieved with conventional simulation method demonstrate the advances and drawbacks of various possible interfaces between the simulator and hardware part of the system under study. Novel ideas for arranging the interface are successfully tested and compared with the more traditional one. The proposed process for assessing the effects of operators on the life-cycle efficiency will be applied for wider group of operators in the future. Driving styles of the operators can be analysed statistically from sufficient large result data. The statistical analysis can find the most life-cycle efficient driving style for the specific environment and machinery. The proposed control interface for HIL simulation need to be further studied. The robustness and the adaptation of the interface in different situations must be verified. The future work will also include studying the suitability of the IEHEC for different working machines using the proposed HIL simulation method.
Resumo:
Serine-proteases are involved in vital processes in virtually all species. They are important targets for researchers studying the relationships between protein structure and activity, for the rational design of new pharmaceuticals. Trypsin was used as a model to assess a possible differential contribution of hydration water to the binding of two synthetic inhibitors. Thermodynamic parameters for the association of bovine ß-trypsin (homogeneous material, observed 23,294.4 ± 0.2 Da, theoretical 23,292.5 Da) with the inhibitors benzamidine and berenil at pH 8.0, 25ºC and with 25 mM CaCl2, were determined using isothermal titration calorimetry and the osmotic stress method. The association constant for berenil was about 12 times higher compared to the one for benzamidine (binding constants are K = 596,599 ± 25,057 and 49,513 ± 2,732 M-1, respectively; the number of binding sites is the same for both ligands, N = 0.99 ± 0.05). Apparently the driving force responsible for this large difference of affinity is not due to hydrophobic interactions because the variation in heat capacity (DCp), a characteristic signature of these interactions, was similar in both systems tested (-464.7 ± 23.9 and -477.1 ± 86.8 J K-1 mol-1 for berenil and benzamidine, respectively). The results also indicated that the enzyme has a net gain of about 21 water molecules regardless of the inhibitor tested. It was shown that the difference in affinity could be due to a larger number of interactions between berenil and the enzyme based on computational modeling. The data support the view that pharmaceuticals derived from benzamidine that enable hydrogen bond formation outside the catalytic binding pocket of ß-trypsin may result in more effective inhibitors.
Resumo:
The reduction of greenhouse gas emissions in the European Union promotes the combustion of biomass rather than fossil fuels in energy production. Circulating fluidized bed (CFB) combustion offers a simple, flexible and efficient way to utilize untreated biomass in a large scale. CFB furnaces are modeled in order to understand their operation better and to help in the design of new furnaces. Therefore, physically accurate models are needed to describe the heavily coupled multiphase flow, reactions and heat transfer inside the furnace. This thesis presents a new model for the fuel flow inside the CFB furnace, which acknowledges the physical properties of the fuel and the multiphase flow phenomena inside the furnace. This model is applied with special interest in the firing of untreated biomass. An experimental method is utilized to characterize gas-fuel drag force relations. This characteristic drag force approach is developed into a gas-fuel drag force model suitable for irregular, non-spherical biomass particles and applied together with the new fuel flow model in the modeling of a large-scale CFB furnace. The model results are physically valid and achieve very good correspondence with the measurement results from large-scale CFB furnace firing biomass. With the methods and models presented in this work, the fuel flow field inside a circulating fluidized bed furnace can be modeled with better accuracy and more efficiently than in previous studies with a three-dimensional holistic model frame.
Resumo:
Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB) spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.
Resumo:
This study tested the hypothesis that simvastatin treatment can improve cardiovascular and autonomic functions and membrane lipoperoxidation, with an increased effect when applied to physically trained ovariectomized rats. Ovariectomized rats were divided into sedentary, sedentary+simvastatin and trained+simvastatin groups (n = 8 each). Exercise training was performed on a treadmill for 8 weeks and simvastatin (5 mg/kg) was administered in the last 2 weeks. Blood pressure (BP) was recorded in conscious animals. Baroreflex sensitivity was evaluated by the tachycardic and bradycardic responses to BP changes. Cardiac vagal and sympathetic effects were determined using methylatropine and propranolol. Oxidative stress was evaluated based on heart and liver lipoperoxidation using the chemiluminescence method. The simvastatin-treated groups presented reduced body weight and mean BP (trained+simvastatin = 99 ± 2 and sedentary+simvastatin = 107 ± 2 mmHg) compared to the sedentary group (122 ± 1 mmHg). Furthermore, the trained group showed lower BP and heart rate compared to the other groups. Tachycardic and bradycardic responses were enhanced in both simvastatin-treated groups. The vagal effect was increased in the trained+simvastatin group and the sympathetic effect was decreased in the sedentary+simvastatin group. Hepatic lipoperoxidation was reduced in sedentary+simvastatin (≈21%) and trained+simvastatin groups (≈57%) compared to the sedentary group. Correlation analysis involving all animals demonstrated that cardiac lipoperoxidation was negatively related to the vagal effect (r = -0.7) and positively correlated to the sympathetic effect (r = 0.7). In conclusion, improvement in cardiovascular and autonomic functions associated with a reduction of lipoperoxidation with simvastatin treatment was increased in trained ovariectomized rats.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
The objective of this work was to determine and model the infrared dehydration curves of apple slices - Fuji and Gala varieties. The slices were dehydrated until constant mass, in a prototype dryer with infrared heating source. The applied temperatures ranged from 50 to 100 °C. Due to the physical characteristics of the product, the dehydration curve was divided in two periods, constant and falling, separated by the critical moisture content. A linear model was used to describe the constant dehydration period. Empirical models traditionally used to model the drying behavior of agricultural products were fitted to the experimental data of the falling dehydration period. Critical moisture contents of 2.811 and 3.103 kgw kgs-1 were observed for the Fuji and Gala varieties, respectively. Based on the results, it was concluded that the constant dehydration rates presented a direct relationship with the temperature; thus, it was possible to fit a model that describes the moisture content variation in function of time and temperature. Among the tested models, which describe the falling dehydration period, the model proposed by Midilli presented the best fit for all studied conditions.
Resumo:
A mathematical model to predict microbial growth in milk was developed and analyzed. The model consists of a system of two differential equations of first order. The equations are based on physical hypotheses of population growth. The model was applied to five different sets of data of microbial growth in dairy products selected from Combase, which is the most important database in the area with thousands of datasets from around the world, and the results showed a good fit. In addition, the model provides equations for the evaluation of the maximum specific growth rate and the duration of the lag phase which may provide useful information about microbial growth.
Resumo:
Celery (Apium graveolens L. var. secalinum Alef) leaves with 50±0.07 g weight and 91.75±0.15% humidity (~11.21 db) were dried using 8 different microwave power densities ranging between 1.8-20 W g-1, until the humidity fell down to 8.95±0.23% (~0.1 db). Microwave drying processes were completed between 5.5 and 77 min depending on the microwave power densities. In this study, measured values were compared with predicted values obtained from twenty thin layer drying theoretical, semi-empirical and empirical equations with a new thin layer drying equation. Within applied microwave power density; models whose coefficient and correlation (R²) values are highest were chosen as the best models. Weibull distribution model gave the most suitable predictions at all power density. At increasing microwave power densities, the effective moisture diffusivity values ranged from 1.595 10-10 to 6.377 10-12 m2 s-1. The activation energy was calculated using an exponential expression based on Arrhenius equation. The linear relationship between the drying rate constant and effective moisture diffusivity gave the best fit.
Resumo:
Abstract The aim of this work was to evaluate a non-agitated process of bioethanol production from soybean molasses and the kinetic parameters of fermentation using a strain of Saccharomyces cerevisiae (ATCC® 2345). Kinetic experiment was conducted in medium with 30% (w v-1) of soluble solids without supplementation or pH adjustment. The maximum ethanol concentration was in 44 hours, the ethanol productivity was 0.946 g L-1 h-1, the yield over total initial sugars (Y1) was 47.87%, over consumed sugars (Y2) was 88.08% and specific cells production rate was 0.006 h-1. The mathematical polynomial was adjusted to the experimental data and provided very similar parameters of yield and productivity. Based in this study, for one ton of soybean molasses can be produced 103 kg of anhydrous bioethanol.
Resumo:
Building Information Modeling – BIM is widely spreading in the Architecture, Engineering, and Construction (AEC) industries. Manufacturers of building elements are also starting to provide more and more objects of their products. The ideal availability and distribution for these models is not yet stabilized. Usual goal of a manufacturer is to get their model into design as early as possible. Finding the ways to satisfy customer needs with a superior service would help to achieve this goal. This study aims to seek what case company’s customers want out of the model and what they think is the ideal way to obtain these models and what are the desired functionalities for this service. This master’s thesis uses a modified version of lead user method to gain understanding of what the needs are in a longer term. In this framework also benchmarking of current solutions and their common model functions is done. Empirical data is collected with survey and interviews. As a result this thesis provides understanding that what is the information customer uses when obtaining a model, what kind of model is expected to be achieved and how is should the process optimally function. Based on these results ideal service is pointed out.
Resumo:
The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.
Resumo:
The purpose of this thesis is to focus on credit risk estimation. Different credit risk estimation methods and characteristics of credit risk are discussed. The study is twofold, including an interview of a credit risk specialist and a quantitative section. Quantitative section applies the KMV model to estimate credit risk of 12 sample companies from three different industries: automobile, banking and financial sector and technology. Timeframe of the estimation is one year. On the basis of the KMV model and the interview, implications for analysis of credit risk are discussed. The KMV model yields consistent results with the existing credit ratings. However, banking and financial sector requires calibration of the model due to high leverage of the industry. Credit risk is considerably driven by leverage, value and volatility of assets. Credit risk models produce useful information on credit worthiness of a business. Yet, quantitative models often require qualitative support in the decision-making situation.
Resumo:
Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
Resumo:
In the literature on tests of normality, much concern has been expressed over the problems associated with residual-based procedures. Indeed, the specialized tables of critical points which are needed to perform the tests have been derived for the location-scale model; hence reliance on available significance points in the context of regression models may cause size distortions. We propose a general solution to the problem of controlling the size normality tests for the disturbances of standard linear regression, which is based on using the technique of Monte Carlo tests.