994 resultados para Cognitive Simulation
Resumo:
The study evaluated the energy performance of pig farming integrated with maize production in mechanized no-tillage system. In this proposed conception of integration, the swine excrement is used as fertilizers in the maize crop. The system was designed involving the activities associated to the pig management and maize production (soil management, cultivation and harvest). A one-year period of analysis was considered, enabling the production of three batches of pigs and two crops of maize. To evaluate the energy performance, three indicators were created: energy efficiency, use of non-renewable resources efficiency and cost of non-renewable energy to produce protein. The energy inputs are composed by the inputs and infrastructure used by the breeding of pigs and maize production, as well as the solar energy incident on the agroecosystem. The energy outputs are represented by the products (finished pigs and maize). The results obtained in the simulation indicates that the integration improves the energy performance of pig farms, with an increase in the energy efficiency (186%) as well as in the use of the non-renewable energy resources efficiency (352%), while reducing the cost of non-renewable energy to produce protein (‑58%).
Resumo:
Hydrological models are important tools that have been used in water resource planning and management. Thus, the aim of this work was to calibrate and validate in a daily time scale, the SWAT model (Soil and Water Assessment Tool) to the watershed of the Galo creek , located in Espírito Santo State. To conduct the study we used georeferenced maps of relief, soil type and use, in addition to historical daily time series of basin climate and flow. In modeling were used time series corresponding to the periods Jan 1, 1995 to Dec 31, 2000 and Jan 1, 2001 to Dec 20, 2003 for calibration and validation, respectively. Model performance evaluation was done using the Nash-Sutcliffe coefficient (E NS) and the percentage of bias (P BIAS). SWAT evaluation was also done in the simulation of the following hydrological variables: maximum and minimum annual daily flowsand minimum reference flows, Q90 and Q95, based on mean absolute error. E NS and P BIAS were, respectively, 0.65 and 7.2% and 0.70 and 14.1%, for calibration and validation, indicating a satisfactory performance for the model. SWAT adequately simulated minimum annual daily flow and the reference flows, Q90 and Q95; it was not suitable in the simulation of maximum annual daily flows.
Resumo:
The Bartlett-Lewis Rectangular Pulse Modified (BLPRM) model simulates the precipitous slide in the hourly and sub-hourly and has six parameters for each of the twelve months of the year. This study aimed to evaluate the behavior of precipitation series in the duration of 15 min, obtained by simulation using the model BLPRM in situations: (a) where the parameters are estimated from a combination of statistics, creating five different sets; (b) suitability of the model to generate rain. To adjust the parameters were used rain gauge records of Pelotas/RS/Brazil, which statistics were estimated - mean, variance, covariance, autocorrelation coefficient of lag 1, the proportion of dry days in the period considered. The results showed that the parameters related to the time of onset of precipitation (λ) and intensities (μx) were the most stable and the most unstable were ν parameter, related to rain duration. The BLPRM model adequately represented the mean, variance, and proportion of the dry period of the series of precipitation lasting 15 min and, the time dependence of the heights of rain, represented autocorrelation coefficient of the first retardation was statistically less simulated series suitability for the duration of 15 min.
Resumo:
Based on experimental tests, it was obtained the equations for drying, equilibrium moisture content, latent heat of vaporization of water contained in the product and the equation of specific heat of cassava starch pellets, essential parameters for realizing modeling and mathematical simulation of mechanical drying of cassava starch for a new technique proposed, consisting of preformed by pelleting and subsequent artificial drying of starch pellets. Drying tests were conducted in an experimental chamber by varying the air temperature, relative humidity, air velocity and product load. The specific heat of starch was determined by differential scanning calorimetry. The generated equations were validated through regression analysis, finding an appropriate correlation of the data, which indicates that by using these equations, can accurately model and simulate the drying process of cassava starch pellets.
Resumo:
ABSTRACT The successful in the implementation of wind turbines depends on several factors, including: the wind resource at the installation site, the equipment used, project acquisition and operational costs. In this paper, the production of electricity from two small wind turbines was compared through simulation using the computer software HOMER - a national model of 6kW and an imported one of 5kW. The wind resources in three different cities were considered: Campinas (SP/BR), Cubatão (São Paulo/BR) and Roscoe (Texas/ USA). A wind power system connected to the grid and a wind isolated system - batteries were evaluated. The results showed that the energy cost ($/kWh) is strongly dependent on the windmill characteristics and local wind resource. Regarding the isolated wind system – batteries, the full supply guarantee to the simulated electrical load is only achieved with a battery bank with many units and high number of wind turbines, due to the intermittency of wind power.
Resumo:
The role of genetic factors in the pathogenesis of Alzheimer’s disease (AD) is not completely understood. In order to improve this understanding, the cerebral glucose metabolism of seven monozygotic and nine dizygotic twin pairs discordant for AD was compared to that of 13 unrelated controls using positron emission tomography (PET). Traditional region of interest analysis revealed no differences between the non-demented dizygotic co-twins and controls. In contrast, in voxel-level and automated region of interest analyses, the non-demented monozygotic co-twins displayed a lower metabolic rate in temporal and parietal cortices as well as in subcortical grey matter structures when compared to controls. Again, no reductions were seen in the non-demented dizygotic co-twins. The reductions seen in the non-demented monozygotic co-twins may indicate a higher genetically mediated risk of AD or genetically mediated hypometabolism possibly rendering them more vulnerable to AD pathogenesis. With no disease modifying treatment available for AD, prevention of dementia is of the utmost importance. A total of 2 165 at least 65 years old twins of the Finnish Twin Cohort with questionnaire data from 1981 participated in a validated telephone interview assessing cognitive function between 1999 and 2007. Those subjects reporting heavy alcohol drinking in 1981 had an elevated cognitive impairment risk over 20 years later compared to light drinkers. In addition, binge drinking was associated with an increased risk even when total alcohol consumption was controlled for, suggesting that binge drinking is an independent risk factor for cognitive impairment. When compared to light drinkers, also non-drinkers had an increased risk of cognitive impairment. Midlife hypertension, obesity and low leisure time physical activity but not hypercholesterolemia were significant risk factors for cognitive impairment. The accumulation of risk factors increased cognitive impairment risk in an additive manner. A previously postulated dementia risk score based on midlife demographic and cardiovascular factors was validated. The risk score was found to well predict cognitive impairment risk, and cognitive impairment risk increased significantly as the score became higher. However, the risk score is not accurate enough for use in the clinic without further testing.
Resumo:
The survival of preterm born infants has increased but the prevalence of long-term morbidities has still remained high. Preterm born children are at an increased risk for various developmental impairments including both severe neurological deficits as well as deficits in cognitive development. According to the literature the developmental outcome perspective differs between countries, centers, and eras. Definitions of preterm infant vary between studies, and the follow-up has been carried out with diverse methods making the comparison less reliable. It is essential to offer parents upto-date information about the outcome of preterm infants born in the same area. A centralized follow-up of children at risk makes it possible to monitor the consequences of changes in the treatment practices of hospitals on developmental outcome. This thesis is part of a larger regional, prospective multidisciplinary follow-up project entitled “Development and Functioning of Very Low Birth Weight Infants from Infancy to School Age” (PIeniPAinoisten RIskilasten käyttäytyminen ja toimintakyky imeväisiästä kouluikään, PIPARI). The thesis consists of four original studies that present data of very low birth weight (VLBW) infants born between 2001 and 2006, who are followed up from the neonatal period until the age of five years. The main outcome measure was cognitive development and secondary outcomes were significant neurological deficits (cerebral palsy, CP, deafness, and blindness). In Study I, the early crying and fussing behavior of preterm infants was studied using parental diaries, and the relation of crying behavior and cognitive and motor development at the age of two years was assessed. In Study II, the developmental outcome (cognitive, CP, deafness, and blindness) at the age of two years was studied in relation to demographic, antenatal, neonatal, and brain imaging data. Development was studied in relationship to a full-term born control group born in the same hospital. In Study III, the stability of cognitive development was studied in VLBW and full-term groups by comparing the outcomes at the ages of two and five years. Finally, in Study IV the precursors of reading skills (phonological processing, rapid automatized naming, and letter knowledge) were assessed for VLBW and full-term children at the age of five years. Pre-reading skills were studied in relation to demographic, antenatal, neonatal, and brain imaging data. The main findings of the thesis were that VLBW infants who fussed or cried more in the infancy were not at greater risk for problems in their cognitive development. However, crying was associated with poorer motor development. The developmental outcome of the present population was better that has been reported earlier and this improvement covered also cognitive development. However, the difference to fullterm born peers was still significant. Major brain pathology and intestinal perforation were independent significant risk factors for adverse outcome, also when several individual risk factors were controlled for. Cognitive development at the age of two years was strongly related with development at the age of five years, stressing the importance of the early assessment, and the possibility for early interventions. Finally, VLBW children had poorer pre-reading skills compared with their full-term born peers, but the IQ was an important mediator even when children with mental retardation were excluded from the analysis. The findings suggest that counseling parents about the developmental perspectives of their preterm infant should be based on data covering the same birth hospital. Neonatal brain imaging data and neonatal morbidity are important predictors for developmental outcome. The findings of the present study stress the importance of both short-term (two years) and long-term (five years) follow-ups for the individual, and for improving the quality of care.
Resumo:
The aim of this dissertation is to investigate if participation in business simulation gaming sessions can make different leadership styles visible and provide students with experiences beneficial for the development of leadership skills. Particularly, the focus is to describe the development of leadership styles when leading virtual teams in computer-supported collaborative game settings and to identify the outcomes of using computer simulation games as leadership training tools. To answer to the objectives of the study, three empirical experiments were conducted to explore if participation in business simulation gaming sessions (Study I and II), which integrate face-to-face and virtual communication (Study III and IV), can make different leadership styles visible and provide students with experiences beneficial for the development of leadership skills. In the first experiment, a group of multicultural graduate business students (N=41) participated in gaming sessions with a computerized business simulation game (Study III). In the second experiment, a group of graduate students (N=9) participated in the training with a ‘real estate’ computer game (Study I and II). In the third experiment, a business simulation gaming session was organized for graduate students group (N=26) and the participants played the simulation game in virtual teams, which were organizationally and geographically dispersed but connected via technology (Study IV). Each team in all experiments had three to four students and students were between 22 and 25 years old. The business computer games used for the empirical experiments presented an enormous number of complex operations in which a team leader needed to make the final decisions involved in leading the team to win the game. These gaming environments were interactive;; participants interacted by solving the given tasks in the game. Thus, strategy and appropriate leadership were needed to be successful. The training was competition-based and required implementation of leadership skills. The data of these studies consist of observations, participants’ reflective essays written after the gaming sessions, pre- and post-tests questionnaires and participants’ answers to open- ended questions. Participants’ interactions and collaboration were observed when they played the computer games. The transcripts of notes from observations and students dialogs were coded in terms of transactional, transformational, heroic and post-heroic leadership styles. For the data analysis of the transcribed notes from observations, content analysis and discourse analysis was implemented. The Multifactor Leadership Questionnaire (MLQ) was also utilized in the study to measure transformational and transactional leadership styles;; in addition, quantitative (one-way repeated measures ANOVA) and qualitative data analyses have been performed. The results of this study indicate that in the business simulation gaming environment, certain leadership characteristics emerged spontaneously. Experiences about leadership varied between the teams and were dependent on the role individual students had in their team. These four studies showed that simulation gaming environment has the potential to be used in higher education to exercise the leadership styles relevant in real-world work contexts. Further, the study indicated that given debriefing sessions, the simulation game context has much potential to benefit learning. The participants who showed interest in leadership roles were given the opportunity of developing leadership skills in practice. The study also provides evidence of unpredictable situations that participants can experience and learn from during the gaming sessions. The study illustrates the complex nature of experiences from the gaming environments and the need for the team leader and role divisions during the gaming sessions. It could be concluded that the experience of simulation game training illustrated the complexity of real life situations and provided participants with the challenges of virtual leadership experiences and the difficulties of using leadership styles in practice. As a result, the study offers playing computer simulation games in small teams as one way to exercise leadership styles in practice.
Resumo:
Transportation and warehousing are large and growing sectors in the society, and their efficiency is of high importance. Transportation also has a large share of global carbondioxide emissions, which are one the leading causes of anthropogenic climate warming. Various countries have agreed to decrease their carbon emissions according to the Kyoto protocol. Transportation is the only sector where emissions have steadily increased since the 1990s, which highlights the importance of transportation efficiency. The efficiency of transportation and warehousing can be improved with the help of simulations, but models alone are not sufficient. This research concentrates on the use of simulations in decision support systems. Three main simulation approaches are used in logistics: discrete-event simulation, systems dynamics, and agent-based modeling. However, individual simulation approaches have weaknesses of their own. Hybridization (combining two or more approaches) can improve the quality of the models, as it allows using a different method to overcome the weakness of one method. It is important to choose the correct approach (or a combination of approaches) when modeling transportation and warehousing issues. If an inappropriate method is chosen (this can occur if the modeler is proficient in only one approach or the model specification is not conducted thoroughly), the simulation model will have an inaccurate structure, which in turn will lead to misleading results. This issue can further escalate, as the decision-maker may assume that the presented simulation model gives the most useful results available, even though the whole model can be based on a poorly chosen structure. In this research it is argued that simulation- based decision support systems need to take various issues into account to make a functioning decision support system. The actual simulation model can be constructed using any (or multiple) approach, it can be combined with different optimization modules, and there needs to be a proper interface between the model and the user. These issues are presented in a framework, which simulation modelers can use when creating decision support systems. In order for decision-makers to fully benefit from the simulations, the user interface needs to clearly separate the model and the user, but at the same time, the user needs to be able to run the appropriate runs in order to analyze the problems correctly. This study recommends that simulation modelers should start to transfer their tacit knowledge to explicit knowledge. This would greatly benefit the whole simulation community and improve the quality of simulation-based decision support systems as well. More studies should also be conducted by using hybrid models and integrating simulations with Graphical Information Systems.
Resumo:
Preattentive perception of occasional deviating stimuli in the stream of standard stimuli can be recorded with cognitive event-related potential (ERP) mismatch negativity (MMN). The earlier detection of stimuli at the auditory cortex can be examined with N1 and P2 ERPs. The MMN recording does not require co-operation, it correlates with perceptual threshold, and even complex sounds can be used as stimuli. The aim of this study was to examine different aspects that should be considered when measuring discrimination of hearing with ERPs. The MMN was found to be stimulusintensity- dependent. As the intensity of sine wave stimuli was increased from 40 to 80 dB HL, MMN mean amplitudes increased. The effect of stimulus frequency on the MMN was studied so that the pitch difference would be equal in each stimulus block according to the psychophysiological mel scale or the difference limen of frequency (DLF). However, the blocks differed from each other. The contralateral white noise masking (50 dB EML) was found to attenuate the MMN amplitude when the right ear was stimulated. The N1 amplitude was attenuated and, in contrast, P2 amplitude was not affected by contralateral white noise masking. The perception and production of vowels by four postlingually deafened patients with a cochlear implant were studied. The MMN response could be elicited in the patient with the best vowel perception abilities. The results of the studies show that concerning the MMN recordings, the stimulus parameters and recording procedure design have a great influence on the results.
Resumo:
Combating climate change is one of the key tasks of humanity in the 21st century. One of the leading causes is carbon dioxide emissions due to usage of fossil fuels. Renewable energy sources should be used instead of relying on oil, gas, and coal. In Finland a significant amount of energy is produced using wood. The usage of wood chips is expected to increase in the future significantly, over 60 %. The aim of this research is to improve understanding over the costs of wood chip supply chains. This is conducted by utilizing simulation as the main research method. The simulation model utilizes both agent-based modelling and discrete event simulation to imitate the wood chip supply chain. This thesis concentrates on the usage of simulation based decision support systems in strategic decision-making. The simulation model is part of a decision support system, which connects the simulation model to databases but also provides a graphical user interface for the decisionmaker. The main analysis conducted with the decision support system concentrates on comparing a traditional supply chain to a supply chain utilizing specialized containers. According to the analysis, the container supply chain is able to have smaller costs than the traditional supply chain. Also, a container supply chain can be more easily scaled up due to faster emptying operations. Initially the container operations would only supply part of the fuel needs of a power plant and it would complement the current supply chain. The model can be expanded to include intermodal supply chains as due to increased demand in the future there is not enough wood chips located close to current and future power plants.
Resumo:
Computational model-based simulation methods were developed for the modelling of bioaffinity assays. Bioaffinity-based methods are widely used to quantify a biological substance in biological research, development and in routine clinical in vitro diagnostics. Bioaffinity assays are based on the high affinity and structural specificity between the binding biomolecules. The simulation methods developed are based on the mechanistic assay model, which relies on the chemical reaction kinetics and describes the forming of a bound component as a function of time from the initial binding interaction. The simulation methods were focused on studying the behaviour and the reliability of bioaffinity assay and the possibilities the modelling methods of binding reaction kinetics provide, such as predicting assay results even before the binding reaction has reached equilibrium. For example, a rapid quantitative result from a clinical bioaffinity assay sample can be very significant, e.g. even the smallest elevation of a heart muscle marker reveals a cardiac injury. The simulation methods were used to identify critical error factors in rapid bioaffinity assays. A new kinetic calibration method was developed to calibrate a measurement system by kinetic measurement data utilizing only one standard concentration. A nodebased method was developed to model multi-component binding reactions, which have been a challenge to traditional numerical methods. The node-method was also used to model protein adsorption as an example of nonspecific binding of biomolecules. These methods have been compared with the experimental data from practice and can be utilized in in vitro diagnostics, drug discovery and in medical imaging.
Resumo:
The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.
Resumo:
Modern machine structures are often fabricated by welding. From a fatigue point of view, the structural details and especially, the welded details are the most prone to fatigue damage and failure. Design against fatigue requires information on the fatigue resistance of a structure’s critical details and the stress loads that act on each detail. Even though, dynamic simulation of flexible bodies is already current method for analyzing structures, obtaining the stress history of a structural detail during dynamic simulation is a challenging task; especially when the detail has a complex geometry. In particular, analyzing the stress history of every structural detail within a single finite element model can be overwhelming since the amount of nodal degrees of freedom needed in the model may require an impractical amount of computational effort. The purpose of computer simulation is to reduce amount of prototypes and speed up the product development process. Also, to take operator influence into account, real time models, i.e. simplified and computationally efficient models are required. This in turn, requires stress computation to be efficient if it will be performed during dynamic simulation. The research looks back at the theoretical background of multibody dynamic simulation and finite element method to find suitable parts to form a new approach for efficient stress calculation. This study proposes that, the problem of stress calculation during dynamic simulation can be greatly simplified by using a combination of floating frame of reference formulation with modal superposition and a sub-modeling approach. In practice, the proposed approach can be used to efficiently generate the relevant fatigue assessment stress history for a structural detail during or after dynamic simulation. In this work numerical examples are presented to demonstrate the proposed approach in practice. The results show that approach is applicable and can be used as proposed.