862 resultados para Computer simulation software
Resumo:
We apply Agent-Based Modeling and Simulation (ABMS) to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents do offer potential for developing organizational capabilities in the future. Our multi-disciplinary research team has worked with a UK department store to collect data and capture perceptions about operations from actors within departments. Based on this case study work, we have built a simulator that we present in this paper. We then use the simulator to gather empirical evidence regarding two specific management practices: empowerment and employee development.
Resumo:
One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.
Resumo:
Following a drop in estrogen in the period of menopause some women begin to lose bone mass more than 1% per year reaching the end of five years with loss greater than 25%. In this regard, factors such as older age, low calcium intake and premature menopause favor the onset of osteoporosis. Preventive methods such as nutritional counseling to a proper diet and the support of technology through applications that assess dietary intake are essential. Thus, this study aimed to develop an application for Android® platform focused on the evaluation of nutritional and organic conditions involved in bone health and risks for developing osteoporosis in postmenopausal women. To achieve this goal we proceeded to a study of 72 women aged 46-79 years, from the physical exercise for bone health of the Laboratory for Research in Biochemistry and Densitometry the Federal Technological University of Paraná program. Data were collected in the second half of 2014 through tests Bone Densitometry and Body Composition, Blood Tests, Anthropometric data and Nutrition Assessment. The study included women with a current diagnosis of osteopenia or osteoporosis primary, aged more than 45 years postmenopausal. For the assessment of bone mineral density and body composition used the device Absorptiometry Dual Energy X-ray (DXA) brand Hologic Discovery TM Model A. For anthropometric assessment was included to body mass, height, abdominal circumference, Waist circumference and hip circumference. The instrument for assessing food consumption was used Recall 24 hours a day (24HR). The estimated intake of energy and nutrients was carried from the tabulation of the food eaten in the Software Diet Pro 4®. In a sub sample of 30 women with osteopenia / osteoporosis serum calcium and alkaline phosphatase tests were performed. The results demonstrated a group of women (n = 30) average calcium intake of 570mg / day (± 340). The analysis showed a mean serum calcium within the normal range (10,20mg / dl ± 0.32) and average values and slightly increased alkaline phosphatase (105.40 U / L ± 23.70). Furthermore, there was a significant correlation between the consumption of protein and the optimal daily intake of calcium (0.375 p-value 0.05). Based on these findings, we developed an application early stage in Android® platform operating system Google®, being called OsteoNutri. We chose to use Java Eclipse® where it was executed Android® version of the project; choice of application icons and setting the visual editor for building the application layouts. The DroidDraw® was used for development of the three application GUIs. For practical tests we used a cell compatible with the version that was created (4.4 or higher). The prototype was developed in conjunction with the Group and Instrumentation Applications Development (GDAI) of the Federal Technological University of Paraná. So this application can be considered an important tool in dietary control, allowing closer control consumption of calcium and dietary proteins.
Resumo:
Electrical neuromodulation of lumbar segments improves motor control after spinal cord injury in animal models and humans. However, the physiological principles underlying the effect of this intervention remain poorly understood, which has limited the therapeutic approach to continuous stimulation applied to restricted spinal cord locations. Here we developed stimulation protocols that reproduce the natural dynamics of motoneuron activation during locomotion. For this, we computed the spatiotemporal activation pattern of muscle synergies during locomotion in healthy rats. Computer simulations identified optimal electrode locations to target each synergy through the recruitment of proprioceptive feedback circuits. This framework steered the design of spatially selective spinal implants and real-time control software that modulate extensor and flexor synergies with precise temporal resolution. Spatiotemporal neuromodulation therapies improved gait quality, weight-bearing capacity, endurance and skilled locomotion in several rodent models of spinal cord injury. These new concepts are directly translatable to strategies to improve motor control in humans.
Resumo:
Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used
Resumo:
Oil wells subjected to cyclic steam injection present important challenges for the development of well cementing systems, mainly due to tensile stresses caused by thermal gradients during its useful life. Cement sheath failures in wells using conventional high compressive strength systems lead to the use of cement systems that are more flexible and/or ductile, with emphasis on Portland cement systems with latex addition. Recent research efforts have presented geopolymeric systems as alternatives. These cementing systems are based on alkaline activation of amorphous aluminosilicates such as metakaolin or fly ash and display advantageous properties such as high compressive strength, fast setting and thermal stability. Basic geopolymeric formulations can be found in the literature, which meet basic oil industry specifications such as rheology, compressive strength and thickening time. In this work, new geopolymeric formulations were developed, based on metakaolin, potassium silicate, potassium hydroxide, silica fume and mineral fiber, using the state of the art in chemical composition, mixture modeling and additivation to optimize the most relevant properties for oil well cementing. Starting from molar ratios considered ideal in the literature (SiO2/Al2O3 = 3.8 e K2O/Al2O3 = 1.0), a study of dry mixtures was performed,based on the compressive packing model, resulting in an optimal volume of 6% for the added solid material. This material (silica fume and mineral fiber) works both as an additional silica source (in the case of silica fume) and as mechanical reinforcement, especially in the case of mineral fiber, which incremented the tensile strength. The first triaxial mechanical study of this class of materials was performed. For comparison, a mechanical study of conventional latex-based cementing systems was also carried out. Regardless of differences in the failure mode (brittle for geopolymers, ductile for latex-based systems), the superior uniaxial compressive strength (37 MPa for the geopolymeric slurry P5 versus 18 MPa for the conventional slurry P2), similar triaxial behavior (friction angle 21° for P5 and P2) and lower stifness (in the elastic region 5.1 GPa for P5 versus 6.8 GPa for P2) of the geopolymeric systems allowed them to withstand a similar amount of mechanical energy (155 kJ/m3 for P5 versus 208 kJ/m3 for P2), noting that geopolymers work in the elastic regime, without the microcracking present in the case of latex-based systems. Therefore, the geopolymers studied on this work must be designed for application in the elastic region to avoid brittle failure. Finally, the tensile strength of geopolymers is originally poor (1.3 MPa for the geopolymeric slurry P3) due to its brittle structure. However, after additivation with mineral fiber, the tensile strength became equivalent to that of latex-based systems (2.3 MPa for P5 and 2.1 MPa for P2). The technical viability of conventional and proposed formulations was evaluated for the whole well life, including stresses due to cyclic steam injection. This analysis was performed using finite element-based simulation software. It was verified that conventional slurries are viable up to 204ºF (400ºC) and geopolymeric slurries are viable above 500ºF (260ºC)
Resumo:
In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.
Resumo:
A primary goal of this dissertation is to understand the links between mathematical models that describe crystal surfaces at three fundamental length scales: The scale of individual atoms, the scale of collections of atoms forming crystal defects, and macroscopic scale. Characterizing connections between different classes of models is a critical task for gaining insight into the physics they describe, a long-standing objective in applied analysis, and also highly relevant in engineering applications. The key concept I use in each problem addressed in this thesis is coarse graining, which is a strategy for connecting fine representations or models with coarser representations. Often this idea is invoked to reduce a large discrete system to an appropriate continuum description, e.g. individual particles are represented by a continuous density. While there is no general theory of coarse graining, one closely related mathematical approach is asymptotic analysis, i.e. the description of limiting behavior as some parameter becomes very large or very small. In the case of crystalline solids, it is natural to consider cases where the number of particles is large or where the lattice spacing is small. Limits such as these often make explicit the nature of links between models capturing different scales, and, once established, provide a means of improving our understanding, or the models themselves. Finding appropriate variables whose limits illustrate the important connections between models is no easy task, however. This is one area where computer simulation is extremely helpful, as it allows us to see the results of complex dynamics and gather clues regarding the roles of different physical quantities. On the other hand, connections between models enable the development of novel multiscale computational schemes, so understanding can assist computation and vice versa. Some of these ideas are demonstrated in this thesis. The important outcomes of this thesis include: (1) a systematic derivation of the step-flow model of Burton, Cabrera, and Frank, with corrections, from an atomistic solid-on-solid-type models in 1+1 dimensions; (2) the inclusion of an atomistically motivated transport mechanism in an island dynamics model allowing for a more detailed account of mound evolution; and (3) the development of a hybrid discrete-continuum scheme for simulating the relaxation of a faceted crystal mound. Central to all of these modeling and simulation efforts is the presence of steps composed of individual layers of atoms on vicinal crystal surfaces. Consequently, a recurring theme in this research is the observation that mesoscale defects play a crucial role in crystal morphological evolution.
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
The assessment of building thermal performance is often carried out using HVAC energy consumption data, when available, or thermal comfort variables measurements, for free-running buildings. Both types of data can be determined by monitoring or computer simulation. The assessment based on thermal comfort variables is the most complex because it depends on the determination of the thermal comfort zone. For these reasons, this master thesis explores methods of building thermal performance assessment using variables of thermal comfort simulated by DesignBuilder software. The main objective is to contribute to the development of methods to support architectural decisions during the design process, and energy and sustainable rating systems. The research method consists on selecting thermal comfort methods, modeling them in electronic sheets with output charts developed to optimize the analyses, which are used to assess the simulation results of low cost house configurations. The house models consist in a base case, which are already built, and changes in thermal transmittance, absorptance, and shading. The simulation results are assessed using each thermal comfort method, to identify the sensitivity of them. The final results show the limitations of the methods, the importance of a method that considers thermal radiance and wind speed, and the contribution of the chart proposed
Resumo:
The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design
Resumo:
The method "toe-to-heel air injection" (THAITM) is a process of enhanced oil recovery, which is the integration of in-situ combustion with technological advances in drilling horizontal wells. This method uses horizontal wells as producers of oil, keeping vertical injection wells to inject air. This process has not yet been applied in Brazil, making it necessary, evaluation of these new technologies applied to local realities, therefore, this study aimed to perform a parametric study of the combustion process with in-situ oil production in horizontal wells, using a semi synthetic reservoir, with characteristics of the Brazilian Northeast basin. The simulations were performed in a commercial software "STARS" (Steam, Thermal, and Advanced Processes Reservoir Simulator), from CMG (Computer Modelling Group). The following operating parameters were analyzed: air rate, configuration of producer wells and oxygen concentration. A sensitivity study on cumulative oil (Np) was performed with the technique of experimental design, with a mixed model of two and three levels (32x22), a total of 36 runs. Also, it was done a technical economic estimative for each model of fluid. The results showed that injection rate was the most influence parameter on oil recovery, for both studied models, well arrangement depends on fluid model, and oxygen concentration favors recovery oil. The process can be profitable depends on air rate
Resumo:
Artist David Lyons and computer scientist David Flatla work collaboratively to create art that intentionally targets audiences of varying visual abilities mediated through smart device interfaces. Conceived as an investigation into theories and practices of visual perception, they explore the idea that artwork can be intentionally created to be experienced differently dependent on one’s visual abilities. They have created motion graphics and supporting recolouring and colour vision deficiency (CVD) simulation software. Some of the motion graphics communicate details specifically to those with colour blindness/CVD by containing moving imagery only seen by those with CVD. Others will contain moving images that those with typical colour vision can experience but appear to be unchanging to people with CVD. All the artwork is revealed for both audiences through the use of specially programmed smart devices, fitted with augmented reality recolouring and CVD simulation software. The visual elements come from various sources, including the Ishihara Colour Blind Test, movie marques, and game shows. The software created reflects the perceptual capabilities of most individuals with reduced colour vision. The development of the simulation software and the motion graphic series are examined and discussed from both computer science and artistic positions.
Resumo:
Background: Intensified selection of polled individuals has recently gained importance in predominantly horned dairy cattle breeds as an alternative to routine dehorning. The status quo of the current polled breeding pool of genetically-closely related artificial insemination sires with lower breeding values for performance traits raises questions regarding the effects of intensified selection based on this founder pool. Methods: We developed a stochastic simulation framework that combines the stochastic simulation software QMSim and a self-designed R program named QUALsim that acts as an external extension. Two traits were simulated in a dairy cattle population for 25 generations: one quantitative (QMSim) and one qualitative trait with Mendelian inheritance (i.e. polledness, QUALsim). The assignment scheme for qualitative trait genotypes initiated realistic initial breeding situations regarding allele frequencies, true breeding values for the quantitative trait and genetic relatedness. Intensified selection for polled cattle was achieved using an approach that weights estimated breeding values in the animal best linear unbiased prediction model for the quantitative trait depending on genotypes or phenotypes for the polled trait with a user-defined weighting factor. Results: Selection response for the polled trait was highest in the selection scheme based on genotypes. Selection based on phenotypes led to significantly lower allele frequencies for polled. The male selection path played a significantly greater role for a fast dissemination of polled alleles compared to female selection strategies. Fixation of the polled allele implies selection based on polled genotypes among males. In comparison to a base breeding scenario that does not take polledness into account, intensive selection for polled substantially reduced genetic gain for this quantitative trait after 25 generations. Reducing selection intensity for polled males while maintaining strong selection intensity among females, simultaneously decreased losses in genetic gain and achieved a final allele frequency of 0.93 for polled. Conclusions: A fast transition to a completely polled population through intensified selection for polled was in contradiction to the preservation of high genetic gain for the quantitative trait. Selection on male polled genotypes with moderate weighting, and selection on female polled phenotypes with high weighting, could be a suitable compromise regarding all important breeding aspects.
Resumo:
In solid rocket motors, the absence of combustion controllability and the large amount of financial resources involved in full-scale firing tests, increase the importance of numerical simulations in order to asses stringent mission thrust requirements and evaluate the influence of thrust chamber phenomena affecting the grain combustion. Among those phenomena, grain local defects (propellant casting inclusions and debondings), combustion heat accumulation involving pressure peaks (Friedman Curl effect), and case-insulating thermal protection material ablation affect thrust prediction in terms of not negligible deviations with respect to the nominal expected trace. Most of the recent models have proposed a simplified treatment to the problem using empirical corrective functions, with the disadvantages of not fully understanding the physical dynamics and thus of not obtaining predictive results for different configurations of solid rocket motors in a boundary conditions-varied scenario. This work is aimed to introduce different mathematical approaches to model, analyze, and predict the abovementioned phenomena, presenting a detailed physical interpretation based on existing SRMs configurations. Internal ballistics predictions are obtained with an in-house simulation software, where the adoption of a dynamic three-dimensional triangular mesh together with advanced computer graphics methods, allows the previous target to be reached. Numerical procedures are explained in detail. Simulation results are carried out and discussed based on experimental data.