831 resultados para computational complexity
Resumo:
Traditional econometric approaches in modeling the dynamics of equity and commodity markets, have, made great progress in the past decades. However, they assume rationality among the economic agents and and do not capture the dynamics that produce extreme events (black swans), due to deviation from the rationality assumption. The purpose of this study is to simulate the dynamics of silver markets by using the novel computational market dynamics approach. To this end, the daily data from the period of 1st March 2000 to 1st March 2013 of closing prices of spot silver prices has been simulated with the Jabłonska-Capasso-Morale(JCM) model. The Maximum Likelihood approach has been employed to calibrate the acquired data with JCM. Statistical analysis of the simulated series with respect to the actual one has been conducted to evaluate model performance. The model captures the animal spirits dynamics present in the data under evaluation well.
Resumo:
Kalman filter is a recursive mathematical power tool that plays an increasingly vital role in innumerable fields of study. The filter has been put to service in a multitude of studies involving both time series modelling and financial time series modelling. Modelling time series data in Computational Market Dynamics (CMD) can be accomplished using the Jablonska-Capasso-Morale (JCM) model. Maximum likelihood approach has always been utilised to estimate the parameters of the JCM model. The purpose of this study is to discover if the Kalman filter can be effectively utilized in CMD. Ensemble Kalman filter (EnKF), with 50 ensemble members, applied to US sugar prices spanning the period of January, 1960 to February, 2012 was employed for this work. The real data and Kalman filter trajectories showed no significant discrepancies, hence indicating satisfactory performance of the technique. Since only US sugar prices were utilized, it would be interesting to discover the nature of results if other data sets are employed.
Resumo:
A support ring of AISI 304L stainless steel that holds vertical, parallel wires arranged in a circle forming a cylinder is studied. The wires are attached to the ring with heat-induced shrinkage. When the ring is heated with a torch the heat affected zone tries to expand while the adjacent cool structure obstructs the expansion causing upsetting. During cooling, the ring shrinks smaller than its original size clamping the wires. The most important requirement for the ring is that it should be as round as possible and the deformations should occur as overall shrinkage in the ring diameter. A three-dimensional nonlinear transient sequential thermo-structural Abaqus model is used together with a Fortran code that enters the heat flux to each affected element. The local and overall deformations in one ring inflicted by the heating are studied with a small amount of inspection on residual stresses. A variety of different cases are chosen to be studied with the model constructed to provide directional knowledge; torch flux with the means of speed, location of the wires, heating location and structural factors. The decrease of heating speed increases heat flux that rises the temperature increasing shrinkage. In a single progressive heating uneven distribution of shrinkage appears to the start/end region that can be partially fixed with using speeded heating’s to strengthen the heating of that region. Location of the wires affect greatly to the caused shrinkage unlike heating location. The ring structure affects also greatly to the shrinkage; smaller diameter, bigger ring height, thinner thickness and greater number of wires increase shrinkage.
Resumo:
In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.
Resumo:
Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB) spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.
Resumo:
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
Resumo:
Bearing performance signi cantly a ects the dynamic behaviors and estimated working life of a rotating system. A common bearing type is the ball bearing, which has been under investigation in numerous published studies. The complexity of the ball bearing models described in the literature varies. Naturally, model complexity is related to computational burden. In particular, the inclusion of centrifugal forces and gyroscopic moments signi cantly increases the system degrees of freedom and lengthens solution time. On the other hand, for low or moderate rotating speeds, these e ects can be neglected without signi cant loss of accuracy. The objective of this paper is to present guidelines for the appropriate selection of a suitable bearing model for three case studies. To this end, two ball bearing models were implemented. One considers high-speed forces, and the other neglects them. Both models were used to study a three structures, and the simulation results were.
Resumo:
In-package pasteurization is the most used method for beer microbiological stabilization. The search for safer and better quality food has created a need to better understand the processes involved in producing it. However, little is known about the temperature and velocity profiles during the thermal processes of liquid foods in commercial packaging, which results in over-dimensioned processes to guarantee safety, decreasing the sensorial and nutritional characteristics of the product and increasing process costs. Simulations using Computational Fluid-Dynamics (CFD) have been used by various authors to evaluate those processes. The objective of the present paper was to evaluate the effect of packaging orientation in the pasteurization of beer in a commercial aluminum can using CFD. A heating process was simulated at 60 ºC up to 15 PUs (a conventional beer process, in which 1 Pasteurization Unit (PU) is equivalent to 1minute at 60 ºC). The temperature profile and convection current velocity along the process and the variation of the PUs were evaluated in relation to time considering the cans in the conventional, inverted, and horizontal positions. The temperature and velocity profiles were similar to those presented in the literature. The package position did not result in process improvement.
Resumo:
Food processes must ensure safety and high-quality products for a growing demand consumer creating the need for better knowledge of its unit operations. The Computational Fluid Dynamics (CFD) has been widely used for better understanding the food thermal processes, and it is one of the safest and most frequently used methods for food preservation. However, there is no single study in the literature describing thermal process of liquid foods in a brick shaped package. The present study evaluated such process and the influence of its orientation on the process lethality. It demonstrated the potential of using CFD to evaluate thermal processes of liquid foods and the importance of rheological characterization and convection in thermal processing of liquid foods. It also showed that packaging orientation does not result in different sterilization values during thermal process of the evaluated fluids in the brick shaped package.
Resumo:
Gravitational phase separation is a common unit operation found in most large-scale chemical processes. The need for phase separation can arise e.g. from product purification or protection of downstream equipment. In gravitational phase separation, the phases separate without the application of an external force. This is achieved in vessels where the flow velocity is lowered substantially compared to pipe flow. If the velocity is low enough, the denser phase settles towards the bottom of the vessel while the lighter phase rises. To find optimal configurations for gravitational phase separator vessels, several different geometrical and internal design features were evaluated based on simulations using OpenFOAM computational fluid dynamics (CFD) software. The studied features included inlet distributors, vessel dimensions, demister configurations and gas phase outlet configurations. Simulations were conducted as single phase steady state calculations. For comparison, additional simulations were performed as dynamic single and two-phase calculations. The steady state single phase calculations provided indications on preferred configurations for most above mentioned features. The results of the dynamic simulations supported the utilization of the computationally faster steady state model as a practical engineering tool. However, the two-phase model provides more truthful results especially with flows where a single phase does not determine the flow characteristics.
Resumo:
The last two decades have provided a vast opportunity to live and explore the compulsive imaginary world or virtual world through massively multiplayer online role-playing games (MMORPGs). MMORPG gives a wide range of opportunities to its users to participate with multi-players on the same platform, to communicate and to do real time actions. There is a virtual economy in these games which is largely player-driven. In-game currency provides its users to build up their Avatars, to buy or sell the necessary goods to play, survive in the games and so on. As a part of virtual economies generated through EVE Online, this thesis mainly focuses on how the prices of the minerals in EVE Online behave by applying the Jabłonska- Capasso-Morale (JCM) mathematical simulation model. It is to verify up to what degree the model can reproduce the virtual economy behavior. The model is applied to buy and sell prices of two minerals namely, isogen and morphite. The simulation results demonstrate that JCM model ts reasonably well to the mineral prices, which lets us conclude that virtual economies behave similarly to the real ones.
Resumo:
Complexity and constructivism in economics. This paper attempts to show and summarize the concept of rules, order and complexity introduced around the mid-twentieth century by Friedrich August von Hayek. It also attempts to create a current parallel between those concepts and the field of complexity economics. At the time of his writings, the author sought to present arguments against the Cartesian rationality. Nowadays, the concepts presented by him could also serve as arguments against the way of thought used in mainstream microeconomics. A debate can now be seen between the mainstream microeconomics and the authors of the complexity theory applied to the economy, which can be understood as explanations guided by generic assumptions versus natural explanations guided in a computational approach.
Resumo:
The effects of a complexly worded counterattitudinal appeal on laypeople's attitudes toward a legal issue were examined, using the Elaboration Likelihood Model (ELM) of persuasion as a theoretical framework. This model states that persuasion can result from the elaboration and scrutiny of the message arguments (i.e., central route processing), or can result from less cognitively effortful strategies, such as relying on source characteristics as a cue to message validity (i.e., peripheral route processing). One hundred and sixty-seven undergraduates (85 men and 81 women) listened to eitller a low status or high status source deliver a counterattitudinal speech on a legal issue. The speech was designed to contain strong or weak arguments. These arguments were 'worded in a simple and, therefore, easy to comprehend manner, or in a complex and, therefore, difficult to comprehend manner. Thus, there were three experimental manipulations: argument comprehensibility (easy to comprehend vs. difficult to comprehend), argumel11 strength (weak vs. strong), and source status (low vs. high). After listening to tIle speec.J] participants completed a measure 'of their attitude toward the legal issue, a thought listil1g task, an argument recall task,manipulation checks, measures of motivation to process the message, and measures of mood. As a result of the failure of the argument strength manipulation, only the effects of the comprehel1sibility and source status manipulations were tested. There was, however, some evidence of more central route processing in the easy comprehension condition than in the difficult comprehension condition, as predicted. Significant correlations were found between attitude and favourable and unfavourable thoughts about the legal issue with easy to comprehend arguments; whereas, there was a correlation only between attitude and favourable thoughts 11 toward the issue with difficult to comprehend arguments, suggesting, perhaps, that central route processing, \vhich involves argument scrutiny and elaboration, occurred under conditions of easy comprehension to a greater extent than under conditions of difficult comprehension. The results also revealed, among other findings, several significant effects of gender. Men had more favourable attitudes toward the legal issue than did women, men recalled more arguments from the speech than did women, men were less frustrated while listening to the speech than were ,vomen, and men put more effort into thinking about the message arguments than did women. When the arguments were difficult to comprehend, men had more favourable thoughts and fewer unfavourable thoughts about the legal issue than did women. Men and women may have had different affective responses to the issue of plea bargaining (with women responding more negatively than men), especially in light of a local and controversial plea bargain that occurred around the time of this study. Such pre-existing gender differences may have led to tIle lower frustration, the greater effort, the greater recall, and more positive attitudes for men than for WOlnen. Results· from this study suggest that current cognitive models of persuasion may not be very applicable to controversial issues which elicit strong emotional responses. Finally, these data indicate that affective responses, the controversial and emotional nature ofthe issue, gender and other individual differences are important considerations when experts are attempting to persuade laypeople toward a counterattitudinal position.
Resumo:
As the complexity of evolutionary design problems grow, so too must the quality of solutions scale to that complexity. In this research, we develop a genetic programming system with individuals encoded as tree-based generative representations to address scalability. This system is capable of multi-objective evaluation using a ranked sum scoring strategy. We examine Hornby's features and measures of modularity, reuse and hierarchy in evolutionary design problems. Experiments are carried out, using the system to generate three-dimensional forms, and analyses of feature characteristics such as modularity, reuse and hierarchy were performed. This work expands on that of Hornby's, by examining a new and more difficult problem domain. The results from these experiments show that individuals encoded with those three features performed best overall. It is also seen, that the measures of complexity conform to the results of Hornby. Moving forward with only this best performing encoding, the system was applied to the generation of three-dimensional external building architecture. One objective considered was passive solar performance, in which the system was challenged with generating forms that optimize exposure to the Sun. The results from these and other experiments satisfied the requirements. The system was shown to scale well to the architectural problems studied.
Resumo:
The use of theory to understand and facilitate catalytic enantioselective organic transformations involving copper and hydrobenzoin derivatives is reported. Section A details the use of theory to predict, facilitate, and understand a copper promoted amino oxygenation reaction reported by Chemler et al. Using Density Functional Theory (DFT), employing the hybrid B3LYP functional and a LanL2DZ/6-31G(d) basis set, the mechanistic details were studied on a N-tosyl-o-allylaniline and a [alpha]-methyl-[gamma]-alkenyl sulfonamide substrate. The results suggest the N-C bond formation proceeds via a cisaminocupration, and not through a radical-type mechanism. Additionally, the origin of diastereoselection observed with [alpha]-methyl-[gamma]-alkenyl sulfonamide arises from avoidance of unfavourable steric interactions between the methyl substituent and the N -protecting group. Section B details the computationally guided, experimental investigation of two hydrobenzoin derivatives as ligands/ catalysts, as well as the attempted synthesis of a third hydrobenzoin derivative. The bis-boronic acid derived from hydrobenzoin was successful as a Lewis acid catalyst in the Bignielli reaction and the Conia ene reaction, but provided only racemic products. The chiral diol derived from hydrobenzoin successfully increased the rate of the addition of diethyl zinc to benzaldehyde in the presence of titanium tetraisopropoxide, however poor enantioinduction was obseverved. Notably, the observed reactivity was successfully predicted by theoretical calculations.