979 resultados para Distributed Simulation
Resumo:
A visual SLAM system has been implemented and optimised for real-time deployment on an AUV equipped with calibrated stereo cameras. The system incorporates a novel approach to landmark description in which landmarks are local sub maps that consist of a cloud of 3D points and their associated SIFT/SURF descriptors. Landmarks are also sparsely distributed which simplifies and accelerates data association and map updates. In addition to landmark-based localisation the system utilises visual odometry to estimate the pose of the vehicle in 6 degrees of freedom by identifying temporal matches between consecutive local sub maps and computing the motion. Both the extended Kalman filter and unscented Kalman filter have been considered for filtering the observations. The output of the filter is also smoothed using the Rauch-Tung-Striebel (RTS) method to obtain a better alignment of the sequence of local sub maps and to deliver a large-scale 3D acquisition of the surveyed area. Synthetic experiments have been performed using a simulation environment in which ray tracing is used to generate synthetic images for the stereo system
Resumo:
The activated sludge process - the main biological technology usually applied towastewater treatment plants (WWTP) - directly depends on live beings (microorganisms), and therefore on unforeseen changes produced by them. It could be possible to get a good plant operation if the supervisory control system is able to react to the changes and deviations in the system and can take thenecessary actions to restore the system’s performance. These decisions are oftenbased both on physical, chemical, microbiological principles (suitable to bemodelled by conventional control algorithms) and on some knowledge (suitable to be modelled by knowledge-based systems). But one of the key problems in knowledge-based control systems design is the development of an architecture able to manage efficiently the different elements of the process (integrated architecture), to learn from previous cases (spec@c experimental knowledge) and to acquire the domain knowledge (general expert knowledge). These problems increase when the process belongs to an ill-structured domain and is composed of several complex operational units. Therefore, an integrated and distributed AIarchitecture seems to be a good choice. This paper proposes an integrated and distributed supervisory multi-level architecture for the supervision of WWTP, that overcomes some of the main troubles of classical control techniques and those of knowledge-based systems applied to real world systems
Resumo:
Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.
Resumo:
There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.
Resumo:
Previous genetic studies have demonstrated that natal homing shapes the stock structure of marine turtle nesting populations. However, widespread sharing of common haplotypes based on short segments of the mitochondrial control region often limits resolution of the demographic connectivity of populations. Recent studies employing longer control region sequences to resolve haplotype sharing have focused on regional assessments of genetic structure and phylogeography. Here we synthesize available control region sequences for loggerhead turtles from the Mediterranean Sea, Atlantic, and western Indian Ocean basins. These data represent six of the nine globally significant regional management units (RMUs) for the species and include novel sequence data from Brazil, Cape Verde, South Africa and Oman. Genetic tests of differentiation among 42 rookeries represented by short sequences (380 bp haplotypes from 3,486 samples) and 40 rookeries represented by long sequences (~800 bp haplotypes from 3,434 samples) supported the distinction of the six RMUs analyzed as well as recognition of at least 18 demographically independent management units (MUs) with respect to female natal homing. A total of 59 haplotypes were resolved. These haplotypes belonged to two highly divergent global lineages, with haplogroup I represented primarily by CC-A1, CC-A4, and CC-A11 variants and haplogroup II represented by CC-A2 and derived variants. Geographic distribution patterns of haplogroup II haplotypes and the nested position of CC-A11.6 from Oman among the Atlantic haplotypes invoke recent colonization of the Indian Ocean from the Atlantic for both global lineages. The haplotypes we confirmed for western Indian Ocean RMUs allow reinterpretation of previous mixed stock analysis and further suggest that contemporary migratory connectivity between the Indian and Atlantic Oceans occurs on a broader scale than previously hypothesized. This study represents a valuable model for conducting comprehensive international cooperative data management and research in marine ecology.
Resumo:
A physical model for the simulation of x-ray emission spectra from samples irradiated with kilovolt electron beams is proposed. Inner shell ionization by electron impact is described by means of total cross sections evaluated from an optical-data model. A double differential cross section is proposed for bremsstrahlung emission, which reproduces the radiative stopping powers derived from the partial wave calculations of Kissel, Quarles and Pratt [At. Data Nucl. Data Tables 28, 381 (1983)]. These ionization and radiative cross sections have been introduced into a general-purpose Monte Carlo code, which performs simulation of coupled electron and photon transport for arbitrary materials. To improve the efficiency of the simulation, interaction forcing, a variance reduction technique, has been applied for both ionizing collisions and radiative events. The reliability of simulated x-ray spectra is analyzed by comparing simulation results with electron probe measurements.
Resumo:
We present a general algorithm for the simulation of x-ray spectra emitted from targets of arbitrary composition bombarded with kilovolt electron beams. Electron and photon transport is simulated by means of the general-purpose Monte Carlo code PENELOPE, using the standard, detailed simulation scheme. Bremsstrahlung emission is described by using a recently proposed algorithm, in which the energy of emitted photons is sampled from numerical cross-section tables, while the angular distribution of the photons is represented by an analytical expression with parameters determined by fitting benchmark shape functions obtained from partial-wave calculations. Ionization of K and L shells by electron impact is accounted for by means of ionization cross sections calculated from the distorted-wave Born approximation. The relaxation of the excited atoms following the ionization of an inner shell, which proceeds through emission of characteristic x rays and Auger electrons, is simulated until all vacancies have migrated to M and outer shells. For comparison, measurements of x-ray emission spectra generated by 20 keV electrons impinging normally on multiple bulk targets of pure elements, which span the periodic system, have been performed using an electron microprobe. Simulation results are shown to be in close agreement with these measurements.
Resumo:
The role of transport in the economy is twofold. As a sector of economic activity it contributes to a share of national income. On the other hand, improvements in transport infrastructure create room for accelerated economic growth. As a means to support railways as a safe and environmentally friendly transportation mode, the EU legislation has required the opening of domestic railway freight for competition from beginning of year 2007. The importance of railways as a mode of transport has been great in Finland, as a larger share of freight has been carried on rails than in Europe on average. In this thesis it is claimed that the efficiency of goods transport can be enhanced by service specific investments. Furthermore, it is stressed that simulation can and should be used to evaluate the cost-efficiency of transport systems on operational level, as well as to assess transportation infrastructure investments. In all the studied cases notable efficiency improvements were found. For example in distribution, home delivery of groceries can be almost twice as cost efficient as the current practice of visiting the store. The majority of the cases concentrated on railway freight. In timber transportation, the item with the largest annual transport volume in domestic railway freight in Finland, the transportation cost could be reduced most substantially. Also in international timber procurement, the utilization of railway wagons could be improved by combining complementary flows. The efficiency improvements also have positive environmental effects; a large part of road transit could be moved to rails annually. If impacts of freight transport are included in cost-benefit analysis of railway investments, up to 50 % increase in the net benefits of the evaluated alternatives can be experienced, avoiding a possible inbuilt bias in the assessment framework, and thus increasing the efficiency of national investments in transportation infrastructure. Transportation systems are a typical example of complex real world systems that cannot be analysed realistically by analytical methods, whereas simulation allows inclusion of dynamics and the level of detail required. Regarding simulation as a viable tool for assessing the efficiency of transportation systems finds support also in the international survey conducted for railway freight operators; operators use operations research methods widely for planning purposes, while simulation is applied only by the larger operators.
Resumo:
Species structure and composition in Mediterranean riparian forests are determined by hydrological features, longitudinal zonation, and riverbank topography. This study assesses the distribution of four native riparian plants along the riverbank topographic gradient in three river stretches in southern Spain, with special emphasis on the occupation of adult and young feet of each species. The studied stretches suffered minimal human disturbances, displayed semi-arid conditions, and had wide riparian areas to allow the development of the target species: black alder (Alnus glutinosa), salvia leaf willow (Salix salviifolia), narrow-leafed ash (Fraxinus angustifolia), and oleander (Nerium oleander). Thalweg height was used to define the riverbank topographic gradient. The results showed a preferential zone for black alder and salvia leaf willow in the range of 0-150 cm from the channel thalweg, with adult alders and willows being more common between 51 and 150 cm and young alders being more common under 50 cm. Conversely, narrow-leafed ash and oleander were much more frequent, and showed greater development, in the ranges of 151-200 cm and 201-250 cm, respectively, whereas the young feet of both species covered the entire topographic range. Adult feet of the four species were spatially segregated along the riverbank topographic gradient, indicating their differential ability to cope with water stress from the non-tolerant alders and willows to more tolerant narrow-leafed ash trees and oleanders. Young feet, however, showed a strategy more closely linked to the initial availability of colonisation sites within riparian areas to the dispersion strategy of each species and to the distribution of adult feet. In Mediterranean areas, where riparian management has traditionally faced great challenges, the incorporation of species preferences along riverbank gradients could improve the performance of restoration projects.
Resumo:
This study examines Smart Grids and distributed generation, which is connected to a single-family house. The distributed generation comprises small wind power plant and solar panels. The study is done from the consumer point of view and it is divided into two parts. The first part presents the theoretical part and the second part presents the research part. The theoretical part consists of the definition of distributed generation, wind power, solar energy and Smart Grids. The study examines what the Smart Grids will enable. New technology concerning Smart Grids is also examined. The research part introduces wind and sun conditions from two countries. The countries are Finland and Germany. According to the wind and sun conditions of these two countries, the annual electricity production from wind power plant and solar panels will be calculated. The costs of generating electricity from wind and solar energy are calculated from the results of annual electricity productions. The study will also deal with feed-in tariffs, which are supporting systems for renewable energy resources. It is examined in the study, if it is cost-effective for the consumers to use the produced electricity by themselves or sell it to the grid. Finally, figures for both countries are formed. The figures include the calculated cost of generating electricity from wind power plant and solar panels, retail and wholesale prices and feed-in tariffs. In Finland, it is not cost-effective to sell the produced electricity to the grid, before there are support systems. In Germany, it is cost-effective to sell the produced electricity from solar panels to the grid because of feed-in tariffs. On the other hand, in Germany it is cost-effective to produce electricity from wind to own use because the retail price is higher than the produced electricity from wind.
Resumo:
In this paper we address the problem of extracting representative point samples from polygonal models. The goal of such a sampling algorithm is to find points that are evenly distributed. We propose star-discrepancy as a measure for sampling quality and propose new sampling methods based on global line distributions. We investigate several line generation algorithms including an efficient hardware-based sampling method. Our method contributes to the area of point-based graphics by extracting points that are more evenly distributed than by sampling with current algorithms
Resumo:
This thesis introduces a real-time simulation environment based on the multibody simulation approach. The environment consists of components that are used in conventional product development, including computer aided drawing, visualization, dynamic simulation and finite element software architecture, data transfer and haptics. These components are combined to perform as a coupled system on one platform. The environment is used to simulate mobile and industrial machines at different stages of a product life time. Consequently, the demands of the simulated scenarios vary. In this thesis, a real-time simulation environment based on the multibody approach is used to study a reel mechanism of a paper machine and a gantry crane. These case systems are used to demonstrate the usability of the real-time simulation environment for fault detection purposes and in the context of a training simulator. In order to describe the dynamical performance of a mobile or industrial machine, the nonlinear equations of motion must be defined. In this thesis, the dynamical behaviour of machines is modelled using the multibody simulation approach. A multibody system may consist of rigid and flexible bodies which are joined using kinematic joint constraints while force components are used to describe the actuators. The strength of multibody dynamics relies upon its ability to describe nonlinearities arising from wearing of the components, friction, large rotations or contact forces in a systematic manner. For this reason, the interfaces between subsystems such as mechanics, hydraulics and control systems of the mechatronic machine can be defined and analyzed in a straightforward manner.
Resumo:
This paper presents a methodology to determine the parameters used in the simulation of delamination in composite materials using decohesion finite elements. A closed-form expression is developed to define the stiffness of the cohesive layer. A novel procedure that allows the use of coarser meshes of decohesion elements in large-scale computations is proposed. The procedure ensures that the energy dissipated by the fracture process is correctly computed. It is shown that coarse-meshed models defined using the approach proposed here yield the same results as the models with finer meshes normally used in the simulation of fracture processes
Resumo:
A damage model for the simulation of delamination propagation under high-cycle fatigue loading is proposed. The basis for the formulation is a cohesive law that links fracture and damage mechanics to establish the evolution of the damage variable in terms of the crack growth rate dA/dN. The damage state is obtained as a function of the loading conditions as well as the experimentally-determined coefficients of the Paris Law crack propagation rates for the material. It is shown that by using the constitutive fatigue damage model in a structural analysis, experimental results can be reproduced without the need of additional model-specific curve-fitting parameters
Resumo:
A thermodynamically consistent damage model for the simulation of progressive delamination under variable mode ratio is presented. The model is formulated in the context of the Damage Mechanics. The constitutive equation that results from the definition of the free energy as a function of a damage variable is used to model the initiation and propagation of delamination. A new delamination initiation criterion is developed to assure that the formulation can account for changes in the loading mode in a thermodynamically consistent way. The formulation proposed accounts for crack closure effets avoiding interfacial penetration of two adjacent layers aftercomplete decohesion. The model is implemented in a finite element formulation. The numerical predictions given by the model are compared with experimental results