899 resultados para Multi-scale Fractal Dimension
Resumo:
Coastal ecosystems represent an inestimable source of biodiversity, being among the most productive areas on the planet. Despite the great ecological and economic value of those environments, many threats endanger the species living in this ecosystem, like the rapid warming and the sea acidification, among many other. Benthic calcifying organisms (e.g. mollusks, corals and echinoderms) in particular, are among the most exposed to those hazards. These organisms use calcium carbonate as a structural and protective material through the biomineralization process, biologically controlled by the organism, but nevertheless, strongly influenced by the environmental surroundings. Evaluating how a changing environment can influence the process of biomineralization is critical to understand how those species of great ecological and economic importance will face the ongoing climate change. This thesis investigates the mechanism of biomineralization in different mollusks’ species of the Adriatic Sea, providing detailed descriptions of shells skeletal, biometric and growth parameters. Applying a multidisciplinary and multi-scale research approach, the influence of external environmental factors on the process of shell formation has been investigated. To achieve this purpose analysis were conducted both on current populations and on fossil remain, which allows to investigate ecological responses to past climate transitions. Mollusks’ shells in fact are one of the best tools to understand climate change in the past, present and future, since they record the environmental conditions prevailed during their life, reflected on the geochemical properties, microstructure and growth of the shell. This approach allowed to overcome the time scale limit imposed by field and laboratory survey, and better understand species long term adaptive response to changing environment, a crucial issue to define proper conservation and management strategies. Furthermore, the investigation of fossil record of mollusks assemblages offered the opportunity to evaluate the long-term biotic response to anthropogenic stressors in the north Adriatic Sea.
Resumo:
The aim of this work is to present a general overview of state-of-the-art related to design for uncertainty with a focus on aerospace structures. In particular, a simulation on a FCCZ lattice cell and on the profile shape of a nozzle will be performed. Optimization under uncertainty is characterized by the need to make decisions without complete knowledge of the problem data. When dealing with a complex problem, non-linearity, or optimization, two main issues are raised: the uncertainty of the feasibility of the solution and the uncertainty of the objective value of the function. In the first part, the Design Of Experiments (DOE) methodologies, Uncertainty Quantification (UQ), and then Uncertainty optimization will be deepened. The second part will show an application of the previous theories on through a commercial software. Nowadays multiobjective optimization on high non-linear problem can be a powerful tool to approach new concept solutions or to develop cutting-edge design. In this thesis an effective improvement have been reached on a rocket nozzle. Future work could include the introduction of multi scale modelling, multiphysics approach and every strategy useful to simulate as much possible real operative condition of the studied design.
Resumo:
Collective behaviours can be observed in both natural and man-made systems composed of a large number of elemental subsystems. Typically, each elemental subsystem has its own dynamics but, whenever interaction between individuals occurs, the individual behaviours tend to be relaxed, and collective behaviours emerge. In this paper, the collective behaviour of a large-scale system composed of several coupled elemental particles is analysed. The dynamics of the particles are governed by the same type of equations but having different parameter values and initial conditions. Coupling between particles is based on statistical feedback, which means that each particle is affected by the average behaviour of its neighbours. It is shown that the global system may unveil several types of collective behaviours, corresponding to partial synchronisation, characterised by the existence of several clusters of synchronised subsystems, and global synchronisation between particles, where all the elemental particles synchronise completely.
Resumo:
We search for evidence of physics beyond the Standard Model in the production of final states with multiple high transverse momentum jets, using 20.3 fb−1 of proton-proton collision data recorded by the ATLAS detector at s√ = 8 TeV. No excess of events beyond Standard Model expectations is observed, and upper limits on the visible cross-section for non-Standard Model production of multi-jet final states are set. Using a wide variety of models for black hole and string ball production and decay, the limit on the cross-section times acceptance is as low as 0.16 fb at the 95% CL for a minimum scalar sum of jet transverse momentum in the event of about 4.3 TeV. Using models for black hole and string ball production and decay, exclusion contours are determined as a function of the production mass threshold and the gravity scale. These limits can be interpreted in terms of lower-mass limits on black hole and string ball production that range from 4.6 to 6.2 TeV.
Resumo:
Magdeburg, Univ., Fak. für Wirtschaftswiss., Diss., 2013
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.
Resumo:
Milk supply from Mexican dairy farms does not meet demand and small-scale farms can contribute toward closing the gap. Two multi-criteria programming techniques, goal programming and compromise programming, were used in a study of small-scale dairy farms in central Mexico. To build the goal and compromise programming models, 4 ordinary linear programming models were also developed, which had objective functions to maximize metabolizable energy for milk production, to maximize margin of income over feed costs, to maximize metabolizable protein for milk production, and to minimize purchased feedstuffs. Neither multicriteria approach was significantly better than the other; however, by applying both models it was possible to perform a more comprehensive analysis of these small-scale dairy systems. The multi-criteria programming models affirm findings from previous work and suggest that a forage strategy based on alfalfa, rye-grass, and corn silage would meet nutrient requirements of the herd. Both models suggested that there is an economic advantage in rescheduling the calving season to the second and third calendar quarters to better synchronize higher demand for nutrients with the period of high forage availability.
Resumo:
Where users are interacting in a distributed virtual environment, the actions of each user must be observed by peers with sufficient consistency and within a limited delay so as not to be detrimental to the interaction. The consistency control issue may be split into three parts: update control; consistent enactment and evolution of events; and causal consistency. The delay in the presentation of events, termed latency, is primarily dependent on the network propagation delay and the consistency control algorithms. The latency induced by the consistency control algorithm, in particular causal ordering, is proportional to the number of participants. This paper describes how the effect of network delays may be reduced and introduces a scalable solution that provides sufficient consistency control while minimising its effect on latency. The principles described have been developed at Reading over the past five years. Similar principles are now emerging in the simulation community through the HLA standard. This paper attempts to validate the suggested principles within the schema of distributed simulation and virtual environments and to compare and contrast with those described by the HLA definition documents.
Resumo:
Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Nota: A autora agradece à Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) pela concessão de bolsa de estudos para o desenvolvimento deste projeto de pesquisa.
Resumo:
Although it has been suggested that retinal vasculature is a diffusion-limited aggregation (DLA) fractal, no study has been dedicated to standardizing its fractal analysis . The aims of this project was to standardize a method to estimate the fractal dimensions of retinal vasculature and to characterize their normal values; to determine if this estimation is dependent on skeletization and on segmentation and calculation methods; to assess the suitability of the DLA model and to determine the usefulness of log-log graphs in characterizing vasculature fractality . To achieve these aims, the information, mass-radius and box counting dimensions of 20 eyes vasculatures were compared when the vessels were manually or computationally segmented; the fractal dimensions of the vasculatures of 60 eyes of healthy volunteers were compared with those of 40 DLA models and the log-log graphs obtained were compared with those of known fractals and those of non-fractals. The main results were: the fractal dimensions of vascular trees were dependent on segmentation methods and dimension calculation methods, but there was no difference between manual segmentation and scale-space, multithreshold and wavelet computational methods; the means of the information and box dimensions for arteriolar trees were 1.29. against 1.34 and 1.35 for the venular trees; the dimension for the DLA models were higher than that for vessels; the log-log graphs were straight, but with varying local slopes, both for vascular trees and for fractals and non-fractals. This results leads to the following conclusions: the estimation of the fractal dimensions for retinal vasculature is dependent on its skeletization and on the segmentation and calculation methods; log-log graphs are not suitable as a fractality test; the means of the information and box counting dimensions for the normal eyes were 1.47 and 1.43, respectively, and the DLA model with optic disc seeding is not sufficient for retinal vascularization modeling
Resumo:
This paper presents the principal results of a detailed study about the use of the Meaningful Fractal Fuzzy Dimension measure in the problem in determining adequately the topological dimension of output space of a Self-Organizing Map. This fractal measure is conceived by combining the Fractals Theory and Fuzzy Approximate Reasoning. In this work this measure was applied on the dataset in order to obtain a priori knowledge, which is used to support the decision making about the SOM output space design. Several maps were designed with this approach and their evaluations are discussed here.
Resumo:
Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.