915 resultados para deterministic fractals
Resumo:
This article presents a quantitative and objective approach to cat ganglion cell characterization and classification. The combination of several biologically relevant features such as diameter, eccentricity, fractal dimension, influence histogram, influence area, convex hull area, and convex hull diameter are derived from geometrical transforms and then processed by three different clustering methods (Ward's hierarchical scheme, K-means and genetic algorithm), whose results are then combined by a voting strategy. These experiments indicate the superiority of some features and also suggest some possible biological implications.
Resumo:
This paper presents models that can be used in the design of microstrip antennas for mobile communications. The antennas can be triangular or rectangular. The presented models are compared with deterministic and empirical models based on artificial neural networks (ANN) presented in the literature. The models are based on Perceptron Multilayer (PML) and Radial Basis Function (RBF) ANN. RBF based models presented the best results. Also, the models can be embedded in CAD systems, in order to design microstrip antennas for mobile communications.
Resumo:
The obtaining of the correct space distribution for attributes of the soil is relevant in the agricultural planning, in what concerns to the installation and maintenance of the cultures. The objective of that work was to compare statistical interpolation methods (ordinary krigagem) and deterministic methods (inverse square distance) in the estimate of CTC and V% in a distrophic yellow-red Latossolo. The study was accomplished in the State of Experimental Hands on of the Instituto Capixaba de Pesquisa, Assistência Técnica e Extensão Rural (INCAPER), in an irregular grading with 109 points. The data were collected in the layer of 0 - 0,20 m in the projection of the cup of the plants, in the superior part of the slope. The performance of the interpolators was obtained and compared using the criterion of the medium mistake. The observations are dependent in space until a maximum reach of 14,1 m, considering the isotropy. IDW presented larger mistake in the estimate of the data; however its difference in relation to KRIG was small for both variables.
Resumo:
This paper uses artificial neural networks (ANN) to compute the resonance frequencies of rectangular microstrip antennas (MSA), used in mobile communications. Perceptron Multi-layers (PML) networks were used, with the Quasi-Newton method proposed by Broyden, Fletcher, Goldfarb and Shanno (BFGS). Due to the nature of the problem, two hundred and fifty networks were trained, and the resonance frequency for each test antenna was calculated by statistical methods. The estimate resonance frequencies for six test antennas were compared with others results obtained by deterministic and ANN based empirical models from the literature, and presented a better agreement with the experimental values.
Resumo:
We report results from a search for neutral Higgs bosons produced in association with b quarks using data recorded by the D0 experiment at the Fermilab Tevatron Collider and corresponding to an integrated luminosity of 7.3fb-1. This production mode can be enhanced in several extensions of the standard model (SM) such as in its minimal supersymmetric extension (MSSM) at high tan β. We search for Higgs bosons decaying to tau pairs with one tau decaying to a muon and neutrinos and the other to hadrons. The data are found to be consistent with SM expectations, and we set upper limits on the cross section times branching ratio in the Higgs boson mass range from 90 to 320GeV/c2. We interpret our result in the MSSM parameter space, excluding tan β values down to 25 for Higgs boson masses below 170GeV/c2. © 2011 American Physical Society.
Resumo:
Deterministic Optimal Reactive Power Dispatch problem has been extensively studied, such that the demand power and the availability of shunt reactive power compensators are known and fixed. Give this background, a two-stage stochastic optimization model is first formulated under the presumption that the load demand can be modeled as specified random parameters. A second stochastic chance-constrained model is presented considering uncertainty on the demand and the equivalent availability of shunt reactive power compensators. Simulations on six-bus and 30-bus test systems are used to illustrate the validity and essential features of the proposed models. This simulations shows that the proposed models can prevent to the power system operator about of the deficit of reactive power in the power system and suggest that shunt reactive sourses must be dispatched against the unavailability of any reactive source. © 2012 IEEE.
Resumo:
The medium term hydropower scheduling (MTHS) problem involves an attempt to determine, for each time stage of the planning period, the amount of generation at each hydro plant which will maximize the expected future benefits throughout the planning period, while respecting plant operational constraints. Besides, it is important to emphasize that this decision-making has been done based mainly on inflow earliness knowledge. To perform the forecast of a determinate basin, it is possible to use some intelligent computational approaches. In this paper one considers the Dynamic Programming (DP) with the inflows given by their average values, thus turning the problem into a deterministic one which the solution can be obtained by deterministic DP (DDP). The performance of the DDP technique in the MTHS problem was assessed by simulation using the ensemble prediction models. Features and sensitivities of these models are discussed. © 2012 IEEE.
Resumo:
The results of the histopathological analyses after the implantation of highly crystalline PVA microspheres in subcutaneous tissues of Wistar rats are here in reported. Three different groups of PVA microparticles were systematically studied: highly crystalline, amorphous, and commercial ones. In addition to these experiments, complementary analyses of architectural complexity were performed using fractal dimension (FD), and Shannon's entropy (SE) concepts. The highly crystalline microspheres induced inflammatory reactions similar to the ones observed for the commercial ones, while the inflammatory reactions caused by the amorphous ones were less intense. Statistical analyses of the subcutaneous tissues of Wistar rats implanted with the highly crystalline microspheres resulted in FD and SE values significantly higher than the statistical parameters observed for the amorphous ones. The FD and SE parameters obtained for the subcutaneous tissues of Wistar rats implanted with crystalline and commercial microparticles were statistically similar. Briefly, the results indicated that the new highly crystalline microspheres had biocompatible behavior comparable to the commercial ones. In addition, statistical tools such as FD and SE analyses when combined with histopathological analyses can be useful tools to investigate the architectural complexity tissues caused by complex inflammatory reactions. © 2012 WILEY PERIODICALS, INC.
Resumo:
Following the thermodynamic formulation of a multifractal measure that was shown to enable the detection of large fluctuations at an early stage, here we propose a new index which permits us to distinguish events like financial crises in real time. We calculate the partition function from which we can obtain thermodynamic quantities analogous to the free energy and specific heat. The index is defined as the normalized energy variation and it can be used to study the behavior of stochastic time series, such as financial market daily data. Famous financial market crashes-Black Thursday (1929), Black Monday (1987) and the subprime crisis (2008)-are identified with clear and robust results. The method is also applied to the market fluctuations of 2011. From these results it appears as if the apparent crisis of 2011 is of a different nature to the other three. We also show that the analysis has forecasting capabilities. © 2012 Elsevier B.V. All rights reserved.
Resumo:
This work combines symbolic machine learning and multiscale fractal techniques to generate models that characterize cellular rejection in myocardial biopsies and that can base a diagnosis support system. The models express the knowledge by the features threshold, fractal dimension, lacunarity, number of clusters, spatial percolation and percolation probability, all obtained with myocardial biopsies processing. Models were evaluated and the most significant was the one generated by the C4.5 algorithm for the features spatial percolation and number of clusters. The result is relevant and contributes to the specialized literature since it determines a standard diagnosis protocol. © 2013 Springer.
Local attractors, degeneracy and analyticity: Symmetry effects on the locally coupled Kuramoto model
Resumo:
In this work we study the local coupled Kuramoto model with periodic boundary conditions. Our main objective is to show how analytical solutions may be obtained from symmetry assumptions, and while we proceed on our endeavor we show apart from the existence of local attractors, some unexpected features resulting from the symmetry properties, such as intermittent and chaotic period phase slips, degeneracy of stable solutions and double bifurcation composition. As a result of our analysis, we show that stable fixed points in the synchronized region may be obtained with just a small amount of the existent solutions, and for a class of natural frequencies configuration we show analytical expressions for the critical synchronization coupling as a function of the number of oscillators, both exact and asymptotic. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
This paper tackles a Nurse Scheduling Problem which consists of generating work schedules for a set of nurses while considering their shift preferences and other requirements. The objective is to maximize the satisfaction of nurses' preferences and minimize the violation of soft constraints. This paper presents a new deterministic heuristic algorithm, called MAPA (multi-assignment problem-based algorithm), which is based on successive resolutions of the assignment problem. The algorithm has two phases: a constructive phase and an improvement phase. The constructive phase builds a full schedule by solving successive assignment problems, one for each day in the planning period. The improvement phase uses a couple of procedures that re-solve assignment problems to produce a better schedule. Given the deterministic nature of this algorithm, the same schedule is obtained each time that the algorithm is applied to the same problem instance. The performance of MAPA is benchmarked against published results for almost 250,000 instances from the NSPLib dataset. In most cases, particularly on large instances of the problem, the results produced by MAPA are better when compared to best-known solutions from the literature. The experiments reported here also show that the MAPA algorithm finds more feasible solutions compared with other algorithms in the literature, which suggest that this proposed approach is effective and robust. © 2013 Springer Science+Business Media New York.
Resumo:
Perhaps due to its origins in a production scheduling software called Optimised Production Technology (OPT), plus the idea of focusing on system constraints, many believe that the Theory of Constraints (TOC) has a vocation for optimal solutions. Those who assess TOC according to this perspective indicate that it guarantees an optimal solution only in certain circumstances. In opposition to this view and founded on a numeric example of a production mix problem, this paper shows, by means of TOC assumptions, why the TOC should not be compared to methods intended to seek optimal or the best solutions, but rather sufficiently good solutions, possible in non-deterministic environments. Moreover, we extend the range of relevant literature on product mix decision by introducing a heuristic based on the uniquely identified work that aims at achieving feasible solutions according to the TOC point of view. The heuristic proposed is tested on 100 production mix problems and the results are compared with the responses obtained with the use of Integer Linear Programming. The results show that the heuristic gives good results on average, but performance falls sharply in some situations. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)