912 resultados para average complexity
Resumo:
In this work, crystalline titanium dioxide (TiO2) nanoparticles with variable average crystallite sizes (e.g., 8 nm) and surface areas (e.g., 192 m² g-1) were synthesized in pure anatase phase using H2O2 to reduce the hydrolysis rate of the titanium ions. An isopropanol (IP) solution was employed as the reaction medium. The TiO2 nanoparticles were characterized by powder X-ray diffraction analysis (XRD), Raman spectroscopy and transmission electron microscopy (TEM). By changing the synthesis parameters it was possible to control nanoparticle size and avoid the coalescence process. A dependence of the Raman wavenumber on the nanocrystal sizes was determined, which is quite useful for a quick check of the size of TiO2 nanocrystals.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
AIM: The purpose of this study was to examine the effect of intensive practice in table-tennis on perceptual, decision-making and motor-systems. Groups of elite (HL=11), intermediate (LL=6) and control (CC=11) performed tasks of different levels. METHODS: All subjects underwent to reaction-time-test and response-time-test consisting of a pointing task to targets placed at distinct distances (15 and 25-cm) on the right and left sides. The ball speed test in forehand and backhand condition just for HL and LL group. RESULTS: In CC group reaction time was higher compared to HL (P< 0.05) group. In the response-time-test, there was a significant main effect of distance (P< 0.0001) and the tennis-table expertise (P= 0.011). In the ball speed test the HL were constantly faster compared to the LL in both forehand stroke (P< 0.0001) and backhand stroke (P< 0.0001). Overall, the forehand stroke was significantly faster than the backhand stroke. CONCLUSION: We can conclude that table-tennis-players have shorter response-times than non-athletes and the tasks of reaction-time and response-time are incapable to distinguish the performance of well-trained table tennis players of the intermediate player, but the ball speed test seems be able to do it.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciências Ambientais - Sorocaba
Resumo:
Stage-structured population models predict transient population dynamics if the population deviates from the stable stage distribution. Ecologists’ interest in transient dynamics is growing because populations regularly deviate from the stable stage distribution, which can lead to transient dynamics that differ significantly from the stable stage dynamics. Because the structure of a population matrix (i.e., the number of life-history stages) can influence the predicted scale of the deviation, we explored the effect of matrix size on predicted transient dynamics and the resulting amplification of population size. First, we experimentally measured the transition rates between the different life-history stages and the adult fecundity and survival of the aphid, Acythosiphon pisum. Second, we used these data to parameterize models with different numbers of stages. Third, we compared model predictions with empirically measured transient population growth following the introduction of a single adult aphid. We find that the models with the largest number of life-history stages predicted the largest transient population growth rates, but in all models there was a considerable discrepancy between predicted and empirically measured transient peaks and a dramatic underestimation of final population sizes. For instance, the mean population size after 20 days was 2394 aphids compared to the highest predicted population size of 531 aphids; the predicted asymptotic growth rate (λmax) was consistent with the experiments. Possible explanations for this discrepancy are discussed. Includes 4 supplemental files.
Resumo:
The enzymatically catalyzed template-directed extension of ssDNA/primer complex is an impor-tant reaction of extraordinary complexity. The DNA polymerase does not merely facilitate the insertion of dNMP, but it also performs rapid screening of substrates to ensure a high degree of fidelity. Several kinetic studies have determined rate constants and equilibrium constants for the elementary steps that make up the overall pathway. The information is used to develop a macro-scopic kinetic model, using an approach described by Ninio [Ninio J., 1987. Alternative to the steady-state method: derivation of reaction rates from first-passage times and pathway probabili-ties. Proc. Natl. Acad. Sci. U.S.A. 84, 663–667]. The principle idea of the Ninio approach is to track a single template/primer complex over time and to identify the expected behavior. The average time to insert a single nucleotide is a weighted sum of several terms, in-cluding the actual time to insert a nucleotide plus delays due to polymerase detachment from ei-ther the ternary (template-primer-polymerase) or quaternary (+nucleotide) complexes and time delays associated with the identification and ultimate rejection of an incorrect nucleotide from the binding site. The passage times of all events and their probability of occurrence are ex-pressed in terms of the rate constants of the elementary steps of the reaction pathway. The model accounts for variations in the average insertion time with different nucleotides as well as the in-fluence of G+C content of the sequence in the vicinity of the insertion site. Furthermore the model provides estimates of error frequencies. If nucleotide extension is recognized as a compe-tition between successful insertions and time delaying events, it can be described as a binomial process with a probability distribution. The distribution gives the probability to extend a primer/template complex with a certain number of base pairs and in general it maps annealed complexes into extension products.
Resumo:
This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.
Resumo:
Methods from statistical physics, such as those involving complex networks, have been increasingly used in the quantitative analysis of linguistic phenomena. In this paper, we represented pieces of text with different levels of simplification in co-occurrence networks and found that topological regularity correlated negatively with textual complexity. Furthermore, in less complex texts the distance between concepts, represented as nodes, tended to decrease. The complex networks metrics were treated with multivariate pattern recognition techniques, which allowed us to distinguish between original texts and their simplified versions. For each original text, two simplified versions were generated manually with increasing number of simplification operations. As expected, distinction was easier for the strongly simplified versions, where the most relevant metrics were node strength, shortest paths and diversity. Also, the discrimination of complex texts was improved with higher hierarchical network metrics, thus pointing to the usefulness of considering wider contexts around the concepts. Though the accuracy rate in the distinction was not as high as in methods using deep linguistic knowledge, the complex network approach is still useful for a rapid screening of texts whenever assessing complexity is essential to guarantee accessibility to readers with limited reading ability. Copyright (c) EPLA, 2012
Resumo:
The intention of this paper is to present some Aristotelian arguments regarding the motion on local terrestrial region. Because it is a highly sophisticated and complex explanation dealt with, briefly, the principles and causes that based theoretic sciences in general and in particular physics. Subdivided into eight topics this article in order to facilitate the understanding of these concepts for the reader not familiar with the Aristotelian texts. With intent to avoid an innocent view, anachronistic and linear the citations are of primary sources or commentators of Aristotle's works.
Resumo:
This paper studies the average control problem of discrete-time Markov Decision Processes (MDPs for short) with general state space, Feller transition probabilities, and possibly non-compact control constraint sets A(x). Two hypotheses are considered: either the cost function c is strictly unbounded or the multifunctions A(r)(x) = {a is an element of A(x) : c(x, a) <= r} are upper-semicontinuous and compact-valued for each real r. For these two cases we provide new results for the existence of a solution to the average-cost optimality equality and inequality using the vanishing discount approach. We also study the convergence of the policy iteration approach under these conditions. It should be pointed out that we do not make any assumptions regarding the convergence and the continuity of the limit function generated by the sequence of relative difference of the alpha-discounted value functions and the Poisson equations as often encountered in the literature. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.
Resumo:
With the financial market globalization, foreign investments became vital for the economies, mainly in emerging countries. In the last decades, Brazilian exchange rates appeared as a good indicator to measure either investors' confidence or risk aversion. Here, some events of global or national financial crisis are analyzed, trying to understand how they influenced the "dollar-real" rate evolution. The theoretical tool to be used is the Lopez-Mancini-Calbet (LMC) complexity measure that, applied to real exchange rate data, has shown good fitness between critical events and measured patterns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The use of statistical methods to analyze large databases of text has been useful in unveiling patterns of human behavior and establishing historical links between cultures and languages. In this study, we identified literary movements by treating books published from 1590 to 1922 as complex networks, whose metrics were analyzed with multivariate techniques to generate six clusters of books. The latter correspond to time periods coinciding with relevant literary movements over the last five centuries. The most important factor contributing to the distinctions between different literary styles was the average shortest path length, in particular the asymmetry of its distribution. Furthermore, over time there has emerged a trend toward larger average shortest path lengths, which is correlated with increased syntactic complexity, and a more uniform use of the words reflected in a smaller power-law coefficient for the distribution of word frequency. Changes in literary style were also found to be driven by opposition to earlier writing styles, as revealed by the analysis performed with geometrical concepts. The approaches adopted here are generic and may be extended to analyze a number of features of languages and cultures.