262 resultados para Randomness
Resumo:
Nonlinear dynamics has emerged into a prominent area of research in the past few Decades.Turbulence, Pattern formation,Multistability etc are some of the important areas of research in nonlinear dynamics apart from the study of chaos.Chaos refers to the complex evolution of a deterministic system, which is highly sensitive to initial conditions. The study of chaos theory started in the modern sense with the investigations of Edward Lorentz in mid 60's. Later developments in this subject provided systematic development of chaos theory as a science of deterministic but complex and unpredictable dynamical systems. This thesis deals with the effect of random fluctuations with its associated characteristic timescales on chaos and synchronization. Here we introduce the concept of noise, and two familiar types of noise are discussed. The classifications and representation of white and colored noise are introduced. Based on this we introduce the concept of randomness that we deal with as a variant of the familiar concept of noise. The dynamical systems introduced are the Rossler system, directly modulated semiconductor lasers and the Harmonic oscillator. The directly modulated semiconductor laser being not a much familiar dynamical system, we have included a detailed introduction to its relevance in Chaotic encryption based cryptography in communication. We show that the effect of a fluctuating parameter mismatch on synchronization is to destroy the synchronization. Further we show that the relation between synchronization error and timescales can be found empirically but there are also cases where this is not possible. Studies show that under the variation of the parameters, the system becomes chaotic, which appears to be the period doubling route to chaos.
Resumo:
BACKGROUND: Resting-state functional magnetic resonance imaging (fMRI) enables investigation of the intrinsic functional organization of the brain. Fractal parameters such as the Hurst exponent, H, describe the complexity of endogenous low-frequency fMRI time series on a continuum from random (H = .5) to ordered (H = 1). Shifts in fractal scaling of physiological time series have been associated with neurological and cardiac conditions. METHODS: Resting-state fMRI time series were recorded in 30 male adults with an autism spectrum condition (ASC) and 33 age- and IQ-matched male volunteers. The Hurst exponent was estimated in the wavelet domain and between-group differences were investigated at global and voxel level and in regions known to be involved in autism. RESULTS: Complex fractal scaling of fMRI time series was found in both groups but globally there was a significant shift to randomness in the ASC (mean H = .758, SD = .045) compared with neurotypical volunteers (mean H = .788, SD = .047). Between-group differences in H, which was always reduced in the ASC group, were seen in most regions previously reported to be involved in autism, including cortical midline structures, medial temporal structures, lateral temporal and parietal structures, insula, amygdala, basal ganglia, thalamus, and inferior frontal gyrus. Severity of autistic symptoms was negatively correlated with H in retrosplenial and right anterior insular cortex. CONCLUSIONS: Autism is associated with a small but significant shift to randomness of endogenous brain oscillations. Complexity measures may provide physiological indicators for autism as they have done for other medical conditions.
Resumo:
Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.
Resumo:
We study quasi-random properties of k-uniform hypergraphs. Our central notion is uniform edge distribution with respect to large vertex sets. We will find several equivalent characterisations of this property and our work can be viewed as an extension of the well known Chung-Graham-Wilson theorem for quasi-random graphs. Moreover, let K(k) be the complete graph on k vertices and M(k) the line graph of the graph of the k-dimensional hypercube. We will show that the pair of graphs (K(k),M(k)) has the property that if the number of copies of both K(k) and M(k) in another graph G are as expected in the random graph of density d, then G is quasi-random (in the sense of the Chung-Graham-Wilson theorem) with density close to d. (C) 2011 Wiley Periodicals, Inc. Random Struct. Alg., 40, 1-38, 2012
Resumo:
Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the ‘nugget’, which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.
Resumo:
The innocent Job suffers, friends are no help, and then Job screams at Yahweh, demanding justice. The first surprise is that Yehweh responds to Job, does not criticize him but tells him he is ignorant, and then gives him a science lesson. The content of that lesson is the second surprise.
Resumo:
Despite many diverse theories that address closely related themes—e.g., probability theory, algorithmic complexity, cryptoanalysis, and pseudorandom number generation—a near-void remains in constructive methods certified to yield the desired “random” output. Herein, we provide explicit techniques to produce broad sets of both highly irregular finite and normal infinite sequences, based on constructions and properties derived from approximate entropy (ApEn), a computable formulation of sequential irregularity. Furthermore, for infinite sequences, we considerably refine normality, by providing methods for constructing diverse classes of normal numbers, classified by the extent to which initial segments deviate from maximal irregularity.
Resumo:
The fundamental question "Are sequential data random?" arises in myriad contexts, often with severe data length constraints. Furthermore, there is frequently a critical need to delineate nonrandom sequences in terms of closeness to randomness--e.g., to evaluate the efficacy of therapy in medicine. We address both these issues from a computable framework via a quantification of regularity. ApEn (approximate entropy), defining maximal randomness for sequences of arbitrary length, indicating the applicability to sequences as short as N = 5 points. An infinite sequence formulation of randomness is introduced that retains the operational (and computable) features of the finite case. In the infinite sequence setting, we indicate how the "foundational" definition of independence in probability theory, and the definition of normality in number theory, reduce to limit theorems without rates of convergence, from which we utilize ApEn to address rates of convergence (of a deficit from maximal randomness), refining the aforementioned concepts in a computationally essential manner. Representative applications among many are indicated to assess (i) random number generation output; (ii) well-shuffled arrangements; and (iii) (the quality of) bootstrap replicates.
Resumo:
ACKNOWLEDGMENTS This paper is supported by the National Natural Science Foundation of China (Grant Nos. 61573067 and 61472045), the Beijing Higher Education Young Elite Teacher Project (Grant No. YETP0449), the Asia Foresight Program under NSFC Grant (Grant No. 61411146001), and the Beijing Natural Science Foundation (Grant No. 4142016).
Resumo:
Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.
Resumo:
This paper analyzes effects of different practice task constraints on heart rate (HR) variability during 4v4 smallsided football games. Participants were sixteen football players divided into two age groups (U13, Mean age: 12.4±0.5 yrs; U15: 14.6±0.5). The task consisted of a 4v4 sub-phase without goalkeepers, on a 25x15 m field, of 15 minutes duration with an active recovery period of 6 minutes between each condition. We recorded players’ heart rates using heart rate monitors (Polar Team System, Polar Electro, Kempele, Finland) as scoring mode was manipulated (line goal: scoring by dribbling past an extended line; double goal: scoring in either of two lateral goals; and central goal: scoring only in one goal). Subsequently, %HR reserve was calculated with the Karvonen formula. We performed a time-series analysis of HR for each individual in each condition. Mean data for intra-participant variability showed that autocorrelation function was associated with more short-range dependence processes in the “line goal” condition, compared to other conditions, demonstrating that the “line goal” constraint induced more randomness in HR response. Relative to inter-individual variability, line goal constraints demonstrated lower %CV and %RMSD (U13: 9% and 19%; U15: 10% and 19%) compared with double goal (U13: 12% and 21%; U15: 12% and 21%) and central goal (U13: 14% and 24%; U15: 13% and 24%) task constraints, respectively. Results suggested that line goal constraints imposed more randomness on cardiovascular stimulation of each individual and lower inter-individual variability than double goal and central goal constraints.
Resumo:
This paper analyzes effects of different practice task constraints on heart rate (HR) variability during 4v4 smallsided football games. Participants were sixteen football players divided into two age groups (U13, Mean age: 12.4±0.5 yrs; U15: 14.6±0.5). The task consisted of a 4v4 sub-phase without goalkeepers, on a 25x15 m field, of 15 minutes duration with an active recovery period of 6 minutes between each condition. We recorded players’ heart rates using heart rate monitors (Polar Team System, Polar Electro, Kempele, Finland) as scoring mode was manipulated (line goal: scoring by dribbling past an extended line; double goal: scoring in either of two lateral goals; and central goal: scoring only in one goal). Subsequently, %HR reserve was calculated with the Karvonen formula. We performed a time-series analysis of HR for each individual in each condition. Mean data for intra-participant variability showed that autocorrelation function was associated with more short-range dependence processes in the “line goal” condition, compared to other conditions, demonstrating that the “line goal” constraint induced more randomness in HR response. Relative to inter-individual variability, line goal constraints demonstrated lower %CV and %RMSD (U13: 9% and 19%; U15: 10% and 19%) compared with double goal (U13: 12% and 21%; U15: 12% and 21%) and central goal (U13: 14% and 24%; U15: 13% and 24%) task constraints, respectively. Results suggested that line goal constraints imposed more randomness on cardiovascular stimulation of each individual and lower inter-individual variability than double goal and central goal constraints.
Resumo:
Objectives: To quantify randomness and cost when choosing health and medical research projects for funding. Design: Analysis of retrospective data from grant review panels. Setting: The National Health & Medical Research Council of Australia. Participants/Data: All panel members’ scores for grant proposals submitted in 2009. Main outcome measure: The proportion of grant proposals that were always, sometimes and never funded after accounting for random variability arising from variation in panel members’ scores; the cost-effectiveness of different size assessment panels. Results: 59% of 620 funded grants were sometimes not funded when random variability was accounted for. Only 9% of grant proposals were always funded, 61% were never funded and 29% were sometimes funded. The extra cost per grant effectively funded from the most effective system was $18,541. Conclusions: Allocating funding for scientific research in health and medicine is costly and somewhat random. There are many useful research questions to be addressed that could improve current processes.
Resumo:
Motorcycles are particularly vulnerable in right-angle crashes at signalized intersections. The objective of this study is to explore how variations in roadway characteristics, environmental factors, traffic factors, maneuver types, human factors as well as driver demographics influence the right-angle crash vulnerability of motorcycles at intersections. The problem is modeled using a mixed logit model with a binary choice category formulation to differentiate how an at-fault vehicle collides with a not-at-fault motorcycle in comparison to other collision types. The mixed logit formulation allows randomness in the parameters and hence takes into account the underlying heterogeneities potentially inherent in driver behavior, and other unobserved variables. A likelihood ratio test reveals that the mixed logit model is indeed better than the standard logit model. Night time riding shows a positive association with the vulnerability of motorcyclists. Moreover, motorcyclists are particularly vulnerable on single lane roads, on the curb and median lanes of multi-lane roads, and on one-way and two-way road type relative to divided-highway. Drivers who deliberately run red light as well as those who are careless towards motorcyclists especially when making turns at intersections increase the vulnerability of motorcyclists. Drivers appear more restrained when there is a passenger onboard and this has decreased the crash potential with motorcyclists. The presence of red light cameras also significantly decreases right-angle crash vulnerabilities of motorcyclists. The findings of this study would be helpful in developing more targeted countermeasures for traffic enforcement, driver/rider training and/or education, safety awareness programs to reduce the vulnerability of motorcyclists.