987 resultados para Complexity Theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to assess a population of patients with diabetes mellitus by means of the INTERMED, a classification system for case complexity integrating biological, psychosocial and health care related aspects of disease. The main hypothesis was that the INTERMED would identify distinct clusters of patients with different degrees of case complexity and different clinical outcomes. Patients (n=61) referred to a tertiary reference care centre were evaluated with the INTERMED and followed 9 months for HbA1c values and 6 months for health care utilisation. Cluster analysis revealed two clusters: cluster 1 (62%) consisting of complex patients with high INTERMED scores and cluster 2 (38%) consisting of less complex patients with lower INTERMED. Cluster 1 patients showed significantly higher HbA1c values and a tendency for increased health care utilisation. Total INTERMED scores were significantly related to HbA1c and explained 21% of its variance. In conclusion, different clusters of patients with different degrees of case complexity were identified by the INTERMED, allowing the detection of highly complex patients at risk for poor diabetes control. The INTERMED therefore provides an objective basis for clinical and scientific progress in diabetes mellitus. Ongoing intervention studies will have to confirm these preliminary data and to evaluate if management strategies based on the INTERMED profiles will improve outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The author reviews past work with Ibict and the global progress made by the Open Access Movement. He postulates a theory of open access being an example of a complex adaptive system created by Internet-based scholarly publishing. Open access could be the cause of a cascade of increasing complexity and opportunities that will reshape this system. He has chosen the pervasive and global "Connectedness" created by the internet and the content spaces it provides for open access collections as a "simple disruptive agent". He discusses how connectedness influences infinite variety, creativity, work, change, knowledge, and the information economy. Case studies from the University of New Mexico Libraries are used where appropriate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper evaluates the reception of Léon Walras' ideas in Russia before 1920. Despite an unfavourable institutional context, Walras was read by Russian economists. On the one hand, Bortkiewicz and Winiarski, who lived outside Russia and had the opportunity to meet and correspond with Walras, were first class readers and very good ambassadors for Walras' ideas, while on the other, the economists living in Russia were more selective in their readings. They restricted themselves to Walras' Elements of Pure Economics, in particular, its theory of exchange, while ignoring its theory of production. We introduce a cultural argument to explain their selective reading. JEL classification numbers: B 13, B 19.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The RuskSkinner formalism was developed in order to give a geometrical unified formalism for describing mechanical systems. It incorporates all the characteristics of Lagrangian and Hamiltonian descriptions of these systems (including dynamical equations and solutions, constraints, Legendre map, evolution operators, equivalence, etc.). In this work we extend this unified framework to first-order classical field theories, and show how this description comprises the main features of the Lagrangian and Hamiltonian formalisms, both for the regular and singular cases. This formulation is a first step toward further applications in optimal control theory for partial differential equations. 2004 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies a risk measure inherited from ruin theory and investigates some of its properties. Specifically, we consider a value-at-risk (VaR)-type risk measure defined as the smallest initial capital needed to ensure that the ultimate ruin probability is less than a given level. This VaR-type risk measure turns out to be equivalent to the VaR of the maximal deficit of the ruin process in infinite time. A related Tail-VaR-type risk measure is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new model for dealing with decision making under risk by considering subjective and objective information in the same formulation is here presented. The uncertain probabilistic weighted average (UPWA) is also presented. Its main advantage is that it unifies the probability and the weighted average in the same formulation and considering the degree of importance that each case has in the analysis. Moreover, it is able to deal with uncertain environments represented in the form of interval numbers. We study some of its main properties and particular cases. The applicability of the UPWA is also studied and it is seen that it is very broad because all the previous studies that use the probability or the weighted average can be revised with this new approach. Focus is placed on a multi-person decision making problem regarding the selection of strategies by using the theory of expertons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the extent of genomic transcription and its functional relevance is a central goal in genomics research. However, detailed genome-wide investigations of transcriptome complexity in major mammalian organs have been scarce. Here, using extensive RNA-seq data, we show that transcription of the genome is substantially more widespread in the testis than in other organs across representative mammals. Furthermore, we reveal that meiotic spermatocytes and especially postmeiotic round spermatids have remarkably diverse transcriptomes, which explains the high transcriptome complexity of the testis as a whole. The widespread transcriptional activity in spermatocytes and spermatids encompasses protein-coding and long noncoding RNA genes but also poorly conserves intergenic sequences, suggesting that it may not be of immediate functional relevance. Rather, our analyses of genome-wide epigenetic data suggest that this prevalent transcription, which most likely promoted the birth of new genes during evolution, is facilitated by an overall permissive chromatin in these germ cells that results from extensive chromatin remodeling.