188 resultados para plausibility


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Risk scores and accelerated diagnostic protocols can identify chest pain patients with low risk of major adverse cardiac event who could be discharged early from the ED, saving time and costs. We aimed to derive and validate a chest pain score and accelerated diagnostic protocol (ADP) that could safely increase the proportion of patients suitable for early discharge. Methods Logistic regression identified statistical predictors for major adverse cardiac events in a derivation cohort. Statistical coefficients were converted to whole numbers to create a score. Clinician feedback was used to improve the clinical plausibility and the usability of the final score (Emergency Department Assessment of Chest pain Score [EDACS]). EDACS was combined with electrocardiogram results and troponin results at 0 and 2 h to develop an ADP (EDACS-ADP). The score and EDACS-ADP were validated and tested for reproducibility in separate cohorts of patients. Results In the derivation (n = 1974) and validation (n = 608) cohorts, the EDACS-ADP classified 42.2% (sensitivity 99.0%, specificity 49.9%) and 51.3% (sensitivity 100.0%, specificity 59.0%) as low risk of major adverse cardiac events, respectively. The intra-class correlation coefficient for categorisation of patients as low risk was 0.87. Conclusion The EDACS-ADP identified approximately half of the patients presenting to the ED with possible cardiac chest pain as having low risk of short-term major adverse cardiac events, with high sensitivity. This is a significant improvement on similar, previously reported protocols. The EDACS-ADP is reproducible and has the potential to make considerable cost reductions to health systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My thesis examined an alternative approach, referred to as the unitary taxation approach to the allocation of profit, which arises from the notion that as a multinational group exists as a single economic entity, it should be taxed as one taxable unit. The plausibility of a unitary taxation regime achieving international acceptance and agreement is highly contestable due to its implementation issues, and economic and political feasibility. Using a case-study approach focusing on Freeport-McMoRan and Rio Tinto's mining operations in Indonesia, this thesis compares both tax regimes against the criteria for a good tax system - equity, efficiency, neutrality and simplicity. This thesis evaluates key issues that arise when implementing a unitary taxation approach with formulary apportionment based on the context of mining multinational firms in Indonesia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper concentrates on Heraclitus, Parmenides and Lao Zi. The focus is on their ideas on change and whether the world is essentially One or if it is composed of many entities. In the first chapter I go over some general tendences in Greek and Chinese philosophy. The differences in the cultural background have an influence in the ways philosophy is made, but the paper aims to show that two questions can be brought up when comparing the philosophies of Heraclitus, Parmenides and Lao Zi. The questions are; is the world essentially One or Many? Is change real and if it is, what is the nature of it and how does it take place? For Heraclitus change is real, and as will be shown later in the chapter, quite essential for the sustainability of the world-order (kosmos). The key-concept in the case of Heraclitus is Logos. Heraclitus uses Logos in several senses, most well known relating to his element-theory. But another important feature of the Logos, the content of real wisdom, is to be able to regard everything as one. This does not mean that world is essentially one for Heraclitus in the ontological sense, but that we should see the underlying unity of multiple phenomena. Heraclitus regards this as hen panta: All from One, One from All. I characterize Heraclitus as epistemic monist and an ontological pluralist. It is plausible that the views of Heraclitus on change were the focus of Parmenides’ severe criticism. Parmenides held the view that the world is essentially one and that to see it as consisting of many entities was the error of mortals, i.e. the common man and his philosophical predecessors. For Parmenides what-is, can be approached by two routes; The Way of Truth (Aletheia) and The Way of Seeming (Doxa). Aletheia essentially sees the world as one, where even time is an illusion. In Doxa Parmenides is giving an explanation of the world seen as consisting of many entities and this is his contribution to the line of thought of his predecessors. It should be noted that a strong emphasis is given to the Aletheia, whereas the world-view given is in Doxa is only probable. I go on to describe Parmenides as ontological monist, who gives some plausibility to pluralistic views. In the work of Lao Zi world can be seen as One or as consisting of Many entities. In my interpretation, Lao Zi uses Dao in two different senses; Dao is the totality of things or the order in change. The wu-aspect (seeing-without-form) attends the world as one, whereas the you-aspect attends the world of many entities. In wu-aspect, Dao refers to the totality of things, when in you-aspect Dao is the order or law in change. There are two insights in Lao Zi regarding the relationship between wu- and- you-apects; in ch.1 it is stated that they are two separate aspects in seeing the world, the other chapters regarding that you comes from wu. This naturally brings in the question whether the One is the peak of seeing the world as many. In other words, is there a way from pluralism to monism. All these considerations make it probable that the work attributed to Lao Zi has been added new material or is a compilation of oral sayings. In the end of the paper I will go on to give some insights on how Logos and Dao can be compared in a relevant manner. I also compare Parmenides holistic monism to Lao Zi’s Dao as nameless totality (i.e. in its wu-aspect). I briefly touch the issues of Heidegger and the future of comparative philosophy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research deals with direct speech quotations in magazine articles through two questions: As my major research question, I study the functions of speech quotations based on a data consisting of six literary-journalistic magazine articles. My minor research question builds on the fact that there is no absolute relation between the sound waves of the spoken language and the graphemes of the written one. Hence, I study the general thoughts on how utterances should be arranged in the written form based on a large review of literature and textbooks on journalistic writing as well as interviews I have made with magazine writers and editors, and the Council of Mass Media in Finland. To support my main research questions, I also examine the reference system of the Finnish language, define the aspects of the literary-journalistic article and study vernacular cues in written speech quotations. FUNCTIONS OF QUOTATIONS. I demonstrate the results of my analysis with a six-pointed apparatus. It is a continuum which extends from the structural level of text, all the way through the explicit functions, to the implicit functions of the quotation. The explicit functions deal with the question of what is the content, whereas the implicit ones base mainly on the question how the content is presented. 1. The speech quotation is an distinctive element in the structure of the magazine article. Thereby it creates a rhythm for the text, such as episodes, paragraphs and clauses. 2. All stories are told through a plot, and in magazine articles, the speech quotations are one of the narrative elements that propel the plot forward. 3. The speech quotations create and intensify the location written in the story. This location can be a physical one but also a social one, in which case it describes the atmosphere and mood in the physical environment and of the story characters. 4. The quotations enhance the plausibility of the facts and assumptions presented in the article, and moreover, when a text is placed between quotation marks, the reader can be assured that the text has been reproduced in the authentic verbatim way. 5. Speech quotations tell about the speaker's unique way of using language and the first-hand experiences of the person quoted. 6. The sixth function of speech quotations is probably the most essential one: the quotations characterize the quoted speaker. In other words, in addition to the propositional content of the utterance, the way in which it has been said transmits a lot of the speaker's character (e.g. nature, generation, behaviour, education, attitudes etc.). It is important to notice, that these six functions of my speech quotation apparatus do not exlude one another. It means that every speech quotation basically includes all of the functions discussed above. However, in practice one or more of them have a principal role, while the others play a subsidiary role. HOW TO MAKE QUOTATIONS? It is not suprising that the field of journalism (textbooks, literature and interviews) holds heterogeneous and unestablished thoughts on how the spoken language should be arranged in written quotations, which is my minor research question. However, the most frequent and distinctive aspects can be depicted in a couple of words: serve the reader and respect the target person. Very common advice on how to arrange the quotations is − firstly, to delete such vernacular cues (e.g. repetitions and ”expletives”) that are common in spoken communication, but purposeless in the written language. − secondly, to complete the phonetic word forms of the spoken language into a more reader-friendly form (esim. punanen → punainen, 'red'), and − thirdly, to enhance the independence of clauses from the (authentic) context and to toughen reciprocal links between them. According to the knowledge of the journalistic field, utterances recorded in different points in time of an interview or a data-collecting session can be transferred as consecutive quotations or even merged together. However, if there is any temporal-spatial location written in the story, the dialogue of the story characters should also be situated in an authentic context – chronologically in the right place in the continuum of the events. To summarize, the way in which the utterances should be arranged into written speech quotations is always situationally-specific − and it is strongly based on the author's discretion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we numerically model isothermal turbulent swirling flow in a cylindrical burner. Three versions of the RNG k-epsilon model are assessed against performance of the standard k-epsilon model. Sensitivity of numerical predictions to grid refinement, differing convective differencing schemes and choice of (unknown) inlet dissipation rate, were closely scrutinised to ensure accuracy. Particular attention is paid to modelling the inlet conditions to within the range of uncertainty of the experimental data, as model predictions proved to be significantly sensitive to relatively small changes in upstream flow conditions. We also examine the characteristics of the swirl--induced recirculation zone predicted by the models over an extended range of inlet conditions. Our main findings are: - (i) the standard k-epsilon model performed best compared with experiment; - (ii) no one inlet specification can simultaneously optimize the performance of the models considered; - (iii) the RNG models predict both single-cell and double-cell IRZ characteristics, the latter both with and without additional internal stagnation points. The first finding indicates that the examined RNG modifications to the standard k-e model do not result in an improved eddy viscosity based model for the prediction of swirl flows. The second finding suggests that tuning established models for optimal performance in swirl flows a priori is not straightforward. The third finding indicates that the RNG based models exhibit a greater variety of structural behaviour, despite being of the same level of complexity as the standard k-e model. The plausibility of the predicted IRZ features are discussed in terms of known vortex breakdown phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we give a method for probabilistic assignment to the Realistic Abductive Reasoning Model, The knowledge is assumed to be represented in the form of causal chaining, namely, hyper-bipartite network. Hyper-bipartite network is the most generalized form of knowledge representation for which, so far, there has been no way of assigning probability to the explanations, First, the inference mechanism using realistic abductive reasoning model is briefly described and then probability is assigned to each of the explanations so as to pick up the explanations in the decreasing order of plausibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel hypothesis on the function of massive feedback pathways in mammalian visual systems. We propose that the cortical feature detectors compete not for the right to represent the output at a point, but for exclusive rights to abstract and represent part of the underlying input. Feedback can do this very naturally. A computational model that implements the above idea for the problem of line detection is presented and based on that we suggest a functional role for the thalamo-cortical loop during perception of lines. We show that the model successfully tackles the so called Cross problem. Based on some recent experimental results, we discuss the biological plausibility of our model. We also comment on the relevance of our hypothesis (on the role of feedback) to general sensory information processing and recognition. (C) 1998 Published by Elsevier Science Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The moments of the hadronic spectral functions are of interest for the extraction of the strong coupling alpha(s) and other QCD parameters from the hadronic decays of the tau lepton. Motivated by the recent analyses of a large class of moments in the standard fixed-order and contour-improved perturbation theories, we consider the perturbative behavior of these moments in the framework of a QCD nonpower perturbation theory, defined by the technique of series acceleration by conformal mappings, which simultaneously implements renormalization-group summation and has a tame large-order behavior. Two recently proposed models of the Adler function are employed to generate the higher-order coefficients of the perturbation series and to predict the exact values of the moments, required for testing the properties of the perturbative expansions. We show that the contour-improved nonpower perturbation theories and the renormalization-group-summed nonpower perturbation theories have very good convergence properties for a large class of moments of the so-called ``reference model,'' including moments that are poorly described by the standard expansions. The results provide additional support for the plausibility of the description of the Adler function in terms of a small number of dominant renormalons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis (IBH) in order to help map the spectrum of possible relations between social interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organization of interaction processes that characterize the dynamics of social engagement. The patterns and synergies of this self-organization help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the practices and dispositions that are summoned in situations of social significance (even if not interactive). This latter idea links interactive factors to more classical observational scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamic interaction of limb segments during movements that involve multiple joints creates torques in one joint due to motion about another. Evidence shows that such interaction torques are taken into account during the planning or control of movement in humans. Two alternative hypotheses could explain the compensation of these dynamic torques. One involves the use of internal models to centrally compute predicted interaction torques and their explicit compensation through anticipatory adjustment of descending motor commands. The alternative, based on the equilibrium-point hypothesis, claims that descending signals can be simple and related to the desired movement kinematics only, while spinal feedback mechanisms are responsible for the appropriate creation and coordination of dynamic muscle forces. Partial supporting evidence exists in each case. However, until now no model has explicitly shown, in the case of the second hypothesis, whether peripheral feedback is really sufficient on its own for coordinating the motion of several joints while at the same time accommodating intersegmental interaction torques. Here we propose a minimal computational model to examine this question. Using a biomechanics simulation of a two-joint arm controlled by spinal neural circuitry, we show for the first time that it is indeed possible for the neuromusculoskeletal system to transform simple descending control signals into muscle activation patterns that accommodate interaction forces depending on their direction and magnitude. This is achieved without the aid of any central predictive signal. Even though the model makes various simplifications and abstractions compared to the complexities involved in the control of human arm movements, the finding lends plausibility to the hypothesis that some multijoint movements can in principle be controlled even in the absence of internal models of intersegmental dynamics or learned compensatory motor signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Toward our comprehensive understanding of legged locomotion in animals and machines, the compass gait model has been intensively studied for a systematic investigation of complex biped locomotion dynamics. While most of the previous studies focused only on the locomotion on flat surfaces, in this article, we tackle with the problem of bipedal locomotion in rough terrains by using a minimalistic control architecture for the compass gait walking model. This controller utilizes an open-loop sinusoidal oscillation of hip motor, which induces basic walking stability without sensory feedback. A set of simulation analyses show that the underlying mechanism lies in the "phase locking" mechanism that compensates phase delays between mechanical dynamics and the open-loop motor oscillation resulting in a relatively large basin of attraction in dynamic bipedal walking. By exploiting this mechanism, we also explain how the basin of attraction can be controlled by manipulating the parameters of oscillator not only on a flat terrain but also in various inclined slopes. Based on the simulation analysis, the proposed controller is implemented in a real-world robotic platform to confirm the plausibility of the approach. In addition, by using these basic principles of self-stability and gait variability, we demonstrate how the proposed controller can be extended with a simple sensory feedback such that the robot is able to control gait patterns autonomously for traversing a rough terrain. © 2010 Springer Science+Business Media, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In human and animal running spring-like leg behavior is found, and similar concepts have been demonstrated by various robotic systems in the past. In general, a spring-mass model provides self-stabilizing characteristics against external perturbations originated in leg-ground interactions and motor control. Although most of these systems made use of linear spring-like legs. The question addressed in this paper is the influence of leg segmentation (i.e. the use of rotational joint and two limb-segments) to the self-stability of running, as it appears to be a common design principle in nature. This paper shows that, with the leg segmentation, the system is able to perform self-stable running behavior in significantly broader ranges of running speed and control parameters (e.g. control of angle of attack at touchdown, and adjustment of spring stiffness) by exploiting a nonlinear relationship between leg force and leg compression. The concept is investigated by using a two-segment leg model and a robotic platform, which demonstrate the plausibility in the real world. ©2008 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to show that Dempster-Shafer evidence theory may be successfully applied to unsupervised classification in multisource remote sensing. Dempster-Shafer formulation allows for consideration of unions of classes, and to represent both imprecision and uncertainty, through the definition of belief and plausibility functions. These two functions, derived from mass function, are generally chosen in a supervised way. In this paper, the authors describe an unsupervised method, based on the comparison of monosource classification results, to select the classes necessary for Dempster-Shafer evidence combination and to define their mass functions. Data fusion is then performed, discarding invalid clusters (e.g. corresponding to conflicting information) thank to an iterative process. Unsupervised multisource classification algorithm is applied to MAC-Europe'91 multisensor airborne campaign data collected over the Orgeval French site. Classification results using different combinations of sensors (TMS and AirSAR) or wavelengths (L- and C-bands) are compared. Performance of data fusion is evaluated in terms of identification of land cover types. The best results are obtained when all three data sets are used. Furthermore, some other combinations of data are tried, and their ability to discriminate between the different land cover types is quantified