918 resultados para Measure of riskiness
Resumo:
An approach by which the detrented fluctuation analysis (DFA) method can be used to help diagnose heart failure was demonstrated. DFA was applied to patients suffering from congestive heart failure (CHF) to check correlations between DFA indices and CHF, and determine a correlation between DFA indices and mortality, with a particular attention to the residue parameter, which is a measure of the departure of the DFA from its power law approximation. DFA parameters proved to be useful as a complement to the physiological parameters weber and FE to sort out the patients into three prognostic group.
Resumo:
Multidisciplinary Design Optimization (MDO) is a methodology for optimizing large coupled systems. Over the years, a number of different MDO decomposition strategies, known as architectures, have been developed, and various pieces of analytical work have been done on MDO and its architectures. However, MDO lacks an overarching paradigm which would unify the field and promote cumulative research. In this paper, we propose a differential geometry framework as such a paradigm: Differential geometry comes with its own set of analysis tools and a long history of use in theoretical physics. We begin by outlining some of the mathematics behind differential geometry and then translate MDO into that framework. This initial work gives new tools and techniques for studying MDO and its architectures while producing a naturally arising measure of design coupling. The framework also suggests several new areas for exploration into and analysis of MDO systems. At this point, analogies with particle dynamics and systems of differential equations look particularly promising for both the wealth of extant background theory that they have and the potential predictive and evaluative power that they hold. © 2012 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Resumo:
We examine theoretically the transient displacement flow and density stratification that develops within a ventilated box after two localized floor-level heat sources of unequal strengths are activated. The heat input is represented by two non-interacting turbulent axisymmetric plumes of constant buoyancy fluxes B1 and B2 > B1. The box connects to an unbounded quiescent external environment of uniform density via openings at the top and base. A theoretical model is developed to predict the time evolution of the dimensionless depths λj and mean buoyancies δj of the 'intermediate' (j = 1) and 'top' (j = 2) layers leading to steady state. The flow behaviour is classified in terms of a stratification parameter S, a dimensionless measure of the relative forcing strengths of the two buoyant layers that drive the flow. We find that dδ1/dτ α 1/λ1 and dδ2/dτ α 1/λ2, where τ is a dimensionless time. When S 1, the intermediate layer is shallow (small λ1), whereas the top layer is relatively deep (large λ2) and, in this limit, δ1 and δ2 evolve on two characteristically different time scales. This produces a time lag and gives rise to a 'thermal overshoot', during which δ1 exceeds its steady value and attains a maximum during the transients; a flow feature we refer to, in the context of a ventilated room, as 'localized overheating'. For a given source strength ratio ψ = B1/B2, we show that thermal overshoots are realized for dimensionless opening areas A < Aoh and are strongly dependent on the time history of the flow. We establish the region of {A, ψ} space where rapid development of δ1 results in δ1 > δ2, giving rise to a bulk overturning of the buoyant layers. Finally, some implications of these results, specifically to the ventilation of a room, are discussed. © Cambridge University Press 2013.
Resumo:
Localization of chess-board vertices is a common task in computer vision, underpinning many applications, but relatively little work focusses on designing a specific feature detector that is fast, accurate and robust. In this paper the 'Chess-board Extraction by Subtraction and Summation' (ChESS) feature detector, designed to exclusively respond to chess-board vertices, is presented. The method proposed is robust against noise, poor lighting and poor contrast, requires no prior knowledge of the extent of the chess-board pattern, is computationally very efficient, and provides a strength measure of detected features. Such a detector has significant application both in the key field of camera calibration, as well as in structured light 3D reconstruction. Evidence is presented showing its superior robustness, accuracy, and efficiency in comparison to other commonly used detectors, including Harris & Stephens and SUSAN, both under simulation and in experimental 3D reconstruction of flat plate and cylindrical objects. © 2013 Elsevier Inc. All rights reserved.
Resumo:
This paper discusses the algorithm on the distance from a point and an infinite sub-space in high dimensional space With the development of Information Geometry([1]), the analysis tools of points distribution in high dimension space, as a measure of calculability, draw more attention of experts of pattern recognition. By the assistance of these tools, Geometrical properties of sets of samples in high-dimensional structures are studied, under guidance of the established properties and theorems in high-dimensional geometry.
Resumo:
Three-protein circadian oscillations in cyanobacteria sustain for weeks. To understand how cellular oscillations function robustly in stochastic fluctuating environments, we used a stochastic model to uncover two natures of circadian oscillation: the potential landscape related to steady-state probability distribution of protein concentrations; and the corresponding flux related to speed of concentration changes which drive the oscillations. The barrier height of escaping from the oscillation attractor on the landscape provides a quantitative measure of the robustness and coherence for oscillations against intrinsic and external fluctuations. The difference between the locations of the zero total driving force and the extremal of the potential provides a possible experimental probe and quantification of the force from curl flux. These results, correlated with experiments, can help in the design of robust oscillatory networks.
Resumo:
We study the origin of robustness of yeast cell cycle cellular network through uncovering its underlying energy landscape. This is realized from the information of the steady-state probabilities by solving a discrete set of kinetic master equations for the network. We discovered that the potential landscape of yeast cell cycle network is funneled toward the global minimum, G1 state. The ratio of the energy gap between G1 and average versus roughness of the landscape termed as robustness ratio ( RR) becomes a quantitative measure of the robustness and stability for the network. The funneled landscape is quite robust against random perturbations from the inherent wiring or connections of the network. There exists a global phase transition between the more sensitive response or less self-degradation phase leading to underlying funneled global landscape with large RR, and insensitive response or more self-degradation phase leading to shallower underlying landscape of the network with small RR. Furthermore, we show that the more robust landscape also leads to less dissipation cost of the network. Least dissipation and robust landscape might be a realization of Darwinian principle of natural selection at cellular network level. It may provide an optimal criterion for network wiring connections and design.
Resumo:
We uncover the underlying potential energy landscape for a cellular network. We find that the potential energy landscape of the mitogen-activated protein-kinase signal transduction network is funneled toward the global minimum. The funneled landscape is quite robust against random perturbations. This naturally explains robustness from a physical point of view. The ratio of slope versus roughness of the landscape becomes a quantitative measure of robustness of the network. Funneled landscape is a realization of the Darwinian principle of natural selection at the cellular network level. It provides an optimal criterion for network connections and design. Our approach is general and can be applied to other cellular networks.
Resumo:
The pulsed-laser polymerization in emulsions has been simulated by the Monte Carlo method. Our simulation shows that the best measure of the propagation rate coefficients K-p is the peak maximum of molecular weight distribution for microemulsions when the droplets are small. However, the inflection point at the low-molecular-weight side of the peaks provides the best measure of K-p of bigger droplets. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
Influences of seven organic modifiers, including urea, methanol (MeOH), dioxane (DIO), tetrahydrofuran (THF), acetonitrile (ACN), 1-propanol (1-PrOH) and 2-propanol (2-PrOH), on the solute retention and the electrokinetic migrations in micellar electrokinetic capillary chromatography (MEKC) are investigated with sodium dodecyl sulfate (SDS) micelle as pseudostationary phase. It is observed that in the limited concentration ranges used in the MEKC systems the effect of organic modifier concentration on the retention can be described by the equation logk'=logk'(w)-SC for most binary aqueous-organic buffer, but deviations from this retention equation are observed at ACN and particularly THF as organic modifiers. With parameter S as a measure of the elutropic strength, the elutropic strength of the organic modifiers is found to follow a general order urea
Resumo:
The blocking probability of a network is a common measure of its performance. There exist means of quickly calculating the blocking probabilities of Banyan networks; however, because Banyan networks have no redundant paths, they are not inherently fault-tolerant, and so their use in large-scale multiprocessors is problematic. Unfortunately, the addition of multiple paths between message sources and sinks in a network complicates the calculation of blocking probabilities. A methodology for exact calculation of blocking probabilities for small networks with redundant paths is presented here, with some discussion of its potential use in approximating blocking probabilities for large networks with redundant paths.
Resumo:
A fundamental problem in artificial intelligence is obtaining coherent behavior in rule-based problem solving systems. A good quantitative measure of coherence is time behavior; a system that never, in retrospect, applied a rule needlessly is certainly coherent; a system suffering from combinatorial blowup is certainly behaving incoherently. This report describes a rule-based problem solving system for automatically writing and improving numerical computer programs from specifications. The specifications are in terms of "constraints" among inputs and outputs. The system has solved program synthesis problems involving systems of equations, determining that methods of successive approximation converge, transforming recursion to iteration, and manipulating power series (using differing organizations, control structures, and argument-passing techniques).
Resumo:
McGuinness, T. and Morgan, R. (2005). The effect of market and learning orientation on strategy dynamics: The contributing effect of organisational change capability. European Journal of Marketing. 39(11-12), pp.1306-1326 RAE2008
Resumo:
Jenkins, Tudor; Hayton, D.J.; Bedson, T.R.; Palmer, R.E., (2001) 'Quantitative evaluation of electron beam writing in passivated gold nanoclusters', Applied Physics Letters (78) pp.1921-1923 RAE2008
Resumo:
Oculographical research of people watching a human face indicates than beholder's eyes stop most often and for the longest period of time on the eyes and the mouth of the face looked at and that they move among these three points most frequently. The position of the eyes and mouth in relation to one another can be described with a single number being a measure of an angle with the vertex in the middle of the mouth and with arms crossing the centers of the eye pupils. The angles were measured from photographs of people from all over the world, as well as of residents of Lublin. Subsequently, the subjects from Lublin were asked to make face schemas by positioning the eyes and the mouth in the way they considered most attractive. The eye-mouth-eye angle of these schemas was measured. Additionally, measurements of the same angle were taken from the faces depicted in icons. The schemas of the most attractive - according to the subjects - faces were characterized by angles approximating the mean angle from the photographs, and significantly greater than the mean angle from the icons.