40 resultados para POLYNOMIAL CHAOS

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: The purpose of this study was to develop a mathematical model (sine model, SIN) to describe fat oxidation kinetics as a function of the relative exercise intensity [% of maximal oxygen uptake (%VO2max)] during graded exercise and to determine the exercise intensity (Fatmax) that elicits maximal fat oxidation (MFO) and the intensity at which the fat oxidation becomes negligible (Fatmin). This model included three independent variables (dilatation, symmetry, and translation) that incorporated primary expected modulations of the curve because of training level or body composition. METHODS: Thirty-two healthy volunteers (17 women and 15 men) performed a graded exercise test on a cycle ergometer, with 3-min stages and 20-W increments. Substrate oxidation rates were determined using indirect calorimetry. SIN was compared with measured values (MV) and with other methods currently used [i.e., the RER method (MRER) and third polynomial curves (P3)]. RESULTS: There was no significant difference in the fitting accuracy between SIN and P3 (P = 0.157), whereas MRER was less precise than SIN (P < 0.001). Fatmax (44 +/- 10% VO2max) and MFO (0.37 +/- 0.16 g x min(-1)) determined using SIN were significantly correlated with MV, P3, and MRER (P < 0.001). The variable of dilatation was correlated with Fatmax, Fatmin, and MFO (r = 0.79, r = 0.67, and r = 0.60, respectively, P < 0.001). CONCLUSIONS: The SIN model presents the same precision as other methods currently used in the determination of Fatmax and MFO but in addition allows calculation of Fatmin. Moreover, the three independent variables are directly related to the main expected modulations of the fat oxidation curve. SIN, therefore, seems to be an appropriate tool in analyzing fat oxidation kinetics obtained during graded exercise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum indeterminism is frequently invoked as a solution to the problem of how a disembodied soul might interact with the brain (as Descartes proposed), and is sometimes invoked in theories of libertarian free will even when they do not involve dualistic assumptions. Taking as example the Eccles-Beck model of interaction between self (or soul) and brain at the level of synaptic exocytosis, I here evaluate the plausibility of these approaches. I conclude that Heisenbergian uncertainty is too small to affect synaptic function, and that amplification by chaos or by other means does not provide a solution to this problem. Furthermore, even if Heisenbergian effects did modify brain functioning, the changes would be swamped by those due to thermal noise. Cells and neural circuits have powerful noise-resistance mechanisms, that are adequate protection against thermal noise and must therefore be more than sufficient to buffer against Heisenbergian effects. Other forms of quantum indeterminism must be considered, because these can be much greater than Heisenbergian uncertainty, but these have not so far been shown to play a role in the brain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Excessive exposure to solar ultraviolet (UV) is the main cause of skin cancer. Specific prevention should be further developed to target overexposed or highly vulnerable populations. A better characterisation of anatomical UV exposure patterns is however needed for specific prevention. To develop a regression model for predicting the UV exposure ratio (ER, ratio between the anatomical dose and the corresponding ground level dose) for each body site without requiring individual measurements. A 3D numeric model (SimUVEx) was used to compute ER for various body sites and postures. A multiple fractional polynomial regression analysis was performed to identify predictors of ER. The regression model used simulation data and its performance was tested on an independent data set. Two input variables were sufficient to explain ER: the cosine of the maximal daily solar zenith angle and the fraction of the sky visible from the body site. The regression model was in good agreement with the simulated data ER (R(2)=0.988). Relative errors up to +20% and -10% were found in daily doses predictions, whereas an average relative error of only 2.4% (-0.03% to 5.4%) was found in yearly dose predictions. The regression model predicts accurately ER and UV doses on the basis of readily available data such as global UV erythemal irradiance measured at ground surface stations or inferred from satellite information. It renders the development of exposure data on a wide temporal and geographical scale possible and opens broad perspectives for epidemiological studies and skin cancer prevention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we unify, simplify, and extend previous work on the evolutionary dynamics of symmetric N-player matrix games with two pure strategies. In such games, gains from switching strategies depend, in general, on how many other individuals in the group play a given strategy. As a consequence, the gain function determining the gradient of selection can be a polynomial of degree N-1. In order to deal with the intricacy of the resulting evolutionary dynamics, we make use of the theory of polynomials in Bernstein form. This theory implies a tight link between the sign pattern of the gains from switching on the one hand and the number and stability of the rest points of the replicator dynamics on the other hand. While this relationship is a general one, it is most informative if gains from switching have at most two sign changes, as is the case for most multi-player matrix games considered in the literature. We demonstrate that previous results for public goods games are easily recovered and extended using this observation. Further examples illustrate how focusing on the sign pattern of the gains from switching obviates the need for a more involved analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans une première partie, l'A. de cet article considère les relations instables qui existent entre l'éthique et la théologie, car il s'agit de deux problématiques distinctes. Dans le deuxième point, il montre que ces deux problématiques ne sont pas si distinctes qu'il eût été possible, traitant de la première, d'en taire complètement la seconde, qui aborde de front la tension entre la maîtrise et le chaos. La partie finale développe la reconstruction de l'avenir public de l'éthique théologique : signaler la transcendance, répondre du mal et résister aux injustices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Management of blood pressure (BP) in acute ischemic stroke is controversial. The present study aims to explore the association between baseline BP levels and BP change and outcome in the overall stroke population and in specific subgroups with regard to the presence of arterial hypertensive disease and prior antihypertensive treatment. METHODS: All patients registered in the Acute STroke Registry and Analysis of Lausanne (ASTRAL) between 2003 and 2009 were analyzed. Unfavorable outcome was defined as modified Rankin score more than 2. A local polynomial surface algorithm was used to assess the effect of BP values on outcome in the overall population and in predefined subgroups. RESULTS: Up to a certain point, as initial BP was increasing, optimal outcome was seen with a progressively more substantial BP decrease over the next 24-48 h. Patients without hypertensive disease and an initially low BP seemed to benefit from an increase of BP. In patients with hypertensive disease, initial BP and its subsequent changes seemed to have less influence on clinical outcome. Patients who were previously treated with antihypertensives did not tolerate initially low BPs well. CONCLUSION: Optimal outcome in acute ischemic stroke may be determined not only by initial BP levels but also by the direction and magnitude of associated BP change over the first 24-48 h.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.