915 resultados para deterministic fractals
Resumo:
The β2 adrenergic receptor (β2AR) regulates smooth muscle relaxation in the vasculature and airways. Long- and Short-acting β-agonists (LABAs/SABAs) are widely used in treatment of chronic obstructive pulmonary disorder (COPD) and asthma. Despite their widespread clinical use we do not understand well the dominant β2AR regulatory pathways that are stimulated during therapy and bring about tachyphylaxis, which is the loss of drug effects. Thus, an understanding of how the β2AR responds to various β-agonists is crucial to their rational use. Towards that end we have developed deterministic models that explore the mechanism of drug- induced β2AR regulation. These mathematical models can be classified into three classes; (i) Six quantitative models of SABA-induced G protein coupled receptor kinase (GRK)-mediated β2AR regulation; (ii) Three phenomenological models of salmeterol (a LABA)-induced GRK-mediated β2AR regulation; and (iii) One semi-quantitative, unified model of SABA-induced GRK-, protein kinase A (PKA)-, and phosphodiesterase (PDE)-mediated regulation of β2AR signalling. The various models were constrained with all or some of the following experimental data; (i) GRK-mediated β2AR phosphorylation in response to various LABAs/SABAs; (ii) dephosphorylation of the GRK site on the β2AR; (iii) β2AR internalisation; (iv) β2AR recycling; (v) β2AR desensitisation; (vi) β2AR resensitisation; (vii) PKA-mediated β2AR phosphorylation in response to a SABA; and (viii) LABA/SABA induced cAMP profile ± PDE inhibitors. The models of GRK-mediated β2AR regulation show that plasma membrane dephosphorylation and recycling of the phosphorylated β2AR are required to reconcile with the measured dephosphorylation kinetics. We further used a consensus model to predict the consequences of rapid pulsatile agonist stimulation and found that although resensitisation was rapid, the β2AR system retained the memory of prior stimuli and desensitised much more rapidly and strongly in response to subsequent stimuli. This could explain tachyphylaxis of SABAs over repeated use in rescue therapy of asthma patients. The LABA models show that the long action of salmeterol can be explained due to decreased stability of the arrestin/β2AR/salmeterol complex. This could explain long action of β-agonists used in maintenance therapy of asthma patients. Our consensus model of PKA/PDE/GRK-mediated β2AR regulation is being used to identify the dominant β2AR desensitisation pathways under different therapeutic regimens in human airway cells. In summary our models represent a significant advance towards understanding agonist-specific β2AR regulation that will aid in a more rational use of the β2AR agonists in the treatment of asthma.
Resumo:
The goal of this study was to investigate offline memory consolidation with regard to general motor skill learning and implicit sequence-specific learning. We trained young adults on a serial reaction time task with a retention interval of either 24 hours (Experiment 1) or 1 week (Experiment 2) between two sessions. We manipulated sequence complexity (deterministic vs. probabilistic) and motor responses (unimanual or vs. bimanual). We found no evidence of offline memory consolidation for sequencespecific learning with either interval (in the sense of no deterioration over the interval but no further improvement either). However, we did find evidence of offline enhancement of general motor skill learning with both intervals, independent of kind of sequence or kind of response. These results suggest that general motor skill learning, but not sequence-specific learning, appears to be enhanced during offline intervals in implicit sequence learning.
Resumo:
The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).
Resumo:
Many studies have examined whether communities are structured by random or deterministic processes, and both are likely to play a role, but relatively few studies have attempted to quantify the degree of randomness in species composition. We quantified, for the first time, the degree of randomness in forest bird communities based on an analysis of spatial autocorrelation in three regions of Germany. The compositional dissimilarity between pairs of forest patches was regressed against the distance between them. We then calculated the y-intercept of the curve, i.e. the ‘nugget’, which represents the compositional dissimilarity at zero spatial distance. We therefore assume, following similar work on plant communities, that this represents the degree of randomness in species composition. We then analysed how the degree of randomness in community composition varied over time and with forest management intensity, which we expected to reduce the importance of random processes by increasing the strength of environmental drivers. We found that a high portion of the bird community composition could be explained by chance (overall mean of 0.63), implying that most of the variation in local bird community composition is driven by stochastic processes. Forest management intensity did not consistently affect the mean degree of randomness in community composition, perhaps because the bird communities were relatively insensitive to management intensity. We found a high temporal variation in the degree of randomness, which may indicate temporal variation in assembly processes and in the importance of key environmental drivers. We conclude that the degree of randomness in community composition should be considered in bird community studies, and the high values we find may indicate that bird community composition is relatively hard to predict at the regional scale.
Resumo:
Image denoising continues to be an active research topic. Although state-of-the-art denoising methods are numerically impressive and approch theoretical limits, they suffer from visible artifacts.While they produce acceptable results for natural images, human eyes are less forgiving when viewing synthetic images. At the same time, current methods are becoming more complex, making analysis, and implementation difficult. We propose image denoising as a simple physical process, which progressively reduces noise by deterministic annealing. The results of our implementation are numerically and visually excellent. We further demonstrate that our method is particularly suited for synthetic images. Finally, we offer a new perspective on image denoising using robust estimators.
Resumo:
In this paper we introduce a class of descriptors for regular languages arising from an application of the Stone duality between finite Boolean algebras and finite sets. These descriptors, called classical fortresses, are object specified in classical propositional logic and capable to accept exactly regular languages. To prove this, we show that the languages accepted by classical fortresses and deterministic finite automata coincide. Classical fortresses, besides being propositional descriptors for regular languages, also turn out to be an efficient tool for providing alternative and intuitive proofs for the closure properties of regular languages.
Resumo:
We consider the problem of twenty questions with noisy answers, in which we seek to find a target by repeatedly choosing a set, asking an oracle whether the target lies in this set, and obtaining an answer corrupted by noise. Starting with a prior distribution on the target's location, we seek to minimize the expected entropy of the posterior distribution. We formulate this problem as a dynamic program and show that any policy optimizing the one-step expected reduction in entropy is also optimal over the full horizon. Two such Bayes optimal policies are presented: one generalizes the probabilistic bisection policy due to Horstein and the other asks a deterministic set of questions. We study the structural properties of the latter, and illustrate its use in a computer vision application.
Resumo:
Trabecular bone score (TBS) rests on the textural analysis of DXA to reflect the decay in trabecular structure characterising osteoporosis. Yet, its discriminative power in fracture studies remains incomprehensible as prior biomechanical tests found no correlation with vertebral strength. To verify this result possibly due to an unrealistic set-up and to cover a wide range of loading scenarios, the data from three previous biomechanical studies using different experimental settings was used. They involved the compressive failure of 62 human lumbar vertebrae loaded 1) via intervertebral discs to mimic the in vivo situation (“full vertebra”), 2) via the classical endplate embedding (“vertebral body”) or 3) via a ball joint to induce anterior wedge failure (“vertebral section”). HR-pQCT scans acquired prior testing were used to simulate anterior-posterior DXA from which areal bone mineral density (aBMD) and the initial slope of the variogram (ISV), the early definition of TBS, were evaluated. Finally, the relation of aBMD and ISV with failure load (Fexp) and apparent failure stress (σexp) was assessed and their relative contribution to a multi-linear model was quantified via ANOVA. We found that, unlike aBMD, ISV did not significantly correlate with Fexp and σexp, except for the “vertebral body” case (r2 = 0.396, p = 0.028). Aside from the “vertebra section” set-up where it explained only 6.4% of σexp (p = 0.037), it brought no significant improvement to aBMD. These results indicate that ISV, a replica of TBS, is a poor surrogate for vertebral strength no matter the testing set-up, which supports the prior observations and raises a fortiori the question of the deterministic factors underlying the statistical relationship between TBS and vertebral fracture risk.
Resumo:
Introduction: According to the ecological view, coordination establishes byvirtueof social context. Affordances thought of as situational opportunities to interact are assumed to represent the guiding principles underlying decisions involved in interpersonal coordination. It’s generally agreed that affordances are not an objective part of the (social) environment but that they depend on the constructive perception of involved subjects. Theory and empirical data hold that cognitive operations enabling domain-specific efficacy beliefs are involved in the perception of affordances. The aim of the present study was to test the effects of these cognitive concepts in the subjective construction of local affordances and their influence on decision making in football. Methods: 71 football players (M = 24.3 years, SD = 3.3, 21 % women) from different divisions participated in the study. Participants were presented scenarios of offensive game situations. They were asked to take the perspective of the person on the ball and to indicate where they would pass the ball from within each situation. The participants stated their decisions in two conditions with different game score (1:0 vs. 0:1). The playing fields of all scenarios were then divided into ten zones. For each zone, participants were asked to rate their confidence in being able to pass the ball there (self-efficacy), the likelihood of the group staying in ball possession if the ball were passed into the zone (group-efficacy I), the likelihood of the ball being covered safely by a team member (pass control / group-efficacy II), and whether a pass would establish a better initial position to attack the opponents’ goal (offensive convenience). Answers were reported on visual analog scales ranging from 1 to 10. Data were analyzed specifying general linear models for binomially distributed data (Mplus). Maximum likelihood with non-normality robust standard errors was chosen to estimate parameters. Results: Analyses showed that zone- and domain-specific efficacy beliefs significantly affected passing decisions. Because of collinearity with self-efficacy and group-efficacy I, group-efficacy II was excluded from the models to ease interpretation of the results. Generally, zones with high values in the subjective ratings had a higher probability to be chosen as passing destination (βself-efficacy = 0.133, p < .001, OR = 1.142; βgroup-efficacy I = 0.128, p < .001, OR = 1.137; βoffensive convenience = 0.057, p < .01, OR = 1.059). There were, however, characteristic differences in the two score conditions. While group-efficacy I was the only significant predictor in condition 1 (βgroup-efficacy I = 0.379, p < .001), only self-efficacy and offensive convenience contributed to passing decisions in condition 2 (βself-efficacy = 0.135, p < .01; βoffensive convenience = 0.120, p < .001). Discussion: The results indicate that subjectively distinct attributes projected to playfield zones affect passing decisions. The study proposes a probabilistic alternative to Lewin’s (1951) hodological and deterministic field theory and enables insight into how dimensions of the psychological landscape afford passing behavior. Being part of a team, this psychological landscape is not only constituted by probabilities that refer to the potential and consequences of individual behavior, but also to that of the group system of which individuals are part of. Hence, in regulating action decisions in group settings, informers are extended to aspects referring to the group-level. References: Lewin, K. (1951). In D. Cartwright (Ed.), Field theory in social sciences: Selected theoretical papers by Kurt Lewin. New York: Harper & Brothers.
Resumo:
The application of Markov processes is very useful to health-care problems. The objective of this study is to provide a structured methodology of forecasting cost based upon combining a stochastic model of utilization (Markov Chain) and deterministic cost function. The perspective of the cost in this study is the reimbursement for the services rendered. The data to be used is the OneCare database of claim records of their enrollees over a two-year period of January 1, 1996–December 31, 1997. The model combines a Markov Chain that describes the utilization pattern and its variability where the use of resources by risk groups (age, gender, and diagnosis) will be considered in the process and a cost function determined from a fixed schedule based on real costs or charges for those in the OneCare claims database. The cost function is a secondary application to the model. Goodness-of-fit will be used checked for the model against the traditional method of cost forecasting. ^
Resumo:
One of the most critical aspects of G Protein Coupled Receptors (GPCRs) regulation is their rapid and acute desensitization following agonist stimulation. Phosphorylation of these receptors by GPCR kinases (GRK) is a major mechanism of desensitization. Considerable evidence from studies of rhodopsin kinase and GRK2 suggests there is an allosteric docking site for the receptor distinct from the GRK catalytic site. While the agonist-activated GPCR appears crucial for GRK activation, the molecular details of this interaction remain unclear. Recent studies suggested an important role for the N- and C-termini and domains in the small lobe of the kinase domain in allosteric activation; however, neither the mechanism of action of that site nor the RH domain contributions have been elucidated. To search for the allosteric site, we first indentified evolutionarily conserved sites within the RH and kinase domains presumably deterministic of protein function employing evolutionary trace (ET) methodology and crystal structures of GRK6. Focusing on a conserved cluster centered on helices 3, 9, and 10 in the RH domain, key residues of GRK5 and 6 were targeted for mutagenesis and functional assays. We found that a number of double mutations within helices 3, 9, and 10 and the N-terminus markedly reduced (50–90%) the constitutive phosphorylation of the β-2 Adrenergic Receptor (β2AR) in intact cells and phosphorylation of light-activated rhodopsin (Rho*) in vitro as compared to wild type (WT) GRK5 or 6. Based on these results, we designed peptide mimetics of GRK5 helix 9 both computationally and through chemical modifications with the goal of both confirming the importance of helix 9 and developing a useful inhibitor to disrupt the GPCR-GRK interaction. Several peptides were found to block Rho* phosphorylation by GRK5 including the native helix 9 sequence, Peptide Builder designed-peptide preserving only the key ET residues, and chemically locked helices. Most peptidomimetics showed inhibition of GRK5 activity greater than 80 % with an IC50 of ∼ 30 µM. Alanine scanning of helix 9 has further revealed both essential and non-essential residues for inhibition. Importantly, substitution of Arg 169 by an alanine in the native helix 9-based peptide gave an almost complete inhibition at 30 µM with an IC50 of ∼ 10 µM. In summary we report a previously unrecognized crucial role for the RH domain of GRK5 and 6, and the subsequent identification of a lead peptide inhibitor of protein-protein interaction with potential for specific blockade of GPCR desensitization. ^
Resumo:
En este trabajo se desarrolló un modelo probabilístico que utiliza la teoría de la función de densidad de probabilidades derivada para estimar la carga media anual de nitratos transportada por el escurrimiento superficial, utilizando una relación funcional entre el escurrimiento y la carga de nitratos. El modelo determinístico hidrológico y de calidad de agua denominado Simulator for Water Resources in Rural Basins - Water Quality (SWRRB-WQ) fue utilizado para estimar la carga de nitratos en el escurrimiento superficial. Este modelo emplea como variable de entrada la precipitación diaria observada en la Estación del Aeropuerto de Olavarría durante el período 1988 a 2002. Para la calibración del modelo se aplicó una nueva metodología que estima la incertidumbre en los valores observados. Ambos modelos probabilístico y determinístico se aplican en una subcuenca rural del arroyo Tapalqué (provincia de Buenos Aires, Argentina) y finalmente se comparan los valores de la carga de nitratos estimados con los dos modelos con las observaciones realizadas en la sección del arroyo motivo de este estudio. Los resultados muestran que la carga media de nitratos obtenida con el modelo probabilístico es del mismo orden de magnitud que los valores medios observados y estimados con el modelo hidrológico y de calidad de agua SWRRB-WQ.
Resumo:
Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.
Resumo:
El presente trabajo tiene como objeto reevaluar el rol que desempeña el argumento del érgon en la EN, el cual ha sido largamente utilizado como elemento favorable a una lectura determinista de la virtud en Aristóteles. En esta lectura decir que toda naturaleza tiene una función que le es propia equivale a decir que por naturaleza aquello está llamado a cumplir necesariamente con esa función. La naturaleza entonces es entendida a la vez como condición necesaria y suficiente para llevar a cabo una función específica. En lo que sigue procuraremos mostrar que esta lectura es errónea, ya que la disposición natural, si bien es condición necesaria no es a la vez condición suficiente para llevar a cabo una función. Para ello partiremos de una elucidación de dos sentidos de naturaleza en Física II y reconstruiremos el argumento del érgon a la luz de estos nuevos elementos
Resumo:
El presente trabajo tiene como objeto reevaluar el rol que desempeña el argumento del érgon en la EN, el cual ha sido largamente utilizado como elemento favorable a una lectura determinista de la virtud en Aristóteles. En esta lectura decir que toda naturaleza tiene una función que le es propia equivale a decir que por naturaleza aquello está llamado a cumplir necesariamente con esa función. La naturaleza entonces es entendida a la vez como condición necesaria y suficiente para llevar a cabo una función específica. En lo que sigue procuraremos mostrar que esta lectura es errónea, ya que la disposición natural, si bien es condición necesaria no es a la vez condición suficiente para llevar a cabo una función. Para ello partiremos de una elucidación de dos sentidos de naturaleza en Física II y reconstruiremos el argumento del érgon a la luz de estos nuevos elementos