950 resultados para Robust model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many definitions and debates exist about the core characteristics of social and solidarity economy (SSE) and its actors. Among others, legal forms, profit, geographical scope, and size as criteria for identifying SSE actors often reveal dissents among SSE scholars. Instead of using a dichotomous, either-in-or-out definition of SSE actors, this paper presents an assessment tool that takes into account multiple dimensions to offer a more comprehensive and nuanced view of the field. We first define the core dimensions of the assessment tool by synthesizing the multiple indicators found in the literature. We then empirically test these dimensions and their interrelatedness and seek to identify potential clusters of actors. Finally we discuss the practical implications of our model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first generation models of currency crises have often been criticized because they predict that, in the absence of very large triggering shocks, currency attacks should be predictable and lead to small devaluations. This paper shows that these features of first generation models are not robust to the inclusion of private information. In particular, this paper analyzes a generalization of the Krugman-Flood-Garber (KFG) model, which relaxes the assumption that all consumers are perfectly informed about the level of fundamentals. In this environment, the KFG equilibrium of zero devaluation is only one of many possible equilibria. In all the other equilibria, the lack of perfect information delays the attack on the currency past the point at which the shadow exchange rate equals the peg, giving rise to unpredictable and discrete devaluations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agent-based computational economics is becoming widely used in practice. This paperexplores the consistency of some of its standard techniques. We focus in particular on prevailingwholesale electricity trading simulation methods. We include different supply and demandrepresentations and propose the Experience-Weighted Attractions method to include severalbehavioural algorithms. We compare the results across assumptions and to economic theorypredictions. The match is good under best-response and reinforcement learning but not underfictitious play. The simulations perform well under flat and upward-slopping supply bidding,and also for plausible demand elasticity assumptions. Learning is influenced by the number ofbids per plant and the initial conditions. The overall conclusion is that agent-based simulationassumptions are far from innocuous. We link their performance to underlying features, andidentify those that are better suited to model wholesale electricity markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explains the divergent behavior of European an US unemploymentrates using a job market matching model of the labor market with aninteraction between shocks an institutions. It shows that a reduction inTF growth rates, an increase in real interest rates, and an increase intax rates leads to a permanent increase in unemployment rates when thereplacement rates or initial tax rates are high, while no increase inunemployment occurs when institutions are "employment friendly". The paperalso shows that an increase in turbulence, modelle as an increase probabilityof skill loss, is not a robust explanation for the European unemploymentpuzzle in the context of a matching model with both endogenous job creationand job estruction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive an international asset pricing model that assumes local investorshave preferences of the type "keeping up with the Joneses." In aninternational setting investors compare their current wealth with that oftheir peers who live in the same country. In the process of inferring thecountry's average wealth, investors incorporate information from the domesticmarket portfolio. In equilibrium, this gives rise to a multifactor CAPMwhere, together with the world market price of risk, there existscountry-speciffic prices of risk associated with deviations from thecountry's average wealth level. The model performs signifficantly better, interms of explaining cross-section of returns, than the international CAPM.Moreover, the results are robust, both for conditional and unconditionaltests, to the inclusion of currency risk, macroeconomic sources of risk andthe Fama and French HML factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The n-octanol/water partition coefficient (log Po/w) is a key physicochemical parameter for drug discovery, design, and development. Here, we present a physics-based approach that shows a strong linear correlation between the computed solvation free energy in implicit solvents and the experimental log Po/w on a cleansed data set of more than 17,500 molecules. After internal validation by five-fold cross-validation and data randomization, the predictive power of the most interesting multiple linear model, based on two GB/SA parameters solely, was tested on two different external sets of molecules. On the Martel druglike test set, the predictive power of the best model (N = 706, r = 0.64, MAE = 1.18, and RMSE = 1.40) is similar to six well-established empirical methods. On the 17-drug test set, our model outperformed all compared empirical methodologies (N = 17, r = 0.94, MAE = 0.38, and RMSE = 0.52). The physical basis of our original GB/SA approach together with its predictive capacity, computational efficiency (1 to 2 s per molecule), and tridimensional molecular graphics capability lay the foundations for a promising predictor, the implicit log P method (iLOGP), to complement the portfolio of drug design tools developed and provided by the SIB Swiss Institute of Bioinformatics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I discuss the identifiability of a structural New Keynesian Phillips curve when it is embedded in a small scale dynamic stochastic general equilibrium model. Identification problems emerge because not all the structural parameters are recoverable from the semi-structural ones and because the objective functions I consider are poorly behaved. The solution and the moment mappings are responsible for the problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: While successful termination by pacing of organized atrial tachycardias has been observed in patients, single site rapid pacing has not yet led to conclusive results for the termination of atrial fibrillation (AF). The purpose of this study was to evaluate a novel atrial septal pacing algorithm for the termination of AF in a biophysical model of the human atria. METHODS AND RESULTS: Sustained AF was generated in a model based on human magnetic resonance images and membrane kinetics. Rapid pacing was applied from the septal area following a dual-stage scheme: (i) rapid pacing for 10-30 s at pacing intervals 62-70% of AF cycle length (AFCL), (ii) slow pacing for 1.5 s at 180% AFCL, initiated by a single stimulus at 130% AFCL. Atrial fibrillation termination success rates were computed. A mean success rate for AF termination of 10.2% was obtained for rapid septal pacing only. The addition of the slow pacing phase increased this rate to 20.2%. At an optimal pacing cycle length (64% AFCL) up to 29% of AF termination was observed. CONCLUSION: The proposed septal pacing algorithm could suppress AF reentries in a more robust way than classical single site rapid pacing. Experimental studies are now needed to determine whether similar termination mechanisms and rates can be observed in animals or humans, and in which types of AF this pacing strategy might be most effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In adult mammals, neural progenitors located in the dentate gyrus retain their ability to generate neurons and glia throughout lifetime. In rodents, increased production of new granule neurons is associated with improved memory capacities, while decreased hippocampal neurogenesis results in impaired memory performance in several memory tasks. In mouse models of Alzheimer's disease, neurogenesis is impaired and the granule neurons that are generated fail to integrate existing networks. Thus, enhancing neurogenesis should improve functional plasticity in the hippocampus and restore cognitive deficits in these mice. Here, we performed a screen of transcription factors that could potentially enhance adult hippocampal neurogenesis. We identified Neurod1 as a robust neuronal determinant with the capability to direct hippocampal progenitors towards an exclusive granule neuron fate. Importantly, Neurod1 also accelerated neuronal maturation and functional integration of new neurons during the period of their maturation when they contribute to memory processes. When tested in an APPxPS1 mouse model of Alzheimer's disease, directed expression of Neurod1 in cycling hippocampal progenitors conspicuously reduced dendritic spine density deficits on new hippocampal neurons, to the same level as that observed in healthy age-matched control animals. Remarkably, this population of highly connected new neurons was sufficient to restore spatial memory in these diseased mice. Collectively our findings demonstrate that endogenous neural stem cells of the diseased brain can be manipulated to become new neurons that could allow cognitive improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spared nerve injury (SNI) model mimics human neuropathic pain related to peripheral nerve injury and is based upon an invasive but simple surgical procedure. Since its first description in 2000, it has displayed a remarkable development. It produces a robust, reliable and long-lasting neuropathic pain-like behaviour (allodynia and hyperalgesia) as well as the possibility of studying both injured and non-injured neuronal populations in the same spinal ganglion. Besides, variants of the SNI model have been developed in rats, mice and neonatal/young rodents, resulting in several possible angles of analysis. Therefore, the purpose of this chapter is to provide a detailed guidance regarding the SNI model and its variants, highlighting its surgical and behavioural testing specificities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuropathic pain is a major health issue and is frequently accompanied by allodynia (painful sensations in response to normally non-painful stimulations), and unpleasant paresthesia/dysesthesia, pointing to alterations in sensory pathways normally dedicated to the processing of non-nociceptive information. Interestingly, mounting evidence indicate that central glial cells are key players in allodynia, partly due to changes in the astrocytic capacity to scavenge extracellular glutamate and gamma-aminobutyric acid (GABA), through changes in their respective transporters (EAAT and GAT). In the present study, we investigated the glial changes occurring in the dorsal column nuclei, the major target of normally innocuous sensory information, in the rat spared nerve injury (SNI) model of neuropathic pain. We report that together with a robust microglial and astrocytic reaction in the ipsilateral gracile nucleus, the GABA transporter GAT-1 is upregulated with no change in GAT-3 or glutamate transporters. Furthermore, [(3)H] GABA reuptake on crude synaptosome preparation shows that transporter activity is functionally increased ipsilaterally in SNI rats. This GAT-1 upregulation appears evenly distributed in the gracile nucleus and colocalizes with astrocytic activation. Neither glial activation nor GAT-1 modulation was detected in the cuneate nucleus. Together, the present results point to GABA transport in the gracile nucleus as a putative therapeutic target against abnormal sensory perceptions related to neuropathic pain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robust estimators for accelerated failure time models with asymmetric (or symmetric) error distribution and censored observations are proposed. It is assumed that the error model belongs to a log-location-scale family of distributions and that the mean response is the parameter of interest. Since scale is a main component of mean, scale is not treated as a nuisance parameter. A three steps procedure is proposed. In the first step, an initial high breakdown point S estimate is computed. In the second step, observations that are unlikely under the estimated model are rejected or down weighted. Finally, a weighted maximum likelihood estimate is computed. To define the estimates, functions of censored residuals are replaced by their estimated conditional expectation given that the response is larger than the observed censored value. The rejection rule in the second step is based on an adaptive cut-off that, asymptotically, does not reject any observation when the data are generat ed according to the model. Therefore, the final estimate attains full efficiency at the model, with respect to the maximum likelihood estimate, while maintaining the breakdown point of the initial estimator. Asymptotic results are provided. The new procedure is evaluated with the help of Monte Carlo simulations. Two examples with real data are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Positive selection is widely estimated from protein coding sequence alignments by the nonsynonymous-to-synonymous ratio omega. Increasingly elaborate codon models are used in a likelihood framework for this estimation. Although there is widespread concern about the robustness of the estimation of the omega ratio, more efforts are needed to estimate this robustness, especially in the context of complex models. Here, we focused on the branch-site codon model. We investigated its robustness on a large set of simulated data. First, we investigated the impact of sequence divergence. We found evidence of underestimation of the synonymous substitution rate for values as small as 0.5, with a slight increase in false positives for the branch-site test. When dS increases further, underestimation of dS is worse, but false positives decrease. Interestingly, the detection of true positives follows a similar distribution, with a maximum for intermediary values of dS. Thus, high dS is more of a concern for a loss of power (false negatives) than for false positives of the test. Second, we investigated the impact of GC content. We showed that there is no significant difference of false positives between high GC (up to similar to 80%) and low GC (similar to 30%) genes. Moreover, neither shifts of GC content on a specific branch nor major shifts in GC along the gene sequence generate many false positives. Our results confirm that the branch-site is a very conservative test.