925 resultados para Estimation Of Distribution Algorithm
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
It is becoming clear that the detection and integration of synaptic input and its conversion into an output signal in cortical neurons are strongly influenced by background synaptic activity or "noise." The majority of this noise results from the spontaneous release of synaptic transmitters, interacting with ligand-gated ion channels in the postsynaptic neuron [Berretta N, Jones RSG (1996); A comparison of spontaneous synaptic EPSCs in layer V and layer II neurones in the rat entorhinal cortex in vitro. J Neurophysiol 76:1089-1110; Jones RSG, Woodhall GL (2005) Background synaptic activity in rat entorhinal cortical neurons: differential control of transmitter release by presynaptic receptors. J Physiol 562:107-120; LoTurco JJ, Mody I, Kriegstein AR (1990) Differential activation of glutamate receptors by spontaneously released transmitter in slices of neocortex. Neurosci Lett 114:265-271; Otis TS, Staley KJ, Mody I (1991) Perpetual inhibitory activity in mammalian brain slices generated by spontaneous GABA release. Brain Res 545:142-150; Ropert N, Miles R, Korn H (1990) Characteristics of miniature inhibitory postsynaptic currents in CA1 pyramidal neurones of rat hippocampus. J Physiol 428:707-722; Salin PA, Prince DA (1996) Spontaneous GABAA receptor-mediated inhibitory currents in adult rat somatosensory cortex. J Neurophysiol 75:1573-1588; Staley KJ (1999) Quantal GABA release: noise or not? Nat Neurosci 2:494-495; Woodhall GL, Bailey SJ, Thompson SE, Evans DIP, Stacey AE, Jones RSG (2005) Fundamental differences in spontaneous synaptic inhibition between deep and superficial layers of the rat entorhinal cortex. Hippocampus 15:232-245]. The function of synaptic noise has been the subject of debate for some years, but there is increasing evidence that it modifies or controls neuronal excitability and, thus, the integrative properties of cortical neurons. In the present study we have investigated a novel approach [Rudolph M, Piwkowska Z, Badoual M, Bal T, Destexhe A (2004) A method to estimate synaptic conductances from membrane potential fluctuations. J Neurophysiol 91:2884-2896] to simultaneously quantify synaptic inhibitory and excitatory synaptic noise, together with postsynaptic excitability, in rat entorhinal cortical neurons in vitro. The results suggest that this is a viable and useful approach to the study of the function of synaptic noise in cortical networks. © 2007 IBRO.
Resumo:
A number of papers and reports covering the techno-economic analysis of bio-oil production has been published. These have had different scopes, use different feedstocks and reflected national cost structures. This paper reviews and compares their cost estimates and the experimental results that underpin them. A comprehensive cost and performance model was produced based on consensus data from the previous studies or stated scenarios where data is not available that reflected UK costs. The model takes account sales of bio-char that is a co-product of pyrolysis and the electricity consumption of the pyrolysis plant and biomass pre-processing plants. It was concluded that it should be able to produce bio-oil in the UK from energy crops for a similar cost as distillate fuel oil. It was also found that there was little difference in the processing cost for woodchips and baled miscanthus. © 2011 Elsevier Ltd.
Resumo:
This thesis presents an analysis of the stability of complex distribution networks. We present a stability analysis against cascading failures. We propose a spin [binary] model, based on concepts of statistical mechanics. We test macroscopic properties of distribution networks with respect to various topological structures and distributions of microparameters. The equilibrium properties of the systems are obtained in a statistical mechanics framework by application of the replica method. We demonstrate the validity of our approach by comparing it with Monte Carlo simulations. We analyse the network properties in terms of phase diagrams and found both qualitative and quantitative dependence of the network properties on the network structure and macroparameters. The structure of the phase diagrams points at the existence of phase transition and the presence of stable and metastable states in the system. We also present an analysis of robustness against overloading in the distribution networks. We propose a model that describes a distribution process in a network. The model incorporates the currents between any connected hubs in the network, local constraints in the form of Kirchoff's law and a global optimizational criterion. The flow of currents in the system is driven by the consumption. We study two principal types of model: infinite and finite link capacity. The key properties are the distributions of currents in the system. We again use a statistical mechanics framework to describe the currents in the system in terms of macroscopic parameters. In order to obtain observable properties we apply the replica method. We are able to assess the criticality of the level of demand with respect to the available resources and the architecture of the network. Furthermore, the parts of the system, where critical currents may emerge, can be identified. This, in turn, provides us with the characteristic description of the spread of the overloading in the systems.
Resumo:
This article uses a semiparametric smooth coefficient model (SPSCM) to estimate TFP growth and its components (scale and technical change). The SPSCM is derived from a nonparametric specification of the production technology represented by an input distance function (IDF), using a growth formulation. The functional coefficients of the SPSCM come naturally from the model and are fully flexible in the sense that no functional form of the underlying production technology is used to derive them. Another advantage of the SPSCM is that it can estimate bias (input and scale) in technical change in a fully flexible manner. We also used a translog IDF framework to estimate TFP growth components. A panel of U.S. electricity generating plants for the period 1986–1998 is used for this purpose. Comparing estimated TFP growth results from both parametric and semiparametric models against the Divisia TFP growth, we conclude that the SPSCM performs the best in tracking the temporal behavior of TFP growth.
Estimation of productivity in Korean electric power plants:a semiparametric smooth coefficient model
Resumo:
This paper analyzes the impact of load factor, facility and generator types on the productivity of Korean electric power plants. In order to capture important differences in the effect of load policy on power output, we use a semiparametric smooth coefficient (SPSC) model that allows us to model heterogeneous performances across power plants and over time by allowing underlying technologies to be heterogeneous. The SPSC model accommodates both continuous and discrete covariates. Various specification tests are conducted to compare performance of the SPSC model. Using a unique generator level panel dataset spanning the period 1995-2006, we find that the impact of load factor, generator and facility types on power generation varies substantially in terms of magnitude and significance across different plant characteristics. The results have strong implication for generation policy in Korea as outlined in this study.
Resumo:
The incentive dilemma refers to a situation in which incentives are offered but do not work as intended. The authors suggest that, in an interorganizational context, whether a principal-provided incentive works is a function of how it is evaluated by an agent: for its contribution to the agent's bottom line (instrumental evaluation) and for the extent it is strategically aligned with the agent's direction (congruence evaluation). To further understand when incentives work, the influence of two key contextual variables-industry volatility and dependence-are examined. A field study featuring 57 semi-structured depth interviews and 386 responses from twin surveys in the information technology and brewing industries provide data for hypothesis testing. When and whether incentives work is demonstrated by certain conditions under which the agent's evaluation of an incentive has positive or negative effects on its compliance and active representation. Further, some outcomes are reversed in the high volatility condition. © 2013 Academy of Marketing Science.