930 resultados para CFC Rules
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.
Resumo:
We consider collective decision problems given by a profile of single-peaked preferences defined over the real line and a set of pure public facilities to be located on the line. In this context, Bochet and Gordon (2012) provide a large class of priority rules based on efficiency, object-population monotonicity and sovereignty. Each such rule is described by a fixed priority ordering among interest groups. We show that any priority rule which treats agents symmetrically — anonymity — respects some form of coherence across collective decision problems — reinforcement — and only depends on peak information — peakonly — is a weighted majoritarian rule. Each such rule defines priorities based on the relative size of the interest groups and specific weights attached to locations. We give an explicit account of the richness of this class of rules.
Resumo:
Population coding is widely regarded as a key mechanism for achieving reliable behavioral decisions. We previously introduced reinforcement learning for population-based decision making by spiking neurons. Here we generalize population reinforcement learning to spike-based plasticity rules that take account of the postsynaptic neural code. We consider spike/no-spike, spike count and spike latency codes. The multi-valued and continuous-valued features in the postsynaptic code allow for a generalization of binary decision making to multi-valued decision making and continuous-valued action selection. We show that code-specific learning rules speed up learning both for the discrete classification and the continuous regression tasks. The suggested learning rules also speed up with increasing population size as opposed to standard reinforcement learning rules. Continuous action selection is further shown to explain realistic learning speeds in the Morris water maze. Finally, we introduce the concept of action perturbation as opposed to the classical weight- or node-perturbation as an exploration mechanism underlying reinforcement learning. Exploration in the action space greatly increases the speed of learning as compared to exploration in the neuron or weight space.
Resumo:
The city of Bath is a World Heritage site and its thermal waters, the Roman Baths and new spa development rely on undisturbed flow of the springs (45 °C). The current investigations provide an improved understanding of the residence times and flow regime as basis for the source protection. Trace gas indicators including the noble gases (helium, neon, argon, krypton and xenon) and chlorofluorocarbons (CFCs), together with a more comprehensive examination of chemical and stable isotope tracers are used to characterise the sources of the thermal water and any modern components. It is shown conclusively by the use of 39Ar that the bulk of the thermal water has been in circulation within the Carboniferous Limestone for at least 1000 years. Other stable isotope and noble gas measurements confirm previous findings and strongly suggest recharge within the Holocene time period (i.e. the last 12 kyr). Measurements of dissolved 85Kr and chlorofluorocarbons constrain previous indications from tritium that a small proportion (<5%) of the thermal water originates from modern leakage into the spring pipe passing through Mesozoic valley fill underlying Bath. This introduces small amounts of O2 into the system, resulting in the Fe precipitation seen in the King’s Spring. Silica geothermometry indicates that the water is likely to have reached a maximum temperature of between 69–99 °C, indicating a most probable maximum circulation depth of ∼3 km, which is in line with recent geological models. The rise to the surface of the water is sufficiently indirect that a temperature loss of >20 °C is incurred. There is overwhelming evidence that the water has evolved within the Carboniferous Limestone formation, although the chemistry alone cannot pinpoint the geometry of the recharge area or circulation route. For a likely residence time of 1–12 kyr, volumetric calculations imply a large storage volume and circulation pathway if typical porosities of the limestone at depth are used, indicating that much of the Bath-Bristol basin must be involved in the water storage.
Resumo:
Spike timing dependent plasticity (STDP) is a phenomenon in which the precise timing of spikes affects the sign and magnitude of changes in synaptic strength. STDP is often interpreted as the comprehensive learning rule for a synapse - the "first law" of synaptic plasticity. This interpretation is made explicit in theoretical models in which the total plasticity produced by complex spike patterns results from a superposition of the effects of all spike pairs. Although such models are appealing for their simplicity, they can fail dramatically. For example, the measured single-spike learning rule between hippocampal CA3 and CA1 pyramidal neurons does not predict the existence of long-term potentiation one of the best-known forms of synaptic plasticity. Layers of complexity have been added to the basic STDP model to repair predictive failures, but they have been outstripped by experimental data. We propose an alternate first law: neural activity triggers changes in key biochemical intermediates, which act as a more direct trigger of plasticity mechanisms. One particularly successful model uses intracellular calcium as the intermediate and can account for many observed properties of bidirectional plasticity. In this formulation, STDP is not itself the basis for explaining other forms of plasticity, but is instead a consequence of changes in the biochemical intermediate, calcium. Eventually a mechanism-based framework for learning rules should include other messengers, discrete change at individual synapses, spread of plasticity among neighboring synapses, and priming of hidden processes that change a synapse's susceptibility to future change. Mechanism-based models provide a rich framework for the computational representation of synaptic plasticity.
Resumo:
The Internet revolution and the digital environment have spurred a significant amount of innovative activity that has had spillover effects on many sectors of the economy. For a growing group of countries – both developed and developing – digital goods and services have become an important engine of economic growth and a clear priority in their future-oriented economic strategies. Neither the rapid technological developments associated with digitization, nor their increased societal significance have so far been reflected in international economic law in a comprehensive manner. The law of the World Trade Organization (WTO) in particular, has not reacted in any proactive manner. A pertinent question that arises is whether the WTO rules are still useful and able to accommodate the new digital economy or whether they have been rendered outdated and incapable of dealing with this important development? The present think-piece seeks answers to these questions and maps the key issues and challenges which the WTO faces. In appraisal of the current state of affairs, developments in venues other than the WTO, and proposals tabled by stakeholders, some recommendations for the ways forward are made.