75 resultados para scaling rules
Resumo:
The session aims at analyzing efforts in up-scaling cleaner and more efficient energy solutions for poor people in developing countries by addressing the following questions: What are factors along the whole value chain and in the institutional, social, but also environmental space that enable up-scaling of improved pro-poor technologies? Are there differences between energy carriers or in different contexts? What are most promising entry points for up-scaling?
Resumo:
Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.
Resumo:
Software architecture consists of a set of design choices that can be partially expressed in form of rules that the implementation must conform to. Architectural rules are intended to ensure properties that fulfill fundamental non-functional requirements. Verifying architectural rules is often a non- trivial activity: available tools are often not very usable and support only a narrow subset of the rules that are commonly specified by practitioners. In this paper we present a new highly-readable declarative language for specifying architectural rules. With our approach, users can specify a wide variety of rules using a single uniform notation. Rules can get tested by third-party tools by conforming to pre-defined specification templates. Practitioners can take advantage of the capabilities of a growing number of testing tools without dealing with them directly.
Resumo:
In this Opinion piece, I argue that the dynamics of viruses and the cellular immune response depend on the body size of the host. I use allometric scaling theory to interpret observed quantitative differences in the infection dynamics of lymphocytic choriomeningitis virus (LCMV) in mice (Mus musculus), simian immunodeficiency virus (SIV) in rhesus macaques (Macaca mulatta) and human immunodeficiency virus (HIV) in humans.
Resumo:
Experience is lacking with mineral scaling and corrosion in enhanced geothermal systems (EGS) in which surface water is circulated through hydraulically stimulated crystalline rocks. As an aid in designing EGS projects we have conducted multicomponent reactive-transport simulations to predict the likely characteristics of scales and corrosion that may form when exploiting heat from granitoid reservoir rocks at ∼200 °C and 5 km depth. The specifications of an EGS project at Basel, Switzerland, are used to constrain the model. The main water–rock reactions in the reservoir during hydraulic stimulation and the subsequent doublet operation were identified in a separate paper (Alt-Epping et al., 2013b). Here we use the computed composition of the reservoir fluid to (1) predict mineral scaling in the injection and production wells, (2) evaluate methods of chemical geothermometry and (3) identify geochemical indicators of incipient corrosion. The envisaged heat extraction scheme ensures that even if the reservoir fluid is in equilibrium with quartz, cooling of the fluid will not induce saturation with respect to amorphous silica, thus eliminating the risk of silica scaling. However, the ascending fluid attains saturation with respect to crystalline aluminosilicates such as albite, microcline and chlorite, and possibly with respect to amorphous aluminosilicates. If no silica-bearing minerals precipitate upon ascent, reservoir temperatures can be predicted by classical formulations of silica geothermometry. In contrast, Na/K concentration ratios in the production fluid reflect steady-state conditions in the reservoir rather than albite–microcline equilibrium. Thus, even though igneous orthoclase is abundant in the reservoir and albite precipitates as a secondary phase, Na/K geothermometers fail to yield accurate temperatures. Anhydrite, which is present in fractures in the Basel reservoir, is predicted to dissolve during operation. This may lead to precipitation of pyrite and, at high exposure of anhydrite to the circulating fluid, of hematite scaling in the geothermal installation. In general, incipient corrosion of the casing can be detected at the production wellhead through an increase in H2(aq) and the enhanced precipitation of Fe-bearing aluminosilicates. The appearance of magnetite in scales indicates high corrosion rates.
Resumo:
We determine the mass of the bottom quark from high moments of the bbproduction cross section in e+e−annihilation, which are dominated by the threshold region. On the theory side next-to-next-to-next-to-leading order (NNNLO) calculations both for the resonances and the continuum cross section are used for the first time. We find mPSb(2GeV) =4.532+0.013−0.039GeVfor the potential-subtracted mass and mMSb(mMSb) =4.193+0.022−0.035GeVfor the MSbottom-quark mass.
Resumo:
Most species do not live in a constant environment over space or time. Their environment is often heterogeneous with a huge variability in resource availability and exposure to pathogens or predators, which may affect the local densities of the species. Moreover, the habitat might be fragmented, preventing free and isotropic migrations between local sub-populations (demes) of a species, making some demes more isolated than others. For example, during the last ice age populations of many species migrated towards refuge areas from which re-colonization originated when conditions improved. However, populations that could not move fast enough or could not adapt to the new environmental conditions faced extinctions. Populations living in these types of dynamic environments are often referred to as metapopulations and modeled as an array of subdivisions (or demes) that exchange migrants with their neighbors. Several studies have focused on the description of their demography, probability of extinction and expected patterns of diversity at different scales. Importantly, all these evolutionary processes may affect genetic diversity, which can affect the chance of populations to persist. In this chapter we provide an overview on the consequences of fragmentation, long-distance dispersal, range contractions and range shifts on genetic diversity. In addition, we describe new methods to detect and quantify underlying evolutionary processes from sampled genetic data.
Resumo:
Various software packages for project management include a procedure for resource-constrained scheduling. In several packages, the user can influence this procedure by selecting a priority rule. However, the resource-allocation methods that are implemented in the procedures are proprietary information; therefore, the question of how the priority-rule selection impacts the performance of the procedures arises. We experimentally evaluate the resource-allocation methods of eight recent software packages using the 600 instances of the PSPLIB J120 test set. The results of our analysis indicate that applying the default rule tends to outperform a randomly selected rule, whereas applying two randomly selected rules tends to outperform the default rule. Applying a small set of more than two rules further improves the project durations considerably. However, a large number of rules must be applied to obtain the best possible project durations.
Resumo:
OBJECTIVES To investigate the frequency of interim analyses, stopping rules, and data safety and monitoring boards (DSMBs) in protocols of randomized controlled trials (RCTs); to examine these features across different reasons for trial discontinuation; and to identify discrepancies in reporting between protocols and publications. STUDY DESIGN AND SETTING We used data from a cohort of RCT protocols approved between 2000 and 2003 by six research ethics committees in Switzerland, Germany, and Canada. RESULTS Of 894 RCT protocols, 289 prespecified interim analyses (32.3%), 153 stopping rules (17.1%), and 257 DSMBs (28.7%). Overall, 249 of 894 RCTs (27.9%) were prematurely discontinued; mostly due to reasons such as poor recruitment, administrative reasons, or unexpected harm. Forty-six of 249 RCTs (18.4%) were discontinued due to early benefit or futility; of those, 37 (80.4%) were stopped outside a formal interim analysis or stopping rule. Of 515 published RCTs, there were discrepancies between protocols and publications for interim analyses (21.1%), stopping rules (14.4%), and DSMBs (19.6%). CONCLUSION Two-thirds of RCT protocols did not consider interim analyses, stopping rules, or DSMBs. Most RCTs discontinued for early benefit or futility were stopped without a prespecified mechanism. When assessing trial manuscripts, journals should require access to the protocol.
Resumo:
Der Beitrag diskutiert Möglichkeiten zur Automatisierung von Kundenbeziehungsprozessen im Customer Relationship Management mit Hilfe von Business Rules. Anhand einer CRM-Architektur werden Anwendungsmöglichkeiten erörtert und am Beispiel einer Cross-Selling-Kampagne vertieft. Technische Aspekte werden dabei nicht im Detail betrachtet. Der Schwerpunkt liegt vielmehr in der Diskussion von Automatisierungs- und Integrationspotenzialen durch den Einsatz von Business Rules, wie sie in zunehmend individualisierten Kundenbeziehungen in Massenmärkten gegeben sind.