955 resultados para cache consistency
Resumo:
This article examines the determinants of positional incongruence between pre-election statements and post-election behaviour in the Swiss parliament between 2003 and 2009. The question is examined at the individual MP level, which is appropriate for dispersion-of-powers systems like Switzerland. While the overall rate of political congruence reaches about 85%, a multilevel logit analysis detects the underlying factors which push or curb a candidate's propensity to change his or her mind once elected. The results show that positional changes are more likely when (1) MPs are freshmen, (2) individual voting behaviour is invisible to the public, (3) the electoral district magnitude is not small, (4) the vote is not about a party's core issue, (5) the MP belongs to a party which is located in the political centre, and (6) if the pre-election statement dissents from the majority position of the legislative party group. Of these factors, the last one is paramount.
Resumo:
We consider the problem of nonparametric estimation of a concave regression function F. We show that the supremum distance between the least square s estimatorand F on a compact interval is typically of order(log(n)/n)2/5. This entails rates of convergence for the estimator’s derivative. Moreover, we discuss the impact of additional constraints on F such as monotonicity and pointwise bounds. Then we apply these results to the analysis of current status data, where the distribution function of the event times is assumed to be concave.
Resumo:
Based on the balance theory (Heider, 1958), we hypothesized that emotions (i.e., schadenfreude, resentment, joy and sorrow) induced by other person’s outcomes function as responses restoring balance within cognitive units consisting of the perceiver, other persons and their outcomes. As a consequence, emotional reactions towards others’ outcomes depend on the perceiver’s attitudes in such a way that outcomes of a well-liked person rise congruous responses (sorrow after failure and joy after success), while outcomes of a disliked other lead to incongruous responses (schadenfreude after failure and resentment after success). Our participants recalled a situation from their past in which somebody they liked or disliked had succeed or failed. Additionally, we manipulated whether the outcome referred to a domain where participants’ self-interest was involved or not. We analyzed the participants’ average emotional state as well as specific emotions induced by the recalled events. Consistently with expectations we found that balancing principles played a major role in shaping emotional responses to successes and failures of person who were well-liked or disliked.
Resumo:
Studying individual differences in conscious awareness can potentially lend fundamental insights into the neural bases of binding mechanisms and consciousness (Cohen Kadosh and Henik, 2007). Partly for this reason, considerable attention has been devoted to the neural mechanisms underlying grapheme–color synesthesia, a healthy condition involving atypical brain activation and the concurrent experience of color photisms in response to letters, numbers, and words. For instance, the letter C printed in black on a white background may elicit a yellow color photism that is perceived to be spatially colocalized with the inducing stimulus or internally in the “mind's eye” as, for instance, a visual image. Synesthetic experiences are involuntary, idiosyncratic, and consistent over time (Rouw et al., 2011). To date, neuroimaging research on synesthesia has focused on brain areas activated during the experience of synesthesia and associated structural brain differences. However, activity patterns of the synesthetic brain at rest remain largely unexplored. Moreover, the neural correlates of synesthetic consistency, the hallmark characteristic of synesthesia, remain elusive.
Resumo:
To ensure the integrity of an intensity modulated radiation therapy (IMRT) treatment, each plan must be validated through a measurement-based quality assurance (QA) procedure, known as patient specific IMRT QA. Many methods of measurement and analysis have evolved for this QA. There is not a standard among clinical institutions, and many devices and action levels are used. Since the acceptance criteria determines if the dosimetric tools’ output passes the patient plan, it is important to see how these parameters influence the performance of the QA device. While analyzing the results of IMRT QA, it is important to understand the variability in the measurements. Due to the different form factors of the many QA methods, this reproducibility can be device dependent. These questions of patient-specific IMRT QA reproducibility and performance were investigated across five dosimeter systems: a helical diode array, radiographic film, ion chamber, diode array (AP field-by-field, AP composite, and rotational composite), and an in-house designed multiple ion chamber phantom. The reproducibility was gauged for each device by comparing the coefficients of variation (CV) across six patient plans. The performance of each device was determined by comparing each one’s ability to accurately label a plan as acceptable or unacceptable compared to a gold standard. All methods demonstrated a CV of less than 4%. Film proved to have the highest variability in QA measurement, likely due to the high level of user involvement in the readout and analysis. This is further shown by how the setup contributed more variation than the readout and analysis for all of the methods, except film. When evaluated for ability to correctly label acceptable and unacceptable plans, two distinct performance groups emerged with the helical diode array, AP composite diode array, film, and ion chamber in the better group; and the rotational composite and AP field-by-field diode array in the poorer group. Additionally, optimal threshold cutoffs were determined for each of the dosimetry systems. These findings, combined with practical considerations for factors such as labor and cost, can aid a clinic in its choice of an effective and safe patient-specific IMRT QA implementation.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
An approximate analytic model of a shared memory multiprocessor with a Cache Only Memory Architecture (COMA), the busbased Data Difussion Machine (DDM), is presented and validated. It describes the timing and interference in the system as a function of the hardware, the protocols, the topology and the workload. Model results have been compared to results from an independent simulator. The comparison shows good model accuracy specially for non-saturated systems, where the errors in response times and device utilizations are independent of the number of processors and remain below 10% in 90% of the simulations. Therefore, the model can be used as an average performance prediction tool that avoids expensive simulations in the design of systems with many processors.
Resumo:
In just a few years cloud computing has become a very popular paradigm and a business success story, with storage being one of the key features. To achieve high data availability, cloud storage services rely on replication. In this context, one major challenge is data consistency. In contrast to traditional approaches that are mostly based on strong consistency, many cloud storage services opt for weaker consistency models in order to achieve better availability and performance. This comes at the cost of a high probability of stale data being read, as the replicas involved in the reads may not always have the most recent write. In this paper, we propose a novel approach, named Harmony, which adaptively tunes the consistency level at run-time according to the application requirements. The key idea behind Harmony is an intelligent estimation model of stale reads, allowing to elastically scale up or down the number of replicas involved in read operations to maintain a low (possibly zero) tolerable fraction of stale reads. As a result, Harmony can meet the desired consistency of the applications while achieving good performance. We have implemented Harmony and performed extensive evaluations with the Cassandra cloud storage on Grid?5000 testbed and on Amazon EC2. The results show that Harmony can achieve good performance without exceeding the tolerated number of stale reads. For instance, in contrast to the static eventual consistency used in Cassandra, Harmony reduces the stale data being read by almost 80% while adding only minimal latency. Meanwhile, it improves the throughput of the system by 45% while maintaining the desired consistency requirements of the applications when compared to the strong consistency model in Cassandra.
Resumo:
The first level data cache un modern processors has become a major consumer of energy due to its increasing size and high frequency access rate. In order to reduce this high energy con sumption, we propose in this paper a straightforward filtering technique based on a highly accurate forwarding predictor. Specifically, a simple structure predicts whether a load instruction will obtain its corresponding data via forwarding from the load-store structure -thus avoiding the data cache access - or if it will be provided by the data cache. This mechanism manages to reduce the data cache energy consumption by an average of 21.5% with a negligible performance penalty of less than 0.1%. Furthermore, in this paper we focus on the cache static energy consumption too by disabling a portin of sets of the L2 associative cache. Overall, when merging both proposals, the combined L1 and L2 total energy consumption is reduced by an average of 29.2% with a performance penalty of just 0.25%. Keywords: Energy consumption; filtering; forwarding predictor; cache hierarchy
Resumo:
With the advent of cloud computing model, distributed caches have become the cornerstone for building scalable applications. Popular systems like Facebook [1] or Twitter use Memcached [5], a highly scalable distributed object cache, to speed up applications by avoiding database accesses. Distributed object caches assign objects to cache instances based on a hashing function, and objects are not moved from a cache instance to another unless more instances are added to the cache and objects are redistributed. This may lead to situations where some cache instances are overloaded when some of the objects they store are frequently accessed, while other cache instances are less frequently used. In this paper we propose a multi-resource load balancing algorithm for distributed cache systems. The algorithm aims at balancing both CPU and Memory resources among cache instances by redistributing stored data. Considering the possible conflict of balancing multiple resources at the same time, we give CPU and Memory resources weighted priorities based on the runtime load distributions. A scarcer resource is given a higher weight than a less scarce resource when load balancing. The system imbalance degree is evaluated based on monitoring information, and the utility load of a node, a unit for resource consumption. Besides, since continuous rebalance of the system may affect the QoS of applications utilizing the cache system, our data selection policy ensures that each data migration minimizes the system imbalance degree and hence, the total reconfiguration cost can be minimized. An extensive simulation is conducted to compare our policy with other policies. Our policy shows a significant improvement in time efficiency and decrease in reconfiguration cost.
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
Because of the high number of crashes occurring on highways, it is necessary to intensify the search for new tools that help in understanding their causes. This research explores the use of a geographic information system (GIS) for an integrated analysis, taking into account two accident-related factors: design consistency (DC) (based on vehicle speed) and available sight distance (ASD) (based on visibility). Both factors require specific GIS software add-ins, which are explained. Digital terrain models (DTMs), vehicle paths, road centerlines, a speed prediction model, and crash data are integrated in the GIS. The usefulness of this approach has been assessed through a study of more than 500 crashes. From a regularly spaced grid, the terrain (bare ground) has been modeled through a triangulated irregular network (TIN). The length of the roads analyzed is greater than 100 km. Results have shown that DC and ASD could be related to crashes in approximately 4% of cases. In order to illustrate the potential of GIS, two crashes are fully analyzed: a car rollover after running off road on the right side and a rear-end collision of two moving vehicles. Although this procedure uses two software add-ins that are available only for ArcGIS, the study gives a practical demonstration of the suitability of GIS for conducting integrated studies of road safety.
Resumo:
Increased variability in performance has been associated with the emergence of several neurological and psychiatric pathologies. However, whether and how consistency of neuronal activity may also be indicative of an underlying pathology is still poorly understood. Here we propose a novel method for evaluating consistency from non-invasive brain recordings. We evaluate the consistency of the cortical activity recorded with magnetoencephalography in a group of subjects diagnosed with Mild Cognitive Impairment (MCI), a condition sometimes prodromal of dementia, during the execution of a memory task. We use metrics coming from nonlinear dynamics to evaluate the consistency of cortical regions. A representation known as parenclitic networks is constructed, where atypical features are endowed with a network structure, the topological properties of which can be studied at various scales. Pathological conditions correspond to strongly heterogeneous networks, whereas typical or normative conditions are characterized by sparsely connected networks with homogeneous nodes. The analysis of this kind of networks allows identifying the extent to which consistency is affected in the MCI group and the focal points where MCI is especially severe. To the best of our knowledge, these results represent the first attempt at evaluating the consistency of brain functional activity using complex networks theory.
Resumo:
We examine the predictive ability and consistency properties of exchange rate expectations for the dollar/euro using a survey conducted in Spain by PwC among a panel of experts and entrepreneurs. Our results suggest that the PwC panel have some forecasting ability for time horizons from 3 to 9 months, although only for the 3-month ahead expectations we obtain marginal evidence of unbiasedness and efficiency in the forecasts. As for the consistency properties of the exchange rate expectations formation process, we find that survey participants form stabilising expectations in the short-run and destabilising expectations in the long- run and that the expectation formation process is closer to fundamentalists than chartists.
Resumo:
Actualmente, el rendimiento de los computadores es un tema candente. Existen importantes limitaciones físicas y tecnológicas en los semiconductores de hoy en día, por lo que se realiza un gran esfuerzo desde las universidades y la industria para garantizar la continuidad de la ley de Moore. Este proyecto está centrado en el estudio de la cache y la jerarquía de memoria, uno de los grandes temas en la materia. Para ello, hemos escogido MIPSfpga, una plataforma hardware abierta de Imagination Technologies, lo que nos ha permitido implementar y testear diferentes políticas de reemplazamiento como prueba de concepto, demostrando, además, las bondades de la plataforma.