966 resultados para CONSISTENCY
Resumo:
While the use of thromboelastometry analysis (ROTEM®) in evaluation of haemostasis is rapidly increasing, important validity parameters of testing remain inadequately examined. We aimed to study systematically the consistency of thromboelastometry parameters within individual tests regarding measurements between different analysers, between different channels of the same analyser, between morning and afternoon measurements (circadian variation), and if measured four weeks apart. Citrated whole blood samples from 40 healthy volunteers were analysed with two analysers in parallel. EXTEM, INTEM, FIBTEM, HEPTEM and APTEM tests were conducted. A Bland-Altman comparison was performed and homogeneity of variances was tested using the pitman test. P-value ranges were used to classify the level of homogeneity (p<0.15 - low homogeneity, p = 0.15 to 0.5 - intermediate homogeneity, p>0.5 high homogeneity). Less than half of all comparisons made showed high homogeneity of variances (p>0.5) and in about a fifth of comparisons data distributions were heterogeneous (p<0.15). There was no clear pattern for homogeneity. On average, comparisons of MCF, ML and LI30 measurements tended to be better, but none of the tests assessed outperformed another. In conclusion, systematic investigation reveals large differences in the results of some thromboelastometry parameters and lack of consistency. Clinicians and scientists should take these inconsistencies into account and focus on parameters with a higher homogeneity such as MCF.
Resumo:
Family preservation has been criticized for implementing programs that are not theoretically founded. One result of this circumstance is a lack of information regarding processes and outcomes of family preservation services. The knowledge base of family preservation is thus rather limited at present and will remain limited unless theory is consistently integrated within individual programs. A model for conceptualizing how theoretical consistency may be implemented within programs is presented and applied to family preservation. It is also necessary for programs to establish theoretical consistency before theoretical diversity, both within individual and across multiple programs, in order to advance the field in meaningful ways. A developmental cycle of knowledge generation is presented and applied to family preservation.
Resumo:
This article examines the determinants of positional incongruence between pre-election statements and post-election behaviour in the Swiss parliament between 2003 and 2009. The question is examined at the individual MP level, which is appropriate for dispersion-of-powers systems like Switzerland. While the overall rate of political congruence reaches about 85%, a multilevel logit analysis detects the underlying factors which push or curb a candidate's propensity to change his or her mind once elected. The results show that positional changes are more likely when (1) MPs are freshmen, (2) individual voting behaviour is invisible to the public, (3) the electoral district magnitude is not small, (4) the vote is not about a party's core issue, (5) the MP belongs to a party which is located in the political centre, and (6) if the pre-election statement dissents from the majority position of the legislative party group. Of these factors, the last one is paramount.
Resumo:
We consider the problem of nonparametric estimation of a concave regression function F. We show that the supremum distance between the least square s estimatorand F on a compact interval is typically of order(log(n)/n)2/5. This entails rates of convergence for the estimator’s derivative. Moreover, we discuss the impact of additional constraints on F such as monotonicity and pointwise bounds. Then we apply these results to the analysis of current status data, where the distribution function of the event times is assumed to be concave.
Resumo:
Based on the balance theory (Heider, 1958), we hypothesized that emotions (i.e., schadenfreude, resentment, joy and sorrow) induced by other person’s outcomes function as responses restoring balance within cognitive units consisting of the perceiver, other persons and their outcomes. As a consequence, emotional reactions towards others’ outcomes depend on the perceiver’s attitudes in such a way that outcomes of a well-liked person rise congruous responses (sorrow after failure and joy after success), while outcomes of a disliked other lead to incongruous responses (schadenfreude after failure and resentment after success). Our participants recalled a situation from their past in which somebody they liked or disliked had succeed or failed. Additionally, we manipulated whether the outcome referred to a domain where participants’ self-interest was involved or not. We analyzed the participants’ average emotional state as well as specific emotions induced by the recalled events. Consistently with expectations we found that balancing principles played a major role in shaping emotional responses to successes and failures of person who were well-liked or disliked.
Resumo:
Studying individual differences in conscious awareness can potentially lend fundamental insights into the neural bases of binding mechanisms and consciousness (Cohen Kadosh and Henik, 2007). Partly for this reason, considerable attention has been devoted to the neural mechanisms underlying grapheme–color synesthesia, a healthy condition involving atypical brain activation and the concurrent experience of color photisms in response to letters, numbers, and words. For instance, the letter C printed in black on a white background may elicit a yellow color photism that is perceived to be spatially colocalized with the inducing stimulus or internally in the “mind's eye” as, for instance, a visual image. Synesthetic experiences are involuntary, idiosyncratic, and consistent over time (Rouw et al., 2011). To date, neuroimaging research on synesthesia has focused on brain areas activated during the experience of synesthesia and associated structural brain differences. However, activity patterns of the synesthetic brain at rest remain largely unexplored. Moreover, the neural correlates of synesthetic consistency, the hallmark characteristic of synesthesia, remain elusive.
Resumo:
To ensure the integrity of an intensity modulated radiation therapy (IMRT) treatment, each plan must be validated through a measurement-based quality assurance (QA) procedure, known as patient specific IMRT QA. Many methods of measurement and analysis have evolved for this QA. There is not a standard among clinical institutions, and many devices and action levels are used. Since the acceptance criteria determines if the dosimetric tools’ output passes the patient plan, it is important to see how these parameters influence the performance of the QA device. While analyzing the results of IMRT QA, it is important to understand the variability in the measurements. Due to the different form factors of the many QA methods, this reproducibility can be device dependent. These questions of patient-specific IMRT QA reproducibility and performance were investigated across five dosimeter systems: a helical diode array, radiographic film, ion chamber, diode array (AP field-by-field, AP composite, and rotational composite), and an in-house designed multiple ion chamber phantom. The reproducibility was gauged for each device by comparing the coefficients of variation (CV) across six patient plans. The performance of each device was determined by comparing each one’s ability to accurately label a plan as acceptable or unacceptable compared to a gold standard. All methods demonstrated a CV of less than 4%. Film proved to have the highest variability in QA measurement, likely due to the high level of user involvement in the readout and analysis. This is further shown by how the setup contributed more variation than the readout and analysis for all of the methods, except film. When evaluated for ability to correctly label acceptable and unacceptable plans, two distinct performance groups emerged with the helical diode array, AP composite diode array, film, and ion chamber in the better group; and the rotational composite and AP field-by-field diode array in the poorer group. Additionally, optimal threshold cutoffs were determined for each of the dosimetry systems. These findings, combined with practical considerations for factors such as labor and cost, can aid a clinic in its choice of an effective and safe patient-specific IMRT QA implementation.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
In just a few years cloud computing has become a very popular paradigm and a business success story, with storage being one of the key features. To achieve high data availability, cloud storage services rely on replication. In this context, one major challenge is data consistency. In contrast to traditional approaches that are mostly based on strong consistency, many cloud storage services opt for weaker consistency models in order to achieve better availability and performance. This comes at the cost of a high probability of stale data being read, as the replicas involved in the reads may not always have the most recent write. In this paper, we propose a novel approach, named Harmony, which adaptively tunes the consistency level at run-time according to the application requirements. The key idea behind Harmony is an intelligent estimation model of stale reads, allowing to elastically scale up or down the number of replicas involved in read operations to maintain a low (possibly zero) tolerable fraction of stale reads. As a result, Harmony can meet the desired consistency of the applications while achieving good performance. We have implemented Harmony and performed extensive evaluations with the Cassandra cloud storage on Grid?5000 testbed and on Amazon EC2. The results show that Harmony can achieve good performance without exceeding the tolerated number of stale reads. For instance, in contrast to the static eventual consistency used in Cassandra, Harmony reduces the stale data being read by almost 80% while adding only minimal latency. Meanwhile, it improves the throughput of the system by 45% while maintaining the desired consistency requirements of the applications when compared to the strong consistency model in Cassandra.
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
Because of the high number of crashes occurring on highways, it is necessary to intensify the search for new tools that help in understanding their causes. This research explores the use of a geographic information system (GIS) for an integrated analysis, taking into account two accident-related factors: design consistency (DC) (based on vehicle speed) and available sight distance (ASD) (based on visibility). Both factors require specific GIS software add-ins, which are explained. Digital terrain models (DTMs), vehicle paths, road centerlines, a speed prediction model, and crash data are integrated in the GIS. The usefulness of this approach has been assessed through a study of more than 500 crashes. From a regularly spaced grid, the terrain (bare ground) has been modeled through a triangulated irregular network (TIN). The length of the roads analyzed is greater than 100 km. Results have shown that DC and ASD could be related to crashes in approximately 4% of cases. In order to illustrate the potential of GIS, two crashes are fully analyzed: a car rollover after running off road on the right side and a rear-end collision of two moving vehicles. Although this procedure uses two software add-ins that are available only for ArcGIS, the study gives a practical demonstration of the suitability of GIS for conducting integrated studies of road safety.
Resumo:
Increased variability in performance has been associated with the emergence of several neurological and psychiatric pathologies. However, whether and how consistency of neuronal activity may also be indicative of an underlying pathology is still poorly understood. Here we propose a novel method for evaluating consistency from non-invasive brain recordings. We evaluate the consistency of the cortical activity recorded with magnetoencephalography in a group of subjects diagnosed with Mild Cognitive Impairment (MCI), a condition sometimes prodromal of dementia, during the execution of a memory task. We use metrics coming from nonlinear dynamics to evaluate the consistency of cortical regions. A representation known as parenclitic networks is constructed, where atypical features are endowed with a network structure, the topological properties of which can be studied at various scales. Pathological conditions correspond to strongly heterogeneous networks, whereas typical or normative conditions are characterized by sparsely connected networks with homogeneous nodes. The analysis of this kind of networks allows identifying the extent to which consistency is affected in the MCI group and the focal points where MCI is especially severe. To the best of our knowledge, these results represent the first attempt at evaluating the consistency of brain functional activity using complex networks theory.
Resumo:
We examine the predictive ability and consistency properties of exchange rate expectations for the dollar/euro using a survey conducted in Spain by PwC among a panel of experts and entrepreneurs. Our results suggest that the PwC panel have some forecasting ability for time horizons from 3 to 9 months, although only for the 3-month ahead expectations we obtain marginal evidence of unbiasedness and efficiency in the forecasts. As for the consistency properties of the exchange rate expectations formation process, we find that survey participants form stabilising expectations in the short-run and destabilising expectations in the long- run and that the expectation formation process is closer to fundamentalists than chartists.
Resumo:
The thermodynamic consistency of almost 90 VLE data series, including isothermal and isobaric conditions for systems of both total and partial miscibility in the liquid phase, has been examined by means of the area and point-to-point tests. In addition, the Gibbs energy of mixing function calculated from these experimental data has been inspected, with some rather surprising results: certain data sets exhibiting high dispersion or leading to Gibbs energy of mixing curves inconsistent with the total or partial miscibility of the liquid phase, surprisingly, pass the tests. Several possible inconsistencies in the tests themselves or in their application are discussed. Related to this is a very interesting and ambitious initiative that arose within the NIST organization: the development of an algorithm to assess the quality of experimental VLE data. The present paper questions the applicability of two of the five tests that are combined in the algorithm. It further shows that the deviation of the experimental VLE data from the correlation obtained by a given model, the basis of some point-to-point tests, should not be used to evaluate the quality of these data.