153 resultados para Hartree Fock scheme correlation errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To compare the diagnostic performance of multi-detector CT arthrography (CTA) and 1.5-T MR arthrography (MRA) in detecting hyaline cartilage lesions of the shoulder, with arthroscopic correlation. PATIENTS AND METHODS: CTA and MRA prospectively obtained in 56 consecutive patients following the same arthrographic procedure were independently evaluated for glenohumeral cartilage lesions (modified Outerbridge grade ≥2 and grade 4) by two musculoskeletal radiologists. The cartilage surface was divided in 18 anatomical areas. Arthroscopy was taken as the reference standard. Diagnostic performance of CTA and MRA was compared using ROC analysis. Interobserver and intraobserver agreement was determined by κ statistics. RESULTS: Sensitivity and specificity of CTA varied from 46.4 to 82.4 % and from 89.0 to 95.9 % respectively; sensitivity and specificity of MRA varied from 31.9 to 66.2 % and from 91.1 to 97.5 % respectively. Diagnostic performance of CTA was statistically significantly better than MRA for both readers (all p ≤ 0.04). Interobserver agreement for the evaluation of cartilage lesions was substantial with CTA (κ = 0.63) and moderate with MRA (κ = 0.54). Intraobserver agreement was almost perfect with both CTA (κ = 0.94-0.95) and MRA (κ = 0.83-0.87). CONCLUSION: The diagnostic performance of CTA and MRA for the detection of glenohumeral cartilage lesions is moderate, although statistically significantly better with CTA. KEY POINTS: ? CTA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? MRA has moderate diagnostic performance for detecting glenohumeral cartilage substance loss. ? CTA is more accurate than MRA for detecting cartilage substance loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between the binding of Vicia villosa (VV) lectin and the expression of cytolytic function in T lymphoblasts has been investigated using flow cytofluorometric techniques. Spleen cells activated in vitro in 5-day mixed leukocyte cultures (MLC) were incubated sequentially with VV, rabbit anti-V antiserum, and fluoresceinated sheep anti-rabbit IgG. When these stained MLC cells were passed on a flow cytometer gated to exclude nonviable cells and small lymphocytes, a single heterogeneous peak of fluorescence was seen, as compared to control MLC cells that had not been incubated with VV. Fluorescence of lymphoblasts was dependent upon lectin dose and was eliminated when staining was performed in the presence of N-acetyl-D-galactosamine, the appropriate competitive sugar for VV. T cell blast populations activated against H-2, Mls, or parasite antigens all had comparable levels of fluorescence after staining with VV, although the cytolytic activity of these cells varied widely. Furthermore, when MLC lymphoblasts binding large or small amounts of VV were sorted on the basis of their relative fluorescence intensity and tested for cytolytic function, no appreciable difference in activity between the 2 populations was observed. These results are inconsistent with the hypothesis that VV binds selectively to cytolytic T lymphocytes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background: Medical errors have recently been recognized as a relevant concern in public health, and increasing research efforts have been made to find ways of improving patient safety. In palliative care, however, studies on errors are scant. Objective: Our aim was to gather pilot data concerning experiences and attitudes of palliative care professionals on this topic. Methods: We developed a questionnaire, which consists of questions on relevance, estimated frequency, kinds and severity of errors, their causes and consequences, and the way palliative care professionals handle them. The questionnaire was sent to all specialist palliative care institutions in the region of Bavaria, Germany (n=168; inhabitants 12.5 million) reaching a response rate of 42% (n=70). Results: Errors in palliative care were regarded as a highly relevant problem (median 8 on a 10-point numeric rating scale). Most respondents experienced a moderate frequency of errors (1-10 per 100 patients). Errors in communication were estimated to be more common than those in symptom control. The causes most often mentioned were deficits in communication or organization. Moral and psychological problems for the person committing the error were seen as more frequent than consequences for the patient. Ninety percent of respondents declared that they disclose errors to the harmed patient. For 78% of the professionals, the issue was not a part of their professional training. Conclusion: Professionals acknowledge errors-in particular errors in communication-to be a common and relevant problem in palliative care, one that has, however, been neglected in training and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The construct of cognitive errors is clinically relevant for cognitive therapy of mood disorders. Beck's universality hypothesis postulates the relevance of negative cognitions in all subtypes of mood disorders, as well as positive cognitions for manic states. This hypothesis has rarely been empirically addressed for patients presenting bipolar affective disorder (BD). In-patients (n = 30) presenting with BD were interviewed, as were 30 participants of a matched control group. Valid and reliable observer-rater methodology for cognitive errors was applied to the session transcripts. Overall, patients make more cognitive errors than controls. When manic and depressive patients were compared, parts of the universality hypothesis were confirmed. Manic symptoms are related to positive and negative cognitive errors. These results are discussed with regard to the main assumptions of the cognitive model for depression; thus adding an argument for extending it to the BD diagnostic group, taking into consideration specificities in terms of cognitive errors. Clinical implications for cognitive therapy of BD are suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution of continuous traits is the central component of comparative analyses in phylogenetics, and the comparison of alternative models of trait evolution has greatly improved our understanding of the mechanisms driving phenotypic differentiation. Several factors influence the comparison of models, and we explore the effects of random errors in trait measurement on the accuracy of model selection. We simulate trait data under a Brownian motion model (BM) and introduce different magnitudes of random measurement error. We then evaluate the resulting statistical support for this model against two alternative models: Ornstein-Uhlenbeck (OU) and accelerating/decelerating rates (ACDC). Our analyses show that even small measurement errors (10%) consistently bias model selection towards erroneous rejection of BM in favour of more parameter-rich models (most frequently the OU model). Fortunately, methods that explicitly incorporate measurement errors in phylogenetic analyses considerably improve the accuracy of model selection. Our results call for caution in interpreting the results of model selection in comparative analyses, especially when complex models garner only modest additional support. Importantly, as measurement errors occur in most trait data sets, we suggest that estimation of measurement errors should always be performed during comparative analysis to reduce chances of misidentification of evolutionary processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New Global Positioning System (GPS) receivers allow now to measure a location on earth at high frequency (5Hz) with a centimetric precision using phase differential positioning method. We studied whether such technique was accurate enough to retrieve basic parameters of human locomotion. Eight subjects walked on an athletics track at four different imposed step frequencies (70-130steps/min) plus a run at free pace. Differential carrier phase localization between a fixed base station and the mobile antenna mounted on the walking person was calculated. In parallel, a triaxial accelerometer, attached to the low back, recorded body accelerations. The different parameters were averaged for 150 consecutive steps of each run for each subject (total of 6000 steps analyzed). We observed a perfect correlation between average step duration measured by accelerometer and by GPS (r=0.9998, N=40). Two important parameters for the calculation of the external work of walking were also analyzed, namely the vertical lift of the trunk and the velocity variation per step. For an average walking speed of 4.0km/h, average vertical lift and velocity variation were, respectively, 4.8cm and 0.60km/h. The average intra-individual step-to-step variability at a constant speed, which includes GPS errors and the biological gait style variation, were found to be 24. 5% (coefficient of variation) for vertical lift and 44.5% for velocity variation. It is concluded that GPS technique can provide useful biomechanical parameters for the analysis of an unlimited number of strides in an unconstrained free-living environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine the types and incidence of caruncular lesions and to investigate the correlation between clinical and histologic diagnosis. DESIGN: Retrospective, observational case series. METHODS: Records of patients with a lesion of the caruncle that was excised and submitted to our ocular pathology department between January 1979 and May 2005 were reviewed. Lesions were classified by histologic type and correlated with patient age, gender, and preoperative clinical diagnosis. RESULTS: A total of 195 consecutive caruncular lesions from 191 patients were identified. Twenty-four different types of lesions were identified; the most common were nevi (n = 92, 47%) and papillomas (n = 29, 15%). One keratoacanthoma was identified. One hundred eighty-three lesions (93.8%) were benign, six (3.1%) were premalignant, and five (2.6%) were malignant. Preoperative clinical diagnosis corresponded to postexcision histologic diagnosis in 73 cases (37.4%). Suspected malignancy was a common reason for excision (61 cases, 31.3%), but malignancy was confirmed in only three (4.9%) of 61 cases. Two of the five malignant lesions were clinically thought to be benign. CONCLUSIONS: We hereby report the first caruncular keratoacanthoma. The rarity and variety of caruncular lesions make clinical diagnosis difficult. Malignancy is clinically overestimated, and some malignant lesions can take a benign aspect, justifying close photographic follow-up of all lesions. Because caruncular malignant melanoma is associated with poor prognosis, pigmented lesions should be monitored carefully. In the absence of clear criteria for malignancy, any change in color, size, or vascularization of a caruncular lesion should hasten excision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous sources of evidence point to the fact that heterogeneity within the Earth's deep crystalline crust is complex and hence may be best described through stochastic rather than deterministic approaches. As seismic reflection imaging arguably offers the best means of sampling deep crustal rocks in situ, much interest has been expressed in using such data to characterize the stochastic nature of crustal heterogeneity. Previous work on this problem has shown that the spatial statistics of seismic reflection data are indeed related to those of the underlying heterogeneous seismic velocity distribution. As of yet, however, the nature of this relationship has remained elusive due to the fact that most of the work was either strictly empirical or based on incorrect methodological approaches. Here, we introduce a conceptual model, based on the assumption of weak scattering, that allows us to quantitatively link the second-order statistics of a 2-D seismic velocity distribution with those of the corresponding processed and depth-migrated seismic reflection image. We then perform a sensitivity study in order to investigate what information regarding the stochastic model parameters describing crustal velocity heterogeneity might potentially be recovered from the statistics of a seismic reflection image using this model. Finally, we present a Monte Carlo inversion strategy to estimate these parameters and we show examples of its application at two different source frequencies and using two different sets of prior information. Our results indicate that the inverse problem is inherently non-unique and that many different combinations of the vertical and lateral correlation lengths describing the velocity heterogeneity can yield seismic images with the same 2-D autocorrelation structure. The ratio of all of these possible combinations of vertical and lateral correlation lengths, however, remains roughly constant which indicates that, without additional prior information, the aspect ratio is the only parameter describing the stochastic seismic velocity structure that can be reliably recovered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract : The human body is composed of a huge number of cells acting together in a concerted manner. The current understanding is that proteins perform most of the necessary activities in keeping a cell alive. The DNA, on the other hand, stores the information on how to produce the different proteins in the genome. Regulating gene transcription is the first important step that can thus affect the life of a cell, modify its functions and its responses to the environment. Regulation is a complex operation that involves specialized proteins, the transcription factors. Transcription factors (TFs) can bind to DNA and activate the processes leading to the expression of genes into new proteins. Errors in this process may lead to diseases. In particular, some transcription factors have been associated with a lethal pathological state, commonly known as cancer, associated with uncontrolled cellular proliferation, invasiveness of healthy tissues and abnormal responses to stimuli. Understanding cancer-related regulatory programs is a difficult task, often involving several TFs interacting together and influencing each other's activity. This Thesis presents new computational methodologies to study gene regulation. In addition we present applications of our methods to the understanding of cancer-related regulatory programs. The understanding of transcriptional regulation is a major challenge. We address this difficult question combining computational approaches with large collections of heterogeneous experimental data. In detail, we design signal processing tools to recover transcription factors binding sites on the DNA from genome-wide surveys like chromatin immunoprecipitation assays on tiling arrays (ChIP-chip). We then use the localization about the binding of TFs to explain expression levels of regulated genes. In this way we identify a regulatory synergy between two TFs, the oncogene C-MYC and SP1. C-MYC and SP1 bind preferentially at promoters and when SP1 binds next to C-NIYC on the DNA, the nearby gene is strongly expressed. The association between the two TFs at promoters is reflected by the binding sites conservation across mammals, by the permissive underlying chromatin states 'it represents an important control mechanism involved in cellular proliferation, thereby involved in cancer. Secondly, we identify the characteristics of TF estrogen receptor alpha (hERa) target genes and we study the influence of hERa in regulating transcription. hERa, upon hormone estrogen signaling, binds to DNA to regulate transcription of its targets in concert with its co-factors. To overcome the scarce experimental data about the binding sites of other TFs that may interact with hERa, we conduct in silico analysis of the sequences underlying the ChIP sites using the collection of position weight matrices (PWMs) of hERa partners, TFs FOXA1 and SP1. We combine ChIP-chip and ChIP-paired-end-diTags (ChIP-pet) data about hERa binding on DNA with the sequence information to explain gene expression levels in a large collection of cancer tissue samples and also on studies about the response of cells to estrogen. We confirm that hERa binding sites are distributed anywhere on the genome. However, we distinguish between binding sites near promoters and binding sites along the transcripts. The first group shows weak binding of hERa and high occurrence of SP1 motifs, in particular near estrogen responsive genes. The second group shows strong binding of hERa and significant correlation between the number of binding sites along a gene and the strength of gene induction in presence of estrogen. Some binding sites of the second group also show presence of FOXA1, but the role of this TF still needs to be investigated. Different mechanisms have been proposed to explain hERa-mediated induction of gene expression. Our work supports the model of hERa activating gene expression from distal binding sites by interacting with promoter bound TFs, like SP1. hERa has been associated with survival rates of breast cancer patients, though explanatory models are still incomplete: this result is important to better understand how hERa can control gene expression. Thirdly, we address the difficult question of regulatory network inference. We tackle this problem analyzing time-series of biological measurements such as quantification of mRNA levels or protein concentrations. Our approach uses the well-established penalized linear regression models where we impose sparseness on the connectivity of the regulatory network. We extend this method enforcing the coherence of the regulatory dependencies: a TF must coherently behave as an activator, or a repressor on all its targets. This requirement is implemented as constraints on the signs of the regressed coefficients in the penalized linear regression model. Our approach is better at reconstructing meaningful biological networks than previous methods based on penalized regression. The method is tested on the DREAM2 challenge of reconstructing a five-genes/TFs regulatory network obtaining the best performance in the "undirected signed excitatory" category. Thus, these bioinformatics methods, which are reliable, interpretable and fast enough to cover large biological dataset, have enabled us to better understand gene regulation in humans.

Relevância:

20.00% 20.00%

Publicador: