70 resultados para standard on auditing

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

1 Summary This dissertation deals with two major aspects of corporate governance that grew in importance during the last years: the internal audit function and financial accounting education. In three essays, I contribute to research on these topics which are embedded in the broader corporate governance literature. The first two essays consist of experimental investigations of internal auditors' judgments. They deal with two research issues for which accounting research lacks evidence: The effectiveness of internal controls and the potentially conflicting role of the internal audit function between management and the audit committee. The findings of the first two essays contribute to the literature on internal auditors' judgment and the role of the internal audit function as a major cornerstone of corporate governance. The third essay theoretically examines a broader issue but also relates to the overall research question of this dissertation: What contributes to effective corporate governance? This last essay takes the perspective that the root for quality corporate governance is appropriate financial accounting education. r develop a public interest approach to accounting education that contributes to the literature on adequate accounting education with respect to corporate governance and accounting harmonization. The increasing importance of both the internal audit function and accounting education for corporate governance can be explained by the same recent fundamental changes that still affect accounting research and practice. First, the Sarbanes-Oxley Act of 2002 (SOX, 2002) and the 8th EU Directive (EU, 2006) have led to a bigger role for the internal audit function in corporate governance. Their implications regarding the implementation of audit committees and their oversight over internal controls are extensive. As a consequence, the internal audit function has become increasingly important for corporate governance and serves a new master (i.e. the audit committee) within the company in addition to management. Second, the SOX (2002) and the 8th EU Directive introduced additional internal control mechanisms that are expected to contribute to the reliability of financial information. As a consequence, the internal audit function is expected to contribute to a greater extent to the reliability of financial statements. Therefore, effective internal control mechanisms that strengthen objective judgments and independence become important. This is especially true when external- auditors rely on the work of internal auditors in the context of the International Standard on Auditing (ISA) 610 and the equivalent US Statement on Auditing Standards (SAS) 65 (see IFAC, 2009 and AICPA, 1990). Third, the harmonization of international reporting standards is increasingly promoted by means of a principles-based approach. It is the leading approach since a study of the SEC (2003) that was required by the SOX (2002) in section 108(d) was in favor of this approach. As a result, the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) commit themselves to the development of compatible accounting standards based on a principles-based approach. Moreover, since the Norwalk Agreement of 2002, the two standard setters have developed exposure drafts for a common conceptual framework that will be the basis for accounting harmonization. The new .framework will be in favor of fair value measurement and accounting for real-world economic phenomena. These changes in terms of standard setting lead to a trend towards more professional judgment in the accounting process. They affect internal and external auditors, accountants, and managers in general. As a consequence, a new competency set for preparers and users of financial statements is required. The basil for this new competency set is adequate accounting education (Schipper, 2003). These three issues which affect corporate governance are the initial point of this dissertation and constitute its motivation. Two broad questions motivated a scientific examination in three essays: 1) What are major aspects to be examined regarding the new role of the internal audit function? 2) How should major changes in standard setting affect financial accounting education? The first question became apparent due to two published literature reviews by Gramling et al. (2004) and Cohen, Krishnamoorthy & Wright (2004). These studies raise various questions for future research that are still relevant and which motivate the first two essays of my dissertation. In the first essay, I focus on the role of the internal audit function as one cornerstone of corporate governance and its potentially conflicting role of serving both management and the audit committee (IIA, 2003). In an experimental study, I provide evidence on the challenges for internal auditors in their role as servant for two masters -the audit committee and management -and how this influences internal auditors' judgment (Gramling et al. 2004; Cohen, Krishnamoorthy & Wright, 2004). I ask if there is an expectation gap between what internal auditors should provide for corporate governance in theory compared to what internal auditors are able to provide in practice. In particular, I focus on the effect of serving two masters on the internal auditor's independence. I argue that independence is hardly achievable if the internal audit function serves two masters with conflicting priorities. The second essay provides evidence on the effectiveness of accountability as an internal control mechanism. In general, internal control mechanisms based on accountability were enforced by the SOX (2002) and the 8th EU Directive. Subsequently, many companies introduced sub-certification processes that should contribute to an objective judgment process. Thus, these mechanisms are important to strengthen the reliability of financial statements. Based on a need for evidence on the effectiveness of internal control mechanisms (Brennan & Solomon, 2008; Gramling et al. 2004; Cohen, Krishnamoorthy & Wright, 2004; Solomon & Trotman, 2003), I designed an experiment to examine the joint effect of accountability and obedience pressure in an internal audit setting. I argue that obedience pressure potentially can lead to a negative influence on accountants' objectivity (e.g. DeZoort & Lord, 1997) whereas accountability can mitigate this negative effect. My second main research question - How should major changes in standard setting affect financial accounting education? - is investigated in the third essay. It is motivated by the observation during my PhD that many conferences deal with the topic of accounting education but very little is published about what needs to be done. Moreover, the Endings in the first two essays of this thesis and their literature review suggest that financial accounting education can contribute significantly to quality corporate governance as argued elsewhere (Schipper, 2003; Boyce, 2004; Ghoshal, 2005). In the third essay of this thesis, I therefore focus on approaches to financial accounting education that account for the changes in standard setting and also contribute to corporate governance and accounting harmonization. I argue that the competency set that is required in practice changes due to major changes in standard setting. As the major contribution of the third article, I develop a public interest approach for financial accounting education. The major findings of this dissertation can be summarized as follows. The first essay provides evidence to an important research question raised by Gramling et al. (2004, p. 240): "If the audit committee and management have different visions for the corporate governance role of the IAF, which vision will dominate?" According to the results of the first essay, internal auditors do follow the priorities of either management or the audit committee based on the guidance provided by the Chief Audit executive. The study's results question whether the independence of the internal audit function is actually achievable. My findings contribute to research on internal auditors' judgment and the internal audit function's independence in the broader frame of corporate governance. The results are also important for practice because independence is a major justification for a positive contribution of the internal audit function to corporate governance. The major findings of the second essay indicate that the duty to sign work results - a means of holding people accountable -mitigates the negative effect of obedience pressure on reliability. Hence, I found evidence that control .mechanisms relying on certifications may enhance the reliability of financial information. These findings contribute to the literature on the effectiveness of internal control mechanisms. They are also important in the light of sub-certification processes that resulted from the Sarbanes-Oxley Act and the 8th EU Directive. The third essay contributes to the literature by developing a measurement framework that accounts for the consequences of major trends in standard setting. Moreovér, it shows how these trends affect the required .competency set of people dealing with accounting issues. Based on this work, my main contribution is the development of a public interest approach for the design of adequate financial accounting curricula. 2 Serving two masters: Experimental evidence on the independence of internal auditors Abstract Twenty nine internal auditors participated in a study that examines the independence of internal auditors in their potentially competing roles of serving two masters: the audit committee and management. Our main hypothesis suggests that internal auditors' independence is not achievable in an institutional setting in which internal auditors are accountable to two different parties with potentially differing priorities. We test our hypothesis in an experiment in which the treatment consisted of two different instructions of the Chief audit executive; one stressing the priority of management (cost reduction) and one stressing the priority of the audit committee (effectiveness). Internal auditors had to evaluate internal controls and their inherent costs of different processes which varied in their degree of task complexity. Our main results indicate that internal auditors' evaluation of the processes is significantly different when task complexity is high. Our findings suggest that internal auditors do follow the priorities of either management or the audit committee depending on the instructions of a superior internal auditor. The study's results question whether the independence of the internal audit function is actually achievable. With our findings, we contribute to research on internal auditors' judgment and the internal audit function's independence in the frame of corporate governance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Cardiopulmonary bypass (CPB) with aortic cross-clamping and cardioplegic arrest remains the method of choice for patients requiring standard myocardial revascularization. Therefore, very high-risk patients presenting with acute coronary syndrome, unstable angina, onset of cardiac decompensation and requiring emergency multiple myocardial revascularization, can have a poor outcome. The on-pump beating heart technique can reduce the mortality and the morbidity in such a selected group of patients and this report describes our clinical experience. METHODS: Out of 290 patients operated for CABG from January 2005 to January 2006, 25 (8.6%) selected high-risk patients suffering from life threatening coronary syndrome (mean age 69 +/- 7 years) and requiring emergency multiple myocardial revascularization, underwent on-pump beating heart surgery. The mean pre-operative left ventricle ejection fraction (LVEF) was 27 +/- 8%. The majority of them (88%) suffered of tri-vessel coronary disease and 6 (24%) had a left main stump disease. Nine patients (35%) were on severe cardiac failure and seven among them (28%) received a pre-operative intra-aortic balloon pump. The pre-operative EuroScore rate was equal or above 8 in 18 patients (73%). RESULTS: All patients underwent on-pump-beating heart coronary revascularization. The mean number of graft/patient was 2.9 +/- 0.6 and the internal mammary artery was used in 23 patients (92%). The mean CPB time was 84 +/- 19 minutes. Two patients died during the recovery stay in the intensive care unit, and there were no postoperative myocardial infarctions between the survivors. Eight patients suffered of transitorily renal failure and 1 patient developed a sternal wound infection. The mean hospital stay was 12 +/- 7 days. The follow-up was complete for all 23 patients survived at surgery and the mean follow-up time was 14 +/- 5 months. One patient died during the follow-up for cardiac arrest and 2 patients required an implantable cardiac defibrillator. One year after surgery they all had a standard trans-thoracic echocardiogram showing a mean LVEF rate of 36 +/- 11.8%. CONCLUSION: Standard on-pump arrested heart coronary surgery has higher mortality and morbidity in emergencies. The on-pump beating heart myocardial revascularization seems to be a valid alternative for the restricted and selected cohort of patients suffering from life threatening coronary syndrome and requiring multiple emergency CABG.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to develop and validate a set of clinical criteria for the classification of patients affected by periodic fevers. Patients with inherited periodic fevers (familial Mediterranean fever (FMF); mevalonate kinase deficiency (MKD); tumour necrosis factor receptor-associated periodic fever syndrome (TRAPS); cryopyrin-associated periodic syndromes (CAPS)) enrolled in the Eurofever Registry up until March 2013 were evaluated. Patients with periodic fever, aphthosis, pharyngitis and adenitis (PFAPA) syndrome were used as negative controls. For each genetic disease, patients were considered to be 'gold standard' on the basis of the presence of a confirmatory genetic analysis. Clinical criteria were formulated on the basis of univariate and multivariate analysis in an initial group of patients (training set) and validated in an independent set of patients (validation set). A total of 1215 consecutive patients with periodic fevers were identified, and 518 gold standard patients (291 FMF, 74 MKD, 86 TRAPS, 67 CAPS) and 199 patients with PFAPA as disease controls were evaluated. The univariate and multivariate analyses identified a number of clinical variables that correlated independently with each disease, and four provisional classification scores were created. Cut-off values of the classification scores were chosen using receiver operating characteristic curve analysis as those giving the highest sensitivity and specificity. The classification scores were then tested in an independent set of patients (validation set) with an area under the curve of 0.98 for FMF, 0.95 for TRAPS, 0.96 for MKD, and 0.99 for CAPS. In conclusion, evidence-based provisional clinical criteria with high sensitivity and specificity for the clinical classification of patients with inherited periodic fevers have been developed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: When treating peripheral ectatic disease-like pellucid marginal degeneration (PMD), corneal cross-linking with UV-A and riboflavin (CXL) must be applied eccentrically to the periphery of the lower cornea, partly irradiating the corneal limbus. Here, we investigated the effect of standard and double-standard fluence corneal cross-linking with riboflavin and UV-A (CXL) on cornea and corneal limbus in the rabbit eye in vivo. METHODS: Epithelium-off CXL was performed in male New Zealand White rabbits with two irradiation diameters (7 mm central cornea, 13 mm cornea and limbus), using standard fluence (5.4 J/cm(2)) and double-standard fluence (10.8 J/cm(2)) settings. Controls were subjected to epithelial removal and riboflavin instillation, but were not irradiated with UV-A. Following CXL, animals were examined daily until complete closure of the epithelium, and at 7, 14, 21, and 28 days. Animals were killed and a corneoscleral button was excised and processed for light microscopy and immunohistochemistry. RESULTS: For both irradiation diameters and fluences tested, no signs of endothelial damage or limbal vessel thrombosis were observed, and time to re-epithelialization was similar to untreated controls. Histological and immunohistochemical analysis revealed no differences in the p63 putative stem cell marker expression pattern. CONCLUSIONS: Even when using fluence twice as high as the one used in current clinical CXL settings, circumferential UV-A irradiation of the corneal limbus does not alter the regenerative capacity of the limbal epithelial cells, and the expression pattern of the putative stem cell marker p63 remains unchanged. This suggests that eccentric CXL may be performed safely in PMD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To investigate the effect of incremental increases in intraocular straylight on threshold measurements made by three modern forms of perimetry: Standard Automated Perimetry (SAP) using Octopus (Dynamic, G-Pattern), Pulsar Perimetry (PP) (TOP, 66 points) and the Moorfields Motion Displacement Test (MDT) (WEBS, 32 points).Methods: Four healthy young observers were recruited (mean age 26yrs [25yrs, 28yrs]), refractive correction [+2 D, -4.25D]). Five white opacity filters (WOF), each scattering light by different amounts were used to create incremental increases in intraocular straylight (IS). Resultant IS values were measured with each WOF and at baseline (no WOF) for each subject using a C-Quant Straylight Meter (Oculus, Wetzlar, Germany). A 25 yr old has an IS value of ~0.85 log(s). An increase of 40% in IS to 1.2log(s) corresponds to the physiological value of a 70yr old. Each WOFs created an increase in IS between 10-150% from baseline, ranging from effects similar to normal aging to those found with considerable cataract. Each subject underwent 6 test sessions over a 2-week period; each session consisted of the 3 perimetric tests using one of the five WOFs and baseline (both instrument and filter were randomised).Results: The reduction in sensitivity from baseline was calculated. A two-way ANOVA on mean change in threshold (where subjects were treated as rows in the block and each increment in fog filters was treated as column) was used to examine the effect of incremental increases in straylight. Both SAP (p<0.001) and Pulsar (p<0.001) were significantly affected by increases in straylight. The MDT (p=0.35) remained comparatively robust to increases in straylight.Conclusions: The Moorfields MDT measurement of threshold is robust to effects of additional straylight as compared to SAP and PP.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: To quantify the relationship between bone marrow (BM) response to radiation and radiation dose by using (18)F-labeled fluorodeoxyglucose positron emission tomography [(18)F]FDG-PET standard uptake values (SUV) and to correlate these findings with hematological toxicity (HT) in cervical cancer (CC) patients treated with chemoradiation therapy (CRT). METHODS AND MATERIALS: Seventeen women with a diagnosis of CC were treated with standard doses of CRT. All patients underwent pre- and post-therapy [(18)F]FDG-PET/computed tomography (CT). Hemograms were obtained before and during treatment and 3 months after treatment and at last follow-up. Pelvic bone was autosegmented as total bone marrow (BMTOT). Active bone marrow (BMACT) was contoured based on SUV greater than the mean SUV of BMTOT. The volumes (V) of each region receiving 10, 20, 30, and 40 Gy (V10, V20, V30, and V40, respectively) were calculated. Metabolic volume histograms and voxel SUV map response graphs were created. Relative changes in SUV before and after therapy were calculated by separating SUV voxels into radiation therapy dose ranges of 5 Gy. The relationships among SUV decrease, radiation dose, and HT were investigated using multiple regression models. RESULTS: Mean relative pre-post-therapy SUV reductions in BMTOT and BMACT were 27% and 38%, respectively. BMACT volume was significantly reduced after treatment (from 651.5 to 231.6 cm(3), respectively; P<.0001). BMACT V30 was significantly correlated with a reduction in BMACT SUV (R(2), 0.14; P<.001). The reduction in BMACT SUV significantly correlated with reduction in white blood cells (WBCs) at 3 months post-treatment (R(2), 0.27; P=.04) and at last follow-up (R(2), 0.25; P=.04). Different dosimetric parameters of BMTOT and BMACT correlated with long-term hematological outcome. CONCLUSIONS: The volumes of BMTOT and BMACT that are exposed to even relatively low doses of radiation are associated with a decrease in WBC counts following CRT. The loss in proliferative BM SUV uptake translates into low WBC nadirs after treatment. These results suggest the potential of intensity modulated radiation therapy to spare BMTOT to reduce long-term hematological toxicity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyzed the initial adhesion and biofilm formation of Staphylococcus aureus (ATCC 29213) and S. epidermidis RP62A (ATCC 35984) on various bone grafts and bone graft substitutes under standardized in vitro conditions. In parallel, microcalorimetry was evaluated as a real-time microbiological assay in the investigation of biofilm formation and material science research. The materials beta-tricalcium phosphate (beta-TCP), processed human spongiosa (Tutoplast) and poly(methyl methacrylate) (PMMA) were investigated and compared with polyethylene (PE). Bacterial counts (log(10) cfu per sample) were highest on beta-TCP (S. aureus 7.67 +/- 0.17; S. epidermidis 8.14 +/- 0.05) while bacterial density (log(10) cfu per surface) was highest on PMMA (S. aureus 6.12 +/- 0.2, S. epidermidis 7.65 +/- 0.13). Detection time for S. aureus biofilms was shorter for the porous materials (beta-TCP and processed human spongiosa, p < 0.001) compared to the smooth materials (PMMA and PE), with no differences between beta-TCP and processed human spongiosa (p > 0.05) or PMMA and PE (p > 0.05). In contrast, for S. epidermidis biofilms the detection time was different (p < 0.001) between all materials except between processed human spongiosa and PE (p > 0.05). The quantitative analysis by quantitative culture after washing and sonication of the material demonstrated the importance of monitoring factors like specific surface or porosity of the test materials. Isothermal microcalorimetry proved to be a suitable tool for an accurate, non-invasive and real-time microbiological assay, allowing the detection of bacterial biomass without removing the biofilm from the surface.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been shown that repolarization alternans, a beat-to-beat alternation in action potential duration, enhances dispersion of repolarization above a critical heart rate and promotes susceptibility to ventricular arrhythmias. It is unknown whether repolarization alternans is measurable in the atria using standard pacemakers and whether it plays a role in promoting atrial fibrillation. In this work, atrial repolarization alternans amplitude and periodicity are studied in a sheep model of pacing-induced atrial fibrillation. Two pacemakers, each with one right atrial and ventricular lead, were implanted in 4 male sheep after ablation of the atrioventricular junction. The first one was used to deliver rapid pacing for measurements of right atrial repolarization alternans and the second one to record a unipolar electrogram. Atrial repolarization alternans appeared rate-dependent and its amplitude increased as a function of pacing rate. Repolarization alternans was intermittent but no periodicity was detected. An increase of repolarization alternans preceding episodes of non-sustained atrial fibrillation suggests that repolarization alternans is a promising parameter for assessment of atrial fibrillation susceptibility.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Accurate prediction of mortality following burns is useful as an audit tool, and for providing treatment plan and resource allocation criteria. Common burn formulae (Ryan Score, Abbreviated Burn Severity Index (ABSI), classic and revised Baux) have not been compared with the standard Acute Physiology and Chronic Health Evaluation II (APACHEII) or re-validated in a severely (≥20% total burn surface area) burned population. Furthermore, the revised Baux (R-Baux) has been externally validated thoroughly only once and the pediatric Baux (P-Baux) has yet to be. Using 522 severely burned patients, we show that burn formulae (ABSI, Baux, revised Baux) outperform APACHEII among adults (AUROC increase p<0.001 adults; p>0.5 children). The Ryan Score performs well especially among the most at-risk populations (estimated mortality [90% CI] original versus current study: 33% [26-41%] versus 30.18% [24.25-36.86%] for Ryan Score 2; 87% [78-93%] versus 66.48% [51.31-78.87%] for Ryan Score 3). The R-Baux shows accurate discrimination (AUROC 0.908 [0.869-0.947]) and is well-calibrated. However, the ABSI and P-Baux, although showing high measures of discrimination (AUROC 0.826 [0.737-0.916] and 0.848 [0.758-0.938]) in children), exceedingly overestimates mortality, indicating poor calibration. We highlight challenges in designing and employing scores that are applicable to a wide range of populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel approach to measure carbon dioxide (CO2) in gaseous samples, based on a precise and accurate quantification by (13)CO2 internal standard generated in situ is presented. The main goal of this study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable in the routine determination of CO2. The main drawback of the GC methods discussed in the literature for CO2 measurement is the lack of a specific internal standard necessary to perform quantification. CO2 measurement is still quantified by external calibration without taking into account analytical problems which can often occur considering gaseous samples. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in situ an internal labeled standard gas ((13)CO2) on the basis of the stoichiometric formation of CO2 by the reaction of hydrochloric acid (HCl) with sodium hydrogen carbonate (NaH(13)CO3). This method allows a precise measurement of CO2 concentration and was validated on various human postmortem gas samples in order to study its efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapport de synthèse : Introduction : Internet est une source importante d'information sur la santé mentale. Le trouble bipolaire est communément associé à un handicap, des comorbidités, un faible taux d'introspection et une mauvaise compliance au traitement. Le fardeau de la maladie, de par les épisodes dépressifs et maniaques, peut conduire les personnes (dont le diagnostic de trouble bipolaire a été déjà posé ou non), ainsi que leur famille à rechercher des informations sur Internet. De ce fait, il est important que les sites Web traitant du sujet contiennent de l'information de haute qualité, basée sur les évidences scientifiques. Objectif.: évaluer la qualité des informations consultables sur Internat au sujet du trouble bipolaire et identifier des indicateurs de qualité. Méthode: deux mots-clés : « bipolar disorder » et « manic depressive illness » ont été introduits dans les moteurs de recherche les plus souvent utilisés sur Internet. Les sites Internet ont été évalués avec un formulaire standard conçu pour noter les sites sur la base de l'auteur (privé, université, entreprise,...), la présentation, l'interactivité, la lisibilité et la qualité du contenu. Le label de qualité « Health On the Net» (HON), et l'outil DISCERN ont été utilisés pour vérifier leur efficacité comme indicateurs de la qualité. Résultats: sur les 80 sites identifiés, 34 ont été inclus. Sur la base de la mesure des résultats, la qualité du contenu des sites s'est avérée être bonne. La qualité du contenu des sites Web qui traitent du trouble bipolaire est expliquée de manière significative par la lisibilité, la responsabilité et l'interactivité aussi bien que par un score global. Conclusions: dans l'ensemble, la qualité du contenu de l'étude des sites Web traitant du trouble bipolaire est de bonne qualité.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims: Recently, single nucleotide polymorphisms (SNPs) in IL28B were shown to correlate with response to pegylated interferon-a (IFN) and ribavirin therapy of chronic HCV infection. However, the cause for the SNPs effect on therapy response and its application for direct anti-viral (DAV) treatment are not clear. Here, we analyze early HCV kinetics as function of IL28B SNPs to determine its specific effect on viral dynamics. Methods: IL28B SNPs rs8099917, rs12979860 and rs12980275 were genotyped in 252 chronically HCV infected Caucasian naïve patients (67% HCV genotype 1, 28% genotype 2-3) receiving peginterferonalfa- 2a (180 mg/qw) plus ribavirin (1000-1200 mg/qd) in the DITTO study. HCV-RNA was measured (LD = 50 IU/ml) frequently during first 28 days. Results: RVR was achieved in 33% of genotype 1 patients with genotype CC at rs12979860 versus 12-16% for genotypes TT and CT (P < 0.03). Significant (P < 0.001) difference in viral decline was observed already at day 1 (see Figure). First phase decline was significantly (P < 0.001) larger in patients with genotype CC (2.0 log) than for TT and CT genotypes (0.6 and 0.8), indicating IFN anti-viral effectiveness in blocking virion production of 99% versus 75-84%. There was no significant association between second phase slope and rs12979860 genotype in patients with a first phase decline larger than 1 log. HCV kinetics as function of IL28b SNP. The same trend (not shown) was observed for HCV genotype 2-3 patients with different SNP genotype distribution that may indicate differential selection pressure as function of HCV genotype. Similar results were observed for SNPs rs8099917 and rs12980275, with a strong linkage disequilibrium among the 3 loci allowing to define the composite haplotype best associated with IFN effectiveness. Conclusions: IFN effectiveness in blocking virion production/ release is strongly affected by IL28B SNPs, but not other viral dynamic properties such as infected cell loss rate. Thus, IFN based therapy, as standard-of-care or in combination with DAV, should consider IL28B SNPs for prediction and personalized treatment, while response to pure DAV treatment may be less affected by IL28B SNPs. Additional analyses are undergoing to pinpoint the SNP effect on IFN anti-viral effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.