950 resultados para non-parametric smoothing
Resumo:
Introduction: The Fragile X - associated Tremor Ataxia Syndrome (FXTAS) is a recently described, and under-diagnosed, late onset (≈ 60y) neurodegenerative disorder affecting male carriers of a premutation in the Fragile X Mental Retardation 1 (FMR1) gene. The premutation is an CGG (Cytosine-Guanine-Guanine) expansion (55 to 200 CGG repeats) in the proximal region of the FMR1 gene. Patients with FXTAS primarily present with cerebellar ataxia and intention tremor. Neuroradiological features of FXTAS include prominent white matter disease in the periventricular, subcortical, middle cerebellar peduncles and deep white matter of the cerebellum on T2-weighted or FLAIR MR imaging (Jacquemmont 2007, Loesch 2007, Brunberg 2002, Cohen 2006). We hypothesize that a significant white matter alteration is present in younger individuals many years prior to clinical symptoms and/or the presence of visible lesions on conventional MR sequences and might be detectable by magnetization transfer (MT) imaging. Methods: Eleven asymptomatic premutation carriers (mean age = 55 years) and seven intra-familial controls participated to the study. A standardized neurological examination was performed on all participants and a neuropsychological evaluation was carried out before MR scanning performed on a 3T Siemens Trio. The protocol included a sagittal T1-weighted 3D gradient-echo sequence (MPRAGE, 160 slices, 1 mm^3 isotropic voxels) and a gradient-echo MTI (FA 30, TE 15, matrix size 256*256, pixel size 1*1 mm, 36 slices (thickness 2mm), MT pulse duration 7.68 ms, FA 500, frequency offset 1.5 kHz). MTI was performed by acquiring consecutively two set of images; first with and then without the MT saturation pulse. MT images were coregistered to the T1 acquisition. The MTR for every intracranial voxel was calculated as follows: MTR = (M0 - MS)/M0*100%, creating a MTR map for each subject. As first analysis, the whole white matter (WM) was used to mask the MTR image in order to create an histogram of the MTR distribution in the whole tissue class over the two groups examined. Then, for each subject, we performed a segmentation and parcellation of the brain by means of Freesurfer software, starting from the high resolution T1-weighted anatomical acquisition. Cortical parcellations was used to assign a label to the underlying white matter by the construction of a Voronoi diagram in the WM voxels of the MR volume based on distance to the nearest cortical parcellation label. This procedure allowed us to subdivide the cerebral WM in 78 ROIs according to the cortical parcellation (see example in Fig 1). The cerebellum, by the same procedure, was subdivided in 5 ROIs (2 per each hemisphere and one corresponding to the brainstem). For each subject, we calculated the mean value of MTR within each ROI and averaged over controls and patients. Significant differences between the two groups were tested using a two sample T-test (p<0.01). Results: Neurological examination showed that no patient met the clinical criteria of Fragile X Tremor and Ataxia Syndrome yet. Nonetheless, premutation carriers showed some subtle neurological signs of the disorder. In fact, premutation carriers showed a significant increase of tremor (CRST, T-test p=0.007) and increase of ataxia (ICARS, p=0.004) when compared to controls. The neuropsychological evaluation was normal in both groups. To obtain general characterizations of myelination for each subject and premutation carriers, we first computed the distribution of MTR values across the total white matter volume and averaged for each group. We tested the equality of the two distributions with the non parametric Kolmogorov-Smirnov test and we rejected the null-hypothesis at a p=0.03 (fig. 2). As expected, when comparing the asymptomatic permutation carriers with control subjects, the peak value and peak position of the MTR values within the whole WM were decreased and the width of the distribution curve was increased (p<0.01). These three changes point to an alteration of the global myelin status of the premutation carriers. Subsequently, to analyze the regional myelination and white matter integrity of the same group, we performed a ROI analysis of MTR data. The ROI-based analysis showed a decrease of mean MTR value in premutation carriers compared to controls in bilateral orbito-frontal and inferior frontal WM, entorhinal and cingulum regions and cerebellum (Fig 3). The detection of these differences in these regions failed with other conventional MR techniques. Conclusions: These preliminary data confirm that in premutation carriers, there are indeed alterations in "normal appearing white matter" (NAWM) and these alterations are visible with the MT technique. These results indicate that MT imaging may be a relevant approach to detect both global and local alterations within NAWM in "asymptomatic" carriers of premutations in the Fragile X Mental Retardation 1 (FMR1) gene. The sensitivity of MT in the detection of these alterations might point towards a specific physiopathological mechanism linked to an underlying myelin disorder. ROI-based analyses show that the frontal, parahippocampal and cerebellar regions are already significantly affected before the onset of symptoms. A larger sample will allow us to determine the minimum CGG expansion and age associated with these subclinical white matter alterations.
Resumo:
INTRODUCTION No definitive data are available regarding the value of switching to an alternative TNF antagonist in rheumatoid arthritis patients who fail to respond to the first one. The aim of this study was to evaluate treatment response in a clinical setting based on HAQ improvement and EULAR response criteria in RA patients who were switched to a second or a third TNF antagonist due to failure with the first one. METHODS This was an observational, prospective study of a cohort of 417 RA patients treated with TNF antagonists in three university hospitals in Spain between January 1999 and December 2005. A database was created at the participating centres, with well-defined operational instructions. The main outcome variables were analyzed using parametric or non-parametric tests depending on the level of measurement and distribution of each variable. RESULTS Mean (+/- SD) DAS-28 on starting the first, second and third TNF antagonist was 5.9 (+/- 2.0), 5.1 (+/- 1.5) and 6.1 (+/- 1.1). At the end of follow-up, it decreased to 3.3 (+/- 1.6; Delta = -2.6; p > 0.0001), 4.2 (+/- 1.5; Delta = -1.1; p = 0.0001) and 5.4 (+/- 1.7; Delta = -0.7; p = 0.06). For the first TNF antagonist, DAS-28-based EULAR response level was good in 42% and moderate in 33% of patients. The second TNF antagonist yielded a good response in 20% and no response in 53% of patients, while the third one yielded a good response in 28% and no response in 72%. Mean baseline HAQ on starting the first, second and third TNF antagonist was 1.61, 1.52 and 1.87, respectively. At the end of follow-up, it decreased to 1.12 (Delta = -0.49; p < 0.0001), 1.31 (Delta = -0.21, p = 0.004) and 1.75 (Delta = -0.12; p = 0.1), respectively. Sixty four percent of patients had a clinically important improvement in HAQ (defined as > or = -0.22) with the first TNF antagonist and 46% with the second. CONCLUSION A clinically significant effect size was seen in less than half of RA patients cycling to a second TNF antagonist.
Resumo:
INTRODUCTION Human host immune response following infection with the new variant of A/H1N1 pandemic influenza virus (nvH1N1) is poorly understood. We utilize here systemic cytokine and antibody levels in evaluating differences in early immune response in both mild and severe patients infected with nvH1N1. METHODS We profiled 29 cytokines and chemokines and evaluated the haemagglutination inhibition activity as quantitative and qualitative measurements of host immune responses in serum obtained during the first five days after symptoms onset, in two cohorts of nvH1N1 infected patients. Severe patients required hospitalization (n = 20), due to respiratory insufficiency (10 of them were admitted to the intensive care unit), while mild patients had exclusively flu-like symptoms (n = 15). A group of healthy donors was included as control (n = 15). Differences in levels of mediators between groups were assessed by using the non parametric U-Mann Whitney test. Association between variables was determined by calculating the Spearman correlation coefficient. Viral load was performed in serum by using real-time PCR targeting the neuraminidase gene. RESULTS Increased levels of innate-immunity mediators (IP-10, MCP-1, MIP-1beta), and the absence of anti-nvH1N1 antibodies, characterized the early response to nvH1N1 infection in both hospitalized and mild patients. High systemic levels of type-II interferon (IFN-gamma) and also of a group of mediators involved in the development of T-helper 17 (IL-8, IL-9, IL-17, IL-6) and T-helper 1 (TNF-alpha, IL-15, IL-12p70) responses were exclusively found in hospitalized patients. IL-15, IL-12p70, IL-6 constituted a hallmark of critical illness in our study. A significant inverse association was found between IL-6, IL-8 and PaO2 in critical patients. CONCLUSIONS While infection with the nvH1N1 induces a typical innate response in both mild and severe patients, severe disease with respiratory involvement is characterized by early secretion of Th17 and Th1 cytokines usually associated with cell mediated immunity but also commonly linked to the pathogenesis of autoimmune/inflammatory diseases. The exact role of Th1 and Th17 mediators in the evolution of nvH1N1 mild and severe disease merits further investigation as to the detrimental or beneficial role these cytokines play in severe illness.
Resumo:
INTRODUCTION AND AIMS: Satisfaction of inpatients with served food within a hospital care system still constitutes one of the main attempts to modernize food services. The impact of type of menu, food category, hospital centre and timetable on the meals wastage produced in different Spanish healthcare settings, was evaluated. METHODS: Meal wastage was measured through a semiquantitative 5-point scale ("nothing on plate"; "¼ on plate"; "half on plate"; "¾ on plate" and "all on plate"). The study was carried out in two periods of three months each in 2010 and 2011. A trained person took charge of measuring plate waste classified into 726 servings belonging to 11 menus. In total 31,392 plates were served to 7,868 inpatients. A Kruskal-Wallis non-parametric test (p < 0.05) was applied to evaluate significant differences among the variables studied. RESULTS: The menus were satisfactorily consumed because more than 50% of the plates were classified as "nothing on plate". Regarding food categories, 26.78% of the plates corresponded to soups and purées, while pasta and rice, and prepared foods were only distributed in 4-5% of the servings. Desserts were mostly consumed, while cooked vegetables were less accepted by the inpatients evaluated. Other factors such as hospital centre influenced plate waste (p < 0.05) but timetable did not (p > 0.05). CONCLUSION: Visual inspections of plate waste might be useful to optimize type and quality of menus served. The type of menu served and the food category could have a great influence on food acceptability by the inpatientsstudied.
Resumo:
Objective: To asses the results of Autologous Chondrocyte Implantation (ACI) whith periosteal patch and To propagate the care circuits existing about in Andalusia. Material and Methods: From its of ficial licence in 2005, the tissue bank and the Virgen de la Victoria Hospital from Málaga, performed the ACI in the Andalusian public health system. 16 patients has been operated between 2006-2013, whith medium follow-up 47,6 months (6 months-6 years), from public hos- pitals throughout Andalucia, managed by hospital admission source and destination. Physiologically younger patients were selected (<50 años), with singles, > 2cm2 symptomatics chondral lesions, in stables and well aligned knees. ACI was used as res- cue procedure after microfracture ́s failure except osteochondritis dissecans. To assess the results the Concinnati score and the Short Form 36 (SF-36) score were used. A descriptive analysis was performed and non-parametric tests were used to establish correlations and compare results. Results: In 15 patients with more than one year of follow-up: 14 men(87.5%) and 2women (14.5%), medium age 28.2 years old (min 17 max 43), the lesion was located into de femoral condyle, mostly in the internal one (81,2%) with medium size 2,7cm2(2-4,2). We founded significant improvement (p<0,001), both daily activities ( 89,3% preop. limitatión - 9% postop), as in the sports (90,2% preop limitatión - 38% postop) and the exploration of the knee (67,7% hpatological findings preop- 13,3%postop). The SF-36 score improved in all categories, over all in mental health (p> 0,01). The patient satisfaction was high or very high in 12 of the 15 patients ( 80%), and low in 3 patients. Conclusions: ACI improve quality of life and knee function in femoral condyle chondral lesions. The case ́s selection and the collaboration with Tissue Bank, allows us to create care circuits for treatment of patients from other provinces in the Public Sanitary Health System in Andalucia. It is necessary to increase the experience with this type of therapy, consolidating multicenter workgroups that provide strength to the conclusions.
Resumo:
Our project aims at analyzing the relevance of economic factors (mainly income and other socioeconomic characteristics of Spanish households and market prices) on the prevalence of obesity in Spain and to what extent market intervention prices are effective to reduce obesity and improve the quality of the diet, and under what circumstances. In relation to the existing literature worldwide, this project is the first attempt in Spain trying to get an overall picture on the effectiveness of public policies on both food consumption and the quality of diet, on one hand, and on the prevalence of obesity on the other. The project consists of four main parts. The first part represents a critical review of the literature on the economic approach of dealing with the obesity prevalence problems, diet quality and public intervention policies. Although another important body of obesity literature is dealing with physical exercise but in this paper we will limit our attention to those studies related to food consumption respecting the scope of our study and as there are many published literature review dealing with the literature related to the physical exercise and its effect on obesity prevalence. The second part consists of a Parametric and Non-Parametric Analysis of the Role of Economic Factors on Obesity Prevalence in Spain. The third part is trying to overcome the shortcomings of many diet quality indices that have been developed during last decades, such as the Healthy Eating Index, the Diet Quality Index, the Healthy Diet Indicator, and the Mediterranean Diet Score, through the development of a new obesity specific diet quality index. While the last part of our project concentrates on the assessment of the effectiveness of market intervention policies to improve the healthiness of the Spanish Diet Using the new Exact Affine Stone Index (EASI) Demand System.
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
ABSTRACTThe Copula Theory was used to analyze contagion among the BRIC (Brazil, Russia, India and China) and European Union stock markets with the U.S. Equity Market. The market indexes used for the period between January 01, 2005 and February 27, 2010 are: MXBRIC (BRIC), MXEU (European Union) and MXUS (United States). This article evaluated the adequacy of the main copulas found in the financial literature using log-likelihood, Akaike information and Bayesian information criteria. This article provides a groundbreaking study in the area of contagion due to the use of conditional copulas, allowing to calculate the correlation increase between indexes with non-parametric approach. The conditional Symmetrized Joe-Clayton copula was the one that fitted better to the considered pairs of returns. Results indicate evidence of contagion effect in both markets, European Union and BRIC members, with a 5% significance level. Furthermore, there is also evidence that the contagion of U.S. financial crisis was more pronounced in the European Union than in the BRIC markets, with a 5% significance level. Therefore, stock portfolios formed by equities from the BRIC countries were able to offer greater protection during the subprime crisis. The results are aligned with recent papers that present an increase in correlation between stock markets, especially in bear markets.
Resumo:
Objective: To test the efficacy of teaching motivational interviewing (MI) to medical students. Methods: Thirteen 4th year medical students volunteered to participate. Seven days before and 7 days after an 8-hour interactive training MI workshop, each student performed a videorecorded interview with two standardized patients: a 60 year old alcohol dependent woman and a 50 year old cigarette smoking man. Students' counseling skills were coded by two blinded clinicians using the Motivational Interviewing Treatment Integrity 3.0 (MITI). Inter-rater reliability was calculated for all interviews and a test-retest was completed in a sub-sample of 10 consecutive interviews three days apart. Difference between MITI scores before and after training were calculated and tested using non-parametric tests. Effect size was approximated by calculating the probability that posttest scores are greater than pretest scores (P*=P(Pre<Post)+1/2P(Pre=Post)), P*>1/2 indicating greater scores in posttest, P*=1/2 no effect, and P*<1/2 smaller scores in posttest. Results: Median differences between MITI scores before and after MI training indicated a general progression in MI skills: MI spirit global score (median difference=1.5, Inter quartile range=1.5, p<0.001, P*=0.90); Empathy global score (med diff=1, IQR=0.5, p<0.001, P*=0.85); Percentage of MI adherent skills (med diff=36.6, IQR=50.5, p<0.001, P*=0.85); Percentage of open questions (med diff=18.6, IQR=21.6, p<0.001, P*=0.96); reflections/ questions ratio (med diff=0.2, IQR=0.4, p<0.001, P*=0.81). Only Direction global score and the percentage of complex reflections were not significantly improved (med diff=0, IQR=1, p=0.53, P*=0.44, and med diff=4.3, IQR=24.8, p=0.48, P*=0.62, respectively). Inter-rater reliability indicated weighted kappa ranged between 0.14 for Direction to 0.51 for Collaboration and ICC ranged between 0.28 for Simple reflection to 0.95 for Closed question. Test-retests indicated weighted kappa ranged between 0.27 for Direction to 0.80 for Empathy and ICC ranged between 0.87 for Complex reflection to 0.98 for Closed question. Conclusion: This pilot study indicated that an 8-hour training in MI for voluntary 4th year medical students resulted in significant improvement of MI skills. Larger sample of unselected medical students should be studied to generalize the benefit of MI training to medical students. Interrater reliability and test-retests suggested that coders' training should be intensified.
Resumo:
How much would output increase if underdeveloped economies were to increase their levels of schooling? We contribute to the development accounting literature by describing a non-parametric upper bound on the increase in output that can be generated by more schooling. The advantage of our approach is that the upper bound is valid for any number of schooling levels with arbitrary patterns of substitution/complementarity. Another advantage is that the upper bound is robust to certain forms of endogenous technology response to changes in schooling. We also quantify the upper bound for all economies with the necessary data, compare our results with the standard development accounting approach, and provide an update on the results using the standard approach for a large sample of countries.
Resumo:
The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.
Resumo:
The sample dimension, types of variables, format used for measurement, and construction of instruments to collect valid and reliable data must be considered during the research process. In the social and health sciences, and more specifically in nursing, data-collection instruments are usually composed of latent variables or variables that cannot be directly observed. Such facts emphasize the importance of deciding how to measure study variables (using an ordinal scale or a Likert or Likert-type scale). Psychometric scales are examples of instruments that are affected by the type of variables that comprise them, which could cause problems with measurement and statistical analysis (parametric tests versus non-parametric tests). Hence, investigators using these variables must rely on suppositions based on simulation studies or recommendations based on scientific evidence in order to make the best decisions.
Resumo:
Aim: We asked whether myocardial flow reserve (MFR) by Rb-82 cardiac PET improve the selection of patients eligible for invasive coronary angiography (ICA). Material and Methods: We enrolled 26 consecutive patients with suspected or known coronary artery disease who performed dynamic Rb-82 PET/CT and (ICA) within 60 days; 4 patients who underwent revascularization or had any cardiovascular events between PET and ICA were excluded. Myocardial blood flow at rest (rMBF), at stress with adenosine (sMBF) and myocardial flow reserve (MFR=sMBF/rMBF) were estimated using the 1-compartment Lortie model (FlowQuant) for each coronary arteries territories. Stenosis severity was assessed using computer-based automated edge detection (QCA). MFR was divided in 3 groups: G1:MFR<1.5, G2:1.5≤MFR<2 and G3:2≤MFR. Stenosis severity was graded as non-significant (<50% or FFR ≥0.8), intermediate (50%≤stenosis<70%) and severe (≥70%). Correlation between MFR and percentage of stenosis were assessed using a non-parametric Spearman test. Results: In G1 (44 vessels), 17 vessels (39%) had a severe stenosis, 11 (25%) an intermediate one, and 16 (36%) no significant stenosis. In G2 (13 vessels), 2 (15%) vessels presented a severe stenosis, 7 (54%) an intermediate one, and 4 (31%) no significant stenosis. In G3 (9 vessels), 0 vessel presented a severe stenosis, 1 (11%) an intermediate one, and 8 (89%) no significant stenosis. Of note, among 11 patients with 3-vessel low MFR<1.5 (G1), 9/11 (82%) had at least one severe stenosis and 2/11 (18%) had at least one intermediate stenosis. There was a significant inverse correlation between stenosis severity and MFR among all 66 territories analyzed (rho= -0.38, p=0.002). Conclusion: Patients with MFR>2 could avoid ICA. Low MFR (G1, G2) on a vessel-based analysis seems to be a poor predictor of severe stenosis severity. Patients with 3-vessel low MFR would benefit from ICA as they are likely to present a significant stenosis in at least one vessel.
Resumo:
Casos de fraudes têm ocorrido, frequentemente no mercado mundial. Diversos são os profissionais envolvidos nesses casos, inclusive os da contabilidade. Os escândalos contabilísticos, especialmente os mais famosos, como os incidido nas empresas Enron e Wordcom, acenderam para uma maior preocupação em relação a conduta ética dos profissionais da contabilidade. Como consequência há uma maior exigência quanto a transparência e a fidedignidade das informações prestadas por estes profissionais. Esta preocupação visa, sobretudo, manter a confiança das empresas, investidores, fornece-dores e sociedade em geral, de entre outras, na responsabilidade ética do contabilista, de-negrida pelo envolvimento nas fraudes detectadas. Desta forma, o presente estudo teve como objectivo verificar a conduta ética dos contabilistas, quando, no exercício da sua profissão, depararem com questões relacionadas a fraudes. Nesse sentido considerou-se factores que podem vir a influenciar o processo decisório ético de um indivíduo, demonstrados através do modelo de tomada de decisão, desenvolvido por Alves, quanto a motivar um indivíduo a cometer uma fraude, evidenciada através do modelo desenvolvido por Cressey. Tentando responder a questão norteadora desta pesquisa, executou-se a análise descritiva e estatística dos dados. Em relação a análise descritiva, foram elaboradas tabelas de frequência. Para a análise estatística dos dados foi utilizado o teste não paramétrico de Spearman. Os resultados demonstraram que a maioria dos contabilistas, da amostra pesquisada, reconhece a questão moral inserida nos cenários, e discordam dos actos dos agentes de cada cenário, e, ainda os classificam como graves ou muito graves. A pesquisa revelou maior aproximação desses profissionais a corrente teleológica, uma vez que a intenção de agir é mais influenciada por alguns factores como a oportunidade, a racionalização e principalmente a pressão. Alguns factores individuais apresentam influências sob o posicionamento ético dos contabilistas entrevistados nesta pesquisa. Cases of fraud have occurred, in the word market. Several are involved in these cases, including the accounting class. The accounting scandals, especially the most famous, such as focusing on companies and Enron Word Com, kindled to greater concern about the ethical conduct of professional accounting. As a result there is a greater demand on the transparency and reliability of information provide by these professionals This concern is aimed, primarily, to maintain the confidence of businesses, investor, suppliers and society, among others, the ethical responsibility of the meter, denigrated, by involvement in the fraud detected. Thus, this study aimed to verify the ethical conduct of accounts in when, in the exercise of their professional activities, is confronted with issues related to fraud. This is considered some factors that can both come to influence the ethical decision making of an individual, demonstrated by the model of decision making, developed by Alves, as a motivated individual to commit a fraudulent act, developed by Cressey. Seeking to answer question, guiding this study, performed to exploratory and confirmatory analysis of data. For exploratory data analysis were made table of frequencies. For confirmatory analysis of data, were used non parametric tests of Spearman. The results showed that the majority of accountings professionals, the sample, recognizing the moral issue included in the scenarios, disagrees the acts of agents of each scenario, and also classifies such acts as serious and very serious. However, we found that these accounting professionals tend to have a position more toward the teleological theory, since the intention to act is influenced by factors as opportunity, rationalization and particularly the pressure. Some individual factors also had influence on the ethical position of the professional interviewed is this research.