961 resultados para arguments in favor
Resumo:
Motor inhibitory control plays a central role in adaptive behaviors during the entire lifespan. Inhibitory motor control refers to the ability to stop all (global) or a part (selective) of a planned or ongoing motor action. Although the neural processing underlying the global inhibitory control has received much attention from cognitive neuroscientists, brain modulations that occur during selective inhibitory motor control remain unknown. The aim of the present thesis is to investigate the spatio-temporal brain processes of selective inhibitory motor control in young and old adults using high-density electroencephalography. In the first part, we focus on early (preparatory period) spatio-temporal brain processes involved in selective and global inhibitory control in young (study I) and old adults (study II) using a modified Go/No-go task. In study I, we distinguished global from selective inhibition in the early attentional stage of inhibitory control and provided neurophysiological evidence in favor of the combination model. In study II, we showed an under-recruitment of neural resources associated with preservation of performance in old adults during selective inhibition, suggesting efficient cerebral and behavioral adaptations to environmental changes. In the second part, we investigate beta oscillations in the late (post-execution period) spatio-temporal brain processes of selective inhibition during a motor Switching task (i.e., tapping movement from bimanual to unimanual) in young (study III) and old adults (study IV). In study III, we identified concomitant beta synchronization related (i) to sensory reafference processes, which enabled the stabilization of the movement that was perturbed after switching, and (ii) to active inhibition processes that prevented movement of the stopping hand. In study IV, we demonstrated a larger beta synchronization in frontal and parietal regions in old adults compared to young adults, suggesting age-related brain modulations in active inhibition processes. Apart from contributing to a basic understanding of the electrocortical dynamics underlying inhibitory motor control, the findings of the present studies contribute to knowledge regarding the further establishment of specific trainings with aging. -- Le contrôle de l'inhibition motrice joue un rôle central dans les adaptations comportementales quel que soit l'âge. L'inhibition motrice se réfère à la capacité à arrêter entièrement (globale) ou en partie (sélective) une action motrice planifiée ou en cours. Bien que les processus neuronaux sous-jacents de l'inhibition globale aient suscité un grand intérêt auprès des neurosciences cognitives, les modulations cérébrales dans le contrôle de l'inhibition motrice sélective sont encore peu connues. Le but de cette thèse est d'étudier les processus cérébraux spatio-temporels du contrôle de l'inhibition motrice sélective chez les adultes jeunes et âgés en utilisant l'électroencéphalogramme à haute densité. Dans la première partie, nous comparons les processus cérébraux spatio-temporels précoces (préparation motrice) de l'inhibition sélective et globale chez des adultes jeunes (étude I) et âgés (étude II) en utilisant une tâche Go/No-go modifiée. Dans l'étude I, nous avons distingué l'inhibition globale et sélective au niveau des processus attentionnels précoces du contrôle de l'inhibition et nous avons apporté des preuves neurophysiologiques de l'existence d'un modèle de combinaison. Dans l'étude II, nous avons montré une sous-activation neuronale associée à un maintien de la performance dans l'inhibition sélective chez les adultes âgés, suggérant des adaptations cérébrales et comportementales aux contraintes environnementales. Dans la seconde partie, nous examinons les processus cérébraux spatio-temporels tardifs (post-exécution motrice) de l'inhibition sélective pendant une tâche de Switching (tapping bimanuel vers un tapping unimanuel) chez des adultes jeunes (étude III) et âgés (étude IV). Dans l'étude III, nous avons distingué des synchronisations beta liées (i) au traitement des réafférences sensorielles permettant de stabiliser le mouvement perturbé après le switching, et (ii) aux processus d'inhibition active afin d'empêcher les mouvements de la main arrêtée. Dans l'étude IV, cette synchronisation beta était plus forte dans les régions frontales et pariétales chez les âgés par rapport aux jeunes adultes suggérant des modulations cérébrales de l'inhibition active avec l'âge. Outre la contribution fondamentale sur la compréhension des dynamiques électrocorticales dans le contrôle de l'inhibition motrice, les résultats de ces études contribuent à développer les connaissances pour la mise en place de programmes d'entraînements adaptés aux personnes âgées.
Resumo:
The numerical keratinocyte to melanocyte relation was studied in café au lait spots and adjacent normally pigmented skin of 9 patients with classical neurofibromatosis. Compared to normal skin of healthy individuals, the keratinocyte:melanocyte ratio distributions obtained in neurofibromatosis indicated a shift to lower values in the biopsies of café au lait spots and normally pigmented skin. These results are evidence in favor of an impaired tissue organization of the epidermis in neurofibromatosis with regard to the keratinocyte-melanocyte interrelation.
Resumo:
OBJECTIVE: Previous literature has suggested that laws and regulations may impact the use of palliative sedation. Our present study compares the attitudes of French-speaking physicians practicing in the Quebec and Swiss environments, where different laws are in place regarding physician-assisted suicide. METHOD: Data were drawn from two prior studies, one by Blondeau and colleagues and another by Beauverd and coworkers, employing the same two-by-two experimental design with length of prognosis and type of suffering as independent variables. Both the effect of these variables and the effect of their interaction on Swiss and Quebec physicians' attitudes toward sedation were compared. The written comments of respondents were submitted to a qualitative content analysis and summarized in a comparative perspective. RESULTS: The analysis of variance showed that only the type of suffering had an effect on physicians' attitudes toward sedation. The results of the Wilcoxon test indicated that the attitudes of physicians from Quebec and Switzerland tended to be different for two vignettes: long-term prognosis with existential suffering (p = 0.0577) and short-term prognosis with physical suffering (p = 0.0914). In both cases, the Swiss physicians were less prone to palliative sedation. SIGNIFICANCE OF RESULTS: The attitudes of physicians from Quebec and Switzerland toward palliative sedation, particularly regarding prognosis and type of suffering, seem similar. However, the results suggest that physicians from Quebec could be slightly more open to palliative sedation, even though most were not in favor of this practice as an answer to end-of-life existential suffering.
Resumo:
BACKGROUND The prevalence of genotypes of the 677C>T polymorphism for the MTHFR gene varies among humans. In previous studies, we found changes in the genotypic frequencies of this polymorphism in populations of different ages, suggesting that this could be caused by an increase in the intake of folate and multivitamins by women during the periconceptional period. The aim was to analyze changes in the allelic frequencies of this polymorphism in a Spanish population, including samples from spontaneous abortions (SA). METHODS A total of 1305 subjects born in the 20th century were genotyped for the 677C>T polymorphism using allele specific real-time PCR with Taqman probes. A section of our population (n = 276) born in 1980-1989 was compared with fetal samples (n = 344) from SA of unknown etiology from the same period. RESULTS An increase in the frequency of the T allele (0.38 vs 0.47; p < 0.001) and of the TT genotype (0.14 vs 0.24; p < 0.001) in subjects born in the last quarter of the century was observed. In the 1980-1989 period, the results show that the frequency of the wild type genotype (CC) is about tenfold lower in the SA samples than in the controls (0.03 vs 0.33; p < 0.001) and that the frequency of the TT genotype increases in the controls (0.19 to 0.27) and in the SA samples (0.20 to 0.33 (p < 0.01)); r = 0.98. CONCLUSION Selection in favor of the T allele has been detected. This selection could be due to the increased fetal viability in early stages of embryonic development, as is deduced by the increase of mutants in both living and SA populations.
Resumo:
La perception de la productivité SUMMARY The main objective of this thesis is the perception of the productivity in the luxury hospitality industry. Despite a lot of efforts which were already made in the field of the production of goods, this concept (productivity) still remains to be defined in the services sector, more still, in that of the luxury hospitality industry. Since the object of this study is the perception of productivity, we decided to analyze the elements considered to be relevant by the top management in this field. Then, it seemed important to evaluate these same elements for the categories of middle-management and by the in-line employees. As perception is not static, it is dependent in an indirect way on its improvement and however also with the means of improvement. The assumption of our work evokes the possible relationship between productivity and its perception (P), (Q) quality and profitability (R). On this basis we built the P-Q-R model: R=F(P,Q) Finally, our research on this model enabled us to establish a mathematical relation between the three predetermined elements: fR=fP+fQ+c That means that the function efficiency of a process of services -(fR) is the sum of its quality function (intrinsic and extrinsic)-(fP) and of its productivity function - fQ (and the constant of regression "c"). To increase the profitability of the most significant manner, it is necessary to increase at the same time the productivity and quality. On the other hand, according to this formula but also according to the perception of the managers, with a constant profitability, either the productivity decreases in favor of the increase in quality, or the reverse. If the dimensions of the model influence positively or negatively the production process of services, then those wí11 influence in same manner our model (P, Q, R). We advance a point of view saying that profitability depends on the labor productivity which follows same dynamics than the perception of the productivity. The identification of the labor productivity as an essential element of successful management of the hotel is fundamental. The question which always remains in suspense is however the relevance of the concept "labor productivity" for the luxury hospitality industry. It was not proven an obvious correlation between this notion and the one of profitability. We still remain at the stage of perception. It results that one interesting way of future research will be the study of this correlation. As in any kind of luxury industry, the real added value does not consist in the volume produced or in the speed with which the product/service is carried out but in the creativity involved in their results. Let us note that the field of luxury is extremely related to the emotions and to the experience provided to the customers. La perception de la productivité... RÉSUME L'objectif principal de cette thèse est la perception de la productivité dans l'hôtellerie de luxe. Malgré tous les efforts qui ont déjà été faits dans le domaine de la production de biens, ce concept (productivité) reste encore à définir dans le secteur des services, plus encore, dans celui de l'hôtellerie de luxe. Étant donné que l'objet de l'étude est la perception de la productivité, nous avons décidé d'analyser les éléments jugés pertinents par les cadres dirigeants dans ce domaine. Puis, il nous a semblé important d'évaluer ces mêmes éléments pour les catégories de cadres moyens et par les employés in-line. Comme la perception n'est pas statique, elle est liée d'une manière indirecte à son amélioration et cependant également aux moyens d'amélioration. L'hypothèse de notre travail évoque la possible relation entre la productivité et sa perception (P), la qualité (Q) et la rentabilité (R). Sur cette base nous avons construit le modèle P-Q-R de départ R=f(P,Q) Finalement, notre recherche sur ce modèle nous a permis d'établir une relation mathématique entre les trois construits prédéterminés: fR=fP+fQ+c Cela signifie que la fonction rentabilité d'un processus de services -(fR) est la somme de sa fonction qualité (intrinsèque et extrinsèque)-fP et de sa fonction productivité -fQ (plus la constante de régression « c ») Pour augmenter la rentabilité de la manière la plus significative, il faut augmenter en même temps la productivité et la qualité. En revanche, selon cette formule mais selon aussi la perception des managers, à une rentabilité constante, soit la productivité diminue en faveur de l'augmentation de la qualité, soit l'inverse. Si les dimensions du modèle influencent positivement ou négativement le processus de production de services, alors celles-ci vont influencer de la même manière les construits de notre modèle (P, Q, R). Nous avançons un point de vue disant que la rentabilité dépend de la productivité du travail qui suit la même dynamique que la perception de la productivité. L'identification de la productivité du travail comme élément essentiel de gestion réussie de l'hôtel s'avère fondamentale. La question qui reste toujours en suspens est pourtant la pertinence de la notion «productivité du travail » pour l'industrie hôtelière de luxe. Il n'a pas été prouvé une corrélation évidente entre cette notion et celle de la profitabilité. Nous restons donc ici encore au stade de perception. Il en résulte que l'une des voies les plus intéressantes de recherche future sera l'étude de cette corrélation. Comme dans toute industrie de luxe, la vraie valeur ajoutée ne consiste pas toujours dans le volume produit, ni dans la vitesse avec laquelle le produit/service est réalisé, mais parfois dans la créativité emmagasinée dans ces résultats. Notons que le domaine de luxe est extrêmement lié aux émotions et à l'expérience fournie aux clients.
Resumo:
Colorectal cancer is a heterogeneous disease that manifests through diverse clinical scenarios. During many years, our knowledge about the variability of colorectal tumors was limited to the histopathological analysis from which generic classifications associated with different clinical expectations are derived. However, currently we are beginning to understand that under the intense pathological and clinical variability of these tumors there underlies strong genetic and biological heterogeneity. Thus, with the increasing available information of inter-tumor and intra-tumor heterogeneity, the classical pathological approach is being displaced in favor of novel molecular classifications. In the present article, we summarize the most relevant proposals of molecular classifications obtained from the analysis of colorectal tumors using powerful high throughput techniques and devices. We also discuss the role that cancer systems biology may play in the integration and interpretation of the high amount of data generated and the challenges to be addressed in the future development of precision oncology. In addition, we review the current state of implementation of these novel tools in the pathological laboratory and in clinical practice.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
Hereditary non-structural diseases such as catecholaminergic polymorphic ventricular tachycardia (CPVT), long QT, and the Brugada syndrome as well as structural disease such as hypertrophic cardiomyopathy (HCM) and arrhythmogenic right ventricular cardiomyopathy (ARVC) cause a significant percentage of sudden cardiac deaths in the young. In these cases, genetic testing can be useful and does not require proxy consent if it is carried out at the request of judicial authorities as part of a forensic death investigation. Mutations in several genes are implicated in arrhythmic syndromes, including SCN5A, KCNQ1, KCNH2, RyR2, and genes causing HCM. If the victim's test is positive, this information is important for relatives who might be themselves at risk of carrying the disease-causing mutation. There is no consensus about how professionals should proceed in this context. This article discusses the ethical and legal arguments in favour of and against three options: genetic testing of the deceased victim only; counselling of relatives before testing the victim; counselling restricted to relatives of victims who tested positive for mutations of serious and preventable diseases. Legal cases are mentioned that pertain to the duty of geneticists and other physicians to warn relatives. Although the claim for a legal duty is tenuous, recent publications and guidelines suggest that geneticists and others involved in the multidisciplinary approach of sudden death (SD) cases may, nevertheless, have an ethical duty to inform relatives of SD victims. Several practical problems remain pertaining to the costs of testing, the counselling and to the need to obtain permission of judicial authorities.
Resumo:
In this paper, I reframe the long-standing controversy between 'psychological egoism', which argues that human beings never perform altruistic actions, and the opposing thesis of 'psychological altruism', which claims that human beings are, at least sometimes, capable of acting in an altruistic fashion. After a brief sketch of the controversy, I begin by presenting some representative arguments in favour of psychological altruism before showing that they can all be called into question by appealing to the idea of an unconscious self-directed motive. I will then point out that this argumentative strategy not only debunks the reasons for favouring psychological altruism, but also those for favouring psychological egoism; hence it is no use in settling the dispute between the two views. In the second part of the paper, I will try to break this deadlock by reframing the whole controversy, shifting it away from the concept of motive, towards the broader notion of motivation. As it turns out, this shift enables the debate to centre on altruistic emotions and their motivational power, thereby allowing evolutionary arguments to enter the debate and settle the dispute in favour of psychological altruism.
Resumo:
In a world where poor countries provide weak protection for intellectual property rights, market integration shifts technical change in favor of rich nations. Through this channel, free trade may amplify international income differences. At the same time, integration with countries where intellectual property rights are weakly protected can slow down the world growth rate. A crucial implication of these results is that protection of intellectual property is most beneficial in open countries. This prediction, which is novel in the literature, finds support in the data on a panel of 53 countries observed in the years 1965-1990.
Resumo:
We present a theory of choice among lotteries in which the decision maker's attention is drawn to (precisely defined) salient payoffs. This leads the decision maker to a context-dependent representation of lotteries in which true probabilities are replaced by decision weights distorted in favor of salient payoffs. By endogenizing decision weights as a function of payoffs, our model provides a novel and unified account of many empirical phenomena, including frequent risk-seeking behavior, invariance failures such as the Allais paradox, and preference reversals. It also yields new predictions, including some that distinguish it from Prospect Theory, which we test.
Resumo:
We find that trade and domestic market size are robust determinants of economic growth over the 1960-1996 period when trade openness is measured as the US dollar value of imports and exports relative to GDP in PPP US$ ('real openness'). When trade openness is measured as the US dollar value of imports and exports relative to GDP in exchange rate US$ ('nominal openness') however, trade and the size of domestic markets are often non-robust determinants of growth. We argue that real openness is the more appropriate measure of trade and that our empirical results should be seen as evidence in favor of the extent-of-the-market hypothesis.
Resumo:
Objective To understand the process by which an obese woman decides to have bariatric surgery. Method A qualitative survey with a social phenomenology approach, carried out in 2012, with 12 women, using the phenomenological interview. Results A woman bases the decision to have the surgery on: the inappropriateness of her eating habits; a physical appearance that is incompatible with an appearance that is standardized by society; the social prejudice that she has to live with; the limitations imposed by obesity; and her lack of success with previous attempts to lose weight. Outcomes that she hopes for from the decision to have the surgery include: restoring her health; achieving social inclusion; and entering the labor market. Conclusion This study allows one to reflect that prescriptive actions do not give a satisfactory response to a complexity of the subjective questions involved in the decision to have surgery for obesity. For this, what is called for is a program of work based on an interdisciplinary approach, and training that gives value to the bio-psycho-social aspects involved in a decision in favor of surgical treatment.
Resumo:
Abstract : The role of order effects has been widely shown and discussed in areas such as memory and social impression formation. This work focuses in a first half on order effects influencing the verdict chosen at the end of a criminal trial. Contrary to impression formation but according to trial's characteristics, it has been hypothesised that a recency effect would influence the verdict's choice. Three groups of students (N = 576) received a mock trial resume with a specific order stemming from the combination of three witnesses, one expert and two ocular witnesses. Results show a recency effect, the last testimony provoking significantly more acquittals if discriminating, and more condemnations if incriminating. The second half of this work starts from Gestalt and sociopsychological researches presenting numerous insights into cognitive organization of perceptions and opinions. It has been postulated that a witness probative value will change according to the emitted verdict, an incriminating witness or expert possessing a higher probative value in a condemning verdict than in an innocenting one, on the other hand a discriminating witness or expert having a higher probative value in an acquittal than in a condemnation. Results using a seven points scale measuring witnesses' probative value confirm this hypothesis. Argumentations written by the subjects to explain their verdict and refering to the accusing expert also show a congruency effect as categories of arguments are identical in case of condemnation or acquittal, the only difference between both types of verdicts residing in the frequency of these categories following the judgement, higher use of incriminating arguments in case of guiltiness and higher use of discriminating ones if the accused is found innocent. Résumé : L'intervention des effets sériels a fait l'objet de nombreuses recherches dans le domaine de la mémoire et de la formation d'impression en psychologie sociale. Ce travail s'intéresse dans une première partie aux effets d'ordre pouvant influencer le choix du verdict à la fin d'un procès pénal. Contrairement aux résultats obtenus en matière de formation d'impression, mais conformément aux caractéristiques d'un procès, l'hypothèse de l'intervention d'un effet de récence a été formulée, affirmant que les derniers témoins influencent le plus le choix du verdict. Trois groupes d'étudiants (N = 576) ont lu le résumé d'un procès fictif présentant trois témoignages, deux témoins visuels à décharge et un expert à charge. Chaque groupe recevait un ordre spécifique de présentation des témoins de sorte que l'expert se trouvait en première, deuxième ou troisième position. Les résultats montrent un effet de récence, le dernier témoin provoquant davantage d'acquittements s'il est disculpant et davantage de condamnations s'il est inculpant. La seconde partie de ce travail émane des recherches effectuées dans le domaine de la théorie de la forme et de la psychologie sociale ayant un intérêt marqué pour l'organisation cognitive de nos perceptions et de nos opinions. L'hypothèse que nous avons posée souligne le lien entretenu entre l'évaluation de la force probante d'un témoin et le verdict émis : un témoignage discriminant possédera plus de poids en cas d'acquittement qu'en cas de condamnation, inversement un témoignage incriminant aura plus de poids en cas de condamnation qu'en cas d'acquittement. L'utilisation d'une échelle en sept points mesurant la force probante des deux types de témoins confirme cette hypothèse, l'estimation de la valeur accordée à un même témoin variant selon le type de verdict choisi. Les argumentations de chaque verdict ont également montré que les catégories d'arguments se référant à l'expert étaient identiques en cas de condamnation ou d'acquittement de l'inculpé, par contre les fréquences de ces catégories entretiennent un lien congruent avec le verdict, celles inculpantes étant majoritairement utilisées pour asseoir la culpabilité mais peu représentées en cas d'acquittement, inversement celles disculpantes apparaissant bien plus lorsqu'il s'agit d'innocenter l'inculpé que pour le condamner.
Resumo:
O presente trabalho tem como objectivo demonstrar a importância da normalização contabilística para a análise da informação financeira, tendo em conta o meio envolvente em que, hoje, as empresas encontram-se inseridas. As mudanças que ocorreram na economia global levou a que as empresas passassem a adoptar novas formas de elaborar o reporte financeiro como forma de acompanhar essa evolução. A harmonização contabilística, surge neste contexto, como forma de reduzir as diferenças existentes no relato financeiro dos vários países. Nesse sentido o trabalho abordará as várias iniciativas que tem sido feitas em favor da harmonização/normalização contabilística e a sua relevância no contexto internacional e nacional bem como o caso de Cabo Verde, que recentemente aderiu ao processo da normalização. O caso prático baseia-se na transposição das demonstrações financeiras para o novo normativo em vigor e também na análise destas. A metodologia utilizada no trabalho baseia-se na revisão bibliográfica em livros, revistas, pela consulta de sites na Internet e legislação. The purpose of this paper is to demonstrate the importance of the accounting standard for the analysis of financial information, taking into account the environment in which, today, the companies are incorporated. The changes that occurred in the global economy have led companies to introduce new ways of preparing the financial reporting as a way to monitor these developments. The accounting harmonization, it is in this context as a way of reducing the differences existing in the financial reporting of the several countries. In this sense the work will address a variety of initiatives that have been made in favor of harmonization / normalization accounting and its relevance in the international and national context as well as the case of Cape Verde, who recently joined the process of standardization. The practical case is based in on the translation of financial statements for the new rules enforced and also in their analysis. The follows methodology used in the process is based on the literature review in books, magazines, by consulting Web sites and legislation