832 resultados para Issued-based approach


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Subplate neurons are among the earliest born cells of the neocortex and play a fundamental role in cortical development, in particular in the formation of thalamocortical connections. Subplate abnormalities have been described in several neuropathological disorders including schizophrenia, autism and periventricular eukomalacia (Eastwood and Harrison, Schizophr Res, 79, 2005; McQuillen and Ferriero, Brain Pathol, 15, 2005). We have identified and confirmed a range of specific markers for murine subplate using a microarray based approach and found that different subplate subpopulations are characterized by distinct expression patterns of these genes (Hoerder-Suabedissen et al., Cereb Cortex, 19, 2009). In this current study, we are making use of these markers to investigate neuropathological changes of the subplate after cerebral hypoxia-ischemia (HI) in the neonatal rat. First, we characterized the expression of a number of murine subplate markers in the postnatal rat using immunohistochemistry and in situ hybridization. While several genes (Nurr1, Cplx3, Ctgf and Tmem163) presented very similar expression patterns as in the mouse, others (Ddc, MoxD1 and TRH) were completely absent in the rat cortex. This finding suggests important differences in the subplate populations of these two rodent species. In a neonatal rat model of HI, selective vulnerability of subplate has been suggested using BrdU birthdating methods (McQuillen et al., J Neurosci, 15, 2003). We hypothesized that certain subplate subpopulations could be more susceptible than others and analyzed the above subplate markers in a similar yet slightly milder HI model. Two-day old male rat pups underwent permanent occlusion of the right common carotid artery followed by a period of hypoxia (6% O2, 1.5h or 2h) and were analyzed six days later. Preliminary counts on three subplate subpopulations (Nurr1+, Cplx3+ and Ctgf+ cells, respectively) showed similar reductions in cell numbers for all three groups. In addition, we found that the majority of cases which show changes in the subplate also exhibit lesions in the deep cortical layers VI (identified by FoxP2 expression) and sometimes even layer V (revealed by Er81 immunoreactivity), which questions the selective susceptibility of subplate over other cortical layers under the conditions we used in our model. Supported by MRC, FMO holds a Berrow Scholarship, Lincoln College, Oxford.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aquesta petita investigació es divideix en dues parts. La primera, basada en els estudis teòrics, ofereix una visió de la importància de la creativitat dins i fora l’escola i de la didàctica d’aquesta a través de l’experiència de fer cinema. Fer d’actor, guionista, càmera, maquillador, escenògraf, entre d’altres, és una eina metodològica que potencia la creativitat dels alumnes? Ho investiguem a la segona part d’aquest document mitjançant estratègies d’avaluació de la creativitat. I els resultats indiquen que els infants que han realitzar el projecte de fer cinema tenen més bona qualificació en tots els aspectes avaluats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Anticoagulants are a mainstay of cardiovascular therapy, and parenteral anticoagulants have widespread use in cardiology, especially in acute situations. Parenteral anticoagulants include unfractionated heparin, low-molecular-weight heparins, the synthetic pentasaccharides fondaparinux, idraparinux and idrabiotaparinux, and parenteral direct thrombin inhibitors. The several shortcomings of unfractionated heparin and of low-molecular-weight heparins have prompted the development of the other newer agents. Here we review the mechanisms of action, pharmacological properties and side effects of parenteral anticoagulants used in the management of coronary heart disease treated with or without percutaneous coronary interventions, cardioversion for atrial fibrillation, and prosthetic heart valves and valve repair. Using an evidence-based approach, we describe the results of completed clinical trials, highlight ongoing research with currently available agents, and recommend therapeutic options for specific heart diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The global structural connectivity of the brain, the human connectome, is now accessible at millimeter scale with the use of MRI. In this paper, we describe an approach to map the connectome by constructing normalized whole-brain structural connection matrices derived from diffusion MRI tractography at 5 different scales. Using a template-based approach to match cortical landmarks of different subjects, we propose a robust method that allows (a) the selection of identical cortical regions of interest of desired size and location in different subjects with identification of the associated fiber tracts (b) straightforward construction and interpretation of anatomically organized whole-brain connection matrices and (c) statistical inter-subject comparison of brain connectivity at various scales. The fully automated post-processing steps necessary to build such matrices are detailed in this paper. Extensive validation tests are performed to assess the reproducibility of the method in a group of 5 healthy subjects and its reliability is as well considerably discussed in a group of 20 healthy subjects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent literature has discussed the unintended consequences of clinical information technologies (IT) on patient safety, yet there has been little discussion about the safety concerns in the area of consumer health IT. This paper presents a range of safety concerns for consumers in social media, with a case study on YouTube. We conducted a scan of abstracts on 'quality criteria' related to YouTube. Five areas regarding the safety of YouTube for consumers were identifi ed: (a) harmful health material targeted at consumers (such as inappropriate marketing of tobaccoor direct-to-consumer drug advertising); (b) public display of unhealthy behaviour (such as people displaying self-injury behaviours or hurting others); (c) tainted public health messages (i.e. the rise of negative voices againstpublic health messages); (d) psychological impact from accessing inappropriate, offensive or biased social media content; and (e) using social media to distort policy and research funding agendas. The examples presented should contribute to a better understanding about how to promote a safe consumption and production of social media for consumers, and an evidence-based approach to designing social media interventions for health. The potential harm associated with the use of unsafe social media content on the Internet is a major concern. More empirical and theoretical studies are needed to examine how social media infl uences consumer health decisions, behaviours and outcomes, and devise ways to deter the dissemination of harmful infl uences in social media.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1 Summary This dissertation deals with two major aspects of corporate governance that grew in importance during the last years: the internal audit function and financial accounting education. In three essays, I contribute to research on these topics which are embedded in the broader corporate governance literature. The first two essays consist of experimental investigations of internal auditors' judgments. They deal with two research issues for which accounting research lacks evidence: The effectiveness of internal controls and the potentially conflicting role of the internal audit function between management and the audit committee. The findings of the first two essays contribute to the literature on internal auditors' judgment and the role of the internal audit function as a major cornerstone of corporate governance. The third essay theoretically examines a broader issue but also relates to the overall research question of this dissertation: What contributes to effective corporate governance? This last essay takes the perspective that the root for quality corporate governance is appropriate financial accounting education. r develop a public interest approach to accounting education that contributes to the literature on adequate accounting education with respect to corporate governance and accounting harmonization. The increasing importance of both the internal audit function and accounting education for corporate governance can be explained by the same recent fundamental changes that still affect accounting research and practice. First, the Sarbanes-Oxley Act of 2002 (SOX, 2002) and the 8th EU Directive (EU, 2006) have led to a bigger role for the internal audit function in corporate governance. Their implications regarding the implementation of audit committees and their oversight over internal controls are extensive. As a consequence, the internal audit function has become increasingly important for corporate governance and serves a new master (i.e. the audit committee) within the company in addition to management. Second, the SOX (2002) and the 8th EU Directive introduced additional internal control mechanisms that are expected to contribute to the reliability of financial information. As a consequence, the internal audit function is expected to contribute to a greater extent to the reliability of financial statements. Therefore, effective internal control mechanisms that strengthen objective judgments and independence become important. This is especially true when external- auditors rely on the work of internal auditors in the context of the International Standard on Auditing (ISA) 610 and the equivalent US Statement on Auditing Standards (SAS) 65 (see IFAC, 2009 and AICPA, 1990). Third, the harmonization of international reporting standards is increasingly promoted by means of a principles-based approach. It is the leading approach since a study of the SEC (2003) that was required by the SOX (2002) in section 108(d) was in favor of this approach. As a result, the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) commit themselves to the development of compatible accounting standards based on a principles-based approach. Moreover, since the Norwalk Agreement of 2002, the two standard setters have developed exposure drafts for a common conceptual framework that will be the basis for accounting harmonization. The new .framework will be in favor of fair value measurement and accounting for real-world economic phenomena. These changes in terms of standard setting lead to a trend towards more professional judgment in the accounting process. They affect internal and external auditors, accountants, and managers in general. As a consequence, a new competency set for preparers and users of financial statements is required. The basil for this new competency set is adequate accounting education (Schipper, 2003). These three issues which affect corporate governance are the initial point of this dissertation and constitute its motivation. Two broad questions motivated a scientific examination in three essays: 1) What are major aspects to be examined regarding the new role of the internal audit function? 2) How should major changes in standard setting affect financial accounting education? The first question became apparent due to two published literature reviews by Gramling et al. (2004) and Cohen, Krishnamoorthy & Wright (2004). These studies raise various questions for future research that are still relevant and which motivate the first two essays of my dissertation. In the first essay, I focus on the role of the internal audit function as one cornerstone of corporate governance and its potentially conflicting role of serving both management and the audit committee (IIA, 2003). In an experimental study, I provide evidence on the challenges for internal auditors in their role as servant for two masters -the audit committee and management -and how this influences internal auditors' judgment (Gramling et al. 2004; Cohen, Krishnamoorthy & Wright, 2004). I ask if there is an expectation gap between what internal auditors should provide for corporate governance in theory compared to what internal auditors are able to provide in practice. In particular, I focus on the effect of serving two masters on the internal auditor's independence. I argue that independence is hardly achievable if the internal audit function serves two masters with conflicting priorities. The second essay provides evidence on the effectiveness of accountability as an internal control mechanism. In general, internal control mechanisms based on accountability were enforced by the SOX (2002) and the 8th EU Directive. Subsequently, many companies introduced sub-certification processes that should contribute to an objective judgment process. Thus, these mechanisms are important to strengthen the reliability of financial statements. Based on a need for evidence on the effectiveness of internal control mechanisms (Brennan & Solomon, 2008; Gramling et al. 2004; Cohen, Krishnamoorthy & Wright, 2004; Solomon & Trotman, 2003), I designed an experiment to examine the joint effect of accountability and obedience pressure in an internal audit setting. I argue that obedience pressure potentially can lead to a negative influence on accountants' objectivity (e.g. DeZoort & Lord, 1997) whereas accountability can mitigate this negative effect. My second main research question - How should major changes in standard setting affect financial accounting education? - is investigated in the third essay. It is motivated by the observation during my PhD that many conferences deal with the topic of accounting education but very little is published about what needs to be done. Moreover, the Endings in the first two essays of this thesis and their literature review suggest that financial accounting education can contribute significantly to quality corporate governance as argued elsewhere (Schipper, 2003; Boyce, 2004; Ghoshal, 2005). In the third essay of this thesis, I therefore focus on approaches to financial accounting education that account for the changes in standard setting and also contribute to corporate governance and accounting harmonization. I argue that the competency set that is required in practice changes due to major changes in standard setting. As the major contribution of the third article, I develop a public interest approach for financial accounting education. The major findings of this dissertation can be summarized as follows. The first essay provides evidence to an important research question raised by Gramling et al. (2004, p. 240): "If the audit committee and management have different visions for the corporate governance role of the IAF, which vision will dominate?" According to the results of the first essay, internal auditors do follow the priorities of either management or the audit committee based on the guidance provided by the Chief Audit executive. The study's results question whether the independence of the internal audit function is actually achievable. My findings contribute to research on internal auditors' judgment and the internal audit function's independence in the broader frame of corporate governance. The results are also important for practice because independence is a major justification for a positive contribution of the internal audit function to corporate governance. The major findings of the second essay indicate that the duty to sign work results - a means of holding people accountable -mitigates the negative effect of obedience pressure on reliability. Hence, I found evidence that control .mechanisms relying on certifications may enhance the reliability of financial information. These findings contribute to the literature on the effectiveness of internal control mechanisms. They are also important in the light of sub-certification processes that resulted from the Sarbanes-Oxley Act and the 8th EU Directive. The third essay contributes to the literature by developing a measurement framework that accounts for the consequences of major trends in standard setting. Moreovér, it shows how these trends affect the required .competency set of people dealing with accounting issues. Based on this work, my main contribution is the development of a public interest approach for the design of adequate financial accounting curricula. 2 Serving two masters: Experimental evidence on the independence of internal auditors Abstract Twenty nine internal auditors participated in a study that examines the independence of internal auditors in their potentially competing roles of serving two masters: the audit committee and management. Our main hypothesis suggests that internal auditors' independence is not achievable in an institutional setting in which internal auditors are accountable to two different parties with potentially differing priorities. We test our hypothesis in an experiment in which the treatment consisted of two different instructions of the Chief audit executive; one stressing the priority of management (cost reduction) and one stressing the priority of the audit committee (effectiveness). Internal auditors had to evaluate internal controls and their inherent costs of different processes which varied in their degree of task complexity. Our main results indicate that internal auditors' evaluation of the processes is significantly different when task complexity is high. Our findings suggest that internal auditors do follow the priorities of either management or the audit committee depending on the instructions of a superior internal auditor. The study's results question whether the independence of the internal audit function is actually achievable. With our findings, we contribute to research on internal auditors' judgment and the internal audit function's independence in the frame of corporate governance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena on teoriaosassa esitellä pankkien vakavaraisuussäännöksien ja riskienhallinnan perusperiaatteet, perehtyä nykyiseen Basel I -järjestelmään ja sen uudistukseen eli Basel II -kehikkoon. Tutkielmassa keskitytään uuden järjestelmän ensimmäiseen pilariin ja sen mukaisiin minipääomavaatimuksiin. Tarkastelussa on tarkemmin luottoriskin vakavaraisuusvaatimusten mukaiset minimipääoman laskentamenetelmät, standardimenetelmä ja sisäisten luottoluokitusten menetelmä. Standardi-menetelmä käyttää hyväkseen ulkoisia luottoluokituksia, kun taas kehittyneempi sisäisten luottoluokitusten menetelmä hyödyntää pankkien omia tietojärjestelmiä ja näiden tuottamia estimaattejaasiakkaiden luottokelpoisuudesta. Tutkielman empiirisessä osassa tutkitaan esimerkkipankin avulla luottoriskin vakavaraisuusvaatimusten laskentaa Basel I -järjestelmällä ja Basel II -laskentamenetelmillä. Sisäisten luottoluokitusten menetelmän mukaisesti pankille määritetään tase-erien riskipainot ja tutkitaan myös, olisiko pankin mahdollista nykyisellä taserakenteellaan saavuttaa suurempi tulos optimoimalla riskiprofiiliaan käyttäessään kehittyneempää sisäisten luottoluokitusten menetelmää standardimenetelmän sijaan.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli selvittää, millaisia kysymyksiä patenttien arvonmääritykseen liittyy verotuksen kannalta yritysjärjestelytilanteissa. Tutkimus on deskriptiivinen, kvalitatiivinen ja normatiivinen. Yritysjärjestelytilanteet toteutettuina verolakien mukaan ovat purkautumista lukuun ottamatta veroneutraaleja tapahtumia, jolloin osapuolille ei synny verotettavaa tuloa. Jos puolestaan yritysjärjestelyjä ei toteuteta elinkeinoverolainmukaan, realisoituu verotettavaa tuloa. Tällöin patentitkin arvostetaan elinkeinoverolain mukaan käypään arvoon. Patenttien käyvän arvon määritykseen ei ole yhtä ja oikeaa tapaa. Kuitenkin tuottoarvoon perustuvia arvonmääritystapoja pidetään parhaimpina. Patenttien arvonmääritykseen liittyviä kysymyksiä yritysjärjestelyiden verotuksen kannalta ovatkin, miten säilyttää veroneutraalius sekä miten käypä arvo määritetään, jos veroneutraaliutta ei voida säilyttää.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Résumé: Le développement rapide de nouvelles technologies comme l'imagerie médicale a permis l'expansion des études sur les fonctions cérébrales. Le rôle principal des études fonctionnelles cérébrales est de comparer l'activation neuronale entre différents individus. Dans ce contexte, la variabilité anatomique de la taille et de la forme du cerveau pose un problème majeur. Les méthodes actuelles permettent les comparaisons interindividuelles par la normalisation des cerveaux en utilisant un cerveau standard. Les cerveaux standards les plus utilisés actuellement sont le cerveau de Talairach et le cerveau de l'Institut Neurologique de Montréal (MNI) (SPM99). Les méthodes de recalage qui utilisent le cerveau de Talairach, ou celui de MNI, ne sont pas suffisamment précises pour superposer les parties plus variables d'un cortex cérébral (p.ex., le néocortex ou la zone perisylvienne), ainsi que les régions qui ont une asymétrie très importante entre les deux hémisphères. Le but de ce projet est d'évaluer une nouvelle technique de traitement d'images basée sur le recalage non-rigide et utilisant les repères anatomiques. Tout d'abord, nous devons identifier et extraire les structures anatomiques (les repères anatomiques) dans le cerveau à déformer et celui de référence. La correspondance entre ces deux jeux de repères nous permet de déterminer en 3D la déformation appropriée. Pour les repères anatomiques, nous utilisons six points de contrôle qui sont situés : un sur le gyrus de Heschl, un sur la zone motrice de la main et le dernier sur la fissure sylvienne, bilatéralement. Evaluation de notre programme de recalage est accomplie sur les images d'IRM et d'IRMf de neuf sujets parmi dix-huit qui ont participés dans une étude précédente de Maeder et al. Le résultat sur les images anatomiques, IRM, montre le déplacement des repères anatomiques du cerveau à déformer à la position des repères anatomiques de cerveau de référence. La distance du cerveau à déformer par rapport au cerveau de référence diminue après le recalage. Le recalage des images fonctionnelles, IRMf, ne montre pas de variation significative. Le petit nombre de repères, six points de contrôle, n'est pas suffisant pour produire les modifications des cartes statistiques. Cette thèse ouvre la voie à une nouvelle technique de recalage du cortex cérébral dont la direction principale est le recalage de plusieurs points représentant un sillon cérébral. Abstract : The fast development of new technologies such as digital medical imaging brought to the expansion of brain functional studies. One of the methodolgical key issue in brain functional studies is to compare neuronal activation between individuals. In this context, the great variability of brain size and shape is a major problem. Current methods allow inter-individual comparisions by means of normalisation of subjects' brains in relation to a standard brain. A largerly used standard brains are the proportional grid of Talairach and Tournoux and the Montreal Neurological Insititute standard brain (SPM99). However, there is a lack of more precise methods for the superposition of more variable portions of the cerebral cortex (e.g, neocrotex and perisyvlian zone) and in brain regions highly asymmetric between the two cerebral hemipsheres (e.g. planum termporale). The aim of this thesis is to evaluate a new image processing technique based on non-linear model-based registration. Contrary to the intensity-based, model-based registration uses spatial and not intensitiy information to fit one image to another. We extract identifiable anatomical features (point landmarks) in both deforming and target images and by their correspondence we determine the appropriate deformation in 3D. As landmarks, we use six control points that are situated: one on the Heschl'y Gyrus, one on the motor hand area, and one on the sylvian fissure, bilaterally. The evaluation of this model-based approach is performed on MRI and fMRI images of nine of eighteen subjects participating in the Maeder et al. study. Results on anatomical, i.e. MRI, images, show the mouvement of the deforming brain control points to the location of the reference brain control points. The distance of the deforming brain to the reference brain is smallest after the registration compared to the distance before the registration. Registration of functional images, i.e fMRI, doesn't show a significant variation. The small number of registration landmarks, i.e. six, is obvious not sufficient to produce significant modification on the fMRI statistical maps. This thesis opens the way to a new computation technique for cortex registration in which the main directions will be improvement of the registation algorithm, using not only one point as landmark, but many points, representing one particular sulcus.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advent of new advances in mobile computing has changed the manner we do our daily work, even enabling us to perform collaborative activities. However, current groupware approaches do not offer an integrating and efficient solution that jointly tackles the flexibility and heterogeneity inherent to mobility as well as the awareness aspects intrinsic to collaborative environments. Issues related to the diversity of contexts of use are collected under the term plasticity. A great amount of tools have emerged offering a solution to some of these issues, although always focused on individual scenarios. We are working on reusing and specializing some already existing plasticity tools to the groupware design. The aim is to offer the benefits from plasticity and awareness jointly, trying to reach a real collaboration and a deeper understanding of multi-environment groupware scenarios. In particular, this paper presents a conceptual framework aimed at being a reference for the generation of plastic User Interfaces for collaborative environments in a systematic and comprehensive way. Starting from a previous conceptual framework for individual environments, inspired on the model-based approach, we introduce specific components and considerations related to groupware.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NlmCategory="UNASSIGNED">This Perspective discusses the pertinence of variable dosing regimens with anti-vascular endothelial growth factor (VEGF) for neovascular age-related macular degeneration (nAMD) with regard to real-life requirements. After the initial pivotal trials of anti-VEGF therapy, the variable dosing regimens pro re nata (PRN), Treat-and-Extend, and Observe-and-Plan, a recently introduced regimen, aimed to optimize the anti-VEGF treatment strategy for nAMD. The PRN regimen showed good visual results but requires monthly monitoring visits and can therefore be difficult to implement. Moreover, application of the PRN regimen revealed inferior results in real-life circumstances due to problems with resource allocation. The Treat-and-Extend regimen uses an interval based approach and has become widely accepted for its ease of preplanning and the reduced number of office visits required. The parallel development of the Observe-and-Plan regimen demonstrated that the future need for retreatment (interval) could be reliably predicted. Studies investigating the observe-and-plan regimen also showed that this could be used in individualized fixed treatment plans, allowing for dramatically reduced clinical burden and good outcomes, thus meeting the real life requirements. This progressive development of variable dosing regimens is a response to the real-life circumstances of limited human, technical, and financial resources. This includes an individualized treatment approach, optimization of the number of retreatments, a minimal number of monitoring visits, and ease of planning ahead. The Observe-and-Plan regimen achieves this goal with good functional results. Translational Relevance: This perspective reviews the process from the pivotal clinical trials to the development of treatment regimens which are adjusted to real life requirements. The article discusses this translational process which- although not the classical interpretation of translation from fundamental to clinical research, but a subsequent process after the pivotal clinical trials - represents an important translational step from the clinical proof of efficacy to optimization in terms of patients' and clinics' needs. The related scientific procedure includes the exploration of the concept, evaluation of security, and finally proof of efficacy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[spa] El estudio analiza la evolución de los gases de efecto invernadero (GEI) y las emisiones de acidificación para Italia durante el periodo 1995-2005. Los datos muestran que mientras las emisiones que contribuyen a la acidificación han disminuido constantemente, las emisiones de GEI han aumentado debido al aumento de dióxido de carbono. El objetivo de este estudio es poner de relieve cómo diferentes factores económicos, en particular el crecimiento económico, el desarrollo de una tecnología menos contaminante y la estructura del consumo, han impulsado la evolución de las emisiones. La metodología propuesta es un análisis de descomposición estructural (ADE), método que permite descomponer los cambios de la variable de interés entre las diferentes fuerzas y revelar la importancia de cada factor. Por otra parte, este estudio considera la importancia del comercio internacional e intenta incluir el “problema de la responsabilidad”. Es decir, a través de las relaciones comerciales internacionales, un país podría estar exportando procesos de producción contaminantes sin una reducción real de la contaminación implícita en su patrón de consumo. Con este fin, siguiendo primero un enfoque basado en la “responsabilidad del productor”, el ADE se aplica a las emisiones causadas por la producción nacional. Sucesivamente, el análisis se mueve hacia un enfoque basado en la “responsabilidad del consumidor" y la descomposición se aplica a las emisiones relacionadas con la producción nacional o la producción extranjera que satisface la demanda interna. De esta manera, el ejercicio permite una primera comprobación de la importancia del comercio internacional y pone de relieve algunos resultados a nivel global y a nivel sectorial.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[spa] El estudio analiza la evolución de los gases de efecto invernadero (GEI) y las emisiones de acidificación para Italia durante el periodo 1995-2005. Los datos muestran que mientras las emisiones que contribuyen a la acidificación han disminuido constantemente, las emisiones de GEI han aumentado debido al aumento de dióxido de carbono. El objetivo de este estudio es poner de relieve cómo diferentes factores económicos, en particular el crecimiento económico, el desarrollo de una tecnología menos contaminante y la estructura del consumo, han impulsado la evolución de las emisiones. La metodología propuesta es un análisis de descomposición estructural (ADE), método que permite descomponer los cambios de la variable de interés entre las diferentes fuerzas y revelar la importancia de cada factor. Por otra parte, este estudio considera la importancia del comercio internacional e intenta incluir el “problema de la responsabilidad”. Es decir, a través de las relaciones comerciales internacionales, un país podría estar exportando procesos de producción contaminantes sin una reducción real de la contaminación implícita en su patrón de consumo. Con este fin, siguiendo primero un enfoque basado en la “responsabilidad del productor”, el ADE se aplica a las emisiones causadas por la producción nacional. Sucesivamente, el análisis se mueve hacia un enfoque basado en la “responsabilidad del consumidor" y la descomposición se aplica a las emisiones relacionadas con la producción nacional o la producción extranjera que satisface la demanda interna. De esta manera, el ejercicio permite una primera comprobación de la importancia del comercio internacional y pone de relieve algunos resultados a nivel global y a nivel sectorial.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena on kuvata ja analysoida valmisteilla olevan pankkien vakavaraisuuskehikon uudistuksen eri osa-alueita erityisesti markkinakurin edellytysten parantamisen kannalta keskittyen julkistamisvaatimuksiin. Lisäksi tutkielmassa arvioidaan koko uudistuksen pankille ja sen eri intressiryhmille aiheuttamia seurauksia, uudistuksen hyviä ja huonoja puolia sekä mahdollisia ongelmakohtia. Tutkielma perustuu Baselin pankkivalvontakomitean ja Euroopan komission toisiin ehdotuksiin vakavaraisuuskehikon uudistamiseksi. Lisää perspektiiviä antavat aiheesta kirjoitetut artikkelit ja julkaisut sekä haastattelut. Pankkien vakavaraisuuskehikon uudistus muodostuu kolmesta toisiaan täydentävästä niin kutsutusta pilarista, jotka ovat 1) vähimmäisvakavaraisuuden laskentatapa, 2) valvontaprosessin vahvistaminen ja 3) markkinakurin edellytysten parantaminen. Uudistus on vielä kesken ja sen sisältö muuttuu jatkuvasti eri tahojen kantojen kirkastuessa. Varsinaisten johtopäätösten teko on siis vielä liian aikaista, mutta jo nyt on selvää, että kyseessä on laaja ja merkittävä uudistus. Se muun muassa mahdollistaa sisäisten riskiluokitusten käytön ja kannustaa pankkeja tehokkaampaan riskien hallintaan sekä moninkertaistaa julkistettavan tiedon määrän nykyiseen säännöstöön verrattuna. Uudistuksen suuntalinjoista vallitsee kansainvälinen yhteisymmärrys, mutta monia ongelmia on vielä ratkaistava. Sen vuoksi laatijatahot ovat päättäneet antaa vielä kolmannen ehdotuksen ennen lopullista päätöstä. Suurimpia huolenaiheita ovat tällä hetkellä yhdenmukainen kansainvälinen täytäntöönpano ja säännösten tasapuolinen noudattaminen. Myös kehikon yksityiskohtaisuus arveluttaa monia.