782 resultados para standard reform
Resumo:
Since 2008, One of the International Accounting Standards Board’s (IASB) objective has been to replace the old IAS 39 – Financial Instruments standard. IASB achieved this objective in July 2014 when they published the new IFRS 9 – Financial Instruments after many phases. In this study, the main purpose was to find out how the Big Four – audit entities have welcomed the different reforms which IFRS 9 brings to the treatment of financial instruments in the financial statements. Alongside with this, the study presents a short overview to the common attitude towards the new standard. The study proceeds so that the most siginificant reforms have been divided into three main categories and inside of these more precisely to single reforms. This study is based on the qualitative research method. The empirical data of the study consists of comment letters by the Big Four – entities, which have been sent to the IASB regarding Exposure Drafts (ED) of IFRS 9. In total IASB received 757 comment letters regarding to the specific EDs. In this study the population were restricted to 16 comment letters sent by the Big Four – entities. The data is available at IFRS Foundation’s website. According to the research results Big Four – entities think that the reforms which IFRS 9 brings are mainly welcome. In its entirety Big Four – entities consider IFRS 9 better than its predecessor IAS 39. There were differnces in opinions towards IFRS 9 and specific reforms among the Big Four - entities. According to the findings the best reforms were related to the efficiency demands of hedge accounting and to impairments and the valuation of credit losses. The least popular reforms were the reforms regarding the measurement of financial assets and liabilities; more specifically fair value option and the reforms concerning equity instruments were viewed as most challenging.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
If emerging markets are to achieve their objective of joining the ranks of industrialized, developed countries, they must use their economic and political influence to support radical change in the international financial system. This working paper recommends John Maynard Keynes's "clearing union" as a blueprint for reform of the international financial architecture that could address emerging market grievances more effectively than current approaches. Keynes's proposal for the postwar international system sought to remedy some of the same problems currently facing emerging market economies. It was based on the idea that financial stability was predicated on a balance between imports and exports over time, with any divergence from balance providing automatic financing of the debit countries by the creditor countries via a global clearinghouse or settlement system for trade and payments on current account. This eliminated national currency payments for imports and exports; countries received credits or debits in a notional unit of account fixed to national currency. Since the unit of account could not be traded, bought, or sold, it would not be an international reserve currency. The credits with the clearinghouse could only be used to offset debits by buying imports, and if not used for this purpose they would eventually be extinguished; hence the burden of adjustment would be shared equally - credit generated by surpluses would have to be used to buy imports from the countries with debit balances. Emerging market economies could improve upon current schemes for regionally governed financial institutions by using this proposal as a template for the creation of regional clearing unions using a notional unit of account.
Resumo:
We analyze the effect of a parametric reform of the fully-funded pension regime in Colombia on the intensive margin of the labor supply. We take advantage of a threshold defined by law in order to identify the causal effect using a regression discontinuity design. We find that a pension system that increases retirement age and the minimum weeks during which workers must contribute to claim pension benefits causes an increase of around 2 hours on the number of weekly worked hours; this corresponds to 4% of the average number of weekly worked hours or around 14% of a standard deviation of weekly worked hours. The effect is robust to different specifications, polynomial orders and sample sizes.
Resumo:
J.A. Ferreira Neto, E.C. Santos Junior, U. Fra Paleo, D. Miranda Barros, and M.C.O. Moreira. 2011. Optimal subdivision of land in agrarian reform projects: an analysis using genetic algorithms. Cien. Inv. Agr. 38(2): 169-178. The objective of this manuscript is to develop a new procedure to achieve optimal land subdivision using genetic algorithms (GA). The genetic algorithm was tested in the rural settlement of Veredas, located in Minas Gerais, Brazil. This implementation was based on the land aptitude and its productivity index. The sequence of tests in the study was carried out in two areas with eight different agricultural aptitude classes, including one area of 391.88 ha subdivided into 12 lots and another of 404.1763 ha subdivided into 14 lots. The effectiveness of the method was measured using the shunting line standard value of a parceled area lot`s productivity index. To evaluate each parameter, a sequence of 15 calculations was performed to record the best individual fitness average (MMI) found for each parameter variation. The best parameter combination found in testing and used to generate the new parceling with the GA was the following: 320 as the generation number, a population of 40 individuals, 0.8 mutation tax, and a 0.3 renewal tax. The solution generated rather homogeneous lots in terms of productive capacity.
Resumo:
This thesis contains three chapters. The first chapter uses a general equilibrium framework to simulate and compare the long run effects of the Patient Protection and Affordable Care Act (PPACA) and of health care costs reduction policies on macroeconomic variables, government budget, and welfare of individuals. We found that all policies were able to reduce uninsured population, with the PPACA being more effective than cost reductions. The PPACA increased public deficit mainly due to the Medicaid expansion, forcing tax hikes. On the other hand, cost reductions alleviated the fiscal burden of public insurance, reducing public deficit and taxes. Regarding welfare effects, the PPACA as a whole and cost reductions are welfare improving. High welfare gains would be achieved if the U.S. medical costs followed the same trend of OECD countries. Besides, feasible cost reductions are more welfare improving than most of the PPACA components, proving to be a good alternative. The second chapter documents that life cycle general equilibrium models with heterogeneous agents have a very hard time reproducing the American wealth distribution. A common assumption made in this literature is that all young adults enter the economy with no initial assets. In this chapter, we relax this assumption – not supported by the data – and evaluate the ability of an otherwise standard life cycle model to account for the U.S. wealth inequality. The new feature of the model is that agents enter the economy with assets drawn from an initial distribution of assets. We found that heterogeneity with respect to initial wealth is key for this class of models to replicate the data. According to our results, American inequality can be explained almost entirely by the fact that some individuals are lucky enough to be born into wealth, while others are born with few or no assets. The third chapter documents that a common assumption adopted in life cycle general equilibrium models is that the population is stable at steady state, that is, its relative age distribution becomes constant over time. An open question is whether the demographic assumptions commonly adopted in these models in fact imply that the population becomes stable. In this chapter we prove the existence of a stable population in a demographic environment where both the age-specific mortality rates and the population growth rate are constant over time, the setup commonly adopted in life cycle general equilibrium models. Hence, the stability of the population do not need to be taken as assumption in these models.
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
IFRS 9 Financial instruments presents the classification and measurement, the impairment and the hedge accounting requirements for accounting of financial instruments. The standard was set by the International Accounting Standards Board to replace IAS 39 Financial instruments: Recognition and Measurement on 1 January 2018. Hence, the long-criticized and complexly experienced requirements for accounting of financial instruments will undergo the most significant reform. This thesis addresses anticipated effects of IFRS 9, focusing on the challenges the new classification and measurement requirements bring forth in the case organization Kesko. This thesis was conducted as an action research, in which, a case study method was applied. The thesis was conducted with a twofold manner, which involved general analysis of IFRS 9 and further covered distinct ambitions related to the case organization. For the general part, empirical data was gathered by interviewing two IFRS experts from KPMG and PwC, while the interviews within the case organization constituted for the case study. Further, the literature on the IFRS 9 was such scant that the theoretical examination was merged with the IFRS experts’ quotations that also strived to contribute to the overall objective of reinforcing the body of research related to the subject. This thesis indicates that IFRS 9 will most fundamentally reform the impairment and the hedge accounting requirements of financial instruments. Regard to impairment, the changes are anticipated to increase the amount of loan-loss provisions, whereas the relaxed hedge accounting requirements are expected to encourage more companies to commence the application of hedge accounting. The thesis provides empirical support on that the term business model for managing financial assets, introduced in IFRS 9, is ably hard to comprehend and remains ambiguous. It goes on to argue that the most prominent issue in defining the business model for managing financial assets is the limits set in IFRS 9 for selling financial assets. In consideration of Kesko, this thesis finds that the key effects of IFRS 9 are anticipated to be the reshaping of the organization’s treasury policy and further examination of the possibility to apply hedge accounting for foreign exchange derivatives. What is more, the thesis presumes that complying the requirements of IFRS 9 Kesko will apply the hold to collect and sell model for managing financial assets in future.
A Feasibility Study Of Fricke Dosimetry As An Absorbed Dose To Water Standard For 192ir Hdr Sources.
Resumo:
High dose rate brachytherapy (HDR) using 192Ir sources is well accepted as an important treatment option and thus requires an accurate dosimetry standard. However, a dosimetry standard for the direct measurement of the absolute dose to water for this particular source type is currently not available. An improved standard for the absorbed dose to water based on Fricke dosimetry of HDR 192Ir brachytherapy sources is presented in this study. The main goal of this paper is to demonstrate the potential usefulness of the Fricke dosimetry technique for the standardization of the quantity absorbed dose to water for 192Ir sources. A molded, double-walled, spherical vessel for water containing the Fricke solution was constructed based on the Fricke system. The authors measured the absorbed dose to water and compared it with the doses calculated using the AAPM TG-43 report. The overall combined uncertainty associated with the measurements using Fricke dosimetry was 1.4% for k = 1, which is better than the uncertainties reported in previous studies. These results are promising; hence, the use of Fricke dosimetry to measure the absorbed dose to water as a standard for HDR 192Ir may be possible in the future.
Resumo:
PURPOSE: To evaluate the sensitivity and specificity of machine learning classifiers (MLCs) for glaucoma diagnosis using Spectral Domain OCT (SD-OCT) and standard automated perimetry (SAP). METHODS: Observational cross-sectional study. Sixty two glaucoma patients and 48 healthy individuals were included. All patients underwent a complete ophthalmologic examination, achromatic standard automated perimetry (SAP) and retinal nerve fiber layer (RNFL) imaging with SD-OCT (Cirrus HD-OCT; Carl Zeiss Meditec Inc., Dublin, California). Receiver operating characteristic (ROC) curves were obtained for all SD-OCT parameters and global indices of SAP. Subsequently, the following MLCs were tested using parameters from the SD-OCT and SAP: Bagging (BAG), Naive-Bayes (NB), Multilayer Perceptron (MLP), Radial Basis Function (RBF), Random Forest (RAN), Ensemble Selection (ENS), Classification Tree (CTREE), Ada Boost M1(ADA),Support Vector Machine Linear (SVML) and Support Vector Machine Gaussian (SVMG). Areas under the receiver operating characteristic curves (aROC) obtained for isolated SAP and OCT parameters were compared with MLCs using OCT+SAP data. RESULTS: Combining OCT and SAP data, MLCs' aROCs varied from 0.777(CTREE) to 0.946 (RAN).The best OCT+SAP aROC obtained with RAN (0.946) was significantly larger the best single OCT parameter (p<0.05), but was not significantly different from the aROC obtained with the best single SAP parameter (p=0.19). CONCLUSION: Machine learning classifiers trained on OCT and SAP data can successfully discriminate between healthy and glaucomatous eyes. The combination of OCT and SAP measurements improved the diagnostic accuracy compared with OCT data alone.
Resumo:
INTRODUÇÃO: os Centros de Atenção Psicossocial Infantojuvenil (CAPSi) constituem ponta de lança das ações da Reforma Psiquiátrica Brasileira e têm por finalidade o atendimento de crianças e adolescentes com transtornos psíquicos graves. O objetivo é caracterizar o perfil dos usuários de um CAPSi, considerando sexo, idade, hipótese diagnóstica, origem do encaminhamento, inserção escolar e motivo de consulta. MÉTODO: por meio de um protocolo, foram coletados dados da totalidade de prontuários ativos de uma unidade da Grande São Paulo - cento e três - no mês de janeiro de 2008. RESULTADOS: a maioria dos usuários atendidos está na faixa etária de cinco a quinze anos (68,9 por cento) e é do sexo masculino (61,2 por cento). O grupo de transtornos de comportamento e transtornos emocionais corresponde a 21,4 por cento, seguido pelos transtornos do desenvolvimento global (16,2 por cento) e retardo mental (10,5 por cento). A maioria dos usuários foi encaminhada pelo Conselho Tutelar (22,3 por cento) e tiveram como principal motivo da consulta queixas neuromotoras (17,5 por cento), escolares (15,5 por cento) e sociocomportamentais (14,6 por cento). CONCLUSÕES: o número elevado de crianças com problemas neuromotores pode indicar características específicas da instituição estudada que absorveu pacientes e profissionais de um antigo serviço de reabilitação. O grande número de questões relevantes não encontradas apontam para a falta de padronização dos prontuários
Resumo:
A compact frequency standard based on an expanding cold (133)CS cloud is under development in our laboratory. In a first experiment, Cs cold atoms were prepared by a magneto-optical trap in a vapor cell, and a microwave antenna was used to transmit the radiation for the clock transition. The signal obtained from fluorescence of the expanding cold atoms cloud is used to lock a microwave chain. In this way the overall system stability is evaluated. A theoretical model based on a two-level system interacting with the two microwave pulses enables interpretation for the observed features, especially the poor Ramsey fringes contrast. (C) 2008 Optical Society of America.
Resumo:
In Brazil the 1990s constituted years of institutional achievements in the fields of housing and urban rights, given the incorporation of the principles of the social function of cities and property, the recognition of tenure rights for slum dwellers and the direct participation of citizens in the decision making process of urban policies, within the 1988 Constitution. These proposals have become the pillars of the Urban Reform agenda which has penetrated the federal government apparatus since the creation of the Ministry of Cities under Lula's administration. The article evaluates the limits and possibilities for the implementation of this agenda through the analysis of two policies proposed by the Ministry: the National Council of Cities and the campaign for Participatory Master Plans. The approach is based on the organization of the Brazilian State in terms of urban development, the relationship with the political system and the characteristics of Brazilian democracy.