814 resultados para swd: Benchmark
Resumo:
This paper presents an application of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach to the estimation of quantities of Gross Value Added (GVA) referring to economic entities defined at different scales of study. The method first estimates benchmark values of the pace of GVA generation per hour of labour across economic sectors. These values are estimated as intensive variables –e.g. €/hour– by dividing the various sectorial GVA of the country (expressed in € per year) by the hours of paid work in that same sector per year. This assessment is obtained using data referring to national statistics (top down information referring to the national level). Then, the approach uses bottom-up information (the number of hours of paid work in the various economic sectors of an economic entity –e.g. a city or a province– operating within the country) to estimate the amount of GVA produced by that entity. This estimate is obtained by multiplying the number of hours of work in each sector in the economic entity by the benchmark value of GVA generation per hour of work of that particular sector (national average). This method is applied and tested on two different socio-economic systems: (i) Catalonia (considered level n) and Barcelona (considered level n-1); and (ii) the region of Lima (considered level n) and Lima Metropolitan Area (considered level n-1). In both cases, the GVA per year of the local economic entity –Barcelona and Lima Metropolitan Area – is estimated and the resulting value is compared with GVA data provided by statistical offices. The empirical analysis seems to validate the approach, even though the case of Lima Metropolitan Area indicates a need for additional care when dealing with the estimate of GVA in primary sectors (agriculture and mining).
Resumo:
Aims and background. In 2002, a survey including 1759 patients treated from 1980 to 1998 established a "benchmark" Italian data source for prostate cancer radiotherapy. This report updates the previous one. Methods. Data on clinical management and outcomes of 3001 patients treated in 15 centers from 1999 through 2003 were analyzed and compared with those of the previous survey. Results. Significant differences in clinical management (-10% had abdominal ma-gnetic resonance imaging; +26% received ≥70 Gy, +48% conformal radiotherapy, -20% pelvic radiotherapy) and in G3-4 toxicity rates (-3.8%) were recorded. Actuarial 5-year overall, disease-specific, clinical relapse-free, and biochemical relapse-free survival rates were 88%, 96%, 96% and 88%, respectively. At multivariate analysis, D'Amico risk categories significantly impacted on all the outcomes; higher radiotherapy doses were significantly related with better overall survival rates, and a similar trend was evident for disease-specific and biochemical relapse-free survival; cumulative probability of 5-year late G1-4 toxicity was 24.8% and was significantly related to higher radiotherapy doses (P <0.001). Conclusions. The changing patterns of practice described seem related to an improvement in efficacy and safety of radiotherapy for prostate cancer. However, the impact of the new radiotherapy techniques should be prospectively evaluated.
Resumo:
En la actualitat, el món de la gestió sanitària està centrant esforços en la definició, de nous rols professionals (clínics i de gestió) amb responsabilitats transversals (multiàmbit i/o multicèntric); i en el disseny i implementació d’instruments d’ajuda a la presa de decisions, per permetre als professionals disminuir la variabilitat de la pràctica clínica i afavorir les millores en la gestió. És ben conegut que la disminució de la variabilitat i la utilització de paràmetres comuns permet avaluar els diferents centres, sota la mateixa òptica, fet que afavoreix el benchmark, augmenta la seguretat del pacient i en conseqüència, trobem models de més qualitat i eficiència. Aquesta variabilitat no la trobem només en l’àmbit purament assistencial, sinó també la identifiquem en àmbits de responsabilitat gestora. Un d’aquests entre d’altres, podria ser la de la Direcció Infermera d’una organització sanitària. En aquest treball cerquem un model organitzatiu de coordinació horitzontal, que busca una fórmula de coordinació en les diferents Direccions Infermeres d’un mateix nivell assistencial (atenció especialitzada), a través de la creació de diferents figures de comandaments d’infermeria transversal amb responsabilitat sobre aquells processos claus per la organització.La creació d’aquestes figures transversals, ofereix la possibilitat de transmetre un discurs homogeni i reproduïble a tots els punts sota la seva responsabilitat, afavorint l’alineament d’objectius, i per tant millors resultats en termes de salut, de gestió, i de compartició de coneixement. Objectius són: 1. Dissenyar un nou model organitzatiu en xarxa de la Direcció Infermera compartit per 3 centres d’ Atenció Especialitzada. 2.Definir les diferents figures transversals que lideraran els eixos estratègics de la Direcció Infermera. 3.Crear un quadre de comandament integral multicèntric per la Direcció Infermera.Per assolir aquests objectius s’ha desenvolupat un conjunt d’accions multipalanca: •Definició de funcions, responsabilitats i àmbit d’actuació de les noves figures de comandament transversal (bottom up). •Disseny d’un model organitzatiu que formalitzi i doni resposta a les necessitats d’aquestes (top-down). •Elaboració d’un quadre de comandament integral de la Direcció Infermera per als tres centres. •Elaboració d’un pla d’acció per la implementació d’aquest projecte.
Resumo:
The article presents and discusses estimates of social and economic indicators for Italy’s regions in benchmark years roughly from Unification to the present day: life expectancy, education, GDP per capita at purchasing power parity, and the new Human Development Index (HDI). A broad interpretative hypothesis, based on the distinction between passive and active modernization, is proposed to account for the evolution of regional imbalances over the long-run. In the lack of active modernization, Southern Italy converged thanks to passive modernization, i.e., State intervention: however, this was more effective in life expectancy, less successful in education, expensive and as a whole ineffective in GDP. As a consequence, convergence in the HDI occurred from the late XIX century to the 1970s, but came to a sudden halt in the last decades of the XX century.
Resumo:
For the ∼1% of the human genome in the ENCODE regions, only about half of the transcriptionally active regions (TARs) identified with tiling microarrays correspond to annotated exons. Here we categorize this large amount of “unannotated transcription.” We use a number of disparate features to classify the 6988 novel TARs—array expression profiles across cell lines and conditions, sequence composition, phylogenetic profiles (presence/absence of syntenic conservation across 17 species), and locations relative to genes. In the classification, we first filter out TARs with unusual sequence composition and those likely resulting from cross-hybridization. We then associate some of those remaining with proximal exons having correlated expression profiles. Finally, we cluster unclassified TARs into putative novel loci, based on similar expression and phylogenetic profiles. To encapsulate our classification, we construct a Database of Active Regions and Tools (DART.gersteinlab.org). DART has special facilities for rapidly handling and comparing many sets of TARs and their heterogeneous features, synchronizing across builds, and interfacing with other resources. Overall, we find that ∼14% of the novel TARs can be associated with known genes, while ∼21% can be clustered into ∼200 novel loci. We observe that TARs associated with genes are enriched in the potential to form structural RNAs and many novel TAR clusters are associated with nearby promoters. To benchmark our classification, we design a set of experiments for testing the connectivity of novel TARs. Overall, we find that 18 of the 46 connections tested validate by RT-PCR and four of five sequenced PCR products confirm connectivity unambiguously.
Resumo:
A new multimodal biometric database designed and acquired within the framework of the European BioSecure Network of Excellence is presented. It is comprised of more than 600 individuals acquired simultaneously in three scenarios: 1) over the Internet, 2) in an office environment with desktop PC, and 3) in indoor/outdoor environments with mobile portable hardware. The three scenarios include a common part of audio/video data. Also, signature and fingerprint data have been acquired both with desktop PC and mobile portable hardware. Additionally, hand and iris data were acquired in the second scenario using desktop PC. Acquisition has been conducted by 11 European institutions. Additional features of the BioSecure Multimodal Database (BMDB) are: two acquisitionsessions, several sensors in certain modalities, balanced gender and age distributions, multimodal realistic scenarios with simple and quick tasks per modality, cross-European diversity, availability of demographic data, and compatibility with other multimodal databases. The novel acquisition conditions of the BMDB allow us to perform new challenging research and evaluation of eithermonomodal or multimodal biometric systems, as in the recent BioSecure Multimodal Evaluation campaign. A description of this campaign including baseline results of individual modalities from the new database is also given. The database is expected to beavailable for research purposes through the BioSecure Association during 2008.
Resumo:
La docencia de la creatividad publicitaria convive con diversas contradicciones y vacíos. Si esto es propio de cualquier materia de estudio, aquí se suma el peculiar curso científico tanto de la creatividad como de la publicidad, dos fenómenos de descomunal repercusión social, cultural y económica, aparentemente asequibles, pero en gran medida desconocidos.El presente trabajo indaga en la posibilidad de sistematizar la evaluación de la creatividad publicitaria y estima el valor docente de dicha sistematización. En una primera fase se exploran los ámbitos de referencia que afectan al tópico, así comolas conexiones que establecen entre ellos, en la esfera académica y profesional. A continuación se orientan y discuten los hallazgos de la prospección, se formula una hipótesis y se diseña una investigación que conduzca a validarla o refutarla.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
Does financial development result in capital being reallocated more rapidly to industries where it is most productive? We argue that if this was the case, financially developed countries should see faster growth in industries with investment opportunities due to global demand and productivity shifts. Testing this cross-industry cross-country growth implication requires proxies for (latent) global industry investment opportunities. We show that tests relying only on data from specific (benchmark) countries may yield spurious evidence for or against the hypothesis. We therefore develop an alternative approach that combines benchmark-country proxies with a proxy that does not reflect opportunities specific to a country or level of financial development. Our empirical results yield clear support for the capital reallocation hypothesis.
Resumo:
It is common in econometric applications that several hypothesis tests arecarried out at the same time. The problem then becomes how to decide whichhypotheses to reject, accounting for the multitude of tests. In this paper,we suggest a stepwise multiple testing procedure which asymptoticallycontrols the familywise error rate at a desired level. Compared to relatedsingle-step methods, our procedure is more powerful in the sense that itoften will reject more false hypotheses. In addition, we advocate the useof studentization when it is feasible. Unlike some stepwise methods, ourmethod implicitly captures the joint dependence structure of the teststatistics, which results in increased ability to detect alternativehypotheses. We prove our method asymptotically controls the familywise errorrate under minimal assumptions. We present our methodology in the context ofcomparing several strategies to a common benchmark and deciding whichstrategies actually beat the benchmark. However, our ideas can easily beextended and/or modied to other contexts, such as making inference for theindividual regression coecients in a multiple regression framework. Somesimulation studies show the improvements of our methods over previous proposals. We also provide an application to a set of real data.
Resumo:
Most US credit card holders revolve high-interest debt, often combined with substantial (i) asset accumulation by retirement, and (ii) low-rate liquid assets. Hyperbolic discounting can resolve only the former puzzle (Laibson et al., 2003). Bertaut and Haliassos (2002) proposed an 'accountant-shopper'framework for the latter. The current paper builds, solves, and simulates a fully-specified accountant-shopper model, to show that this framework canactually generate both types of co-existence, as well as target credit card utilization rates consistent with Gross and Souleles (2002). The benchmark model is compared to setups without self-control problems, with alternative mechanisms, and with impatient but fully rational shoppers.
Resumo:
Most US credit card holders revolve high-interest debt, often combined with substantial (i) asset accumulation by retirement, and (ii) low-rate liquid assets. Hyperbolic discounting can resolve only the former puzzle (Laibson et al., 2003). Bertaut and Haliassos (2002) proposed an 'accountant-shopper'framework for the latter. The current paper builds, solves, and simulates a fully-specified accountant-shopper model, to show that this framework canactually generate both types of co-existence, as well as target credit card utilization rates consistent with Gross and Souleles (2002). The benchmark model is compared to setups without self-control problems, with alternative mechanisms, and with impatient but fully rational shoppers.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
We study how restrictions on firm entry affect intersectoral factor reallocation when openeconomies experience global economic shocks. In our theoretical framework, countries trade freelyin a range of differentiated sectors that are subject to country-specific and global shocks. Entryrestrictions are modeled as an upper bound on the introduction of new differentiated goods followingshocks. Prices and quantities adjust to clear international goods markets, and wages adjustto clear national labor markets. We show that in general equilibrium, countries with tighter entryrestrictions see less factor reallocation compared to the frictionless benchmark. In our empiricalwork, we compare sectoral employment reallocation across countries in the 1980s and 1990s withproxies for frictionless benchmark reallocation. Our results indicate that the gap between actualand frictionless reallocation is greater in countries where it takes longer to start a firm.
Resumo:
Doubts about the reliability of a company's qualitative financial disclosure increase market participant expectations from the auditor's report. The auditing process is supposed to serve as a monitoring device that reduces management incentives to manipulate reported earnings. Empirical research confirms that it could be an efficient device under some circumstancesand recognizes that our estimates of the informativeness of audit reports are unavoidably biased (e.g., because of a client's anticipation of the auditing process). This empirical study supports the significant role of auditors in the financial market, in particular in the prevention of earnings management practice. We focus on earnings misstatements, which auditors correct with anadjustment, using a sample of past and current constituents of the benchmark market index in Spain, IBEX 35, and manually collected audit adjustments reported over the 1997-2004 period (42 companies, 336 annual reports, 75 earnings misstatements). Our findings confirm that companies more often overstate than understate their earnings. An investor may foresee earningsmisreporting, as manipulators have a similar profile (e.g., more leveraged and with lower sales). However, he may receive valuable information from the audit adjustment on the size of earnings misstatement, which can be significantly large (i.e., material in almost all cases). We suggest that the magnitude of an audit adjustment depends, other things constant, on annual revenues and free cash levels. We also examine how the audit adjustment relates to the observed market price, trading volume and stock returns. Our findings are that earnings manipulators have a lower price and larger trading volume compared to their rivals. Their returns are positively associated with the magnitude of earnings misreporting, which is not consistent with the possible pricing of audit information.