150 resultados para STEP-NC
Resumo:
Today, Temporary Labour Migration is a fundamental course of action promoted by relevant economic and political agents, such as EC, the GCIM, or the OECD. Based on a specific empirical case study of Temporary and Circular Labour Migration in the Catalonian agrarian sector, which has been distinguished as a particularly successful formula, we identify a new area of interest: the emergence of a new empirical migrant category, the Circular Labour Migrant, which remains theoretically unnamed and lacks public recognition. We argue that, until now, there have been two historical phases regarding temporary labour migration: one of total deregulation and another of partial regulation, led by private actors with support from public institutions, and featuring circularity. IN a developed Welfare State context, it would be normatively pertinent to except a step towards a third phase, one involving the institutionalization of this new mobility category through the elaboration of a public policy.
Resumo:
This research report concerns about the post-doctoral activities, conducted betweenSeptember 2010 and March 2011 at the University Pompeu Fabra, Barcelona. It comes to identify the consequences of the convergence phenomenon on photojournalism.Thus, in a more general approach, the effort is to to recovery the structural elements of the convergence concept in journalism. It aims to map, as well, the current debates about the repositioning of photographic practices linked to the news produced in a widespread adoption of digital devices in contemporary workflow. It is also specified,the analysis of photographic collectives as a result of the convergence frameworkapplied to photojournalism; the debate on ways of funding; alternatives facing thealleged crisis of press photography and, finally, proposes to create qualifying stages ofdevelopment of photojournalism in the digital age as well as the proposition of hypotheses concerning the structure of the productive routines. In addition, we present three cases to be analyzed in order to explore and verify the occurrence ofcharacteristics that may identify the object of research in the state of practice. Finally,we work in a series of conclusions, revisiting the main hypotheses. With this strategy, ispossible to define an sequence of analysis capable of addressing the characteristics present in the studied cases and other ones in future, thus, be able to affirm this stage as a step, in the continuous historical course of photojournalism.
Resumo:
Aquest projecte consisteix en la producció sonora per a un curtmetratge. Des de la preproducció fins al disseny de so, s’analitzen i es realitzen tots els processos intermedis necessaris per arribar a tenir un producte audiovisual de qualitat i innovador. S’han desenvolupat diverses tasques: des del rodatge del curt, la composició de la banda sonora, fins a la mescla en 5.1, entre d’altres. En aquesta memòria es detalla pas a pas el procés que s’ha seguit per aconseguir el resultat que s’esperava, amb les reflexions prèvies i la investigació de les diverses opcions. L’objectiu principal és aprofundir en el camp del disseny sonor, i arribar a crear sons i efectes que impactin en l’espectador i no el deixin indiferent.
Resumo:
We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.
Resumo:
Des del principi dels temps històrics, la Matemàtica s'ha generat en totes les civilitzacions sobre la base de la resolució de problemes pràctics.Tanmateix, a partir del període grec la Història ens mostra la necessitat de fer un pas més endavant: l'evolució històrica de la Matemàtica situa els mètodes de raonament com a eix central de la recerca en Matemàtica. A partir d'una ullada als objectius i mètodes de treball d'alguns autors cabdals en la Història dels conceptes matemàtics postulem l'aprenentatge de les formes de raonament matemàtic com l'objectiu central de l'educació matemàtica, i la resolució de problemes com el mitjà més eficient per a coronar aquest objectiu.English version.From the beginning of the historical times, mathematics has been generated in all the civilizations on the base of the resolution of practical problems. Nevertheless, from the greek period History shows us the necessity to take one more step: the historical evolution of mathematics locates the methods of reasoning as the central axis of the research in mathematics. Glancing over the objectives and methods of work used bysome fundamental authors in the History of the mathematical concepts we postulated the learning of the forms of mathematical reasoning like the central objective of the mathematical education, and the resolution of problems as the most efficient way to carry out this objective.
Resumo:
It is common in econometric applications that several hypothesis tests arecarried out at the same time. The problem then becomes how to decide whichhypotheses to reject, accounting for the multitude of tests. In this paper,we suggest a stepwise multiple testing procedure which asymptoticallycontrols the familywise error rate at a desired level. Compared to relatedsingle-step methods, our procedure is more powerful in the sense that itoften will reject more false hypotheses. In addition, we advocate the useof studentization when it is feasible. Unlike some stepwise methods, ourmethod implicitly captures the joint dependence structure of the teststatistics, which results in increased ability to detect alternativehypotheses. We prove our method asymptotically controls the familywise errorrate under minimal assumptions. We present our methodology in the context ofcomparing several strategies to a common benchmark and deciding whichstrategies actually beat the benchmark. However, our ideas can easily beextended and/or modied to other contexts, such as making inference for theindividual regression coecients in a multiple regression framework. Somesimulation studies show the improvements of our methods over previous proposals. We also provide an application to a set of real data.
Spanning tests in return and stochastic discount factor mean-variance frontiers: A unifying approach
Resumo:
We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.
Resumo:
Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.
Resumo:
This paper shows that the distribution of observed consumption is not a good proxy for the distribution of heterogeneous consumers when the current tariff is an increasing block tariff. We use a two step method to recover the "true" distribution of consumers. First, we estimate the demand function induced by the current tariff. Second, using the demand system, we specify the distribution of consumers as a function of observed consumption to recover the true distribution. Finally, we design a new two-part tariff which allows us to evaluate the equity of the existence of an increasing block tariff.
Resumo:
The central message of this paper is that nobody should be using the samplecovariance matrix for the purpose of portfolio optimization. It containsestimation error of the kind most likely to perturb a mean-varianceoptimizer. In its place, we suggest using the matrix obtained from thesample covariance matrix through a transformation called shrinkage. Thistends to pull the most extreme coefficients towards more central values,thereby systematically reducing estimation error where it matters most.Statistically, the challenge is to know the optimal shrinkage intensity,and we give the formula for that. Without changing any other step in theportfolio optimization process, we show on actual stock market data thatshrinkage reduces tracking error relative to a benchmark index, andsubstantially increases the realized information ratio of the activeportfolio manager.
Resumo:
The generalization of simple correspondence analysis, for two categorical variables, to multiple correspondence analysis where they may be three or more variables, is not straighforward, both from a mathematical and computational point of view. In this paper we detail the exact computational steps involved in performing a multiple correspondence analysis, including the special aspects of adjusting the principal inertias to correct the percentages of inertia, supplementary points and subset analysis. Furthermore, we give the algorithm for joint correspondence analysis where the cross-tabulations of all unique pairs of variables are analysed jointly. The code in the R language for every step of the computations is given, as well as the results of each computation.
Resumo:
This paper studies the rate of convergence of an appropriatediscretization scheme of the solution of the Mc Kean-Vlasovequation introduced by Bossy and Talay. More specifically,we consider approximations of the distribution and of thedensity of the solution of the stochastic differentialequation associated to the Mc Kean - Vlasov equation. Thescheme adopted here is a mixed one: Euler/weakly interactingparticle system. If $n$ is the number of weakly interactingparticles and $h$ is the uniform step in the timediscretization, we prove that the rate of convergence of thedistribution functions of the approximating sequence in the $L^1(\Omega\times \Bbb R)$ norm and in the sup norm is of theorder of $\frac 1{\sqrt n} + h $, while for the densities is ofthe order $ h +\frac 1 {\sqrt {nh}}$. This result is obtainedby carefully employing techniques of Malliavin Calculus.
Resumo:
This paper establishes a general framework for metric scaling of any distance measure between individuals based on a rectangular individuals-by-variables data matrix. The method allows visualization of both individuals and variables as well as preserving all the good properties of principal axis methods such as principal components and correspondence analysis, based on the singular-value decomposition, including the decomposition of variance into components along principal axes which provide the numerical diagnostics known as contributions. The idea is inspired from the chi-square distance in correspondence analysis which weights each coordinate by an amount calculated from the margins of the data table. In weighted metric multidimensional scaling (WMDS) we allow these weights to be unknown parameters which are estimated from the data to maximize the fit to the original distances. Once this extra weight-estimation step is accomplished, the procedure follows the classical path in decomposing a matrix and displaying its rows and columns in biplots.
Resumo:
The remarkable growth of older population has moved long term care to the front ranks of the social policy agenda. Understanding the factors that determine the type and amount of formal care is important for predicting use in the future and developing long-term policy. In this context we jointly analyze the choice of care (formal, informal, both together or none) as well as the number of hours of care received. Given that the number of hours of care is not independent of the type of care received, we estimate, for the first time in this area of research, a sample selection model with the particularity that the first step is a multinomial logit model. With regard to the debate about complementarity or substitutability between formal and informal care, our results indicate that formal care acts as a reinforcement of the family care in certain cases: for very old care receivers, in those cases in which the individual has multiple disabilities, when many care hours are provided, and in case of mental illness and/or dementia. There exist substantial differences in long term care addressed to younger and older dependent people and dependent women are in risk of becoming more vulnerable to the shortage of informal caregivers in the future. Finally, we have documented that there are great disparities in the availability of public social care across regions.
Resumo:
We revisit the debt overhang question. We first use non-parametric techniques to isolate a panel of countries on the downward sloping section of a debt Laffer curve. In particular, overhang countries are ones where a threshold level of debt is reached in sample, beyond which (initial) debt ends up lowering (subsequent)growth. On average, significantly negative coefficients appear when debt face value reaches 60 percent of GDP or 200 percent of exports, and when its present value reaches 40 percent of GDP or 140 percent of exports. Second, we depart from reduced form growth regressions and perform direct tests of the theory on the thus selected sample of overhang countries. In the spirit of event studies, we ask whether, as overhang level of debt is reached: (i)investment falls precipitously as it should when it becomes optimal to default, (ii) economic policy deteriorates observably, as it should when debt contracts become unable to elicit effort on the part of the debtor, and (iii) the terms of borrowing worsen noticeably, as they should when it becomes optimal for creditors to pre-empt default and exact punitive interest rates. We find a systematic response of investment, particularly when property rights are weakly enforced, some worsening of the policy environment, and a fall in interest rates. This easing of borrowing conditions happens because lending by the private sector virtually disappears in overhang situations, and multilateral agencies step in with concessional rates. Thus, while debt relief is likely to improve economic policy (and especially investment) in overhang countries, it is doubtful that it would ease their terms of borrowing, or the burden of debt.