109 resultados para Laurent series
Resumo:
ADEFFI annual conference
Trinity College Dublin
October 2007
Resumo:
This article addresses the issue of ‘European popular cinema’ by discussing a very specific phenomenon, i.e. the crime series produced in the years immediately preceding World War I (e.g. Victorin Jasset’s Nick Carter, Viggo Larsen’s Arsène Lupin contra Sherlock, Ubaldo Maria del Colle’s Raffles, il ladro misterioso, Louis Feuillade’s Fantômas, George Pearson’s Ultus). On the one hand, the transnational circulation of these films is seen as the result of the development of the European cultural industries since the late nineteenth century; on the other hand, the rapid decline of this genre testifies of the historical peculiarity of this production. In particular, the popular heroic figure of the ‘gentleman thief’ seems to express at the same time the liberating, anti-hierarchial ethos of modernization and the dream of a quiet conciliation of the new and the traditional values: as a consequence, it might be regarded as a telling example of the economical, social and ideological transformations of that crucial phase in European history, when the development of the second industrial revolution and the first phase of ‘globalization’ pointed at the birth of a supranational sphere before the outbreak of World War I, which would temporarily stop this process.
Resumo:
Objective
To investigate the effect of fast food consumption on mean population body mass index (BMI) and explore the possible influence of market deregulation on fast food consumption and BMI.
Methods
The within-country association between fast food consumption and BMI in 25 high-income member countries of the Organisation for Economic Co-operation and Development between 1999 and 2008 was explored through multivariate panel regression models, after adjustment for per capita gross domestic product, urbanization, trade openness, lifestyle indicators and other covariates. The possible mediating effect of annual per capita intake of soft drinks, animal fats and total calories on the association between fast food consumption and BMI was also analysed. Two-stage least squares regression models were conducted, using economic freedom as an instrumental variable, to study the causal effect of fast food consumption on BMI.
Findings
After adjustment for covariates, each 1-unit increase in annual fast food transactions per capita was associated with an increase of 0.033 kg/m2 in age-standardized BMI (95% confidence interval, CI: 0.013–0.052). Only the intake of soft drinks – not animal fat or total calories – mediated the observed association (β: 0.030; 95% CI: 0.010–0.050). Economic freedom was an independent predictor of fast food consumption (β: 0.27; 95% CI: 0.16–0.37). When economic freedom was used as an instrumental variable, the association between fast food and BMI weakened but remained significant (β: 0.023; 95% CI: 0.001–0.045).
Conclusion
Fast food consumption is an independent predictor of mean BMI in high-income countries. Market deregulation policies may contribute to the obesity epidemic by facilitating the spread of fast food.
Resumo:
This Letter describes the development and SAR of a novel series of GlyT1 inhibitors derived from a scaffold hopping approach, in lieu of an HTS campaign, which provided intellectual property position. Members within this new [3.3.0]-based series, e.g. I, displayed excellent GlyT1 potency, selectivity, free fraction, and modest CNS penetration. Moreover, enantioselective GlyT1 inhibition was obsd., within this novel series and a no. of other piperidine bioisosteric cores.
Resumo:
This commentary examines two principal forms of inequality and their evolution since the 1960s: the division of national income between capital and labour, and the share of total income held by the top 1 per cent of earners. Trends are linked to current discussions of inequality drivers such as financialisation, and a brief time-series analysis of the effects of trade and financial sector growth on top incomes is presented.
Resumo:
The paper addresses the issue of choice of bandwidth in the application of semiparametric estimation of the long memory parameter in a univariate time series process. The focus is on the properties of forecasts from the long memory model. A variety of cross-validation methods based on out of sample forecasting properties are proposed. These procedures are used for the choice of bandwidth and subsequent model selection. Simulation evidence is presented that demonstrates the advantage of the proposed new methodology.
Resumo:
Let C be a bounded cochain complex of finitely generatedfree modules over the Laurent polynomial ring L = R[x, x−1, y, y−1].The complex C is called R-finitely dominated if it is homotopy equivalentover R to a bounded complex of finitely generated projective Rmodules.Our main result characterises R-finitely dominated complexesin terms of Novikov cohomology: C is R-finitely dominated if andonly if eight complexes derived from C are acyclic; these complexes areC ⊗L R[[x, y]][(xy)−1] and C ⊗L R[x, x−1][[y]][y−1], and their variants obtainedby swapping x and y, and replacing either indeterminate by its inverse.
Resumo:
We develop a continuous-time asset price model to capture the timeseries momentum documented recently. The underlying stochastic delay differentialsystem facilitates the analysis of effects of different time horizons used bymomentum trading. By studying an optimal asset allocation problem, we find thatthe performance of time series momentum strategy can be significantly improvedby combining with market fundamentals and timing opportunity with respect tomarket trend and volatility. Furthermore, the results also hold for different timehorizons, the out-of-sample tests and with short-sale constraints. The outperformanceof the optimal strategy is immune to market states, investor sentiment andmarket volatility.
Resumo:
Many modeling problems require to estimate a scalar output from one or more time series. Such problems are usually tackled by extracting a fixed number of features from the time series (like their statistical moments), with a consequent loss in information that leads to suboptimal predictive models. Moreover, feature extraction techniques usually make assumptions that are not met by real world settings (e.g. uniformly sampled time series of constant length), and fail to deliver a thorough methodology to deal with noisy data. In this paper a methodology based on functional learning is proposed to overcome the aforementioned problems; the proposed Supervised Aggregative Feature Extraction (SAFE) approach allows to derive continuous, smooth estimates of time series data (yielding aggregate local information), while simultaneously estimating a continuous shape function yielding optimal predictions. The SAFE paradigm enjoys several properties like closed form solution, incorporation of first and second order derivative information into the regressor matrix, interpretability of the generated functional predictor and the possibility to exploit Reproducing Kernel Hilbert Spaces setting to yield nonlinear predictive models. Simulation studies are provided to highlight the strengths of the new methodology w.r.t. standard unsupervised feature selection approaches. © 2012 IEEE.
Resumo:
In this paper the tracking system used to perform a scaled vehicle-barrier crash test is reported. The scaled crash test was performed as part of a wider project aimed at designing a new safety barrier making use of natural building materials. The scaled crash test was designed and performed as a proof of concept of the new mass-based safety barriers and the study was composed of two parts: the scaling technique and of a series of performed scaled crash tests. The scaling method was used for 1) setting the scaled test impact velocity so that energy dissipation and momentum transferring, from the car to the barrier, can be reproduced and 2) predicting the acceleration, velocity and displacement values occurring in the full-scale impact from the results obtained in a scaled test. To achieve this goal the vehicle and barrier displacements were to be recorded together with the vehicle accelerations and angular velocities. These quantities were measured during the tests using acceleration sensors and a tracking system. The tracking system was composed of a high speed camera and a set of targets to measure the vehicle linear and angular velocities. A code was developed to extract the target velocities from the videos and the velocities obtained were then compared with those obtained integrating the accelerations provided by the sensors to check the reliability of the method.