442 resultados para Stan
Resumo:
Characteristics of surveillance video generally include low resolution and poor quality due to environmental, storage and processing limitations. It is extremely difficult for computers and human operators to identify individuals from these videos. To overcome this problem, super-resolution can be used in conjunction with an automated face recognition system to enhance the spatial resolution of video frames containing the subject and narrow down the number of manual verifications performed by the human operator by presenting a list of most likely candidates from the database. As the super-resolution reconstruction process is ill-posed, visual artifacts are often generated as a result. These artifacts can be visually distracting to humans and/or affect machine recognition algorithms. While it is intuitive that higher resolution should lead to improved recognition accuracy, the effects of super-resolution and such artifacts on face recognition performance have not been systematically studied. This paper aims to address this gap while illustrating that super-resolution allows more accurate identification of individuals from low-resolution surveillance footage. The proposed optical flow-based super-resolution method is benchmarked against Baker et al.’s hallucination and Schultz et al.’s super-resolution techniques on images from the Terrascope and XM2VTS databases. Ground truth and interpolated images were also tested to provide a baseline for comparison. Results show that a suitable super-resolution system can improve the discriminability of surveillance video and enhance face recognition accuracy. The experiments also show that Schultz et al.’s method fails when dealing surveillance footage due to its assumption of rigid objects in the scene. The hallucination and optical flow-based methods performed comparably, with the optical flow-based method producing less visually distracting artifacts that interfered with human recognition.
Resumo:
This paper considers VECMs for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration between the permanent components of series reduces the rank of the long-run multiplier matrix, a common feature among the transitory components leads to a rank reduction in the matrix summarizing short-run dynamics. The common feature also implies that there exists linear combinations of the first-differenced variables in a cointegrated VAR that are white noise and traditional tests focus on testing for this characteristic. An alternative, however, is to test the rank of the short-run dynamics matrix directly. Consequently, we use the literature on testing the rank of a matrix to produce some alternative test statistics. We also show that these are identical to one of the traditional tests. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to re-examine an existing empirical study. Finally, this approach is applied to provide a check for the presence of common dynamics in DSGE models.
Resumo:
Forecasts generated by time series models traditionally place greater weight on more recent observations. This paper develops an alternative semi-parametric method for forecasting that does not rely on this convention and applies it to the problem of forecasting asset return volatility. In this approach, a forecast is a weighted average of historical volatility, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in volatility across time (as a measure of market conditions) by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are significantly more accurate than a number of competing approaches at both short and long forecast horizons.
Resumo:
The performance of techniques for evaluating multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper aims to evaluate the efficacy of a range of traditional statistical-based methods for multivariate forecast evaluation together with methods based on underlying considerations of economic theory. It is found that a statistical-based method based on likelihood theory and an economic loss function based on portfolio variance are the most effective means of identifying optimal forecasts of conditional covariance matrices.
Resumo:
Synopsis and review of the Australian prison film Everynight...Everynight (Alkinos Tsilimidos, 1994). Includes cast and credits. An opening title states that Everynight… Everynight is a true story, but due to “legal implications”, the characters have been fictionalised. Another title dedicates the film to the memory of Christopher Dale Flannery, an infamous underworld figure known as ‘Mr Rent-a-Kill’ who spent time in H Division in the 1970s and 1980s. Originally from Melbourne, Flannery was a major figure in the Sydney ‘gang wars’ of 1984-85, dramatised in the television series Underbelly: A Tale of Two Cities (2009). He disappeared in mid-1985; there are several conflicting stories about his fate. The character of Bryant appears to have been based on Stan Taylor who had spent time in H Division with Flannery. Taylor was sentenced to life imprisonment without parole in 1988 for the 1986 bombing of police headquarters in Melbourne...
Resumo:
This paper develops analytical distributions of temperature indices on which temperature derivatives are written. If the deviations of daily temperatures from their expected values are modelled as an Ornstein-Uhlenbeck process with timevarying variance, then the distributions of the temperature index on which the derivative is written is the sum of truncated, correlated Gaussian deviates. The key result of this paper is to provide an analytical approximation to the distribution of this sum, thus allowing the accurate computation of payoffs without the need for any simulation. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is used to demonstrate the efficacy of this approach for estimating the payoffs to temperature derivatives. It is demonstrated that expected payoffs computed directly from historical records are a particularly poor approach to the problem when there are trends in underlying average daily temperature. It is shown that the proposed analytical approach is superior to historical pricing.
Resumo:
This book provides a general framework for specifying, estimating, and testing time series econometric models. Special emphasis is given to estimation by maximum likelihood, but other methods are also discussed, including quasi-maximum likelihood estimation, generalized method of moments estimation, nonparametric estimation, and estimation by simulation. An important advantage of adopting the principle of maximum likelihood as the unifying framework for the book is that many of the estimators and test statistics proposed in econometrics can be derived within a likelihood framework, thereby providing a coherent vehicle for understanding their properties and interrelationships. In contrast to many existing econometric textbooks, which deal mainly with the theoretical properties of estimators and test statistics through a theorem-proof presentation, this book squarely addresses implementation to provide direct conduits between the theory and applied work.
Resumo:
The occurrence of extreme movements in the spot price of electricity represents a significant source of risk to retailers. A range of approaches have been considered with respect to modelling electricity prices; these models, however, have relied on time-series approaches, which typically use restrictive decay schemes placing greater weight on more recent observations. This study develops an alternative, semi-parametric method for forecasting, which uses state-dependent weights derived from a kernel function. The forecasts that are obtained using this method are accurate and therefore potentially useful to electricity retailers in terms of risk management.
Resumo:
Review(s) of: Settling the Pop Score: Pop Texts and Identity Politics, Stan Hawkins, Aldershot, Hants. : Ashgate, 2002, ISBN 0 7546 0352 0; pb, 234pp, ill, music exx, bibl. , discog. , index. The scholarly study of popular music has its origins in sociology and cultural studies, disciplinary areas in which musical meaning is often attributed to aspects of economical and sociological function. Against this tradition, recent writers have offered what is now referred to as ‘popular musicology’: a method or approach that tends towards a specific engagement with ‘pop texts’ on aesthetic, and perhaps even ‘musical’ terms. Stan Hawkins uses the term popular musicology ‘at his own peril,’ clearly recognising the implicit scholarly danger in his approach, whereby ‘formalist questions of musical analysis’ are dealt with ‘alongside the more intertextual discursive theorisations of musical expression’ (p. xii). In other words, popular musicologists dare to tread that fine line between text and context. As editor of the journal Popular Musicology Online, Hawkins is a leading advocate of this practice, specifically in the application of music-analytical techniques to popular music. His methodology attests to the influence of other leading figures in the area, notably Richard Middleton, Allan F. Moore and Derek Scott (general editor of the Ashgate Popular and Folk Music Series in which this book is published).
Resumo:
Techniques for evaluating and selecting multivariate volatility forecasts are not yet understood as well as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a set of competing forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood-based loss function outperforms its competitors, including those based on the given portfolio application. This result indicates that considering the particular application of forecasts is not necessarily the most effective basis on which to select models.
Resumo:
In this 1972 documentary, The Computer Generation, by John Musilli, artist Stan Vanderbeek talks about the possibility of computers as an artist tool. My aim with drawing on this documentary is to compare the current state of transmedia with previous significant changes in media history, to illustrate how the current state of transmedia is quite diverse.