869 resultados para Coordinated and Multiple Views
Resumo:
We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.
Resumo:
Ce mémoire propose une analyse de la collaboration à l’intérieur de projets cinématographiques dans l’œuvre de Pierre Perrault. Comme la collaboration entre cinéaste et participants soulève des questions éthiques, cette recherche étudie deux films pivots dans la carrière de ce cinéaste soit Pour la suite du monde et La bête lumineuse. Tout en contrastant le discours du cinéaste avec celui d’un protagoniste nommé Stéphane-Albert Boulais, cette étude détaille les dynamiques de pouvoir liées à la représentation et analyse l’éthique du créateur. Ce mémoire présente une description complète de la pensée de Pierre Perrault, ainsi que sa pratique tant au niveau du tournage que du montage. Cette étude se consacre à deux terrains cinématographiques pour soulever les pratiques tant au niveau de l’avant, pendant, et après tournage. Ce mémoire se penche ensuite sur Stéphane-Albert Boulais, qui grâce à ses nombreux écrits sur ses expériences cinématographiques, permet de multiplier les regards sur la collaboration. Après une analyse comparative entre les deux terrains cinématographiques, ce mémoire conclut sur une analyse détaillée de l’éthique du créateur à l’intérieur de projets collaboratifs.
Resumo:
Lorsqu’un site ou un bien est protégé par un statut patrimonial, tant national que local, les règlements d’urbanisme sont un des principaux outils d’encadrement des modifications de l’environnement bâti. Comment ces règlements participent-ils à la conservation des valeurs patrimoniales ? Pour explorer cette question, nous avons choisi le cas de l’arrondissement historique et naturel du Mont-Royal (AHNMR, renommé en 2012 site patrimonial du Mont-Royal), un site majeur pour l’identité de Montréal. Nous avons recensé les valeurs patrimoniales attribuées au site et analysé le processus de gestion des projets dans les quatre arrondissements qui se répartissent la partie montréalaise du territoire de l’AHNMR; nous avons également analysé quelques demandes de permis. Le processus est complexe, d’autant plus que l’évaluation est en bonne partie discrétionnaire, incluant des analyses de fonctionnaires et de comités consultatifs de même que des exercices de consultation publique. La recherche a permis de mettre en lumière que les règlements d’urbanisme ont tendance à se concentrer sur les valeurs dont la matérialité est connue (valeurs architecturales et paysagères notamment) et à délaisser les valeurs immatérielles (valeurs d’usage, valeurs identitaires et emblématiques). La juxtaposition des valeurs peut atténuer ce déséquilibre en protégeant une valeur immatérielle par l’entremise d’une valeur matérielle. La documentation des valeurs patrimoniales et de leur incarnation dans l’aménagement d’un site revêt une importance majeure pour l’application des critères d’évaluation. De plus, l’évaluation discrétionnaire apporte de multiple points de vue sur un projet, des opinions d’acteurs, experts en patrimoine ou non, généralement absents de l’évaluation des projets, ce qui contribue à l’évolution de ces derniers. Les consultations publiques donnent lieu à la réévaluation des valeurs patrimoniales ainsi qu’à l’approfondissement des connaissances.
Resumo:
The thesis entitled “Queueing Models with Vacations and Working Vacations" consists of seven chapters including the introductory chapter. In chapters 2 to 7 we analyze different queueing models highlighting the role played by vacations and working vacations. The duration of vacation is exponentially distributed in all these models and multiple vacation policy is followed.In chapter 2 we discuss an M/M/2 queueing system with heterogeneous servers, one of which is always available while the other goes on vacation in the absence of customers waiting for service. Conditional stochastic decomposition of queue length is derived. An illustrative example is provided to study the effect of the input parameters on the system performance measures. Chapter 3 considers a similar setup as chapter 2. The model is analyzed in essentially the same way as in chapter 2 and a numerical example is provided to bring out the qualitative nature of the model. The MAP is a tractable class of point process which is in general nonrenewal. In spite of its versatility it is highly tractable as well. Phase type distributions are ideally suited for applying matrix analytic methods. In all the remaining chapters we assume the arrival process to be MAP and service process to be phase type. In chapter 4 we consider a MAP/PH/1 queue with working vacations. At a departure epoch, the server finding the system empty, takes a vacation. A customer arriving during a vacation will be served but at a lower rate.Chapter 5 discusses a MAP/PH/1 retrial queueing system with working vacations.In chapter 6 the setup of the model is similar to that of chapter 5. The signicant dierence in this model is that there is a nite buer for arrivals.Chapter 7 considers an MMAP(2)/PH/1 queueing model with a nite retrial group
Resumo:
To assess the prevalence of faecal coliform bacteria and multiple drug resistance among Escherichia coli and Salmonella serotypes from Vembanadu Lake. Study design: Systematic microbiological testing. Methods: Monthly collection of water samples were made from ten stations on the southern and northern parts of a salt water regulator constructed in Vembanadu Lake in order to prevent incursion of seawater during certain periods of the year. Density of faecal colifrom bacteria was estimated. E. coli and Salmonella were isolated and their different serotypes were identified. Antibiotic resistance analysis of E. coli and Salmonella serotypes was done and the MAR index of individual isolates was calculated. Results: Density of faecal coliform bacteria ranged from mean MPN value 2900 -7100/100ml. Results showed multiple drug resistance pattern among the bacterial isolates. E. coli showed more than 50% resistance to amickacin, oxytetracycline, streptomycin, tetracycline and kanamycin while Salmonella showed high resistance to oxytetracycline, streptomycin, tetracycline and ampicillin. The MAR indexing of the isolates showed that they have originated from high risk source such as humans, poultry and dairy cows. Conclusions: The high density of faecal coliform bacteria and prevalence of multi drug resistant E. coli and Salmonella serotypes in the lake may pose severe public health risk through related water borne and food borne outbreaks
Resumo:
The research of this thesis dissertation covers developments and applications of short-and long-term climate predictions. The short-term prediction emphasizes monthly and seasonal climate, i.e. forecasting from up to the next month over a season to up to a year or so. The long-term predictions pertain to the analysis of inter-annual- and decadal climate variations over the whole 21st century. These two climate prediction methods are validated and applied in the study area, namely, Khlong Yai (KY) water basin located in the eastern seaboard of Thailand which is a major industrial zone of the country and which has been suffering from severe drought and water shortage in recent years. Since water resources are essential for the further industrial development in this region, a thorough analysis of the potential climate change with its subsequent impact on the water supply in the area is at the heart of this thesis research. The short-term forecast of the next-season climate, such as temperatures and rainfall, offers a potential general guideline for water management and reservoir operation. To that avail, statistical models based on autoregressive techniques, i.e., AR-, ARIMA- and ARIMAex-, which includes additional external regressors, and multiple linear regression- (MLR) models, are developed and applied in the study region. Teleconnections between ocean states and the local climate are investigated and used as extra external predictors in the ARIMAex- and the MLR-model and shown to enhance the accuracy of the short-term predictions significantly. However, as the ocean state – local climate teleconnective relationships provide only a one- to four-month ahead lead time, the ocean state indices can support only a one-season-ahead forecast. Hence, GCM- climate predictors are also suggested as an additional predictor-set for a more reliable and somewhat longer short-term forecast. For the preparation of “pre-warning” information for up-coming possible future climate change with potential adverse hydrological impacts in the study region, the long-term climate prediction methodology is applied. The latter is based on the downscaling of climate predictions from several single- and multi-domain GCMs, using the two well-known downscaling methods SDSM and LARS-WG and a newly developed MLR-downscaling technique that allows the incorporation of a multitude of monthly or daily climate predictors from one- or several (multi-domain) parent GCMs. The numerous downscaling experiments indicate that the MLR- method is more accurate than SDSM and LARS-WG in predicting the recent past 20th-century (1971-2000) long-term monthly climate in the region. The MLR-model is, consequently, then employed to downscale 21st-century GCM- climate predictions under SRES-scenarios A1B, A2 and B1. However, since the hydrological watershed model requires daily-scale climate input data, a new stochastic daily climate generator is developed to rescale monthly observed or predicted climate series to daily series, while adhering to the statistical and geospatial distributional attributes of observed (past) daily climate series in the calibration phase. Employing this daily climate generator, 30 realizations of future daily climate series from downscaled monthly GCM-climate predictor sets are produced and used as input in the SWAT- distributed watershed model, to simulate future streamflow and other hydrological water budget components in the study region in a multi-realization manner. In addition to a general examination of the future changes of the hydrological regime in the KY-basin, potential future changes of the water budgets of three main reservoirs in the basin are analysed, as these are a major source of water supply in the study region. The results of the long-term 21st-century downscaled climate predictions provide evidence that, compared with the past 20th-reference period, the future climate in the study area will be more extreme, particularly, for SRES A1B. Thus, the temperatures will be higher and exhibit larger fluctuations. Although the future intensity of the rainfall is nearly constant, its spatial distribution across the region is partially changing. There is further evidence that the sequential rainfall occurrence will be decreased, so that short periods of high intensities will be followed by longer dry spells. This change in the sequential rainfall pattern will also lead to seasonal reductions of the streamflow and seasonal changes (decreases) of the water storage in the reservoirs. In any case, these predicted future climate changes with their hydrological impacts should encourage water planner and policy makers to develop adaptation strategies to properly handle the future water supply in this area, following the guidelines suggested in this study.
Resumo:
The report addresses the problem of visual recognition under two sources of variability: geometric and photometric. The geometric deals with the relation between 3D objects and their views under orthographic and perspective projection. The photometric deals with the relation between 3D matte objects and their images under changing illumination conditions. Taken together, an alignment-based method is presented for recognizing objects viewed from arbitrary viewing positions and illuminated by arbitrary settings of light sources.
Resumo:
Babies are born with simple manipulation capabilities such as reflexes to perceived stimuli. Initial discoveries by babies are accidental until they become coordinated and curious enough to actively investigate their surroundings. This thesis explores the development of such primitive learning systems using an embodied light-weight hand with three fingers and a thumb. It is self-contained having four motors and 36 exteroceptor and proprioceptor sensors controlled by an on-palm microcontroller. Primitive manipulation is learned from sensory inputs using competitive learning, back-propagation algorithm and reinforcement learning strategies. This hand will be used for a humanoid being developed at the MIT Artificial Intelligence Laboratory.
Resumo:
In January 1983 a group of US government, industry and university information specialists gathered at MIT to take stock of efforts to monitor, acquire, assess, and disseminate Japanese scientific and technical information (JSTI). It was agreed that these efforts were uncoordinated and poorly conceived, and that a clearer understanding of Japanese technical information systems and a clearer sense of its importance to end users was necessary. That meeting led to formal technology assessments, Congressinal hearings, and legislation; it also helped stimulate several private initiatives in JSTI provision. Four years later there exist better coordinated and better conceived JSTI programs in both the public and private sectors, but there remains much room for improvement. This paper will recount their development and assess future directions.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan
Resumo:
The registration of full 3-D models is an important task in computer vision. Range finders only reconstruct a partial view of the object. Many authors have proposed several techniques to register 3D surfaces from multiple views in which there are basically two aspects to consider. First, poor registration in which some sort of correspondences are established. Second, accurate registration in order to obtain a better solution. A survey of the most common techniques is presented and includes experimental results of some of them
Resumo:
As part of the INFO2009 coursework; an interactive resource set to teach students about the Computer Misuse Act, encompassing an explanation of the law and multiple-choice questions.
Resumo:
Background: Genetic and epigenetic factors interacting with the environment over time are the main causes of complex diseases such as autoimmune diseases (ADs). Among the environmental factors are organic solvents (OSs), which are chemical compounds used routinely in commercial industries. Since controversy exists over whether ADs are caused by OSs, a systematic review and meta-analysis were performed to assess the association between OSs and ADs. Methods and Findings: The systematic search was done in the PubMed, SCOPUS, SciELO and LILACS databases up to February 2012. Any type of study that used accepted classification criteria for ADs and had information about exposure to OSs was selected. Out of a total of 103 articles retrieved, 33 were finally included in the meta-analysis. The final odds ratios (ORs) and 95% confidence intervals (CIs) were obtained by the random effect model. A sensitivity analysis confirmed results were not sensitive to restrictions on the data included. Publication bias was trivial. Exposure to OSs was associated to systemic sclerosis, primary systemic vasculitis and multiple sclerosis individually and also to all the ADs evaluated and taken together as a single trait (OR: 1.54; 95% CI: 1.25-1.92; p-value, 0.001). Conclusion: Exposure to OSs is a risk factor for developing ADs. As a corollary, individuals with non-modifiable risk factors (i.e., familial autoimmunity or carrying genetic factors) should avoid any exposure to OSs in order to avoid increasing their risk of ADs.
Resumo:
Since 1991 Colombia has had a market-determined Peso - US Dollar Nominal Exchange Rate (NER), after more than 20 years of controlled and multiple exchange rates. The behavior (revaluation / devaluation) of the NER is constantly reported in news, editorials and op-eds of major newspapers of the nation with particular attention to revaluation. The uneven reporting of revaluation episodes can be explained by the existence of an interest group particulary affected by revaluation, looking to increase awareness and sympathy for help from public institutions. Using the number of news and op-eds from a major Colombian newspaper, it is shown that there is an over-reporting of revaluation episodes in contrast to devaluation ones. Secondly, using text analysis upon the content of the news, it is also shown that the words devaluation and revaluation are far apart in the distribution of words within the news; and revaluation is highly correlated with words related to: public institutions, exporters and the need of assistance. Finally it is also shown that the probability of the central bank buying US dollars to lessen revaluation effects increases with the number of news; even though the central bank allegedly intervenes in the exchange rate market only to tame volatility or accumulate international reserves.
Resumo:
The first part of this work presents an accurate analysis of the most relevant 3D registration techniques, including initial pose estimation, pairwise registration and multiview registration strategies. A new classification has been proposed, based on both the applications and the approach of the methods that have been discussed. The main contribution of this thesis is the proposal of a new 3D multiview registration strategy. The proposed approach detects revisited regions obtaining cycles of views that are used to reduce the inaccuracies that may exist in the final model due to error propagation. The method takes advantage of both global and local information of the registration process, using graph theory techniques in order correlate multiple views and minimize the propagated error by registering the views in an optimal way. The proposed method has been tested using both synthetic and real data, in order to show and study its behavior and demonstrate its reliability.