796 resultados para Counter terrorist measures
Resumo:
Using UK equity index data, this paper considers the impact of news on time varying measures of beta, the usual measure of undiversifiable risk. The empirical model implies that beta depends on news about the market and news about the sector. The asymmetric response of beta to news about the market is consistent across all sectors considered. Recent research is divided as to whether abnormalities in equity returns arise from changes in expected returns in an efficient market or over-reactions to new information. The evidence suggests that such abnormalities may be due to changes in expected returns caused by time-variation and asymmetry in beta.
Resumo:
Classical measures of network connectivity are the number of disjoint paths between a pair of nodes and the size of a minimum cut. For standard graphs, these measures can be computed efficiently using network flow techniques. However, in the Internet on the level of autonomous systems (ASs), referred to as AS-level Internet, routing policies impose restrictions on the paths that traffic can take in the network. These restrictions can be captured by the valley-free path model, which assumes a special directed graph model in which edge types represent relationships between ASs. We consider the adaptation of the classical connectivity measures to the valley-free path model, where it is -hard to compute them. Our first main contribution consists of presenting algorithms for the computation of disjoint paths, and minimum cuts, in the valley-free path model. These algorithms are useful for ASs that want to evaluate different options for selecting upstream providers to improve the robustness of their connection to the Internet. Our second main contribution is an experimental evaluation of our algorithms on four types of directed graph models of the AS-level Internet produced by different inference algorithms. Most importantly, the evaluation shows that our algorithms are able to compute optimal solutions to instances of realistic size of the connectivity problems in the valley-free path model in reasonable time. Furthermore, our experimental results provide information about the characteristics of the directed graph models of the AS-level Internet produced by different inference algorithms. It turns out that (i) we can quantify the difference between the undirected AS-level topology and the directed graph models with respect to fundamental connectivity measures, and (ii) the different inference algorithms yield topologies that are similar with respect to connectivity and are different with respect to the types of paths that exist between pairs of ASs.
Resumo:
Experiences from the Mitigation Options for Phosphorus and Sediment (MOPS) projects, which aim to determine the effectiveness of measures to reduce pollutant loading from agricultural land to surface waters, have been used to contribute to the findings of a recent paper (Kay et al., 2009, Agricultural Systems, 99, 67–75), which reviewed the efficacy of contemporary agricultural stewardship measures for ameliorating the water pollution problems of key concern to the UK water industry. MOPS1 is a recently completed 3-year research project on three different soil types in the UK, which focused on mitigation options for winter cereals. MOPS1 demonstrated that tramlines can be the major pathway for sediment and nutrient transfer from arable hillslopes, and that although minimum tillage, crop residue incorporation, contour cultivation, and beetle banks also have potential to be cost-effective mitigation options, tramline management is the one of the most promising treatments for mitigating diffuse pollution losses, as it was able to reduce sediment and nutrient losses by 72–99% in four out of five site years trialled. Using information from the MOPS projects, this paper builds on the findings of Kay et al. to provide an updated picture of the evidence available and the immediate needs for research in this area.
Resumo:
This paper analyses tendencies in debates about cultural representations of terrorism to assume that artists make critical interventions, while the mass media circulates stereotypes. Some recent feminist analyses of female terrorist acts have re-instituted essentialist arguments in which violence and terrorism is described as inherently masculine, while women are by nature pacifist, so that femininity is the antithesis of militarism. More progressive analyses mostly tend to expose the circulation of stereotypes and their gender bias, in order to protest the misrepresentation of women in violence. These analyses do not construct alternative accounts. Through an analysis of two works by artists Hito Steyerl and Sharon Hayes, the paper argues that some of the moves to re-image the question of women, violence and agency have already been made in contemporary art practices. Through analysing legacies of terrorism and feminism, it becomes possible to rethink the question of agency, militancy and the nature of political art. The paper appears in an edited interdiscplinary collection arising from a conference at Universität der Bundeswehr in Munich. It relates to wider projects involving collaborations with Birkbeck and Edinburgh on representations of terrorism and on violence and contemporary art.
Resumo:
In the context of the financial crash and the commercial property market downturn, this paper examines the basis of valuation used in the UK commercial property lending process. Post-crisis there is discussion of countercyclical measures including the monitoring of asset prices; however there is no consideration of a different approach to property valuation. This paper questions this omission, given the role that valuations play in the bank regulatory process. The different bases of valuation available to lenders within International Valuation Standards are identified as Market Value (MV), Mortgage Lending Value (MLV) and Investment Value (IV), with MV being the most used in the UK. Using the different bases in the period before the financial crisis, the UK property market is modelled at a national office, retail and industrial/warehouse sector level to determine the performance of each alternative valuation basis within the context of counter-cyclical pressures on lending. Both MLV and IV would have produced lower valuations and could have provided lenders with tools for more informed and prudent lending. The paper concludes by recognising some of the practical issues involved in adopting the different bases for the bank lending role but recommends a change to IV.
Resumo:
“Point and click” interactions remain one of the key features of graphical user interfaces (GUIs). People with motion-impairments, however, can often have difficulty with accurate control of standard pointing devices. This paper discusses work that aims to reveal the nature of these difficulties through analyses that consider the cursor’s path of movement. A range of cursor measures was applied, and a number of them were found to be significant in capturing the differences between able-bodied users and motion-impaired users, as well as the differences between a haptic force feedback condition and a control condition. The cursor measures found in the literature, however, do not make up a comprehensive list, but provide a starting point for analysing cursor movements more completely. Six new cursor characteristics for motion-impaired users are introduced to capture aspects of cursor movement different from those already proposed.
Resumo:
People with motion-impairments can often have difficulty with accurate control of standard pointing devices for computer input. The nature of the difficulties may vary, so to be most effective, methods of assisting cursor control must be suited to each user's needs. The work presented here involves a study of cursor trajectories as a means of assessing the requirements of motion-impaired computer users. A new cursor characteristic is proposed that attempts to capture difficulties with moving the cursor in a smooth trajectory. A study was conducted to see if haptic tunnels could improve performance in "point and click" tasks. Results indicate that the tunnels reduced times to target for those users identified by the new characteristic as having the most difficulty moving in a smooth trajectory. This suggests that cursor characteristics have potential applications in performing assessments of a user's cursor control capabilities which can then be used to determine appropriate methods of assistance.
Resumo:
In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agreement with the theoretical estimates.
Resumo:
In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.
Resumo:
ABSTRACT Non-Gaussian/non-linear data assimilation is becoming an increasingly important area of research in the Geosciences as the resolution and non-linearity of models are increased and more and more non-linear observation operators are being used. In this study, we look at the effect of relaxing the assumption of a Gaussian prior on the impact of observations within the data assimilation system. Three different measures of observation impact are studied: the sensitivity of the posterior mean to the observations, mutual information and relative entropy. The sensitivity of the posterior mean is derived analytically when the prior is modelled by a simplified Gaussian mixture and the observation errors are Gaussian. It is found that the sensitivity is a strong function of the value of the observation and proportional to the posterior variance. Similarly, relative entropy is found to be a strong function of the value of the observation. However, the errors in estimating these two measures using a Gaussian approximation to the prior can differ significantly. This hampers conclusions about the effect of the non-Gaussian prior on observation impact. Mutual information does not depend on the value of the observation and is seen to be close to its Gaussian approximation. These findings are illustrated with the particle filter applied to the Lorenz ’63 system. This article is concluded with a discussion of the appropriateness of these measures of observation impact for different situations.
Resumo:
In this study two new measures of lexical diversity are tested for the first time on French. The usefulness of these measures, MTLD (McCarthy and Jarvis (2010 and this volume) ) and HD-D (McCarthy and Jarvis 2007), in predicting different aspects of language proficiency is assessed and compared with D (Malvern and Richards 1997; Malvern, Richards, Chipere and Durán 2004) and Maas (1972) in analyses of stories told by two groups of learners (n=41) of two different proficiency levels and one group of native speakers of French (n=23). The importance of careful lemmatization in studies of lexical diversity which involve highly inflected languages is also demonstrated. The paper shows that the measures of lexical diversity under study are valid proxies for language ability in that they explain up to 62 percent of the variance in French C-test scores, and up to 33 percent of the variance in a measure of complexity. The paper also provides evidence that dependence on segment size continues to be a problem for the measures of lexical diversity discussed in this paper. The paper concludes that limiting the range of text lengths or even keeping text length constant is the safest option in analysing lexical diversity.