16 resultados para Local likelihood function
em Aston University Research Archive
Resumo:
Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.
Resumo:
Principal component analysis (PCA) is a ubiquitous technique for data analysis and processing, but one which is not based upon a probability model. In this paper we demonstrate how the principal axes of a set of observed data vectors may be determined through maximum-likelihood estimation of parameters in a latent variable model closely related to factor analysis. We consider the properties of the associated likelihood function, giving an EM algorithm for estimating the principal subspace iteratively, and discuss the advantages conveyed by the definition of a probability density function for PCA.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This report outlines the derivation and application of a non-zero mean, polynomial-exponential covariance function based Gaussian process which forms the prior wind field model used in 'autonomous' disambiguation. It is principally used since the non-zero mean permits the computation of realistic local wind vector prior probabilities which are required when applying the scaled-likelihood trick, as the marginals of the full wind field prior. As the full prior is multi-variate normal, these marginals are very simple to compute.
Resumo:
In the analysis and prediction of many real-world time series, the assumption of stationarity is not valid. A special form of non-stationarity, where the underlying generator switches between (approximately) stationary regimes, seems particularly appropriate for financial markets. We introduce a new model which combines a dynamic switching (controlled by a hidden Markov model) and a non-linear dynamical system. We show how to train this hybrid model in a maximum likelihood approach and evaluate its performance on both synthetic and financial data.
Resumo:
The thrust of the argument presented in this chapter is that inter-municipal cooperation (IMC) in the United Kingdom reflects local government's constitutional position and its exposure to the exigencies of Westminster (elected central government) and Whitehall (centre of the professional civil service that services central government). For the most part councils are without general powers of competence and are restricted in what they can do by Parliament. This suggests that the capacity for locally driven IMC is restricted and operates principally within a framework constructed by central government's policy objectives and legislation and the political expediencies of the governing political party. In practice, however, recent examples of IMC demonstrate that the practices are more complex than this initial analysis suggests. Central government may exert top-down pressures and impose hierarchical directives, but there are important countervailing forces. Constitutional changes in Scotland and Wales have shifted the locus of central- local relations away from Westminster and Whitehall. In England, the seeding of English government regional offices in 1994 has evolved into an important structural arrangement that encourages councils to work together. Within the local government community there is now widespread acknowledgement that to achieve the ambitious targets set by central government, councils are, by necessity, bound to cooperate and work with other agencies. In recent years, the fragmentation of public service delivery has affected the scope of IMC. Elected local government in the UK is now only one piece of a complex jigsaw of agencies that provides services to the public; whether it is with non-elected bodies, such as health authorities, public protection authorities (police and fire), voluntary nonprofit organisations or for-profit bodies, councils are expected to cooperate widely with agencies in their localities. Indeed, for projects such as regeneration and community renewal, councils may act as the coordinating agency but the success of such projects is measured by collaboration and partnership working (Davies 2002). To place these developments in context, IMC is an example of how, in spite of the fragmentation of traditional forms of government, councils work with other public service agencies and other councils through the medium of interagency partnerships, collaboration between organisations and a mixed economy of service providers. Such an analysis suggests that, following changes to the system of local government, contemporary forms of IMC are less dependent on vertical arrangements (top-down direction from central government) as they are replaced by horizontal modes (expansion of networks and partnership arrangements). Evidence suggests, however that central government continues to steer local authorities through the agency of inspectorates and regulatory bodies, and through policy initiatives, such as local strategic partnerships and local area agreements (Kelly 2006), thus questioning whether, in the case of UK local government, the shift from hierarchy to network and market solutions is less differentiated and transformation less complete than some literature suggests. Vertical or horizontal pressures may promote IMC, yet similar drivers may deter collaboration between local authorities. An example of negative vertical pressure was central government's change of the systems of local taxation during the 1980s. The new taxation regime replaced a tax on property with a tax on individual residency. Although the community charge lasted only a few years, it was a highpoint of the then Conservative government policy that encouraged councils to compete with each other on the basis of the level of local taxation. In practice, however, the complexity of local government funding in the UK rendered worthless any meaningful ambition of councils competing with each other, especially as central government granting to local authorities is predicated (however imperfectly) on at least notional equalisation between those areas with lower tax yields and the more prosperous locations. Horizontal pressures comprise factors such as planning decisions. Over the last quarter century, councils have competed on the granting of permission to out-of-town retail and leisure complexes, now recognised as detrimental to neighbouring authorities because economic forces prevail and local, independent shops are unable to compete with multiple companies. These examples illustrate tensions at the core of the UK polity of whether IMC is feasible when competition between local authorities heightened by local differences reduces opportunities for collaboration. An alternative perspective on IMC is to explore whether specific purposes or functions promote or restrict it. Whether in the principle areas of local government responsibilities relating to social welfare, development and maintenance of the local infrastructure or environmental matters, there are examples of IMC. But opportunities have diminished considerably as councils lost responsibility for services provision as a result of privatisation and transfer of powers to new government agencies or to central government. Over the last twenty years councils have lost their role in the provision of further-or higher-education, public transport and water/sewage. Councils have commissioning power but only a limited presence in providing housing needs, social care and waste management. In other words, as a result of central government policy, there are, in practice, currently far fewer opportunities for councils to cooperate. Since 1997, the New Labour government has promoted IMC through vertical drivers and the development; the operation of these policy initiatives is discussed following the framework of the editors. Current examples of IMC are notable for being driven by higher tiers of government, working with subordinate authorities in principal-agent relations. Collaboration between local authorities and intra-interand cross-sectoral partnerships are initiated by central government. In other words, IMC is shaped by hierarchical drivers from higher levels of government but, in practice, is locally varied and determined less by formula than by necessity and function. © 2007 Springer.
Resumo:
How does the non-executant state ensure that its agents are fulfilling their obligations to deliver nationally determined policies? In the case of elected local government in England and Wales, this function is carried out by the Audit Commission (AC) for Local Authorities and the Health Service for England and Wales. Since being established in 1983, it is the means by which local authorities are held to account by central government, both for its own purposes and on behalf of other interested stakeholders. Although the primary function of the AC is to ensure that local authorities are fulfilling their obligations, it does so by using different methods. By acting as a regulator, an independent expert, an opinion former and a mediator, the AC steers local authorities to ensure that they are compliant with the regulatory regime and are implementing legislation properly.
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.
Resumo:
This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.
Resumo:
The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.
Resumo:
The research objectives were:- 1.To review the literature to establish the factors which have traditionally been regarded as most crucial to the design of effectlve exhaust ventilation systems. 2. To design, construct, install and calibrate a wind tunnel. 3. To develop procedures for air velocity measurement followed by a comprehensive programme of aerodvnamic data collection and data analysis for a variety of conditions. The major research findings were:- a) The literature in the subject is inadequate. There is a particular need for a much greater understanding of the aerodynamics of the suction flow field. b) The discrepancies between the experimentally observed centre-line velocities and those predicted by conventional formulae are unacceptably large. c) There was little agreement between theoretically calculated and observed velocities in the suction zone of captor hoods. d) Improved empirical formulae for the prediction of centre-line velocity applicable to the classical geometrically shaped suction openings and the flanged condition could be (and were) derived. Further analysis of data revealed that: - i) Point velocity is directly proportional to the suction. flow rate and the ratio of the point velocity to the average face velocity is constant. ii) Both shape, and size of the suction opening are significant factors as the coordinates of their points govern the extent of the effect of the suction flow field. iii) The hypothetical ellipsoidal potential function and hyperbolic streamlines were found experimentally to be correct. iv) The effect of guide plates depends on the size, shape and the angle of fitting. The effect was to very approximately double the suction velocity but the exact effect is difficult to predict. v) The axially symmetric openings produce practically symmetric flow fields. Similarity of connection pieces between the suction opening and the main duct in each case is essential in order to induce a similar suction flow field. Additionally a pilot study was made in which an artificial extraneous air flow was created, measured and its interaction with the suction flow field measured and represented graphically.
Resumo:
The study utilized the advanced technology provided by automated perimeters to investigate the hypothesis that patients with retinitis pigmentosa behave atypically over the dynamic range and to concurrently determine the influence of extraneous factors on the format of the normal perimetric sensitivity profile. The perimetric processing of some patients with retinitis pigmentosa was considered to be abnormal in either the temporal and/or the spatial domain. The standard size III stimulus saturated the central regions and was thus ineffective in detecting early depressions in sensitivity in these areas. When stimulus size was scaled in inverse proportion to the square root of ganglion cell receptive field density (M-scaled), isosensitive profiles did not result, although cortical representation was theoretically equivalent across the visual field. It was conjectured that this was due to variations in the ganglion cell characteristics with increasing peripheral angle, most notably spatial summation. It was concluded that the development of perimetric routines incorporating stimulus sizes adjusted in proportion to the coverage factor of retinal ganglion cells would enhance the diagnostic capacity of perimetry. Good general and local correspondence was found between perimetric sensitivity and the available retinal cell counts. Intraocular light scatter arising both from simulations and media opacities depressed perimetric sensitivity. Attenuation was greater centrally for the smaller LED stimuli, whereas the reverse was true for the larger projected stimuli. Prior perimetric experience and pupil size also demonstrated eccentricity-dependent effect on sensitivity. Practice improved perimetric sensitivity for projected stimuli at eccentricities greater than or equal to 30o; particularly in the superior region. Increase in pupil size for LED stimuli enhanced sensitivity at eccentricities greater than 10o. Conversely, microfluctuation in the accommodative response during perimetric examination and the correction of peripheral refractive error had no significant influence on perimetric sensitivity.
Resumo:
Rare-earth co-doping in inorganic materials has a long-held tradition of facilitating highly desirable optoelectronic properties for their application to the laser industry. This study concentrates specifically on rare-earth phosphate glasses, (R2O3)x(R'2O3)y(P2O5)1-(x+y), where (R, R') denotes (Ce, Er) or (La, Nd) co-doping and the total rare-earth composition corresponds to a range between metaphosphate, RP3O9, and ultraphosphate, RP5O14. Thereupon, the effects of rare-earth co-doping on the local structure are assessed at the atomic level. Pair-distribution function analysis of high-energy X-ray diffraction data (Qmax = 28 Å-1) is employed to make this assessment. Results reveal a stark structural invariance to rare-earth co-doping which bears testament to the open-framework and rigid nature of these glasses. A range of desirable attributes of these glasses unfold from this finding; in particular, a structural simplicity that will enable facile molecular engineering of rare-earth phosphate glasses with 'dial-up' lasing properties. When considered together with other factors, this finding also demonstrates additional prospects for these co-doped rare-earth phosphate glasses in nuclear waste storage applications. This study also reveals, for the first time, the ability to distinguish between P-O and PO bonding in these rare-earth phosphate glasses from X-ray diffraction data in a fully quantitative manner. Complementary analysis of high-energy X-ray diffraction data on single rare-earth phosphate glasses of similar rare-earth composition to the co-doped materials is also presented in this context. In a technical sense, all high-energy X-ray diffraction data on these glasses are compared with analogous low-energy diffraction data; their salient differences reveal distinct advantages of high-energy X-ray diffraction data for the study of amorphous materials. © 2013 The Owner Societies.
Resumo:
These case studies from CIMA highlight the need to embed risk management within more easily understood behaviours, consistent with the overall organisational culture. In each case, some form of internal audit team provides either an oversight function or acts as an expert link in that feedback loop. Frontline staff, managers and specialists should be completely aligned on risk, in part just to ensure that there is a consistency of approach. They should understand instinctively that good performance includes good risk management. Tesco has continued to thrive during the recession and remains a robust and efficient group of businesses despite the emergence of potential threats around consumer spending and the supply chain. RBS, by contrast, has suffered catastrophic and very public failures of risk management despite a large in-house function and stiff regulation of risk controls. Birmingham City Council, like all local authorities, is adapting to more commercial modes of operation and is facing diverse threats and opportunities emerging as a result of social change. And DCMS, like many other public sector organisations, has to handle an incredibly complex network of delivery partners within the context of a relatively recent overhaul of central government risk management processes. Key Findings: •Risk management is no longer solely a financial discipline, nor is it simply a concern for the internal control function. •Where organisations retain a discrete risk management cadre – often specialists at monitoring and evaluating a range of risks – their success is dependent on embedding risk awareness in the wider culture of the enterprise. •Risk management is most successful when it is explicitly linked to operational performance. •Clear leadership, specific goals, excellent influencing skills and open-mindedness to potential threats and opportunities are essential for effective risk management. •Bureaucratic processes and systems can hamper good risk management – either as a result of a ‘box-ticking mentality’ or because managers and staff believe they do not need to consider risk themselves.
Resumo:
By contrast to the far-reaching devolution settlements elsewhere in the UK, political agreement on the governance of England outside London remains unsettled. There is cross- party consensus on the need to 'decentre down' authority to regions and localities, but limited agreement on how this should be achieved. This paper explores the welter of initiatives adopted by the recent Labour government that were ostensibly designed to make the meso-level of governance more coherent, accountable and responsive to meeting territorial priorities. Second, it explores the current Conservative-Liberal Democrat Coalition's programme of reform that involves the elimination of Labour's regional institutional architecture and is intended to restore powers to local government and communities and promote local authority co-operation around sub-regions. Labour's reforms were ineffective in achieving any substantial transfer of authority away from Whitehall and, given the Coalition's plans to cut public expenditure, the likelihood of any significant recalibration in central-local relations also appears improbable. © 2012 Copyright Taylor and Francis Group, LLC.