954 resultados para Local linearization methods
Resumo:
In this paper, a novel approach is developed to evaluate the overall performance of a local area network as well as to monitor some possible intrusion detections. The data is obtained via system utility 'ping' and huge data is analyzed via statistical methods. Finally, an overall performance index is defined and simulation experiments in three months proved the effectiveness of the proposed performance index. A software package is developed based on these ideas.
Resumo:
We study the equilibrium states of energy functions involving a large set of real variables, defined on the links of sparsely connected networks, and interacting at the network nodes, using the cavity and replica methods. When applied to the representative problem of network resource allocation, an efficient distributed algorithm is devised, with simulations showing full agreement with theory. Scaling properties with the network connectivity and the resource availability are found. © 2006 The American Physical Society.
Resumo:
How does the non-executant state ensure that its agents are fulfilling their obligations to deliver nationally determined policies? In the case of elected local government in England and Wales, this function is carried out by the Audit Commission (AC) for Local Authorities and the Health Service for England and Wales. Since being established in 1983, it is the means by which local authorities are held to account by central government, both for its own purposes and on behalf of other interested stakeholders. Although the primary function of the AC is to ensure that local authorities are fulfilling their obligations, it does so by using different methods. By acting as a regulator, an independent expert, an opinion former and a mediator, the AC steers local authorities to ensure that they are compliant with the regulatory regime and are implementing legislation properly.
Resumo:
Prices and yields of UK government zero-coupon bonds are used to test alternative yield curve estimation models. Zero-coupon bonds permit a more pure comparison, as the models are providing only the interpolation service and also not making estimation feasible. It is found that better yield curves estimates are obtained by fitting to the yield curve directly rather than fitting first to the discount function. A simple procedure to set the smoothness of the fitted curves is developed, and a positive relationship between oversmoothness and the fitting error is identified. A cubic spline function fitted directly to the yield curve provides the best overall balance of fitting error and smoothness, both along the yield curve and within local maturity regions.
Resumo:
The pattern of illumination on an undulating surface can be used to infer its 3-D form (shape-from-shading). But the recovery of shape would be invalid if the luminance changes actually arose from changes in reflectance. So how does vision distinguish variation in illumination from variation in reflectance to avoid illusory depth? When a corrugated surface is painted with an albedo texture, the variation in local mean luminance (LM) due to shading is accompanied by a similar modulation in local luminance amplitude (AM). This is not so for reflectance variation, nor for roughly textured surfaces. We used depth mapping and paired comparison methods to show that modulations of local luminance amplitude play a role in the interpretation of shape-from-shading. The shape-from-shading percept was enhanced when LM and AM co-varied (in-phase) and was disrupted when they were out of phase or (to a lesser degree) when AM was absent. The perceptual differences between cue types (in-phase vs out-of-phase) were enhanced when the two cues were present at different orientations within a single image. Our results suggest that when LM and AM co-vary (in-phase) this indicates that the source of variation is illumination (caused by undulations of the surface), rather than surface reflectance. Hence, the congruence of LM and AM is a cue that supports a shape-from-shading interpretation. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
This thesis introduces a flexible visual data exploration framework which combines advanced projection algorithms from the machine learning domain with visual representation techniques developed in the information visualisation domain to help a user to explore and understand effectively large multi-dimensional datasets. The advantage of such a framework to other techniques currently available to the domain experts is that the user is directly involved in the data mining process and advanced machine learning algorithms are employed for better projection. A hierarchical visualisation model guided by a domain expert allows them to obtain an informed segmentation of the input space. Two other components of this thesis exploit properties of these principled probabilistic projection algorithms to develop a guided mixture of local experts algorithm which provides robust prediction and a model to estimate feature saliency simultaneously with the training of a projection algorithm.Local models are useful since a single global model cannot capture the full variability of a heterogeneous data space such as the chemical space. Probabilistic hierarchical visualisation techniques provide an effective soft segmentation of an input space by a visualisation hierarchy whose leaf nodes represent different regions of the input space. We use this soft segmentation to develop a guided mixture of local experts (GME) algorithm which is appropriate for the heterogeneous datasets found in chemoinformatics problems. Moreover, in this approach the domain experts are more involved in the model development process which is suitable for an intuition and domain knowledge driven task such as drug discovery. We also derive a generative topographic mapping (GTM) based data visualisation approach which estimates feature saliency simultaneously with the training of a visualisation model.
Resumo:
This sustained longitudinal study, carried out in a single local authority, investigates the implementation of a Total Quality Management (TQM) philosophy in professional local government services. At the start of this research, a large majority of what was written about TQM was polemical and based on limited empirical evidence. This thesis seeks to provide a significant and important piece of work, making a considerable contribution to the current state of knowledge in this area. Teams from four professional services within a single local authority participated in this research, providing the main evidence on how the quality management agenda in a local authority can be successfully implemented. To supplement this rich source of data, various other sources and methods of data collection have been used: 1) Interviews were carried out with senior managers from within the authority; 2) Customer focus groups and questionnaires were used; 3) Interviews were carried out with other organisations, all of which were proponents of a TQM philosophy. A number of tools have been developed to assist in gathering data: 1) The CSFs (critical success factors) benchmarking tool; 2) Five Stages of Quality Improvement Model. A Best Practice Quality Improvement Model, arising from an analysis of the literature and the researcher's own experience is proposed and tested. From the results a number of significant conclusions have been drawn relating to: 1) Triggers for change; 2) Resistance of local government professionals to change 3) Critical success factors and barriers to quality improvement in professional local government services; 4) The problems associated with participant observation and other methodological issues used.
Resumo:
The study addresses the introduction of an innovation of new technology into a bureaucratic profession. The organisational setting is that of local authority secondary schools at a time at which microcomputers were being introduced in both the organisational core (for teaching) and its periphery (school administration). The research studies innovation-adopting organisations within their sectoral context; key actors influencing the innovation are identified at the levels of central government, local government and schools.A review of the literature on new technology and innovation (including educational innovation), and on schools as organisations in a changing environment leads to the development of the conceptual framework of the study using a resource dependency model within a cycle of the acquisition, allocation and utilisation of financial, physical and intangible resources. The research methodology is longitudinal and draws from both positivist and interpretive traditions. lt includes an initial census of the two hundred secondary schools in four local education authorities, a final survey of the same population, and four case studies, using both interview methods and documentation. Two modes of innovation are discerned. In respect of administrative use a rationalising, controlling mode is identified, with local education authorities developing standardised computer-assisted administrative systems for use in schools. In respect of curricular use, in contrast, teachers have been able to maintain an indeterminate occupational knowledge base, derived from an ideology of professionalism in respect of the classroom use of the technology. The mode of innovation in respect of curricular use has been one of learning and enabling. The resourcing policies of central and local government agencies affect the extent of use of the technology for teaching purposes, but the way in which it is used is determined within individual schools, where staff with relevant technical expertise significantly affect the course of the innovation.
Resumo:
This thesis describes an investigation into a Local Authority's desire to use its airport to aid regional economic growth. Short studies on air freight. the impact of an airport on the local economy, incoming tourism. and the factors influencing airlines in their use of airports, show that this desire is valid. but only in so far as the airport enables air services to be provided. A survey of airlines. conducted to remedy some deficiencies in the documented knowledge on airline decision-making criteria. indicates that there is cause for concern about the methods used to develop air services. A comparison with the West German network suggests that Birmingham is underprovided with international scheduled flights, and reinforces the survey conclusion that an airport authority must become actively involved in the development of air services. Participation in the licence applications of two airlines to use Birmingham Airport confirms the need for involvement but without showing the extent of the influence which an airport authority may exert. The conclusion is reached that in order to fulfill its development potential, an airport must be marketed to both the general public and the air transport industry. There is also a need for a national air services plan.
Resumo:
Localised, targeted drug delivery to the oesophagus offers the potential for more effective delivery and reduced drug dosages, coupled with increased patient compliance. This thesis considers bioadhesive liquids, orally retained tablets and films as well as chewable dosage forms as drug delivery systems to target the oesophagus. Miconazole nitrate was used as a model antifungal agent. Chitosan and xanthan gum hydrogels were evaluated as viscous polymer viables with the in vitro retention, drug release and minimum inhibitory concentration values of the formulations measured. Xanthan showed prolonged retention on the oesophageal surface in vitro yet chitosan reduced the MIC value; both polymers offer potential for local targeting to the oesophagus. Cellulose derivatives were investigated within orally retained dosage forms. Both drug and polymer dissolution rates were measured to investigate the drug release mechanism and to develop a formulation with concomitant drug and polymer release to target the oesophagus with solubilised drug within a viscous media. Several in vitro dissolution methods were evaluated to measure drug release from chewable dosage forms with both drug and polymer dissolution quantified to investigate the effects of dissolution apparatus on drug release. The results from this thesis show that a range of drug delivery strategies that can be used to target drug to the oesophagus. The composition of these formulations as well as the methodology used within the development are crucial to best understand the formulation and predict its performance in vivo.
Resumo:
Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.
Resumo:
Objective-We previously demonstrated that upregulation of intermediate-conductance Ca2+ -activated K+ channels (KCa 3.1) is necessary for mitogen-induced phenotypic modulation in isolated porcine coronary smooth muscle cells (SMCs). The objective of the present study was to determine the role of KCa3.1 in the regulation of coronary SMC phenotypic modulation in vivo using a swine model of postangioplasty restenosis. Methods and Results-Balloon angioplasty was performed on coronary arteries of swine using either noncoated or balloons coated with the specific KCa3.1 blocker TRAM-34. Expression of KCa3.1, c-jun, c-fos, repressor element-1 silencing transcription factor (REST), smooth muscle myosin heavy chain (SMMHC), and myocardin was measured using qRT-PCR in isolated medial cells 2 hours and 2 days postangioplasty. KCa3.1, c-jun, and c-fos mRNA levels were increased 2 hours postangioplasty, whereas REST expression decreased. SMMHC expression was unchanged at 2 hours, but decreased 2 days postangioplasty. Use of TRAM-34 coated balloons prevented KCa3.1 upregulation and REST downregulation at 2 hours, SMMHC and myocardin downregulation at 2 days, and attenuated subsequent restenosis 14 and 28 days postangioplasty. Immunohistochemical analysis demonstrated corresponding changes at the protein level. Conclusion-Blockade of KCa3.1 by delivery of TRAM-34 via balloon catheter prevented smooth muscle phenotypic modulation and limited subsequent restenosis. © 2008 American Heart Association, Inc.
Resumo:
Purpose: Although significant amounts of vertical misalignment could have a noticeable effect on visual performance, there is no conclusive evidence about the effect of very small amount of vertical disparity on stereopsis and binocular vision. Hence, the aim of this study was to investigate the effects of induced vertical disparity on local and global stereopsis at near. Materials and Methods: Ninety participants wearing best-corrected refraction had local and global stereopsis tested with 0.5 and 1.0 prism diopter (Δ) vertical prism in front of their dominant and non-dominant eye in turn. This was compared to local and global stereopsis in the same subjects without vertical prism. Data were analyzed in SPSS.17 software using the independent samples T and the repeated measures ANOVA tests. Results: Induced vertical disparity decreases local and global stereopsis. This reduction is greater when vertical disparity is induced in front of the non-dominant eye and affects global more than local stereopsis. Repeated measures ANOVA showed differences in the mean stereopsis between the different measured states for local and global values. Local stereopsis thresholds were reduced by 10s of arc or less on average with 1.0Δ of induced vertical prism in front of either eye. However, global stereopsis thresholds were reduced by over 100s of arc by the same 1.0Δ of induced vertical prism. Conclusion: Induced vertical disparity affects global stereopsis thresholds by an order of magnitude (or a factor of 10) more than local stereopsis. Hence, using a test that measures global stereopsis such as the TNO is more sensitive to vertical misalignment than a test such as the Stereofly that measures local stereopsis. © 2014 Informa Healthcare USA, Inc. All rights reserved: reproduction in whole or part not permitted.
Resumo:
Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.