14 resultados para local iterated function systems
em Aston University Research Archive
Resumo:
The research objectives were:- 1.To review the literature to establish the factors which have traditionally been regarded as most crucial to the design of effectlve exhaust ventilation systems. 2. To design, construct, install and calibrate a wind tunnel. 3. To develop procedures for air velocity measurement followed by a comprehensive programme of aerodvnamic data collection and data analysis for a variety of conditions. The major research findings were:- a) The literature in the subject is inadequate. There is a particular need for a much greater understanding of the aerodynamics of the suction flow field. b) The discrepancies between the experimentally observed centre-line velocities and those predicted by conventional formulae are unacceptably large. c) There was little agreement between theoretically calculated and observed velocities in the suction zone of captor hoods. d) Improved empirical formulae for the prediction of centre-line velocity applicable to the classical geometrically shaped suction openings and the flanged condition could be (and were) derived. Further analysis of data revealed that: - i) Point velocity is directly proportional to the suction. flow rate and the ratio of the point velocity to the average face velocity is constant. ii) Both shape, and size of the suction opening are significant factors as the coordinates of their points govern the extent of the effect of the suction flow field. iii) The hypothetical ellipsoidal potential function and hyperbolic streamlines were found experimentally to be correct. iv) The effect of guide plates depends on the size, shape and the angle of fitting. The effect was to very approximately double the suction velocity but the exact effect is difficult to predict. v) The axially symmetric openings produce practically symmetric flow fields. Similarity of connection pieces between the suction opening and the main duct in each case is essential in order to induce a similar suction flow field. Additionally a pilot study was made in which an artificial extraneous air flow was created, measured and its interaction with the suction flow field measured and represented graphically.
Resumo:
Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.
Resumo:
This article investigates the determinants of foreign direct investment (FDI)location across Italian provinces. Specifically it examines the relationship between industry- specific local industrial systems and the location of inward FDI. This extends previous analysis beyond the mere density of activity, to illustrate the importance of the specific nature of agglomerations in attracting inward investment. The article develops a model of FDI location choice using a unique FDI database stratified by industry and province. The results also suggest that the importance of agglomeration differs between industries, and offers some explanation for this.
Resumo:
The thrust of the argument presented in this chapter is that inter-municipal cooperation (IMC) in the United Kingdom reflects local government's constitutional position and its exposure to the exigencies of Westminster (elected central government) and Whitehall (centre of the professional civil service that services central government). For the most part councils are without general powers of competence and are restricted in what they can do by Parliament. This suggests that the capacity for locally driven IMC is restricted and operates principally within a framework constructed by central government's policy objectives and legislation and the political expediencies of the governing political party. In practice, however, recent examples of IMC demonstrate that the practices are more complex than this initial analysis suggests. Central government may exert top-down pressures and impose hierarchical directives, but there are important countervailing forces. Constitutional changes in Scotland and Wales have shifted the locus of central- local relations away from Westminster and Whitehall. In England, the seeding of English government regional offices in 1994 has evolved into an important structural arrangement that encourages councils to work together. Within the local government community there is now widespread acknowledgement that to achieve the ambitious targets set by central government, councils are, by necessity, bound to cooperate and work with other agencies. In recent years, the fragmentation of public service delivery has affected the scope of IMC. Elected local government in the UK is now only one piece of a complex jigsaw of agencies that provides services to the public; whether it is with non-elected bodies, such as health authorities, public protection authorities (police and fire), voluntary nonprofit organisations or for-profit bodies, councils are expected to cooperate widely with agencies in their localities. Indeed, for projects such as regeneration and community renewal, councils may act as the coordinating agency but the success of such projects is measured by collaboration and partnership working (Davies 2002). To place these developments in context, IMC is an example of how, in spite of the fragmentation of traditional forms of government, councils work with other public service agencies and other councils through the medium of interagency partnerships, collaboration between organisations and a mixed economy of service providers. Such an analysis suggests that, following changes to the system of local government, contemporary forms of IMC are less dependent on vertical arrangements (top-down direction from central government) as they are replaced by horizontal modes (expansion of networks and partnership arrangements). Evidence suggests, however that central government continues to steer local authorities through the agency of inspectorates and regulatory bodies, and through policy initiatives, such as local strategic partnerships and local area agreements (Kelly 2006), thus questioning whether, in the case of UK local government, the shift from hierarchy to network and market solutions is less differentiated and transformation less complete than some literature suggests. Vertical or horizontal pressures may promote IMC, yet similar drivers may deter collaboration between local authorities. An example of negative vertical pressure was central government's change of the systems of local taxation during the 1980s. The new taxation regime replaced a tax on property with a tax on individual residency. Although the community charge lasted only a few years, it was a highpoint of the then Conservative government policy that encouraged councils to compete with each other on the basis of the level of local taxation. In practice, however, the complexity of local government funding in the UK rendered worthless any meaningful ambition of councils competing with each other, especially as central government granting to local authorities is predicated (however imperfectly) on at least notional equalisation between those areas with lower tax yields and the more prosperous locations. Horizontal pressures comprise factors such as planning decisions. Over the last quarter century, councils have competed on the granting of permission to out-of-town retail and leisure complexes, now recognised as detrimental to neighbouring authorities because economic forces prevail and local, independent shops are unable to compete with multiple companies. These examples illustrate tensions at the core of the UK polity of whether IMC is feasible when competition between local authorities heightened by local differences reduces opportunities for collaboration. An alternative perspective on IMC is to explore whether specific purposes or functions promote or restrict it. Whether in the principle areas of local government responsibilities relating to social welfare, development and maintenance of the local infrastructure or environmental matters, there are examples of IMC. But opportunities have diminished considerably as councils lost responsibility for services provision as a result of privatisation and transfer of powers to new government agencies or to central government. Over the last twenty years councils have lost their role in the provision of further-or higher-education, public transport and water/sewage. Councils have commissioning power but only a limited presence in providing housing needs, social care and waste management. In other words, as a result of central government policy, there are, in practice, currently far fewer opportunities for councils to cooperate. Since 1997, the New Labour government has promoted IMC through vertical drivers and the development; the operation of these policy initiatives is discussed following the framework of the editors. Current examples of IMC are notable for being driven by higher tiers of government, working with subordinate authorities in principal-agent relations. Collaboration between local authorities and intra-interand cross-sectoral partnerships are initiated by central government. In other words, IMC is shaped by hierarchical drivers from higher levels of government but, in practice, is locally varied and determined less by formula than by necessity and function. © 2007 Springer.
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.
Resumo:
The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.
Resumo:
This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.
Resumo:
Target-specific delivery has become an integral area of research in order to increase bioavailability and reduce the toxic effects of drugs. As a drug-delivery option, trigger-release liposomes offer sophisticated targeting and greater control-release capabilities. These are broadly divided into two categories; those that utilise the local environment of the target site where there may be an upregulation in certain enzymes or a change in pH and those liposomes that are triggered by an external physical stimulus such as heat, ultrasound or light. These release mechanisms offer a greater degree of control over when and where the drug is released; furthermore, targeting of diseased tissue is enhanced by incorporation of target-specific components such as antibodies. This review aims to show the development of such trigger release liposome systems and the current research in this field.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
These case studies from CIMA highlight the need to embed risk management within more easily understood behaviours, consistent with the overall organisational culture. In each case, some form of internal audit team provides either an oversight function or acts as an expert link in that feedback loop. Frontline staff, managers and specialists should be completely aligned on risk, in part just to ensure that there is a consistency of approach. They should understand instinctively that good performance includes good risk management. Tesco has continued to thrive during the recession and remains a robust and efficient group of businesses despite the emergence of potential threats around consumer spending and the supply chain. RBS, by contrast, has suffered catastrophic and very public failures of risk management despite a large in-house function and stiff regulation of risk controls. Birmingham City Council, like all local authorities, is adapting to more commercial modes of operation and is facing diverse threats and opportunities emerging as a result of social change. And DCMS, like many other public sector organisations, has to handle an incredibly complex network of delivery partners within the context of a relatively recent overhaul of central government risk management processes. Key Findings: •Risk management is no longer solely a financial discipline, nor is it simply a concern for the internal control function. •Where organisations retain a discrete risk management cadre – often specialists at monitoring and evaluating a range of risks – their success is dependent on embedding risk awareness in the wider culture of the enterprise. •Risk management is most successful when it is explicitly linked to operational performance. •Clear leadership, specific goals, excellent influencing skills and open-mindedness to potential threats and opportunities are essential for effective risk management. •Bureaucratic processes and systems can hamper good risk management – either as a result of a ‘box-ticking mentality’ or because managers and staff believe they do not need to consider risk themselves.
Resumo:
MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.
Resumo:
A modern electronic nonlinearity equalizer (NLE) based on inverse Volterra series transfer function (IVSTF) with reduced complexity is applied on coherent optical orthogonal frequency-division multiplexing (CO-OFDM) signals for next-generation long- and ultra-long-haul applications. The OFDM inter-subcarrier crosstalk effects are explored thoroughly using the IVSTF-NLE and compared with the case of linear equalization (LE) for transmission distances of up to 7000 km. © 2013 IEEE.