13 resultados para Mixed Elliptic Problems with Singular Interfaces
em Aston University Research Archive
Resumo:
The Vapnik-Chervonenkis (VC) dimension is a combinatorial measure of a certain class of machine learning problems, which may be used to obtain upper and lower bounds on the number of training examples needed to learn to prescribed levels of accuracy. Most of the known bounds apply to the Probably Approximately Correct (PAC) framework, which is the framework within which we work in this paper. For a learning problem with some known VC dimension, much is known about the order of growth of the sample-size requirement of the problem, as a function of the PAC parameters. The exact value of sample-size requirement is however less well-known, and depends heavily on the particular learning algorithm being used. This is a major obstacle to the practical application of the VC dimension. Hence it is important to know exactly how the sample-size requirement depends on VC dimension, and with that in mind, we describe a general algorithm for learning problems having VC dimension 1. Its sample-size requirement is minimal (as a function of the PAC parameters), and turns out to be the same for all non-trivial learning problems having VC dimension 1. While the method used cannot be naively generalised to higher VC dimension, it suggests that optimal algorithm-dependent bounds may improve substantially on current upper bounds.
Resumo:
Cadogan and Lee (this issue) discuss the problems inherent in modeling formative latent variables as endogenous. In response to the commentaries by Rigdon (this issue) and Finn and Wang (this issue), the present article extends the discussion on formative measures. First, the article shows that regardless of whether statistical identification is achieved, researchers are unable to illuminate the nature of a formative latent variable. Second, the study clarifies issues regarding formative indicator weighting, highlighting that the weightings of formative components should be specified as part of the construct definition. Finally, the study shows that higher-order reflective constructs are invalid, highlights the damage their use can inflict on theory development and knowledge accumulation, and provides recommendations on a number of alternative models which should be used in their place (including the formative model). © 2012 Elsevier Inc.
Resumo:
Legislation: Law of Property (Miscellaneous Provisions) Act 1989 (c.34) s.2 Case: Healey v Brown [2002] W.T.L.R. 849 (Ch D) Paper looks at the use of mutual wills in practice. An empirical survey of probate solicitors is carried out and the results analysed. Significantly most solicitors seem, unaware of the controversial ruling as regards mutual wills in Healey v Brown and the impact of S.2 Law of Property Miscellaneous Provisions Act 1989 where land is concerned. Unsuprisingly the survey demonstrates that mutual wills are not commonly used and tend to be avoided by practising solicitors.
Resumo:
This paper presents a simulated genetic algorithm (GA) model of scheduling the flow shop problem with re-entrant jobs. The objective of this research is to minimize the weighted tardiness and makespan. The proposed model considers that the jobs with non-identical due dates are processed on the machines in the same order. Furthermore, the re-entrant jobs are stochastic as only some jobs are required to reenter to the flow shop. The tardiness weight is adjusted once the jobs reenter to the shop. The performance of the proposed GA model is verified by a number of numerical experiments where the data come from the case company. The results show the proposed method has a higher order satisfaction rate than the current industrial practices.
Resumo:
The activity of a silica-supported BF3–methanol solid acid catalyst in the cationic polymerisation of an industrial aromatic C9 feedstock has been investigated. Reuse has been achieved under continuous conditions. Titration of the catalyst acid sites with triethylphosphine oxide (TEPO) in conjunction with 31P MAS NMR shows the catalyst to have two types of acid sites. Further analysis with 2,6 di-tert-butyl-4-methylpyridine (DBMP) has revealed the majority of these acid sites to be Brønsted in nature. The role of α-methylstyrene in promoting resin polymerisation via chain transfer is proposed.
Resumo:
One of the main problems with the use of synthetic polymers as biomaterials is the invasion of micro-organisms causing infection. A study of the properties of polymeric antibacterial agents, in particular polyhexamethylene biguanide, has revealed that the essential components for the design of a novel polymeric antibacterial are a balance between hydrophilicity and hydrophobicity coupled with sites of cationicity. The effect of cation incorporation on the physical properties of hydrogels has been investigated. Hydrogel systems copolymerised with either N-vinyl imidazole or dimethylaminoethyl methacrylate have been characterised in terms of their water binding, mechanical and surface properties. It has been concluded that the incorporation of these monomers does not adversely affect the properties of such hydrogels and that these materials are potential candidates for further development for use in biomedical applications. It has been reported that hydro gels with ionic character may increase the deposition of biological material onto the hydrogel surface when it is in contact with body fluids. An investigation into the deposition characteristics of hydrogels containing the potentially cationic monomers has been carried out, using specific protein adsorption and in vitro spoilation techniques. The results suggest that at low levels of cationicity, the deposition of positively charged proteins is reduced without adversely affecting the uptake of the other proteins. The gross deposition characteristics were found to be comparable to some commercially available contact lens materials. A preliminary investigation into the development of novel antibacterial polymers has been completed and some novel methods of bacterial inhibition discussed. These methods include development of an hydrogel whose potential application is as a catheter coating.
Resumo:
Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.
Resumo:
PURPOSE: To determine the objective measures of visual function that are most relevant to subjective quality of vision and perceived reading ability in patients with acquired macular disease. METHODS: Twenty-eight patients with macular disease underwent a comprehensive assessment of visual function. The patients also completed a vision-related quality-of-life questionnaire that included a section of general questions about perceived visual performance and a section with specific questions on reading. RESULTS: Results of all tests of vision correlated highly with reported vision-related quality-of-life impairment. Low-contrast tests explained most of the variance in self-reported problems with reading. Text-reading speed correlated highly with overall concern about vision. CONCLUSIONS: Reading performance is strongly associated with vision-related quality of life. High-contrast distance acuity is not the only relevant measure of visual function in relation to the perceived visual performance of a patient with macular disease. The results suggest the importance of print contrast, even over print size, in reading performance in patients with acquired macular disease.
Resumo:
The initial aim of this project was to improve the performance of a chromatographic bioreactor-separator (CBRS). In such a system, a dilute enzyme solution is pumped continuously through a preparative chromatographic column, while pulses of substrate are periodically injected on to the column. Enzymic reaction and separation are therefore performed in a single unit operation. The chromatographic columns used were jacketed glass columns ranging from 1 to 2 metres long with an internal diameter of 1.5 cm. Linking these columns allowed 1, 2, 3 and 4 metre long CBRS systems to be constructed. The hydrolysis of lactose in the presence of β~galactosidase was the reaction of study. From previous work at Aston University, there appeared to be no difficulties in achieving complete lactose hydrolysis in a CBRS. There did, however, appear to be scope for improving the separative performance, so this was adopted as an initial goal. Reducing the particle size of the stationary phase was identified as a way of achieving this improvement. A cation exchange resin was selected which had an average particle size of around half that previously used when studying this reaction. A CBRS system was developed which overcame the operational problems (such as high pressure drop development) associated with use of such a particle size. A significant improvement in separative power was achieved. This was shown by an increase in the number of theoretical plates (N) from about 500 to about 3000 for a 2 metre long CBRS, coupled with higher resolution. A simple experiment with the 1 metre column showed that combined bioreaction and separation was achievable in this system. Having improved the separative performance of the system, the factors affecting enzymic reaction in a CBRS were investigated; including pulse volume and the degree of mixing between enzyme and substrate. The progress of reaction in a CBRS was then studied. This information was related to the interaction of reaction and separation over the reaction zone. The effect of injecting a pulse over a length of time as in CBRS operation was simulated by fed batch experiments. These experiments were performed in parallel with normal batch experiments where the substrate is mixed almost instantly with the enzyme. The batch experiments enabled samples to be taken every minute and revealed that reaction is very rapid. The hydrodynamic characteristics of the two injector configurations used in CBRS construction were studied using Magnetic Resonance Imaging, combined with hydrodynamic calculations. During the optimisation studies, galactooligosaccharides (GOS) were detected as intermediates in the hydrolysis process. GOS are valuable products with potential and existing applications in food manufacture (as nutraceuticals), medicine and drug targeting. The focus of the research was therefore turned to GOS production. A means of controlling reaction to arrest break down of GOS was required. Raising temperature was identified as a possible means of achieving this within a CBRS. Studies were undertaken to optimise the yield of oligosaccharides, culminating in the design, construction and evaluation of a Dithermal Chromatographic Bioreactor-separator.
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
This thesis considers sparse approximation of still images as the basis of a lossy compression system. The Matching Pursuit (MP) algorithm is presented as a method particularly suited for application in lossy scalable image coding. Its multichannel extension, capable of exploiting inter-channel correlations, is found to be an efficient way to represent colour data in RGB colour space. Known problems with MP, high computational complexity of encoding and dictionary design, are tackled by finding an appropriate partitioning of an image. The idea of performing MP in the spatio-frequency domain after transform such as Discrete Wavelet Transform (DWT) is explored. The main challenge, though, is to encode the image representation obtained after MP into a bit-stream. Novel approaches for encoding the atomic decomposition of a signal and colour amplitudes quantisation are proposed and evaluated. The image codec that has been built is capable of competing with scalable coders such as JPEG 2000 and SPIHT in terms of compression ratio.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
A numerical method for the Dirichlet initial boundary value problem for the heat equation in the exterior and unbounded region of a smooth closed simply connected 3-dimensional domain is proposed and investigated. This method is based on a combination of a Laguerre transformation with respect to the time variable and an integral equation approach in the spatial variables. Using the Laguerre transformation in time reduces the parabolic problem to a sequence of stationary elliptic problems which are solved by a boundary layer approach giving a sequence of boundary integral equations of the first kind to solve. Under the assumption that the boundary surface of the solution domain has a one-to-one mapping onto the unit sphere, these integral equations are transformed and rewritten over this sphere. The numerical discretisation and solution are obtained by a discrete projection method involving spherical harmonic functions. Numerical results are included.