20 resultados para Segmented thermoplastic

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present results that compare the performance of neural networks trained with two Bayesian methods, (i) the Evidence Framework of MacKay (1992) and (ii) a Markov Chain Monte Carlo method due to Neal (1996) on a task of classifying segmented outdoor images. We also investigate the use of the Automatic Relevance Determination method for input feature selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A homologous series of ultra-violet stabilisers containing 2-hydroxybenzophenone (HBP) moiety as a uv absorbing chromophore with varying alkyl chain lengths and sizes were prepared by known chemical synthesis. The strong absorbance of the HBP chromophore was utilized to evaluate the concentration of these stabilisers in low density polyethylene films and concentration of these stabilisers in low density polyethylene films and in relevant solvents by ultra-violet/visible spectroscopy. Intrinsic diffusion coefficients, equilibrium solubilities, volatilities from LDPE films and volatility of pure stabilisers were studied over a temperature range of 5-100oC. The effects of structure, molecular weight and temperature on the above parameters were investigated and the results were analysed on the basis of theoretical models published in the literature. It has been found that an increase in alkyl chain lengths does not change the diffusion coefficients to a significant level, while attachment of polar or branched alkyl groups change their value considerably. An Arrhenius type of relationship for the temperature dependence of diffusion coefficients seems to be valid only for a narrow temperature range, and therefore extrapolation of data from one temperature to another leads to a considerable error. The evidence showed that increase in additive solubility in the polymer is favoured by lower heat of fusions and melting points of additives. This implies the validity of simple regular solution theory to provide an adequate basis for understanding the solubility of additives in polymers The volubility of stabilisers from low density polyethylene films showed that of an additive from a polymer can be expressed in terms of a first-order kinetic equation. In addition the rate of loss of stabilisers was discussed in relation to its diffusion, solubility and volatility and found that all these factors may contribute to the additive loss, although one may be a rate determining factor. Stabiliser migration from LDPE into various solvents and food simulants was studied at temperatures 5, 23, 40 and 70oC; from the plots of rate of migration versus square root time, characteristic diffusion coefficients were obtained by using the solution of Fick's diffusion equations. It was shown that the rate of migration depends primarily on partition coefficients between solvent and the polymer of the additive and also on the swelling action of the contracting media. Characteristic diffusion coefficients were found to approach to intrinsic values in non swelling solvents, whereas in the case of highly swollen polymer samples, the former may be orders of magnitude greater than the latter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To assess the visual performance and subjective experience of eyes implanted with a new bi-aspheric, segmented, multifocal intraocular lens: the Mplus X (Topcon Europe Medical, Capelle aan den IJssel, Netherlands). METHODS: Seventeen patients (mean age: 64.0 ± 12.8 years) had binocular implantation (34 eyes) with the Mplus X. Three months after the implantation, assessment was made of: manifest refraction; uncorrected and corrected distance visual acuity; uncorrected and distance corrected near visual acuity; defocus curves in photopic conditions; contrast sensitivity; halometry as an objective measure of glare; and patient satisfaction with unaided near vision using the Near Acuity Visual Questionnaire. RESULTS: Mean residual manifest refraction was -0.13 ± 0.51 diopters (D). Twenty-five eyes (74%) were within a mean spherical equivalent of ±0.50 D. Mean uncorrected distance visual acuity was +0.10 ± 0.12 logMAR monocularly and 0.02 ± 0.09 logMAR binocularly. Thirty-two eyes (94%) could read 0.3 or better without any reading correction and all patients could read 0.3 or better with a reading correction. Mean monocular uncorrected near visual acuity was 0.18 ± 0.16 logMAR, improving to 0.15 ± 0.15 logMAR with distance correction. Mean binocular uncorrected near visual acuity was 0.11 ± 0.11 logMAR, improving to 0.09 ± 0.12 logMAR with distance correction. Mean binocular contrast sensitivity was 1.75 ± 0.14 log units at 3 cycles per degree, 1.88 ± 0.20 log units at 6 cycles per degree, 1.66 ± 0.19 log units at 12 cycles per degree, and 1.11 ± 0.20 log units at 18 cycles per degree. Mean binocular and monocular halometry showed a glare profile of less than 1° of debilitating light scatter. Mean Near Acuity Visual Questionnaire Rasch score (0 = no difficulty, 100 = extreme difficulty) for satisfaction for near vision was 20.43 ± 14.64 log-odd units. CONCLUSIONS: The Mplus X provides a good visual outcome at distance and near with minimal dysphotopsia. Patients were very satisfied with their uncorrected near vision. © SLACK Incorporated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poly(l-lactide) (PLL) has been blended with a polycaprolactone-based thermoplastic polyurethane (TPU) elastomer as a toughening agent and a poly(l-lactide-co-caprolactone) (PLLCL) copolymer as a compatibilizer. Both 2-component (PLL/TPU) and 3-component (PLL/TPU/PLLCL) blends were prepared by melt mixing, characterized, hot-pressed into thin sheets and their tensile properties tested. The results showed that, although the TPU could toughen the PLL, the blends were largely immiscible leading to phase separation. However, addition of the PLLCL copolymer improved blend compatibility. The best all-round properties were found for the 3-component blend of composition PLL/TPU/PLLCL = 90/10/10 parts by weight.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses the extent to which the equity markets of Hungary, Poland the Czech Republic and Russia have become less segmented. Using a variety of tests it is shown there has been a consistent increase in the co-movement of some Eastern European markets and developed markets. Using the variance decompositions from a vector autoregressive representation of returns it is shown that for Poland and Hungary global factors are having an increasing influence on equity returns, suggestive of increased equity market integration. In this paper we model a system of bivariate equity market correlations as a smooth transition logistic trend model in order to establish how rapidly the countries of Eastern Europe are moving away from market segmentation. We find that Hungary is the country which is becoming integrated the most quickly. © 2005 ELsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research has suggested that the A and B share markets of China may be informationally segmented. In this paper volatility patterns in the A and B share market are studied to establish whether volatility changes to the A and B share markets are synchronous. A consequence of new information, when investors act upon it is that volatility rises. This means that if the A and B markets are perfectly integrated volatility changes to each market would be expected to occur at the same time. However, if they are segmented there is no reason for volatility changes to occur on the same day. Using the iterative cumulative sum of squares across the different markets. Evidence is found of integration between the two A share markets but not between the A and B markets. © 2005 Taylor & Francis Group Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cationic polymerisation of various monomers, including cyclic ethers bearing energetic nitrate ester (-ON02) groups, substituted styrenes and isobutylene has been investigated. The main reaction studied has been the ring-opening polymerisation of 3- (nitratomethyl)-3-methyl oxetane (NIMMO) using the alcohol/BF3.0Et2 binary initiator system. A series of di-, tri- and tetrafunctional telechelic polymers has been synthesised. In order to optimise the system, achieve controlled molecular weight polymers and understand the mechanism of polymerisation the effects of certain parameters on the molecular weight distribution, as determined by Size Exclusion Chromatography, have been examined. This shows the molecular weight achieved depends on a combination of factors including -OH concentration, addition rate of monomer and, most importantly, temperature. A lower temperature and OH concentration tends to produce higher molecular weight, whereas, slower addition rates of monomer, either have no significant effect or produce a lower molecular weight polymer. These factors were used to increase the formation of a cyclic oligomer, by a side reaction, and suggest, that the polymerisation of NIMMO is complicated with endbiting and back biting reactions, along with other transfer/termination processes. These observations appear to fit the model of an active-chain end mechanism. Another cyclic monomer, glycidyl nitrate (GLYN), has been polymerised by the activated monomer mechanism. Various other monomers have been used to end-cap the polymer chains to produce hydroxy ends which are expected to form more stable urethane links, than the glycidyl nitrate ends, when cured with isocyanates. A novel monomer, butadiene oxide dinitrate (BODN), has been prepared and its homopolymerisation and copolymerisation with GL YN studied. In concurrent work the carbocationic polymerisations of isobutylene or substituted styrenes have been studied. Materials with narrow molecular weight distributions have been prepared using the diphenyl phosphate/BCl3 initiator. These systems and monomers are expected to be used in the synthesis of thermoplastic elastomers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assessment of the reliability of systems which learn from data is a key issue to investigate thoroughly before the actual application of information processing techniques to real-world problems. Over the recent years Gaussian processes and Bayesian neural networks have come to the fore and in this thesis their generalisation capabilities are analysed from theoretical and empirical perspectives. Upper and lower bounds on the learning curve of Gaussian processes are investigated in order to estimate the amount of data required to guarantee a certain level of generalisation performance. In this thesis we analyse the effects on the bounds and the learning curve induced by the smoothness of stochastic processes described by four different covariance functions. We also explain the early, linearly-decreasing behaviour of the curves and we investigate the asymptotic behaviour of the upper bounds. The effect of the noise and the characteristic lengthscale of the stochastic process on the tightness of the bounds are also discussed. The analysis is supported by several numerical simulations. The generalisation error of a Gaussian process is affected by the dimension of the input vector and may be decreased by input-variable reduction techniques. In conventional approaches to Gaussian process regression, the positive definite matrix estimating the distance between input points is often taken diagonal. In this thesis we show that a general distance matrix is able to estimate the effective dimensionality of the regression problem as well as to discover the linear transformation from the manifest variables to the hidden-feature space, with a significant reduction of the input dimension. Numerical simulations confirm the significant superiority of the general distance matrix with respect to the diagonal one.In the thesis we also present an empirical investigation of the generalisation errors of neural networks trained by two Bayesian algorithms, the Markov Chain Monte Carlo method and the evidence framework; the neural networks have been trained on the task of labelling segmented outdoor images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the problem of automatically obtaining the object/background segmentation of a rigid 3D object observed in a set of images that have been calibrated for camera pose and intrinsics. Such segmentations can be used to obtain a shape representation of a potentially texture-less object by computing a visual hull. We propose an automatic approach where the object to be segmented is identified by the pose of the cameras instead of user input such as 2D bounding rectangles or brush-strokes. The key behind our method is a pairwise MRF framework that combines (a) foreground/background appearance models, (b) epipolar constraints and (c) weak stereo correspondence into a single segmentation cost function that can be efficiently solved by Graph-cuts. The segmentation thus obtained is further improved using silhouette coherency and then used to update the foreground/background appearance models which are fed into the next Graph-cut computation. These two steps are iterated until segmentation convergences. Our method can automatically provide a 3D surface representation even in texture-less scenes where MVS methods might fail. Furthermore, it confers improved performance in images where the object is not readily separable from the background in colour space, an area that previous segmentation approaches have found challenging. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Congenital nystagmus is an ocular-motor disorder characterised by involuntary, conjugated and bilateral to and fro ocular oscillations. In this study a method to recognise automatically jerk waveform inside a congenital nystagmus recording and to compute foveation time and foveation position variability is presented. The recordings were performed with subjects looking at visual targets, presented in nine eye gaze positions; data were segmented into blocks corresponding to each gaze position. The nystagmus cycles were identified searching for local minima and maxima (SpEp sequence) in intervals centred on each slope change of the eye position signal (position criterion). The SpEp sequence was then refined using an adaptive threshold applied to the eye velocity signal; the outcome is a robust detection of each slow phase start point, fundamental to accurately compute some nystagmus parameters. A total of 1206 slow phases was used to compute the specificity in waveform recognition applying only the position criterion or adding the adaptive threshold; results showed an increase in negative predictive value of 25.1% using both features. The duration of each foveation window was measured on raw data or using an interpolating function of the congenital nystagmus slow phases; foveation time estimation less sensitive to noise was obtained in the second case. © 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Phonological accounts of reading implicate three aspects of phonological awareness tasks that underlie the relationship with reading; a) the language-based nature of the stimuli (words or nonwords), b) the verbal nature of the response, and c) the complexity of the stimuli (words can be segmented into units of speech). Yet, it is uncertain which task characteristics are most important as they are typically confounded. By systematically varying response-type and stimulus complexity across speech and non-speech stimuli, the current study seeks to isolate the characteristics of phonological awareness tasks that drive the prediction of early reading. Method: Four sets of tasks were created; tone stimuli (simple non-speech) requiring a non-verbal response, phonemes (simple speech) requiring a non-verbal response, phonemes requiring a verbal response, and nonwords (complex speech) requiring a verbal response. Tasks were administered to 570 2nd grade children along with standardized tests of reading and non-verbal IQ. Results: Three structural equation models comparing matched sets of tasks were built. Each model consisted of two 'task' factors with a direct link to a reading factor. The following factors predicted unique variance in reading: a) simple speech and non-speech stimuli, b) simple speech requiring a verbal response but not simple speech requiring a non-verbal-response, and c) complex and simple speech stimuli. Conclusions: Results suggest that the prediction of reading by phonological tasks is driven by the verbal nature of the response and not the complexity or 'speechness' of the stimuli. Findings highlight the importance of phonological output processes to early reading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Ind suggests front line employees can be segmented according to their level of brand-supporting performance. His employee typology has not been empirically tested. The paper aims to explore front line employee performance in retail banking, and profile employee types. Design/methodology/approach: Attitudinal and demographic data from a sample of 404 front line service employees in a leading Irish bank informs a typology of service employees. Findings: Champions, Outsiders and Disruptors exist within retail banking. The authors provide an employee profile for each employee type. They found Champions amongst males, and older employees. The highest proportion of female employees surveyed were Outsiders. Disruptors were more likely to complain, and rated their performance lower than any other employee type. Contrary to extant literature, Disruptors were more likely to hold a permanent contract than other employee types. Originality/value: The authors augment the literature by providing insights about the profile of three employee types: Brand Champions, Outsiders and Disruptors. Moreover, the authors postulate the influence of leadership and commitment on each employee type. The cluster profiles raise important questions for hiring, training and rewarding front line banking employees. The authors also provide guidelines for managers to encourage Champions, and curtail Disruptors. © Emerald Group Publishing Limited.