19 resultados para Segmented polyurethanes

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present results that compare the performance of neural networks trained with two Bayesian methods, (i) the Evidence Framework of MacKay (1992) and (ii) a Markov Chain Monte Carlo method due to Neal (1996) on a task of classifying segmented outdoor images. We also investigate the use of the Automatic Relevance Determination method for input feature selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To assess the visual performance and subjective experience of eyes implanted with a new bi-aspheric, segmented, multifocal intraocular lens: the Mplus X (Topcon Europe Medical, Capelle aan den IJssel, Netherlands). METHODS: Seventeen patients (mean age: 64.0 ± 12.8 years) had binocular implantation (34 eyes) with the Mplus X. Three months after the implantation, assessment was made of: manifest refraction; uncorrected and corrected distance visual acuity; uncorrected and distance corrected near visual acuity; defocus curves in photopic conditions; contrast sensitivity; halometry as an objective measure of glare; and patient satisfaction with unaided near vision using the Near Acuity Visual Questionnaire. RESULTS: Mean residual manifest refraction was -0.13 ± 0.51 diopters (D). Twenty-five eyes (74%) were within a mean spherical equivalent of ±0.50 D. Mean uncorrected distance visual acuity was +0.10 ± 0.12 logMAR monocularly and 0.02 ± 0.09 logMAR binocularly. Thirty-two eyes (94%) could read 0.3 or better without any reading correction and all patients could read 0.3 or better with a reading correction. Mean monocular uncorrected near visual acuity was 0.18 ± 0.16 logMAR, improving to 0.15 ± 0.15 logMAR with distance correction. Mean binocular uncorrected near visual acuity was 0.11 ± 0.11 logMAR, improving to 0.09 ± 0.12 logMAR with distance correction. Mean binocular contrast sensitivity was 1.75 ± 0.14 log units at 3 cycles per degree, 1.88 ± 0.20 log units at 6 cycles per degree, 1.66 ± 0.19 log units at 12 cycles per degree, and 1.11 ± 0.20 log units at 18 cycles per degree. Mean binocular and monocular halometry showed a glare profile of less than 1° of debilitating light scatter. Mean Near Acuity Visual Questionnaire Rasch score (0 = no difficulty, 100 = extreme difficulty) for satisfaction for near vision was 20.43 ± 14.64 log-odd units. CONCLUSIONS: The Mplus X provides a good visual outcome at distance and near with minimal dysphotopsia. Patients were very satisfied with their uncorrected near vision. © SLACK Incorporated.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper assesses the extent to which the equity markets of Hungary, Poland the Czech Republic and Russia have become less segmented. Using a variety of tests it is shown there has been a consistent increase in the co-movement of some Eastern European markets and developed markets. Using the variance decompositions from a vector autoregressive representation of returns it is shown that for Poland and Hungary global factors are having an increasing influence on equity returns, suggestive of increased equity market integration. In this paper we model a system of bivariate equity market correlations as a smooth transition logistic trend model in order to establish how rapidly the countries of Eastern Europe are moving away from market segmentation. We find that Hungary is the country which is becoming integrated the most quickly. © 2005 ELsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research has suggested that the A and B share markets of China may be informationally segmented. In this paper volatility patterns in the A and B share market are studied to establish whether volatility changes to the A and B share markets are synchronous. A consequence of new information, when investors act upon it is that volatility rises. This means that if the A and B markets are perfectly integrated volatility changes to each market would be expected to occur at the same time. However, if they are segmented there is no reason for volatility changes to occur on the same day. Using the iterative cumulative sum of squares across the different markets. Evidence is found of integration between the two A share markets but not between the A and B markets. © 2005 Taylor & Francis Group Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrogels may be conveniently described as hydrophilic polymers that are swollen by, but do not dissolve in water. In this work a series of copolymer hydrogels and semi-interpenetrating polymer networks based on the monomers 2-hydroxyethyl methacrylate, N-vinyl pyrrolidone and N'N' dimethyl acrylamide, together with some less hydrophilic hydroxyalkyl acrylates and methacrylates have been synthesised. Variations in structure and composition have been correlated both with the total equilibrium water content of the resultant hydrogel and with the more detailed water binding behaviour, as revealed by differential scanning calorimetry studies. The water binding characteristics of the hydrogels were found to be primarily a function of the water structuring groups present in gel. The water binding abilities of these groups were, however, modified by steric effects. The mechanical properties of the hydrogels were also investigated. These were found to be dependent on both the polymer composition and the amount and nature of the water present in the gels. In biological systems, composite formation provides a means of producing strong, high water content materials. As an analogy with these systems hydrogel composites were prepared. In an initial study of these materials the water binding and mechanical properties of semi-interpenetrating polymer networks of N'N'dimethyl acrylamide with cellulosic type materials, with polyurethanes and with ester containing polymers were examined. A preliminary investigation of surface properties of both the copolymers and semi-interpenetrating polymer networks has been completed, using both contact angle measurements and anchorage dependent fibroblast cells. Measurable differences in surface properties attributable to structural variations in the polymers were detected by droplet techniques in the dehydrated state. However, in the hydrated state these differences were masked by the water in the gels. The use of cells enabled the underlying differences to be probed and the nature of the water structuring group was again found to be the dominant factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assessment of the reliability of systems which learn from data is a key issue to investigate thoroughly before the actual application of information processing techniques to real-world problems. Over the recent years Gaussian processes and Bayesian neural networks have come to the fore and in this thesis their generalisation capabilities are analysed from theoretical and empirical perspectives. Upper and lower bounds on the learning curve of Gaussian processes are investigated in order to estimate the amount of data required to guarantee a certain level of generalisation performance. In this thesis we analyse the effects on the bounds and the learning curve induced by the smoothness of stochastic processes described by four different covariance functions. We also explain the early, linearly-decreasing behaviour of the curves and we investigate the asymptotic behaviour of the upper bounds. The effect of the noise and the characteristic lengthscale of the stochastic process on the tightness of the bounds are also discussed. The analysis is supported by several numerical simulations. The generalisation error of a Gaussian process is affected by the dimension of the input vector and may be decreased by input-variable reduction techniques. In conventional approaches to Gaussian process regression, the positive definite matrix estimating the distance between input points is often taken diagonal. In this thesis we show that a general distance matrix is able to estimate the effective dimensionality of the regression problem as well as to discover the linear transformation from the manifest variables to the hidden-feature space, with a significant reduction of the input dimension. Numerical simulations confirm the significant superiority of the general distance matrix with respect to the diagonal one.In the thesis we also present an empirical investigation of the generalisation errors of neural networks trained by two Bayesian algorithms, the Markov Chain Monte Carlo method and the evidence framework; the neural networks have been trained on the task of labelling segmented outdoor images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses the problem of automatically obtaining the object/background segmentation of a rigid 3D object observed in a set of images that have been calibrated for camera pose and intrinsics. Such segmentations can be used to obtain a shape representation of a potentially texture-less object by computing a visual hull. We propose an automatic approach where the object to be segmented is identified by the pose of the cameras instead of user input such as 2D bounding rectangles or brush-strokes. The key behind our method is a pairwise MRF framework that combines (a) foreground/background appearance models, (b) epipolar constraints and (c) weak stereo correspondence into a single segmentation cost function that can be efficiently solved by Graph-cuts. The segmentation thus obtained is further improved using silhouette coherency and then used to update the foreground/background appearance models which are fed into the next Graph-cut computation. These two steps are iterated until segmentation convergences. Our method can automatically provide a 3D surface representation even in texture-less scenes where MVS methods might fail. Furthermore, it confers improved performance in images where the object is not readily separable from the background in colour space, an area that previous segmentation approaches have found challenging. © 2011 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Congenital nystagmus is an ocular-motor disorder characterised by involuntary, conjugated and bilateral to and fro ocular oscillations. In this study a method to recognise automatically jerk waveform inside a congenital nystagmus recording and to compute foveation time and foveation position variability is presented. The recordings were performed with subjects looking at visual targets, presented in nine eye gaze positions; data were segmented into blocks corresponding to each gaze position. The nystagmus cycles were identified searching for local minima and maxima (SpEp sequence) in intervals centred on each slope change of the eye position signal (position criterion). The SpEp sequence was then refined using an adaptive threshold applied to the eye velocity signal; the outcome is a robust detection of each slow phase start point, fundamental to accurately compute some nystagmus parameters. A total of 1206 slow phases was used to compute the specificity in waveform recognition applying only the position criterion or adding the adaptive threshold; results showed an increase in negative predictive value of 25.1% using both features. The duration of each foveation window was measured on raw data or using an interpolating function of the congenital nystagmus slow phases; foveation time estimation less sensitive to noise was obtained in the second case. © 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Phonological accounts of reading implicate three aspects of phonological awareness tasks that underlie the relationship with reading; a) the language-based nature of the stimuli (words or nonwords), b) the verbal nature of the response, and c) the complexity of the stimuli (words can be segmented into units of speech). Yet, it is uncertain which task characteristics are most important as they are typically confounded. By systematically varying response-type and stimulus complexity across speech and non-speech stimuli, the current study seeks to isolate the characteristics of phonological awareness tasks that drive the prediction of early reading. Method: Four sets of tasks were created; tone stimuli (simple non-speech) requiring a non-verbal response, phonemes (simple speech) requiring a non-verbal response, phonemes requiring a verbal response, and nonwords (complex speech) requiring a verbal response. Tasks were administered to 570 2nd grade children along with standardized tests of reading and non-verbal IQ. Results: Three structural equation models comparing matched sets of tasks were built. Each model consisted of two 'task' factors with a direct link to a reading factor. The following factors predicted unique variance in reading: a) simple speech and non-speech stimuli, b) simple speech requiring a verbal response but not simple speech requiring a non-verbal-response, and c) complex and simple speech stimuli. Conclusions: Results suggest that the prediction of reading by phonological tasks is driven by the verbal nature of the response and not the complexity or 'speechness' of the stimuli. Findings highlight the importance of phonological output processes to early reading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Ind suggests front line employees can be segmented according to their level of brand-supporting performance. His employee typology has not been empirically tested. The paper aims to explore front line employee performance in retail banking, and profile employee types. Design/methodology/approach: Attitudinal and demographic data from a sample of 404 front line service employees in a leading Irish bank informs a typology of service employees. Findings: Champions, Outsiders and Disruptors exist within retail banking. The authors provide an employee profile for each employee type. They found Champions amongst males, and older employees. The highest proportion of female employees surveyed were Outsiders. Disruptors were more likely to complain, and rated their performance lower than any other employee type. Contrary to extant literature, Disruptors were more likely to hold a permanent contract than other employee types. Originality/value: The authors augment the literature by providing insights about the profile of three employee types: Brand Champions, Outsiders and Disruptors. Moreover, the authors postulate the influence of leadership and commitment on each employee type. The cluster profiles raise important questions for hiring, training and rewarding front line banking employees. The authors also provide guidelines for managers to encourage Champions, and curtail Disruptors. © Emerald Group Publishing Limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To evaluate the effect of reducing the number of visual acuity measurements made in a defocus curve on the quality of data quantified. Setting: Midland Eye, Solihull, United Kingdom. Design: Evaluation of a technique. Methods: Defocus curves were constructed by measuring visual acuity on a distance logMAR letter chart, randomizing the test letters between lens presentations. The lens powers evaluated ranged between +1.50 diopters (D) and -5.00 D in 0.50 D steps, which were also presented in a randomized order. Defocus curves were measured binocularly with the Tecnis diffractive, Rezoom refractive, Lentis rotationally asymmetric segmented (+3.00 D addition [add]), and Finevision trifocal multifocal intraocular lenses (IOLs) implanted bilaterally, and also for the diffractive IOL and refractive or rotationally asymmetric segmented (+3.00 D and +1.50 D adds) multifocal IOLs implanted contralaterally. Relative and absolute range of clear-focus metrics and area metrics were calculated for curves fitted using 0.50 D, 1.00 D, and 1.50 D steps and a near add-specific profile (ie, distance, half the near add, and the full near-add powers). Results: A significant difference in simulated results was found in at least 1 of the relative or absolute range of clear-focus or area metrics for each of the multifocal designs examined when the defocus-curve step size was increased (P<.05). Conclusion: Faster methods of capturing defocus curves from multifocal IOL designs appear to distort the metric results and are therefore not valid. Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned. © 2013 ASCRS and ESCRS.