298 resultados para Correction de textures


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Essential hypertension is a highly hereditable disorder in which genetic influences predominate over environmental factors. The molecular genetic profiles which predispose to essential hypertension are not known. In rats with genetic hypertension, there is some recent evidence pointing to linkage of renin gene alleles with blood pressure. The genes for renin and antithrombin III belong to a conserved synteny group which, in humans, spans the q21.3-32.3 region of chromosome I and, in rats, is linkage group X on chromosome 13. The present study examined the association of particular human renin gene (REN) and antithrombin III gene (AT3) polymorphisms with essential hypertension by comparing the frequency of specific alleles for each of these genes in 50 hypertensive offspring of hypertensive parents and 91 normotensive offspring of normotensive parents. In addition, linkage relationships were examined in hypertensive pedigrees with multiple affected individuals. Alleles of a REN HindIII restriction fragment length polymorphism (RFLP) were detected using a genomic clone, λHR5, to probe Southern blots of HindIII-cut leucocyte DNA, and those for an AT3 Pstl RFLP were detected by phATIII 113 complementary DNA probe. The frequencies of each REN allele in the hypertensive group were 0.76 and 0.24 compared with 0.74 and 0.26 in the normotensive group. For AT3, hypertensive allele frequencies were 0.49 and 0.51 compared with normotensive values of 0.54 and 0.46. These differences were not significant by χ2 analysis (P > 0.2). Linkage analysis of a family (data from 16 family members, 10 of whom were hypertensive), informative for both markers, without an age-of-onset correction, and assuming dominant inheritance of hypertension, complete penetrance and a disease frequency of 20%, did not indicate linkage of REN with hypertension, but gave a positive, although not significant, logarithm of the odds for linkage score of 0.784 at a recombination fraction of 0 for AT3 linkage to hypertension. In conclusion, the present study could find no evidence for an association of a REN HindIII RFLP with essential hypertension or for a linkage of the locus defined by this RFLP in a family segregating for hypertension. In the case of an AT3 Pstl RFLP, although association analysis was negative, linkage analysis suggested possible involvement (odds of 6:1 in favour) of a gene located near the 1q23 locus with hypertension in one informative family.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Strange encounters, mobility, evocative textures, cultural connections, stories, water, land, travel, discontinuity - the overriding sense of the exhibition and workshops was that they were a meditation on, and a reconfiguration of the concept of home and belonging. Home and groundedness are never unproblematic, never simply a refuge from the world beyond, but can be disconcerting and disorienting. In this way I viewed being involved in the exhibition as an experience of being unsettled, of myself reflecting on unhomeliness. For me this was partly because curation is a novel disciplinary detour but also because the artists' voices, their involvement in the workshops and their compelling works made it imperative for us all to intersect our work and ideas, but without a set itinerary. Being a curator or artist in was always a collective, mutual, shared event, but clearly not in a claustrophobic communal sense of agreement and consensus. Rather, the events were slightly anxious, uncertain moments which flowed with some lack of fluency, dislocation and apprehension. The result was an exhibition in which diverse visual vocabularies destabilised and questioned the very grounds of belonging beyond the terms I had imagined when we started out. We were all asked to bring down certain borders, to enter a world of flux. It felt simultaneously enthralling and disconcerting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Intensity modulated radiotherapy (IMRT) treatments require more beam-on time and produce more linac head leakage to deliver similar doses to conventional, unmodulated, radiotherapy treatments. It is necessary to take this increased leakage into account when evaluating the results of radiation surveys around bunkers that are, or will be, used for IMRT. The recommended procedure of 15 applying a monitor-unit based workload correction factor to secondary barrier survey measurements, to account for this increased leakage when evaluating radiation survey measurements around IMRT bunkers, can lead to potentially-costly over estimation of the required barrier thickness. This study aims to provide initial guidance on the validity of reducing the value of the correction factor when applied to different radiation barriers (primary barriers, doors, maze walls and other walls) by 20 evaluating three different bunker designs. Methods Radiation survey measurements of primary, scattered and leakage radiation were obtained at each of five survey points around each of three different radiotherapy bunkers and the contribution of leakage to the total measured radiation dose at each point was evaluated. Measurements at each survey point were made with the linac gantry set to 12 equidistant positions from 0 to 330o, to 25 assess the effects of radiation beam direction on the results. Results For all three bunker designs, less than 0.5% of dose measured at and alongside the primary barriers, less than 25% of the dose measured outside the bunker doors and up to 100% of the dose measured outside other secondary barriers was found to be caused by linac head leakage. Conclusions Results of this study suggest that IMRT workload corrections are unnecessary, for 30 survey measurements made at and alongside primary barriers. Use of reduced IMRT workload correction factors is recommended when evaluating survey measurements around a bunker door, provided that a subset of the measurements used in this study are repeated for the bunker in question. Reduction of the correction factor for other secondary barrier survey measurements is not recommended unless the contribution from leakage is separetely evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose to evaluate the effects of the wearer’s pupil size and spherical aberration on visual performance with centre-near, aspheric multifocal contact lenses (MFCLs). The advantage of binocular over monocular vision was also investigated. Methods Twelve young volunteers, with an average age of 27±5 years, participated in the study. LogMAR Visual Acuity (VA) was measured under cycloplegia for a range of defocus levels (from +3.0 to -3.0D, in 0.5D steps) with no correction and with three aspheric MFCLs (Air Optix Aqua Multifocal, Ciba Vision, Duluth, GA, US) with a centre-near design, providing correction for “Low”, “Med” and “High” near demands. Measurements were performed for all combinations of the following conditions: i) artificial pupils of 6mm and 3mm diameter, ii) binocular and monocular (dominant eye) vision. Depth-of-focus (DOF) was calculated from the VA vs. defocus curves. Ocular aberrations under cycloplegia were measured using iTrace. Results VA at -3.0D defocus (simulating near performance) was statistically higher for the 3mm than for the 6mm pupil (p=0.006), and for binocular rather than for monocular vision (p<0.001). Similarly, DOF was better for the 3mm pupil (p=0.002) and for binocular viewing conditions (p<0.001, ANOVA). Both VA at –3.0D defocus and DOF increased as the “addition” of the MFCL correction increased. Finally, with the centre-near MFCLs a linear correlation was found between VA at –3.0D defocus and the wearer’s ocular spherical aberration (R2=0.20 p<0.001 for 6mm data), with the eyes exhibiting the higher positive spherical aberration experiencing lower VAs. By contrast, no correlation was found between VA and spherical aberration at 0.00D defocus (distance vision). Conclusions Both near VA and depth-of-focus improve with these MFCLs, with the effects being more pronounced for small pupils and binocular than for monocular vision. Coupling of the wearer’s ocular spherical aberration with the aberration profiles provided by MFCLs affects their functionality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: In animal models hemi-field deprivation results in localised, graded vitreous chamber elongation and presumably deprivation induced localised changes in retinal processing. The aim of this research was to determine if there are variations in ERG responses across the retina in normal chick eyes and to examine the effect of hemi-field and full-field deprivation on ERG responses across the retina and at earlier times than have previously been examined electrophysiologically. Methods: Chicks were either untreated, wore monocular full-diffusers or half-diffusers (depriving nasal retina) (n = 6-8 each group) from day 8. mfERG responses were measured using the VERIS mfERG system across the central 18.2º× 16.7º (H × V) field. The stimulus consisted of 61 unscaled hexagons with each hexagon modulated between black and white according to a pseudorandom binary m-sequence. The mfERG was measured on day 12 in untreated chicks, following 4 days of hemi-field diffuser wear, and 2, 48 and 96 h after application of full-field diffusers. Results: The ERG response of untreated chick eyes did not vary across the measured field; there was no effect of retinal location on the N1-P1 amplitude (p = 0.108) or on P1 implicit time (p > 0.05). This finding is consistent with retinal ganglion cell density of the chick varying by only a factor of two across the entire retina. Half-diffusers produced a ramped retina and a graded effect of negative lens correction (p < 0.0001); changes in retinal processing were localized. The untreated retina showed increasing complexity of the ERG waveform with development; form-deprivation prevented the increasing complexity of the response at the 2, 48 and 96 h measurement times and produced alterations in response timing. Conclusions: Form-deprivation and its concomitant loss of image contrast and high spatial frequency images prevented development of the ERG responses, consistent with a disruption of development of retinal feedback systems. The characterisation of ERG responses in normal and deprived chick eyes across the retina allows the assessment of concurrent visual and retinal manipulations in this model. (Ophthalmic & Physiological Optics © 2013 The College of Optometrists.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to determine visual performance in water, including the influence of pupil size. The water en-vironment was simulated by placing a goggle filled with saline in front of eyes, with apertures placed at the front of the goggle. Correction factors were determined for the different magnification under this condition in order to to estimate vision in water. Experiments were conducted on letter visual acuity (7 participants), grating resolution (8 participants), and grating contrast sensitivity (1 participant). For letter acuity, mean loss in vision in water, compared to corrected vision in air, varied between 1.1 log minutes of arc resolution (logMAR) for a 1mm aperture to 2.2 logMAR for a 7mm aperture. The vision in minutes of arc was described well by a linear relationship with pupil size. For grating acuity, mean loss varied between 1.1 logMAR for a 2mm aperture to 1.2 logMAR for a 6mm aperture. Contrast sensitivity for a 2mm aperture dete-riorated as spatial frequency increased, with 2 log unit loss by 3 cycles/degree. Superimposed on this deterioration were depressions (notches) in sensitivity, with the first three notches occurring at 0.45, 0.8 and 1.3 cycles/degree with esti-mates for water of 0.39, 0.70 and 1.13 cycles/degree. In conclusion, vision in water is poor. It becomes worse as pupil size increases, but the effects are much more marked for letter targets than for grating targets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To design and manufacture lenses to correct peripheral refraction along the horizontal meridian and to determine whether these resulted in noticeable improvements in visual performance. Method Subjective refraction of a low myope was determined on the basis of best peripheral detection acuity along the horizontal visual field out to ±30° for both horizontal and vertical gratings. Subjective refraction was compared to objective refractions using a COAS-HD aberrometer. Special lenses were made to correct peripheral refraction, based on designs optimized with and without smoothing across a 3 mm diameter square aperture. Grating detection was retested with these lenses. Contrast thresholds of 1.25’ spots were determined across the field for the conditions of best correction, on-axis correction, and the special lenses. Results The participant had high relative peripheral hyperopia, particularly in the temporal visual field (maximum 2.9 D). There were differences > 0.5D between subjective and objective refractions at a few field angles. On-axis correction reduced peripheral detection acuity and increased peripheral contrast threshold in the peripheral visual field, relative to the best correction, by up to 0.4 and 0.5 log units, respectively. The special lenses restored most of the peripheral vision, although not all at angles to ±10°, and with the lens optimized with aperture-smoothing possibly giving better vision than the lens optimized without aperture-smoothing at some angles. Conclusion It is possible to design and manufacture lenses to give near optimum peripheral visual performance to at least ±30° along one visual field meridian. The benefit of such lenses is likely to be manifest only if a subject has a considerable relative peripheral refraction, for example of the order of 2 D.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To determine visual performance in water, including the influence of pupil size. Method: The water environment was simulated by placing a goggle filled with saline in front of eyes, with apertures placed at the front of the goggle. Correction factors were determined for the different magnification under this condition to estimate vision in water. Experiments were conducted on letter visual acuity (7 participants), grating resolution (8 participants), and grating contrast sensitivity (1 participant). Results: For letter acuity, mean loss in vision in water, compared to corrected vision in air, varied between 1.1 log minutes of arc resolution (logMAR) for a 1mm aperture to 2.2 logMAR for a 7mm aperture. The vision in minutes of arc was described well by a linear relationship with pupil size. For grating acuity, mean loss varied between 1.1 logMAR for a 2mm aperture to 1.2 logMAR for a 6mm aperture. Contrast sensitivity for a 2mm aperture deteriorated as spatial frequency increased, with 2 log unit loss by 3 cycles/degree. Superimposed on this deterioration were depressions (notches) in sensitivity, with the first three notches occurring at 0.45, 0.8 and 1.3 cycles/degree and with estimates for water of 0.39, 0.70 and 1.13 cycles/degree. Conclusion: Vision in water is poor. It becomes worse as pupil size increases, but the effects are much more marked for letter targets than for grating targets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

iTRAQ (isobaric tags for relative or absolute quantitation) is a mass spectrometry technology that allows quantitative comparison of protein abundance by measuring peak intensities of reporter ions released from iTRAQ-tagged peptides by fragmentation during MS/MS. However, current data analysis techniques for iTRAQ struggle to report reliable relative protein abundance estimates and suffer with problems of precision and accuracy. The precision of the data is affected by variance heterogeneity: low signal data have higher relative variability; however, low abundance peptides dominate data sets. Accuracy is compromised as ratios are compressed toward 1, leading to underestimation of the ratio. This study investigated both issues and proposed a methodology that combines the peptide measurements to give a robust protein estimate even when the data for the protein are sparse or at low intensity. Our data indicated that ratio compression arises from contamination during precursor ion selection, which occurs at a consistent proportion within an experiment and thus results in a linear relationship between expected and observed ratios. We proposed that a correction factor can be calculated from spiked proteins at known ratios. Then we demonstrated that variance heterogeneity is present in iTRAQ data sets irrespective of the analytical packages, LC-MS/MS instrumentation, and iTRAQ labeling kit (4-plex or 8-plex) used. We proposed using an additive-multiplicative error model for peak intensities in MS/MS quantitation and demonstrated that a variance-stabilizing normalization is able to address the error structure and stabilize the variance across the entire intensity range. The resulting uniform variance structure simplifies the downstream analysis. Heterogeneity of variance consistent with an additive-multiplicative model has been reported in other MS-based quantitation including fields outside of proteomics; consequently the variance-stabilizing normalization methodology has the potential to increase the capabilities of MS in quantitation across diverse areas of biology and chemistry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Novel computer vision techniques have been developed for automatic monitoring of crowed environments such as airports, railway stations and shopping malls. Using video feeds from multiple cameras, the techniques enable crowd counting, crowd flow monitoring, queue monitoring and abnormal event detection. The outcome of the research is useful for surveillance applications and for obtaining operational metrics to improve business efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Many contact lens (CL) manufacturers produce simultaneous-image lenses in which power varies either smoothly or discontinuously with zonal radius. We present in vitro measurements of some recent CLs and discuss how power profiles might be approximated in terms of nominal distance corrections, near additions, and on-eye visual performance. Methods Fully hydrated soft, simultaneous-image CLs from four manufacturers (Air Optix AQUA, Alcon; PureVision multifocal, Bausch & Lomb; Acuvue OASYS for Presbyopia, Vistakon; Biofinity multifocal- ‘‘D’’ design, Cooper Vision) were measured with a Phase focus Lens Profiler (Phase Focus Ltd., Sheffield,UK) in a wet cell and powerswere corrected to powers in air. All lenses had zero labeled power for distance. Results Sagittal power profiles revealed that the ‘‘low’’ add PureVision and Air Optix lenses exhibit smooth (parabolic) profiles, corresponding to negative spherical aberration. The ‘‘mid’’ and ‘‘high’’ add PureVision and Air Optix lenses have biaspheric designs, leading to different rates of power change for the central and peripheral portions. All OASYS lenses display a series of concentric zones, separated by abrupt discontinuities; individual profiles can be constrained between two parabolically decreasing curves, each giving a valid description of the power changes over alternate annular zones. Biofinity lenses have constant power over the central circular region of radius 1.5 mm, followed by an annular zone where the power increases approximately linearly, the gradient increasing with the add power, and finally an outer zone showing a slow, linear increase in power with a gradient being almost independent of the add power. Conclusions The variation in power across the simultaneous-image lenses produces enhanced depth of focus. The throughfocusnature of the image, which influences the ‘‘best focus’’ (distance correction) and the reading addition, will vary with several factors, including lens centration, the wearer’s pupil diameter, and ocular aberrations, particularly spherical aberration; visual performance with some designs may show greater sensitivity to these factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Bluetooth technology is being increasingly used, among the Automated Vehicle Identification Systems, to retrieve important information about urban networks. Because the movement of Bluetooth-equipped vehicles can be monitored, throughout the network of Bluetooth sensors, this technology represents an effective means to acquire accurate time dependant Origin Destination information. In order to obtain reliable estimations, however, a number of issues need to be addressed, through data filtering and correction techniques. Some of the main challenges inherent to Bluetooth data are, first, that Bluetooth sensors may fail to detect all of the nearby Bluetooth-enabled vehicles. As a consequence, the exact journey for some vehicles may become a latent pattern that will need to be estimated. Second, sensors that are in close proximity to each other may have overlapping detection areas, thus making the task of retrieving the correct travelled path even more challenging. The aim of this paper is twofold: to give an overview of the issues inherent to the Bluetooth technology, through the analysis of the data available from the Bluetooth sensors in Brisbane; and to propose a method for retrieving the itineraries of the individual Bluetooth vehicles. We argue that estimating these latent itineraries, accurately, is a crucial step toward the retrieval of accurate dynamic Origin Destination Matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a pose estimation approach that is resilient to typical sensor failure and suitable for low cost agricultural robots. Guiding large agricultural machinery with highly accurate GPS/INS systems has become standard practice, however these systems are inappropriate for smaller, lower-cost robots. Our positioning system estimates pose by fusing data from a low-cost global positioning sensor, low-cost inertial sensors and a new technique for vision-based row tracking. The results first demonstrate that our positioning system will accurately guide a robot to perform a coverage task across a 6 hectare field. The results then demonstrate that our vision-based row tracking algorithm improves the performance of the positioning system despite long periods of precision correction signal dropout and intermittent dropouts of the entire GPS sensor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection and correction of defects remains among the most time consuming and expensive aspects of software development. Extensive automated testing and code inspections may mitigate their effect, but some code fragments are necessarily more likely to be faulty than others, and automated identification of fault prone modules helps to focus testing and inspections, thus limiting wasted effort and potentially improving detection rates. However, software metrics data is often extremely noisy, with enormous imbalances in the size of the positive and negative classes. In this work, we present a new approach to predictive modelling of fault proneness in software modules, introducing a new feature representation to overcome some of these issues. This rank sum representation offers improved or at worst comparable performance to earlier approaches for standard data sets, and readily allows the user to choose an appropriate trade-off between precision and recall to optimise inspection effort to suit different testing environments. The method is evaluated using the NASA Metrics Data Program (MDP) data sets, and performance is compared with existing studies based on the Support Vector Machine (SVM) and Naïve Bayes (NB) Classifiers, and with our own comprehensive evaluation of these methods.