949 resultados para Retinal image quality metric
Resumo:
There is a growing demand for data transmission over digital networks involving mobile terminals. An important class of data required for transmission over mobile terminals is image information such as street maps, floor plans and identikit images. This sort of transmission is of particular interest to the service industries such as the Police force, Fire brigade, medical services and other services. These services cannot be applied directly to mobile terminals because of the limited capacity of the mobile channels and the transmission errors caused by the multipath (Rayleigh) fading. In this research, transmission of line diagram images such as floor plans and street maps, over digital networks involving mobile terminals at transmission rates of 2400 bits/s and 4800 bits/s have been studied. A low bit-rate source encoding technique using geometric codes is found to be suitable to represent line diagram images. In geometric encoding, the amount of data required to represent or store the line diagram images is proportional to the image detail. Thus a simple line diagram image would require a small amount of data. To study the effect of transmission errors due to mobile channels on the transmitted images, error sources (error files), which represent mobile channels under different conditions, have been produced using channel modelling techniques. Satisfactory models of the mobile channel have been obtained when compared to the field test measurements. Subjective performance tests have been carried out to evaluate the quality and usefulness of the received line diagram images under various mobile channel conditions. The effect of mobile transmission errors on the quality of the received images has been determined. To improve the quality of the received images under various mobile channel conditions, forward error correcting codes (FEC) with interleaving and automatic repeat request (ARQ) schemes have been proposed. The performance of the error control codes have been evaluated under various mobile channel conditions. It has been shown that a FEC code with interleaving can be used effectively to improve the quality of the received images under normal and severe mobile channel conditions. Under normal channel conditions, similar results have been obtained when using ARQ schemes. However, under severe mobile channel conditions, the FEC code with interleaving shows better performance.
Resumo:
The aim of this project was to carry out a fundamental study to assess the potential of colour image analysis for use in investigations of fire damaged concrete. This involved:(a) Quantification (rather than purely visual assessment) of colour change as an indicator of the thermal history of concrete.(b) Quantification of the nature and intensity of crack development as an indication of the thermal history of concrete, supporting and in addition to, colour change observations.(c) Further understanding of changes in the physical and chemical properties of aggregate and mortar matrix after heating.(d) An indication of the relationship between cracking and non-destructive methods of testing e.g. UPV or Schmidt hammer. Results showed that colour image analysis could be used to quantify the colour changes found when concrete is heated. Development of red colour coincided with significant reduction in compressive strength. Such measurements may be used to determine the thermal history of concrete by providing information regarding the temperature distribution that existed at the height of a fire. The actual colours observed depended on the types of cement and aggregate that were used to make the concrete. With some aggregates it may be more appropriate to only analyse the mortar matrix. Petrographic techniques may also be used to determine the nature and density of cracks developing at elevated temperatures and values of crack density correlate well with measurements of residual compressive strength. Small differences in crack density were observed with different cements and aggregates, although good correlations were always found with the residual compressive strength. Taken together these two techniques can provide further useful information for the evaluation of fire damaged concrete. This is especially so since petrographic analysis can also provide information on the quality of the original concrete such as cement content and water / cement ratio. Concretes made with blended cements tended to produce small differences in physical and chemical properties compared to those made with unblended cements. There is some evidence to suggest that a coarsening of pore structure in blended cements may lead to onset of cracking at lower temperatures. The use of DTA/TGA was of little use in assessing the thermal history of concrete made with blended cements. Corner spalling and sloughing off, as observed in columns, was effectively reproduced in tests on small scale specimens and the crack distributions measured. Relationships between compressive strength/cracking and non-destructive methods of testing are discussed and an outline procedure for site investigations of fire damaged concrete is described.
Resumo:
Presentation Purpose:To relate structural change to functional change in age-related macular degeneration (AMD) in a cross-sectional population using fundus imaging and the visual field status. Methods:10 degree standard and SWAP visual fields and other standard functional clinical measures were acquired in 44 eyes of 27 patients at various stages of AMD, as well as fundus photographs. Retro-mode SLO images were captured in a subset of 29 eyes of 19 of the patients. Drusen area, measured by automated drusen segmentation software (Smith et al. 2005) was correlated with visual field data. Visual field defect position was compared to the position of the imaged drusen and deposits using custom software. Results:The effect of AMD stage on drusen area within the 6000µm was significant (One-way ANOVA: F = 17.231, p < 0.001), however the trend was not strong across all stages. There were significant linear relationships between visual field parameters and drusen area. The mean deviation (MD) declined by 3.00dB and 3.92dB for each log % drusen area for standard perimetry and SWAP, respectively. The visual field parameters of focal loss displayed the strongest correlations with drusen area. The number of pattern deviation (PD) defects increased by 9.30 and 9.68 defects per log % drusen area for standard perimetry and SWAP, respectively. Weaker correlations were found between drusen area and visual acuity, contrast sensitivity, colour vision and reading speed. 72.6% of standard PD defects and 65.2% of SWAP PD defects coincided with retinal signs of AMD on fundus photography. 67.5% of standard PD defects and 69.7% of SWAP PD defects coincided with deposits on retro-mode images. Conclusions:Perimetry exhibited a stronger relationship with drusen area than other measures of visual function. The structure-function relationship between visual field parameters and drusen area was linear. Overall the indices of focal loss had a stronger correlation with drusen area in SWAP than in standard perimetry. Visual field defects had a high coincidence proportion with retinal manifestations of AMD.Smith R.T. et al. (2005) Arch Ophthalmol 123:200-206.
Resumo:
Presentaton Purpose:We conducted a small study to assess the novel, retro - mode imaging technique of the NIDEK F-10 scanning laser ophthalmoscope, for detecting and quantifying retinal drusen. Methods:Fundus photographs of 4 eyes of 2 patients taken in retro-mode on the Nidek F-10 SLO were graded independently by 6,experienced, masked fundus graders for the presence of retinal drusen , and compared to stereo colour fundus photographs taken with a Topcon TRC-50DX camera. Results:The mean number of retinal drusen detected in retro mode was 142.96+/- 60.8, range 63-265, and on colour fundus photography mean of 66.6+/-32.6, range 26-177. All observers independently detected approximately twice as many drusen on retro-mode than colour fundus photography (p<0.0001, Student’s paired t-test) . The statistical significance of interobserver variation in drusen detection was p=0.07 on colour fundus photography , and p=0.02 on retro mode ( ANOVA) . Conclusions:The retro-mode of the F-10 camera uses infrared laser and an aperture with a modified central stop, with the aperture deviated laterally from the confocal light path. This forms a pseudo -3D image which is a new means of detecting abnomalites in the deeper retinal layers. Retro-mode imaging of retinal drusen using the F-10 Nidek SLO is a highly sensitive technique for detecting and quantifying retinal drusen , and detected twice as many drusen than colour fundus photography. This small pilot study suggests that this novel type of imaging may have a role in the future detection and analysis of retinal drusen, a field that is likely to become increasingly important in future AMD prevention studies.
Resumo:
Systemic hypertension is an important public health concern. If optometrists are to perform a more active role in the detection and monitoring of high blood pressure (BP), there is a need to improve the consistency of describing the retinal vasculature and to assess patient's ability to correctly report the diagnosis of hypertension, its control and medication. One hundred and one patients aged >40 years were dilated and had fundus photography performed. BP was measured and a self-reported history of general health and current medication was compared with the records of their general practitioner (GP). The status of the retinal vasculature was quantified using a numeric scale by five clinicians and this was compared to the same evaluation performed with the aid of a basic pictorial grading scale. Image analysis was used to objectively measure the artery-to-vein (A/V) ratio and arterial reflex. Arteriolar tortuosity and calibre changes were found to be the most sensitive retinal signs of high BP. Using the grading scale to describe the retinal vasculature significantly improved inter- and intra-observer repeatability. Almost half the patients examined were on medication for high BP or cardiovascular disease. Patients' ability to give their complete medical history was poor, as was their ability to recall what medication they had been prescribed. GPs indicated it was useful to receive details of their patient's BP when it was >140/90 mmHg. The use of improved description of the retinal vasculature and stronger links between optometrists and GPs may enhance future patient care. © 2001 The College of Optometrists. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Image content interpretation is much dependent on segmentations efficiency. Requirements for the image recognition applications lead to a nessesity to create models of new type, which will provide some adaptation between law-level image processing, when images are segmented into disjoint regions and features are extracted from each region, and high-level analysis, using obtained set of all features for making decisions. Such analysis requires some a priori information, measurable region properties, heuristics, and plausibility of computational inference. Sometimes to produce reliable true conclusion simultaneous processing of several partitions is desired. In this paper a set of operations with obtained image segmentation and a nested partitions metric are introduced.
Resumo:
A new distance function to compare arbitrary partitions is proposed. Clustering of image collections and image segmentation give objects to be matched. Offered metric intends for combination of visual features and metadata analysis to solve a semantic gap between low-level visual features and high-level human concept.
Resumo:
Metrics estimate the quality of different aspects of software. In particular, cohesion indicates how well the parts of a system hold together. A metric to evaluate class cohesion is important in object-oriented programming because it gives an indication of a good design of classes. There are several proposals of metrics for class cohesion but they have several problems (for instance, low discrimination). In this paper, a new metric to evaluate class cohesion is proposed, called SCOM, which has several relevant features. It has an intuitive and analytical formulation, what is necessary to apply it to large-size software systems. It is normalized to produce values in the range [0..1], thus yielding meaningful values. It is also more sensitive than those previously reported in the literature. The attributes and methods used to evaluate SCOM are unambiguously stated. SCOM has an analytical threshold, which is a very useful but rare feature in software metrics. We assess the metric with several sample cases, showing that it gives more sensitive values than other well know cohesion metrics.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.
Resumo:
This work looks into video quality assessment applied to the field of telecare and proposes an alternative metric to the more traditionally used PSNR based on the requirements of such an application. We show that the Pause Intensity metric introduced in [1] is also relevant and applicable to heterogeneous networks with a wireless last hop connected to a wired TCP backbone. We demonstrate through our emulation testbed that the impairments experienced in such a network architecture are dominated by continuity based impairments rather than artifacts, such as motion drift or blockiness. We also look into the implication of using Pause Intensity as a metric in terms of the overall video latency, which is potentially problematic should the video be sent and acted upon in real-time. We conclude that Pause Intensity may be used alongside the video characteristics which have been suggested as a measure of the overall video quality. © 2012 IEEE.
Resumo:
In this work, we investigate a new objective measurement for assessing the video playback quality for services delivered in networks that use TCP as a transport layer protocol. We define the new metric as pause intensity to characterize the quality of playback in terms of its continuity since, in the case of TCP, data packets are protected from losses but not from delays. Using packet traces generated from real TCP connections in a lossy environment, we are able to simulate the playback of a video and monitor buffer behaviors in order to calculate pause intensity values. We also run subjective tests to verify the effectiveness of the metric introduced and show that the results of pause intensity and the subjective scores made over the same real video clips are closely correlated.
Resumo:
In this work we deal with video streams over TCP networks and propose an alternative measurement to the widely used and accepted peak signal to noise ratio (PSNR) due to the limitations of this metric in the presence of temporal errors. A test-bed was created to simulate buffer under-run in scalable video streams and the pauses produced as a result of the buffer under-run were inserted into the video before being employed as the subject of subjective testing. The pause intensity metric proposed in [1] was compared with the subjective results and it was shown that in spite of reductions in frame rate and resolution, a correlation with pause intensity still exists. Due to these conclusions, the metric may be employed in layer selection in scalable video streams. © 2011 IEEE.
Resumo:
The attempts at carrying out terrorist attacks have become more prevalent. As a result, an increasing number of countries have become particularly vigilant against the means by which terrorists raise funds to finance their draconian acts against human life and property. Among the many counter-terrorism agencies in operation, governments have set up financial intelligence units (FIUs) within their borders for the purpose of tracking down terrorists’ funds. By investigating reported suspicious transactions, FIUs attempt to weed out financial criminals who use these illegal funds to finance terrorist activity. The prominent role played by FIUs means that their performance is always under the spotlight. By interviewing experts and conducting surveys of those associated with the fight against financial crime, this study investigated perceptions of FIU performance on a comparative basis between American and non-American FIUs. The target group of experts included financial institution personnel, civilian agents, law enforcement personnel, academicians, and consultants. Questions for the interview and surveys were based on the Kaplan and Norton’s Balanced Scorecard (BSC) methodology. One of the objectives of this study was to help determine the suitability of the BSC to this arena. While FIUs in this study have concentrated on performance by measuring outputs such as the number of suspicious transaction reports investigated, this study calls for a focus on outcomes involving all the parties responsible for financial criminal investigations. It is only through such an integrated approach that these various entities will be able to improve performance in solving financial crime. Experts in financial intelligence strongly believed that the quality and timeliness of intelligence was more important than keeping track of the number of suspicious transaction reports. Finally, this study concluded that the BSC could be appropriately applied to the arena of financial crime prevention even though the emphasis is markedly different from that in the private sector. While priority in the private sector is given to financial outcomes, in this arena employee growth and internal processes were perceived as most important in achieving a satisfactory outcome.
Resumo:
The development of 3G (the 3rd generation telecommunication) value-added services brings higher requirements of Quality of Service (QoS). Wideband Code Division Multiple Access (WCDMA) is one of three 3G standards, and enhancement of QoS for WCDMA Core Network (CN) becomes more and more important for users and carriers. The dissertation focuses on enhancement of QoS for WCDMA CN. The purpose is to realize the DiffServ (Differentiated Services) model of QoS for WCDMA CN. Based on the parallelism characteristic of Network Processors (NPs), the NP programming model is classified as Pool of Threads (POTs) and Hyper Task Chaining (HTC). In this study, an integrated programming model that combines both of the two models was designed. This model has highly efficient and flexible features, and also solves the problems of sharing conflicts and packet ordering. We used this model as the programming model to realize DiffServ QoS for WCDMA CN. ^ The realization mechanism of the DiffServ model mainly consists of buffer management, packet scheduling and packet classification algorithms based on NPs. First, we proposed an adaptive buffer management algorithm called Packet Adaptive Fair Dropping (PAFD), which takes into consideration of both fairness and throughput, and has smooth service curves. Then, an improved packet scheduling algorithm called Priority-based Weighted Fair Queuing (PWFQ) was introduced to ensure the fairness of packet scheduling and reduce queue time of data packets. At the same time, the delay and jitter are also maintained in a small range. Thirdly, a multi-dimensional packet classification algorithm called Classification Based on Network Processors (CBNPs) was designed. It effectively reduces the memory access and storage space, and provides less time and space complexity. ^ Lastly, an integrated hardware and software system of the DiffServ model of QoS for WCDMA CN was proposed. It was implemented on the NP IXP2400. According to the corresponding experiment results, the proposed system significantly enhanced QoS for WCDMA CN. It extensively improves consistent response time, display distortion and sound image synchronization, and thus increases network efficiency and saves network resource.^