53 resultados para automated full waveform logging system
Resumo:
Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost.
Resumo:
The performance of an air-cycle refrigeration unit for road transport, which had been previously reported, was analysed in detail and compared with the original design model and an equivalent Thermo King SL200 vapour-cycle refrigeration unit. Poor heat exchanger performance was found to be the major contributor to low coefficient of performance values. Using state-of-the-art, but achievable performance levels for turbomachinery and heat exchangers, the performance of an optimised air-cycle refrigeration unit for the same application was predicted. The power requirement of the optimised air-cycle unit was 7% greater than the equivalent vapour-cycle unit at full-load operation. However, at part-load operation the air-cycle unit was estimated to absorb 35% less power than the vapour-cycle unit. The analysis demonstrated that the air-cycle system could potentially match the overall fuel consumption of the vapour-cycle transport refrigeration unit, while delivering the benefit of a completely refrigerant free system.
Resumo:
The environmental attractions of air-cycle refrigeration are considerable. Following a thermodynamic design analysis, an air-cycle demonstrator plant was constructed within the restricted physical envelope of an existing Thermo King SL200 trailer refrigeration unit. This unique plant operated satisfactorily, delivering sustainable cooling for refrigerated trailers using a completely natural and safe working fluid. The full load capacity of the air-cycle unit at -20 °C was 7,8 kW, 8% greater than the equivalent vapour-cycle unit, but the fuel consumption of the air-cycle plant was excessively high. However, at part load operation the disparity in fuel consumption dropped from approximately 200% to around 80%. The components used in the air-cycle demonstrator were not optimised and considerable potential exists for efficiency improvements, possibly to the point where the air-cycle system could rival the efficiency of the standard vapour-cycle system at part-load operation, which represents the biggest proportion of operating time for most units.
Resumo:
Previous studies have revealed considerable interobserver and intraobserver variation in the histological classification of preinvasive cervical squamous lesions. The aim of the present study was to develop a decision support system (DSS) for the histological interpretation of these lesions. Knowledge and uncertainty were represented in the form of a Bayesian belief network that permitted the storage of diagnostic knowledge and, for a given case, the collection of evidence in a cumulative manner that provided a final probability for the possible diagnostic outcomes. The network comprised 8 diagnostic histological features (evidence nodes) that were each independently linked to the diagnosis (decision node) by a conditional probability matrix. Diagnostic outcomes comprised normal; koilocytosis; and cervical intraepithelial neoplasia (CIN) 1, CIN II, and CIN M. For each evidence feature, a set of images was recorded that represented the full spectrum of change for that feature. The system was designed to be interactive in that the histopathologist was prompted to enter evidence into the network via a specifically designed graphical user interface (i-Path Diagnostics, Belfast, Northern Ireland). Membership functions were used to derive the relative likelihoods for the alternative feature outcomes, the likelihood vector was entered into the network, and the updated diagnostic belief was computed for the diagnostic outcomes and displayed. A cumulative probability graph was generated throughout the diagnostic process and presented on screen. The network was tested on 50 cervical colposcopic biopsy specimens, comprising 10 cases each of normal, koilocytosis, CIN 1, CIN H, and CIN III. These had been preselected by a consultant gynecological pathologist. Using conventional morphological assessment, the cases were classified on 2 separate occasions by 2 consultant and 2 junior pathologists. The cases were also then classified using the DSS on 2 occasions by the 4 pathologists and by 2 medical students with no experience in cervical histology. Interobserver and intraobserver agreement using morphology and using the DSS was calculated with K statistics. Intraobserver reproducibility using conventional unaided diagnosis was reasonably good (kappa range, 0.688 to 0.861), but interobserver agreement was poor (kappa range, 0.347 to 0.747). Using the DSS improved overall reproducibility between individuals. Using the DSS, however, did not enhance the diagnostic performance of junior pathologists when comparing their DSS-based diagnosis against an experienced consultant. However, the generation of a cumulative probability graph also allowed a comparison of individual performance, how individual features were assessed in the same case, and how this contributed to diagnostic disagreement between individuals. Diagnostic features such as nuclear pleomorphism were shown to be particularly problematic and poorly reproducible. DSSs such as this therefore not only have a role to play in enhancing decision making but also in the study of diagnostic protocol, education, self-assessment, and quality control. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
Score following has been an important area of research in AI and music since the mid 80's. Various systems were developed, but they were predominantly for providing automated accompaniment to live concert performances, dealing mostly with issues relating to pitch detection and identification of embellished melodies. They have a big potential in the area of education where student performers benefit in practice situations. Current accompaniment systems are not designed to deal with errors that may occur during practising. In this paper we present a system developed to provide accompaniment for students practising at home. First a survey of score following will be given. Then the capabilities of the system will be explained, and the results from the first experiments of the monophonic score following system will be presented.
Resumo:
BACKGROUND: Cardiovascular disease (CVD) occurs more frequently in individuals with a family history of premature CVD. Within families the demographics of CVD are poorly described. DESIGN: We examined the risk estimation based on the Systematic Coronary Risk Evaluation (SCORE) system and the Joint British Guidelines (JBG) for older unaffected siblings of patients with premature CVD (onset ≤55 years for men and ≤60 years for women). METHODS: Between August 1999 and November 2003 laboratory and demographic details were collected on probands with early-onset CVD and their older unaffected siblings. Siblings were screened for clinically overt CVD by a standard questionnaire and 12-lead electrocardiogram (ECG). RESULTS: A total of 790 siblings was identified and full demographic details were available for 645. The following siblings were excluded: 41 with known diabetes mellitus; seven with random plasma glucose of 11.1 mmol/l or greater; and eight with ischaemic ECG. Data were analysed for 589 siblings from 405 families. The mean age was 55.0 years, 43.1% were men and 28.7% were smokers. The mean total serum cholesterol was 5.8 mmol/l and hypertension was present in 49.4%. Using the SCORE system, when projected to age 60 years, 181 men (71.3%) and 67 women (20.0%) would be eligible for risk factor modification. Using JBG with a 10-year risk of 20% or greater, 42 men (16.5%) and four women (1.2%) would be targeted. CONCLUSIONS: Large numbers of these asymptomatic individuals meet both European and British guidelines for the primary prevention of CVD and should be targeted for risk factor modification. The prevalence of individuals defined as eligible for treatment is much higher when using the SCORE system. © 2007 European Society of Cardiology.
Resumo:
Objective
Preliminary assessment of an automated weaning system (SmartCare™/PS) compared to usual management of weaning from mechanical ventilation performed in the absence of formal protocols.
Design and setting
A randomised, controlled pilot study in one Australian intensive care unit.
Patients
A total of 102 patients were equally divided between SmartCare/PS and Control.
Interventions
The automated system titrated pressure support, conducted a spontaneous breathing trial and provided notification of success (“separation potential”).
Measurements and results
The median time from the first identified point of suitability for weaning commencement to the state of “separation potential” using SmartCare/PS was 20 h (interquartile range, IQR, 2–40) compared to 8 h (IQR 2–43) with Control (log-rank P = 0.3). The median time to successful extubation was 43 h (IQR 6–169) using SmartCare/PS and 40 (14–87) with Control (log-rank P = 0.6). Unadjusted, the estimated probability of reaching “separation potential” was 21% lower (95% CI, 48% lower to 20% greater) with SmartCare/PS compared to Control. Adjusted for other covariates (age, gender, APACHE II, SOFAmax, neuromuscular blockade, corticosteroids, coma and elevated blood glucose), these estimates were 31% lower (95% CI, 56% lower to 9% greater) with SmartCare/PS. The study groups showed comparable rates of reintubation, non-invasive ventilation post-extubation, tracheostomy, sedation, neuromuscular blockade and use of corticosteroids.
Conclusions
Substantial reductions in weaning duration previously demonstrated were not confirmed when the SmartCare/PS system was compared to weaning managed by experienced critical care specialty nurses, using a 1:1 nurse-to-patient ratio. The effect of SmartCare/PS may be influenced by the local clinical organisational context.
Resumo:
The Rapid Oscillations in the Solar Atmosphere (ROSA) instrument is a synchronized, six-camera high-cadence solar imaging instrument developed by Queen's University Belfast. The system is available on the Dunn Solar Telescope at the National Solar Observatory in Sunspot, New Mexico, USA, as a common-user instrument. Consisting of six 1k x 1k Peltier-cooled frame-transfer CCD cameras with very low noise (0.02 -aEuro parts per thousand 15 e s(-1) pixel(-1)), each ROSA camera is capable of full-chip readout speeds in excess of 30 Hz, or 200 Hz when the CCD is windowed. Combining multiple cameras and fast readout rates, ROSA will accumulate approximately 12 TB of data per 8 hours observing. Following successful commissioning during August 2008, ROSA will allow for multi-wavelength studies of the solar atmosphere at a high temporal resolution.
Resumo:
Venom has only been recently discovered to be a basal trait of the Anguimorpha lizards. Consequently, very little is known about the timings of toxin recruitment events, venom protein molecular evolution, or even the relative physical diversifications of the venom system itself. A multidisciplinary approach was used to examine the evolution across the full taxonomical range of this similar to 130 million-year-old clade. Analysis of cDNA libraries revealed complex venom transcriptomes. Most notably, three new cardioactive peptide toxin types were discovered (celestoxin, cholecystokinin, and YY peptides). The latter two represent additional examples of convergent use of genes in toxic arsenals, both having previously been documented as components of frog skin defensive chemical secretions. Two other novel venom gland-overexpressed modified versions of other protein frameworks were also recovered from the libraries (epididymal secretory protein and ribonuclease). Lectin, hyaluronidase, and veficolin toxin types were sequenced for the first time from lizard venoms and shown to be homologous to the snake venom forms. In contrast, phylogenetic analyses demonstrated that the lizard natriuretic peptide toxins were recruited independently of the form in snake venoms. The de novo evolution of helokinestatin peptide toxin encoding do-mains within the lizard venom natriuretic gene was revealed to be exclusive to the helodermatid/anguid subclade. New isoforms were sequenced for cysteine-rich secretory protein, kallikrein, and phospholipase A 2 toxins. Venom gland morphological analysis revealed extensive evolutionary tinkering. Anguid glands are characterized by thin capsules and mixed glands, serous at the bottom of the lobule and mucous toward the apex. Twice, independently this arrangement was segregated into specialized serous protein-secreting glands with thick capsules with the mucous lobules now distinct (Heloderma and the Lanthanotus/Varanus clade). The results obtained highlight the importance of utilizing evolution-based search strategies for biodiscovery and emphasize the largely untapped drug design and development potential of lizard venoms. Molecular & Cellular Proteomics 9:2369-2390, 2010.
Resumo:
Automated examination timetabling has been addressed by a wide variety of methodologies and techniques over the last ten years or so. Many of the methods in this broad range of approaches have been evaluated on a collection of benchmark instances provided at the University of Toronto in 1996. Whilst the existence of these datasets has provided an invaluable resource for research into examination timetabling, the instances have significant limitations in terms of their relevance to real-world examination timetabling in modern universities. This paper presents a detailed model which draws upon experiences of implementing examination timetabling systems in universities in Europe, Australasia and America. This model represents the problem that was presented in the 2nd International Timetabling Competition (ITC2007). In presenting this detailed new model, this paper describes the examination timetabling track introduced as part of the competition. In addition to the model, the datasets used in the competition are also based on current real-world instances introduced by EventMAP Limited. It is hoped that the interest generated as part of the competition will lead to the development, investigation and application of a host of novel and exciting techniques to address this important real-world search domain. Moreover, the motivating goal of this paper is to close the currently existing gap between theory and practice in examination timetabling by presenting the research community with a rigorous model which represents the complexity of the real-world situation. In this paper we describe the model and its motivations, followed by a full formal definition.
Resumo:
Although interest in crossbreeding within dairy systems has increased, the role of Jersey crossbred cows within high concentrate input systems has received little attention. This experiment was designed to examine the performance of Holstein-Friesian (HF) and Jersey x Holstein-Friesian (J x HF) cows within a high concentrate input total confinement system (CON) and a medium concentrate input grazing system (GRZ). Eighty spring-calving dairy cows were used in a 2 (cow genotype) x 2 (milk production system) factorial design experiment. The experiment commenced when cows calved and encompassed a full lactation. With GRZ, cows were offered diets containing grass silage and concentrates [70:30 dry matter (DM) ratio] until turnout, grazed grass plus 1.0 kg of concentrate/day during a 199-d grazing period, and grass silage and concentrates (75:25 DM ratio) following rehousing and until drying-off. With CON, cows were confined throughout the lactation and offered diets containing grass silage and concentrates (DM ratio; 40:60, 50:50, 40:40, and 75:25 during d 1 to 100, 101 to 200, 201 to 250, and 251 until drying-off, respectively). Full-lactation concentrate DM intakes were 791 and 2,905 kg/cow for systems GRZ and CON, respectively. Although HF cows had a higher lactation milk yield than J x HF cows, the latter produced milk with a higher fat and protein content, so that solids-corrected milk yield (SCM) was unaffected by genotype. Somatic cell score was higher with the J x HF cows. Throughout lactation, HF cows were on average 37 kg heavier than J x HF cows, whereas the J x HF cows had a higher body condition score. Within each system, food intake did not differ between genotypes, whereas full-lactation yields of milk, fat plus protein, and SCM were higher with CON than with GRZ. A significant genotype x environment interaction was observed for milk yield, and a trend was found for an interaction with SCM. Crossbred cows on CON gained more body condition than HF cows, and overall pregnancy rate was unaffected by either genotype or management system. In summary, milk and SCM yields were higher with CON than with GRZ, whereas genotype had no effect on SCM. However, HF cows exhibited a greater milk yield response and a trend toward a greater SCM yield response with increasing concentrate levels compared with the crossbred cows.
Resumo:
This paper presents an innovative sensor system, created specifically for new civil engineering structural monitoring applications, allowing specially packaged fiber grating-based sensors to be used in harsh, in-the-field measurement conditions for accurate strain measurement with full temperature compensation. The sensor consists of two fiber Bragg gratings that are protected within a polypropylene package, with one of the fiber gratings isolated from the influence of strain and thus responding only to temperature variations, while the other is sensitive to both strain and temperature. To achieve this, the temperature-monitoring fiber grating is slightly bent and enclosed in a metal envelope to isolate it effectively from the strain. Through an appropriate calibration process, both the strain and temperature coefficients of each individual grating component when incorporated in the sensor system can be thus obtained. By using these calibrated coefficients in the operation of the sensor, both strain and temperature can be accurately determined. The specific application for which these sensors have been designed is seen when installed on an innovative small-scale flexi-arch bridge where they are used for real-time strain measurements during the critical installation stage (lifting) and loading. These sensors have demonstrated enhanced resilience when embedded in or surface-mounted on such concrete structures, providing accurate and consistent strain measurements not only during installation but subsequently during use. This offers an inexpensive and highly effective monitoring system tailored for the new, rapid method of the installation of small-scale bridges for a variety of civil engineering applications.
Resumo:
The use of dataflow digital signal processing system modelling
and synthesis techniques has been a fruitful research theme for many years and has yielded many powerful rapid system synthesis and optimisation capabilities. However, recent years have seen the spectrum of languages and techniques splinter in an application specific manner, resulting in an ad-hoc design process which is increasingly dependent on the particular application under development. This poses a major problem for automated toolflows attempting to provide rapid system synthesis for a wide ranges of applications. By analysing a number of dataflow FPGA implementation case studies, this paper shows that despit ethis common traits may be found in current techniques, which fall largely into three classes. Further, it exposes limitations pertaining to their ability to adapt algorith models to implementations for different operating environments and target platforms.
Resumo:
The development of an automated system for the quality assessment of aerodrome ground lighting (AGL), in accordance with associated standards and recommendations, is presented. The system is composed of an image sensor, placed inside the cockpit of an aircraft to record images of the AGL during a normal descent to an aerodrome. A model-based methodology is used to ascertain the optimum match between a template of the AGL and the actual image data in order to calculate the position and orientation of the camera at the instant the image was acquired. The camera position and orientation data are used along with the pixel grey level for each imaged luminaire, to estimate a value for the luminous intensity of a given luminaire. This can then be compared with the expected brightness for that luminaire to ensure it is operating to the required standards. As such, a metric for the quality of the AGL pattern is determined. Experiments on real image data is presented to demonstrate the application and effectiveness of the system.
Resumo:
Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).