926 resultados para acquisition procedures
Resumo:
Capable of three-dimensional imaging of the cornea with micrometer-scale resolution, spectral domain-optical coherence tomography (SDOCT) offers potential advantages over Placido ring and Scheimpflug photography based systems for accurate extraction of quantitative keratometric parameters. In this work, an SDOCT scanning protocol and motion correction algorithm were implemented to minimize the effects of patient motion during data acquisition. Procedures are described for correction of image data artifacts resulting from 3D refraction of SDOCT light in the cornea and from non-idealities of the scanning system geometry performed as a pre-requisite for accurate parameter extraction. Zernike polynomial 3D reconstruction and a recursive half searching algorithm (RHSA) were implemented to extract clinical keratometric parameters including anterior and posterior radii of curvature, central cornea optical power, central corneal thickness, and thickness maps of the cornea. Accuracy and repeatability of the extracted parameters obtained using a commercial 859nm SDOCT retinal imaging system with a corneal adapter were assessed using a rigid gas permeable (RGP) contact lens as a phantom target. Extraction of these parameters was performed in vivo in 3 patients and compared to commercial Placido topography and Scheimpflug photography systems. The repeatability of SDOCT central corneal power measured in vivo was 0.18 Diopters, and the difference observed between the systems averaged 0.1 Diopters between SDOCT and Scheimpflug photography, and 0.6 Diopters between SDOCT and Placido topography.
Resumo:
A computerized handheld procedure is presented in this paper. It is intended as a database complementary tool, to enhance prospective risk analysis in the field of occupational health. The Pendragon forms software (version 3.2) has been used to implement acquisition procedures on Personal Digital Assistants (PDAs) and to transfer data to a computer in an MS-Access format. The data acquisition strategy proposed relies on the risk assessment method practiced at the Institute of Occupational Health Sciences (IST). It involves the use of a systematic hazard list and semi-quantitative risk assessment scales. A set of 7 modular forms has been developed to cover the basic need of field audits. Despite the minor drawbacks observed, the results obtained so far show that handhelds are adequate to support field risk assessment and follow-up activities. Further improvements must still be made in order to increase the tool effectiveness and field adequacy.
Resumo:
Mode of access: Internet.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Background: Catheter ablation procedures for atrial fibrillation (AF) may frequently require long fluoroscopic times. We sought to undertake a review of radiation safety practice in our Cardiac Electrophysiology Laboratory and implement changes to minimize fluoroscopic doses. We also sought to compare the results with radiation doses for percutaneous coronary intervention (PCI) cases performed in our hospital. Methods: Fluoroscopic times and doses for AF ablation procedures performed by a single operator on a Philips Integris H3000 image-intensifier were analysed for 11-month period. Results were compared with all PCI procedures performed over a similar period by multiple operators on a Philips Integris Allura FD system. Comprehensive review of radiation practice in the Electrophysiology laboratory identified the potential to reduce pulse frame rates and doses, and to narrow the field of interest without impacting the performance of the procedure. These changes were implemented and results analysed after a further 11 months. Results: In the pre-intervention period 50 AF catheter ablations had a mean fluoroscopic time of 86.4 min and mean fluoroscopic dose 68.4 Gy/cm2. Post-intervention 75 procedures had a mean fluorosocopic time of 68.9 min (p < 0.0001) and mean dose of 14.3 Gy/cm2 (p < 0.0001) 128 PCI procedures had a mean combined fluoroscopic and image acquisition time of 10.0 min and mean total dose 38.8 Gy/cm2. Conclusions: Catheter ablation procedures for AF may require lengthy use of fluoroscopy but simple modifications to radiation practice can result in marked reductions in radiation dose that compare favourably with PCI case doses
Resumo:
This thesis confronts the nature of the process of learning an intellectual skill, the ability to solve problems efficiently in a particular domain of discourse. The investigation is synthetic; a computational performance model, HACKER, is displayed. Hacker is a computer problem-solving system whose performance improves with practice. HACKER maintains performance knowledge as a library of procedures indexed by descriptions of the problem types for which the procedures are appropriate. When applied to a problem, HACKER tries to use a procedure from this "Answer Library". If no procedure is found to be applicable, HACKER writes one using more general knowledge of the problem domain and of programming techniques. This new program may be generalized and added to the Answer Library.
Resumo:
Very little research has examined K–12 educational technology decision-making in Canada. This collective case study explores the technology procurement process in Ontario’s publicly funded school districts to determine if it is informed by the relevant research, grounded in best practices, and enhances student learning. Using a qualitative approach, 10 senior leaders (i.e., chief information officers, superintendents, etc.) were interviewed. A combination of open-ended and closed-ended questions were used to reveal the most important factors driving technology acquisition, research support, governance procedures, data use, and assessment and return on investment (ROI) measures utilized by school districts in their implementation of educational technology. After participants were interviewed, the data were transcribed, member checked, and then submitted to “Computer-assisted NCT analysis” (Friese, 2014) using ATLAS.ti. The findings show that senior leaders are making acquisitions that are not aligned with current scholarship and not with student learning as the focus. It was also determined that districts struggle to use data-driven decision-making to support the governance of educational technology spending. Finally, the results showed that districts do not have effective assessment measures in place to determine the efficacy or ROI of a purchased technology. Although data are limited to the responses of 10 senior leaders, findings represent the technology leadership for approximately 746,000 Ontario students. The study is meant to serve as an informative resource for senior leaders and presents strategic and research-validated approaches to technology procurement. Further, the study has the potential to refine technology decision-making, policies, and practices in K–12 education.
Resumo:
We present an unsupervised learning algorithm that acquires a natural-language lexicon from raw speech. The algorithm is based on the optimal encoding of symbol sequences in an MDL framework, and uses a hierarchical representation of language that overcomes many of the problems that have stymied previous grammar-induction procedures. The forward mapping from symbol sequences to the speech stream is modeled using features based on articulatory gestures. We present results on the acquisition of lexicons and language models from raw speech, text, and phonetic transcripts, and demonstrate that our algorithm compares very favorably to other reported results with respect to segmentation performance and statistical efficiency.
Resumo:
THE clinical skills of medical professionals rely strongly on the sense of touch, combined with anatomical and diagnostic knowledge. Haptic exploratory procedures allow the expert to detect anomalies via gross and fine palpation, squeezing, and contour following. Haptic feedback is also key to medical interventions, for example when an anaesthetist inserts an epidural needle, a surgeon makes an incision, a dental surgeon drills into a carious lesion, or a veterinarian sutures a wound. Yet, current trends in medical technology and training methods involve less haptic feedback to clinicians and trainees. For example, minimally invasive surgery removes the direct contact between the patient and clinician that gives rise to natural haptic feedback, and furthermore introduces scaling and rotational transforms that confuse the relationship between movements of the hand and the surgical site. Similarly, it is thought that computer-based medical simulation and training systems require high-resolution and realistic haptic feedback to the trainee for significant training transfer to occur. The science and technology of haptics thus has great potential to affect the performance of medical procedures and learning of clinical skills. This special section is about understanding
Resumo:
This paper studies a model of a sequential auction where bidders are allowed to acquire further information about their valuations of the object in the middle of the auction. It is shown that, in any equilibrium where the distribution of the final price is atornless, a bidder's best response has a simple characterization. In particular, the optimal information acquisition point is the same, regardless of the other bidders' actions. This makes it natural to focus on symmetric, undominated equilibria, as in the Vickrey auction. An existence theorem for such a class of equilibria is presented. The paper also presents some results and numerical simulations that compare this sequential auction with the one-shot auction. 8equential auctions typically yield more expected revenue for the seller than their one-shot counterparts. 80 the possibility of mid-auction information acquisition can provide an explanation for why sequential procedures are more often adopted.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
Absolute quantitation of clinical (1)H-MR spectra is virtually always incomplete for single subjects because the separate determination of spectrum, baseline, and transverse and longitudinal relaxation times in single subjects is prohibitively long. Integrated Processing and Acquisition of Data (IPAD) based on a combined 2-dimensional experimental and fitting strategy is suggested to substantially improve the information content from a given measurement time. A series of localized saturation-recovery spectra was recorded and combined with 2-dimensional prior-knowledge fitting to simultaneously determine metabolite T(1) (from analysis of the saturation-recovery time course), metabolite T(2) (from lineshape analysis based on metabolite and water peak shapes), macromolecular baseline (based on T(1) differences and analysis of the saturation-recovery time course), and metabolite concentrations (using prior knowledge fitting and conventional procedures of absolute standardization). The procedure was tested on metabolite solutions and applied in 25 subjects (15-78 years old). Metabolite content was comparable to previously found values. Interindividual variation was larger than intraindividual variation in repeated spectra for metabolite content as well as for some relaxation times. Relaxation times were different for various metabolite groups. Parts of the interindividual variation could be explained by significant age dependence of relaxation times.
Resumo:
Federal Highway Administration, Safety Design Division, McLean, Va.
Resumo:
This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.