910 resultados para Vignetting Correction
Resumo:
Time series classification has been extensively explored in many fields of study. Most methods are based on the historical or current information extracted from data. However, if interest is in a specific future time period, methods that directly relate to forecasts of time series are much more appropriate. An approach to time series classification is proposed based on a polarization measure of forecast densities of time series. By fitting autoregressive models, forecast replicates of each time series are obtained via the bias-corrected bootstrap, and a stationarity correction is considered when necessary. Kernel estimators are then employed to approximate forecast densities, and discrepancies of forecast densities of pairs of time series are estimated by a polarization measure, which evaluates the extent to which two densities overlap. Following the distributional properties of the polarization measure, a discriminant rule and a clustering method are proposed to conduct the supervised and unsupervised classification, respectively. The proposed methodology is applied to both simulated and real data sets, and the results show desirable properties.
Resumo:
In this paper we analyse the role of some of the building blocks of SHA-256. We show that the disturbance-correction strategy is applicable to the SHA-256 architecture and we prove that functions Σ, σ are vital for the security of SHA-256 by showing that for a variant without them it is possible to find collisions with complexity 2^64 hash operations. As a step towards an analysis of the full function, we present the results of our experiments on Hamming weights of expanded messages for different variants of the message expansion and show that there exist low-weight expanded messages for XOR-linearised variants.
Resumo:
Introduction There is growing interest in the biomechanics of ‘fusionless’ implant constructs used for deformity correction in the thoracic spine. Intervertebral stapling is a leading method of fusionless corrective surgery. Although used for a number of years, there is limited evidence as to the effect these staples have on the stiffness of the functional spinal unit. Materials and Methods Thoracic spines from 6-8 week old calves were dissected and divided into motion segments including levels T4-T11 (n=14). Each segment was potted in polymethylemethacrylate. An Instron Biaxial materials testing machine with a custom made jig was used for testing. The segments were tested in flexion/extension, lateral bending and axial rotation at 37⁰C and 100% humidity, using moment control to a maximum 1.75 Nm with a loading rate of 0.3 Nm per second. This torque was found sufficient to achieve physiologically representative ranges of movement. The segments were initially tested uninstrumented with data collected from the tenth load cycle. Next a left anterolateral Shape Memory Alloy (SMA) staple was inserted (Medtronic Sofamor Danek, USA). Biomechanical testing was repeated as before with data collected from the tenth load cycle. Results In flexion/extension there was an insignificant drop in stiffness of 3% (p=0.478). In lateral bending there was a significant drop in stiffness of 21% (p<0.001). This was mainly in lateral bending away from the staple, where the stiffness reduced by 30% (p<0.001). This was in contrast to lateral bending towards the staple where it dropped by 12% which was still statistically significant (p=0.036). In axial rotation there was an overall near significant drop in stiffness of 11% (p=0.076). However, this was more towards the side of the staple measuring a decrease of 14% as opposed to 8% away from the staple. In both cases it was a statistically insignificant drop (p=0.134 and p=0.352 respectively). Conclusion Insertion of intervertebral SMA staples results in a significant reduction in motion segment stiffness in lateral bending especially in the direction away from the staple. The staple had less effect on axial rotation stiffness and minimal effect on flexion/extension stiffness.
Resumo:
Adolescent idiopathic scoliosis (AIS) is a spinal deformity, which may require surgical correction by attaching rods to the patient’s spine using screws inserted into the vertebrae. Complication rates for deformity correction surgery are unacceptably high. Determining an achievable correction without overloading the adjacent spinal tissues or implants requires an understanding of the mechanical interaction between these components. We have developed novel patient specific modelling software to create individualized finite element models (FEM) representing the thoracolumbar spine and ribcage of scoliosis patients. We are using these models to better understand the biomechanics of spinal deformity correction.
Resumo:
Capacitors are widely used for power-factor correction (PFC) in power systems. When a PFC capacitor is installed with a certain load in a microgrid, it may be in parallel with the filter capacitor of the inverter interfacing the utility grid and the local distributed-generation unit and, thus, change the effective filter capacitance. Another complication is the possibility of occurrence of resonance in the microgrid. This paper conducts an in-depth investigation of the effective shunt-filter-capacitance variation and resonance phenomena in a microgrid due to a connection of a PFC capacitor. To compensate the capacitance-parameter variation, an Hinfin controller is designed for the voltage-source- inverter voltage control. By properly choosing the weighting functions, the synthesized Hinfin controller would exhibit high gains at the vicinity of the line frequency, similar to traditional high- performance P+ resonant controller and, thus, would possess nearly zero steady-state error. However, with the robust Hinfin controller, it will be possible to explicitly specify the degree of robustness in face of parameter variations. Furthermore, a thorough investigation is carried out to study the performance of inner current-loop feedback variables under resonance conditions. It reveals that filter-inductor current feedback is more effective in damping the resonance. This resonance can be further attenuated by employing the dual-inverter microgrid conditioner and controlling the series inverter as a virtual resistor affecting only harmonic components without interference with the fundamental power flow. And finally, the study in this paper has been tested experimentally using an experimental microgrid prototype.
Resumo:
The thesis investigates “where were the auditors in asset securitizations”, a criticism of the audit profession before and after the onset of the global financial crisis (GFC). Asset securitizations increase audit complexity and audit risks, which are expected to increase audit fees. Using US bank holding company data from 2003 to 2009, this study examines the association between asset securitization risks and audit fees, and its changes during the global financial crisis. The main test is based on an ordinary least squares (OLS) model, which is adapted from the Fields et al. (2004) bank audit fee model. I employ a principal components analysis to address high correlations among asset securitization risks. Individual securitization risks are also separately tested. A suite of sensitivity tests indicate the results are robust. These include model alterations, sample variations, further controls in the tests, and correcting for the securitizer self-selection problem. A partial least squares (PLS) path modelling methodology is introduced as a separate test, which allows for high intercorrelations, self-selection correction, and sequential order hypotheses in one simultaneous model. The PLS results are consistent with the main results. The study finds significant and positive associations between securitization risks and audit fees. After the commencement of the global financial crisis in 2007, there was an increased focus on the role of audits on asset securitization risks resulting from bank failures; therefore I expect that auditors would become more sensitive to bank asset securitization risks after the commencement of the crisis. I find that auditors appear to focus on different aspects of asset securitization risks during the crisis and that auditors appear to charge a GFC premium for banks. Overall, the results support the view that auditors consider asset securitization risks and market changes, and adjust their audit effort and risk considerations accordingly.
Resumo:
Exhaust emissions from motor vehicles vary widely and depend on factors such as engine operating conditions, fuel, age, mileage and service history. A method has been devised to rapidly identify high-polluting vehicles as they travel on the road. The method is able to monitor emissions from a large number of vehicles in a short time and avoids the need to conduct expensive and time consuming tests on chassis dynamometers. A sample of the exhaust plume is captured as each vehicle passes a roadside monitoring station and the pollutant emission factors are calculated from the measured concentrations using carbon dioxide as a tracer. Although, similar methods have been used to monitor soot and gaseous mass emissions, to-date it has not been used to monitor particle number emissions from a large fleet of vehicles. This is particularly important as epidemiological studies have shown that particle number concentration is an important parameter in determining adverse health effects. The method was applied to measurements of particle number emissions from individual buses in the Brisbane City Council diesel fleet operating on the South-East Busway. Results indicate that the particle number emission factors are gamma- distributed, with a high proportion of the emissions being emitted by a small percentage of the buses. Although most of the high-emitters are the oldest buses in the fleet, there are clear exceptions, with some newer buses emitting as much. We attribute this to their recent service history, particularly pertaining to improper tuning of the engines. We recommend that a targeted correction program would be a highly effective measure in mitigating urban environmental pollution.
Resumo:
Purpose Small field x-ray beam dosimetry is difficult due to a lack of lateral electronic equilibrium, source occlusion, high dose gradients and detector volume averaging. Currently there is no single definitive detector recommended for small field dosimetry. The objective of this work was to evaluate the performance of a new commercial synthetic diamond detector, namely the PTW 60019 microDiamond, for the dosimetry of small x-ray fields as used in stereotactic radiosurgery (SRS). Methods Small field sizes were defined by BrainLAB circular cones (4 – 30 mm diameter) on a Novalis Trilogy linear accelerator and using the 6 MV SRS x-ray beam mode for all measurements. Percentage depth doses were measured and compared to an IBA SFD and a PTW 60012 E diode. Cross profiles were measured and compared to an IBA SFD diode. Field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated by Monte Carlo methods using BEAMnrc and correction factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Results For the small fields of 4 to 30 mm diameter, there were dose differences in the PDDs of up to 1.5% when compared to an IBA SFD and PTW 60012 E diode detector. For the cross profile measurements the penumbra values varied, depending upon the orientation of the detector. The field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated for these field diameters at a depth of 1.4 cm in water and they were within 2.7% of published values for a similar linear accelerator. The corrections factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Conclusions We conclude that the new PTW 60019 microDiamond detector is generally suitable for relative dosimetry in small 6 MV SRS beams for a Novalis Trilogy linear equipped with circular cones.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
Purpose: Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Methods: Twenty visually normal children (mean age: 10.8 ± 0.7 years; 6 males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 minutes of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Results: Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p<0.001) and sustained near work (p<0.001), however, there was no significant interaction between these factors (p>0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p>0.05). Conclusion: Simulated bilateral astigmatism impaired children’s performance on a range of academic–related outcome measures irrespective of the orientation of the astigmatism. These findings have implications for the clinical management of non-amblyogenic levels of astigmatism in relation to academic performance in children. Correction of low to moderate levels of astigmatism may improve the functional performance of children in the classroom.
Resumo:
Purpose To investigate the effect of different levels of refractive blur on real-world driving performance measured under day and nighttime conditions. Methods Participants included 12 visually normal, young adults (mean age = 25.8 ± 5.2 years) who drove an instrumented research vehicle around a 4 km closed road circuit with three different levels of binocular spherical refractive blur (+0.50 diopter sphere [DS], +1.00 DS, +2.00 DS) compared with a baseline condition. The subjects wore optimal spherocylinder correction and the additional blur lenses were mounted in modified full-field goggles; the order of testing of the blur conditions was randomized. Driving performance was assessed in two different sessions under day and nighttime conditions and included measures of road signs recognized, hazard detection and avoidance, gap detection, lane-keeping, sign recognition distance, speed, and time to complete the course. Results Refractive blur and time of day had significant effects on driving performance (P < 0.05), where increasing blur and nighttime driving reduced performance on all driving tasks except gap judgment and lane keeping. There was also a significant interaction between blur and time of day (P < 0.05), such that the effects of blur were exacerbated under nighttime driving conditions; performance differences were evident even for +0.50 DS blur relative to baseline for some measures. Conclusions The effects of blur were greatest under nighttime conditions, even for levels of binocular refractive blur as low as +0.50 DS. These results emphasize the importance of accurate and up-to-date refractive correction of even low levels of refractive error when driving at night.
Resumo:
This thesis investigated in detail the physics of small X-ray fields used in radiotherapy treatments. Because of this work, the ability to accurately measure dose from these very small X-ray fields has been improved in several ways. These include scientifically quantifying when highly accurate measurements are required by introducing the concept of a very small field, and by the invention of a new detector that responds the same in very small fields as in normal fields.
Resumo:
Purpose: To provide a comprehensive overview of research examining the impact of astigmatism on clinical and functional measures of vision, the short and longer term adaptations to astigmatism that occur in the visual system, and the currently available clinical options for the management of patients with astigmatism. Recent findings: The presence of astigmatism can lead to substantial reductions in visual performance in a variety of clinical vision measures and functional visual tasks. Recent evidence demonstrates that astigmatic blur results in short-term adaptations in the visual system that appear to reduce the perceived impact of astigmatism on vision. In the longer term, uncorrected astigmatism in childhood can also significantly impact on visual development, resulting in amblyopia. Astigmatism is also associated with the development of spherical refractive errors. Although the clinical correction of small magnitudes of astigmatism is relatively straightforward, the precise, reliable correction of astigmatism (particularly high astigmatism) can be challenging. A wide variety of refractive corrections are now available for the patient with astigmatism, including spectacle, contact lens and surgical options. Conclusion: Astigmatism is one of the most common refractive errors managed in clinical ophthalmic practice. The significant visual and functional impacts of astigmatism emphasise the importance of its reliable clinical management. With continued improvements in ocular measurement techniques and developments in a range of different refractive correction technologies, the future promises the potential for more precise and comprehensive correction options for astigmatic patients.
Resumo:
Purpose To examine macular retinal thickness and retinal layer thickness with spectral domain optical coherence tomography (OCT) in a population of children with normal ocular health and minimal refractive errors. Methods High resolution macular OCT scans from 196 children aged from 4 to 12 years (mean age 8 ± 2 years) were analysed to determine total retinal thickness and the thickness of 6 different retinal layers across the central 5 mm of the posterior pole. Automated segmentation with manual correction was used to derive retinal thickness values. Results The mean total retinal thickness in the central 1 mm foveal zone was 255 ± 16 μm, and this increased significantly with age (mean increase of 1.8 microns per year) in childhood (p<0.001). Age-related increases in thickness of some retinal layers were also observed, with changes of highest statistical significance found in the outer retinal layers in the central foveal region (p<0.01). Significant topographical variations in thickness of each of the retinal layers were also observed (p<0.001). Conclusions Small magnitude, statistically significant increases in total retinal thickness and retinal layer thickness occur from early childhood to adolescence. The most prominent changes appear to occur in the outer retinal layers of the central fovea.
Resumo:
Purpose:Race appears to be associated with myopiogenesis, with East Asians showing high myopia prevalence. Considering structural variations in the eye, it is possible that retinal shapes are different between races. The purpose of this study was to quantify and compare retinal shapes between racial groups using peripheral refraction (PR) and peripheral eye lengths (PEL). Methods:A Shin-Nippon SRW5000 autorefractor and a Haag-Streit Lenstar LS900 biometer measured PR and PEL, respectively, along horizontal (H) and vertical (V) fields out to ±35° in 5° steps in 29 Caucasian (CA), 16 South Asian (SA) and 23 East Asian (EA) young adults (spherical equivalent range +0.75D to –5.00D in all groups). Retinal vertex curvature Rv and asphericity Q were determined from two methods: a) PR (Dunne): The Gullstrand-Emsley eye was modified according to participant’s intraocular lengths and anterior cornea curvature. Ray-tracing was performed at each angle through the stop, altering cornea asphericity until peripheral astigmatism matched experimental measurements. Retinal curvature and hence retinal co-ordinate intersection with the chief ray were altered until sagittal refraction matched its measurement. b) PEL: Ray-tracing was performed at each angle through the anterior corneal centre of curvature of the Gullstrand-Emsley eye. Ignoring lens refraction, retinal co-ordinates relative to the fovea were determined from PEL and trigonometry. From sets of retinal co-ordinates, conic retinal shapes were fitted in terms of Rv and Q. Repeated-measures ANOVA were conducted on Rv and Q, and post hoc t-tests with Bonferroni correction were used to compare races. Results:In all racial groups both methods showed greater Rv for the horizontal than for the vertical meridian and greater Rv for myopes than emmetropes. Rv was greater in EA than in CA (P=0.02), with Rv for SA being intermediate and not significantly different from CA and EA. The PEL method provided larger Rv than the PR method: PEL: EA vs CA 87±13 vs 83±11 m-1 (H), 79±13 vs 72±14 m-1 (V); PR: EA vs CA 79±10 vs 67±10 m-1 (H), 71±17 vs 66±12 m-1 (V). Q did not vary significantly with race. Conclusions:Estimates of Rv, but not of Q, varied significantly with race. The greater Rv found in EA than in CA and the comparatively high prevalence rate of myopia in many Asian countries may be related.