939 resultados para Correction de textures


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis investigates “where were the auditors in asset securitizations”, a criticism of the audit profession before and after the onset of the global financial crisis (GFC). Asset securitizations increase audit complexity and audit risks, which are expected to increase audit fees. Using US bank holding company data from 2003 to 2009, this study examines the association between asset securitization risks and audit fees, and its changes during the global financial crisis. The main test is based on an ordinary least squares (OLS) model, which is adapted from the Fields et al. (2004) bank audit fee model. I employ a principal components analysis to address high correlations among asset securitization risks. Individual securitization risks are also separately tested. A suite of sensitivity tests indicate the results are robust. These include model alterations, sample variations, further controls in the tests, and correcting for the securitizer self-selection problem. A partial least squares (PLS) path modelling methodology is introduced as a separate test, which allows for high intercorrelations, self-selection correction, and sequential order hypotheses in one simultaneous model. The PLS results are consistent with the main results. The study finds significant and positive associations between securitization risks and audit fees. After the commencement of the global financial crisis in 2007, there was an increased focus on the role of audits on asset securitization risks resulting from bank failures; therefore I expect that auditors would become more sensitive to bank asset securitization risks after the commencement of the crisis. I find that auditors appear to focus on different aspects of asset securitization risks during the crisis and that auditors appear to charge a GFC premium for banks. Overall, the results support the view that auditors consider asset securitization risks and market changes, and adjust their audit effort and risk considerations accordingly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research led to the discovery of one of the best preserved remnants of the Earth's surficial environment 3.47 billion years ago. These ancient volcanic and sedimentary rocks contain original minerals and textures that are rare in rocks of this age. The research concentrated on chemical analysis of volcanic rocks to differentiate secondary alteration from the primary magmatic signature. This study contributes to our understanding of melting processes and geochemical reservoirs in the early Earth, which is vital for forward modelling of Earth's geodynamic evolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exhaust emissions from motor vehicles vary widely and depend on factors such as engine operating conditions, fuel, age, mileage and service history. A method has been devised to rapidly identify high-polluting vehicles as they travel on the road. The method is able to monitor emissions from a large number of vehicles in a short time and avoids the need to conduct expensive and time consuming tests on chassis dynamometers. A sample of the exhaust plume is captured as each vehicle passes a roadside monitoring station and the pollutant emission factors are calculated from the measured concentrations using carbon dioxide as a tracer. Although, similar methods have been used to monitor soot and gaseous mass emissions, to-date it has not been used to monitor particle number emissions from a large fleet of vehicles. This is particularly important as epidemiological studies have shown that particle number concentration is an important parameter in determining adverse health effects. The method was applied to measurements of particle number emissions from individual buses in the Brisbane City Council diesel fleet operating on the South-East Busway. Results indicate that the particle number emission factors are gamma- distributed, with a high proportion of the emissions being emitted by a small percentage of the buses. Although most of the high-emitters are the oldest buses in the fleet, there are clear exceptions, with some newer buses emitting as much. We attribute this to their recent service history, particularly pertaining to improper tuning of the engines. We recommend that a targeted correction program would be a highly effective measure in mitigating urban environmental pollution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Small field x-ray beam dosimetry is difficult due to a lack of lateral electronic equilibrium, source occlusion, high dose gradients and detector volume averaging. Currently there is no single definitive detector recommended for small field dosimetry. The objective of this work was to evaluate the performance of a new commercial synthetic diamond detector, namely the PTW 60019 microDiamond, for the dosimetry of small x-ray fields as used in stereotactic radiosurgery (SRS). Methods Small field sizes were defined by BrainLAB circular cones (4 – 30 mm diameter) on a Novalis Trilogy linear accelerator and using the 6 MV SRS x-ray beam mode for all measurements. Percentage depth doses were measured and compared to an IBA SFD and a PTW 60012 E diode. Cross profiles were measured and compared to an IBA SFD diode. Field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated by Monte Carlo methods using BEAMnrc and correction factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Results For the small fields of 4 to 30 mm diameter, there were dose differences in the PDDs of up to 1.5% when compared to an IBA SFD and PTW 60012 E diode detector. For the cross profile measurements the penumbra values varied, depending upon the orientation of the detector. The field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated for these field diameters at a depth of 1.4 cm in water and they were within 2.7% of published values for a similar linear accelerator. The corrections factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Conclusions We conclude that the new PTW 60019 microDiamond detector is generally suitable for relative dosimetry in small 6 MV SRS beams for a Novalis Trilogy linear equipped with circular cones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two Archaean komatiitic flows, Fred’s Flow in Canada and the Murphy Well Flow in Australia, have similar thicknesses (120 and 160 m) but very different compositions and internal structures. Their contrasting differentiation profiles are keys to determine the cooling and crystallization mechanisms that operated during the eruption of Archaean ultramafic lavas. Fred’s Flow is the type example of a thick komatiitic basalt flow. It is strongly differentiated and consists of a succession of layers with contrasting textures and compositions. The layering is readily explained by the accumulation of olivine and pyroxene in a lower cumulate layer and by evolution of the liquid composition during downward growth of spinifex-textured rocks within the upper crust. The magmas that erupted to form Fred’s Flow had variable compositions, ranging from 12 to 20 wt% MgO, and phenocryst contents from 0 to 20 vol%. The flow was emplaced by two pulses. A first ~20-m-thick pulse was followed by another more voluminous but less magnesian pulse that inflated the flow to its present 120 m thickness. Following the second pulse, the flow crystallized in a closed system and differentiated into cumulates containing 30–38 wt% MgO and a residual gabbroic layer with only 6 wt% MgO. The Murphy Well Flow, in contrast, has a remarkably uniform composition throughout. It comprises a 20-m-thick upper layer of fine-grained dendritic olivine and 2–5 vol% amygdales, a 110–120 m intermediate layer of olivine porphyry and a 20–30 m basal layer of olivine orthocumulate. Throughout the flow, MgO contents vary little, from only 30 to 33 wt%, except for the slightly more magnesian basal layer (38–40 wt%). The uniform composition of the flow and dendritic olivine habits in the upper 20 m point to rapid cooling of a highly magnesian liquid with a composition like that of the bulk of the flow. Under equilibrium conditions, this liquid should have crystallized olivine with the composition Fo94.9, but the most magnesian composition measured by electron microprobe in samples from the flow is Fo92.9. To explain these features, we propose that the parental liquid contained around 32 wt% MgO and 3 wt% H2O. This liquid degassed during the eruption, creating a supercooled liquid that solidified quickly and crystallized olivine with non-equilibrium textures and compositions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Methods: Twenty visually normal children (mean age: 10.8 ± 0.7 years; 6 males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 minutes of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Results: Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p<0.001) and sustained near work (p<0.001), however, there was no significant interaction between these factors (p>0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p>0.05). Conclusion: Simulated bilateral astigmatism impaired children’s performance on a range of academic–related outcome measures irrespective of the orientation of the astigmatism. These findings have implications for the clinical management of non-amblyogenic levels of astigmatism in relation to academic performance in children. Correction of low to moderate levels of astigmatism may improve the functional performance of children in the classroom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To investigate the effect of different levels of refractive blur on real-world driving performance measured under day and nighttime conditions. Methods Participants included 12 visually normal, young adults (mean age = 25.8 ± 5.2 years) who drove an instrumented research vehicle around a 4 km closed road circuit with three different levels of binocular spherical refractive blur (+0.50 diopter sphere [DS], +1.00 DS, +2.00 DS) compared with a baseline condition. The subjects wore optimal spherocylinder correction and the additional blur lenses were mounted in modified full-field goggles; the order of testing of the blur conditions was randomized. Driving performance was assessed in two different sessions under day and nighttime conditions and included measures of road signs recognized, hazard detection and avoidance, gap detection, lane-keeping, sign recognition distance, speed, and time to complete the course. Results Refractive blur and time of day had significant effects on driving performance (P < 0.05), where increasing blur and nighttime driving reduced performance on all driving tasks except gap judgment and lane keeping. There was also a significant interaction between blur and time of day (P < 0.05), such that the effects of blur were exacerbated under nighttime driving conditions; performance differences were evident even for +0.50 DS blur relative to baseline for some measures. Conclusions The effects of blur were greatest under nighttime conditions, even for levels of binocular refractive blur as low as +0.50 DS. These results emphasize the importance of accurate and up-to-date refractive correction of even low levels of refractive error when driving at night.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigated in detail the physics of small X-ray fields used in radiotherapy treatments. Because of this work, the ability to accurately measure dose from these very small X-ray fields has been improved in several ways. These include scientifically quantifying when highly accurate measurements are required by introducing the concept of a very small field, and by the invention of a new detector that responds the same in very small fields as in normal fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To provide a comprehensive overview of research examining the impact of astigmatism on clinical and functional measures of vision, the short and longer term adaptations to astigmatism that occur in the visual system, and the currently available clinical options for the management of patients with astigmatism. Recent findings: The presence of astigmatism can lead to substantial reductions in visual performance in a variety of clinical vision measures and functional visual tasks. Recent evidence demonstrates that astigmatic blur results in short-term adaptations in the visual system that appear to reduce the perceived impact of astigmatism on vision. In the longer term, uncorrected astigmatism in childhood can also significantly impact on visual development, resulting in amblyopia. Astigmatism is also associated with the development of spherical refractive errors. Although the clinical correction of small magnitudes of astigmatism is relatively straightforward, the precise, reliable correction of astigmatism (particularly high astigmatism) can be challenging. A wide variety of refractive corrections are now available for the patient with astigmatism, including spectacle, contact lens and surgical options. Conclusion: Astigmatism is one of the most common refractive errors managed in clinical ophthalmic practice. The significant visual and functional impacts of astigmatism emphasise the importance of its reliable clinical management. With continued improvements in ocular measurement techniques and developments in a range of different refractive correction technologies, the future promises the potential for more precise and comprehensive correction options for astigmatic patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To examine macular retinal thickness and retinal layer thickness with spectral domain optical coherence tomography (OCT) in a population of children with normal ocular health and minimal refractive errors. Methods High resolution macular OCT scans from 196 children aged from 4 to 12 years (mean age 8 ± 2 years) were analysed to determine total retinal thickness and the thickness of 6 different retinal layers across the central 5 mm of the posterior pole. Automated segmentation with manual correction was used to derive retinal thickness values. Results The mean total retinal thickness in the central 1 mm foveal zone was 255 ± 16 μm, and this increased significantly with age (mean increase of 1.8 microns per year) in childhood (p<0.001). Age-related increases in thickness of some retinal layers were also observed, with changes of highest statistical significance found in the outer retinal layers in the central foveal region (p<0.01). Significant topographical variations in thickness of each of the retinal layers were also observed (p<0.001). Conclusions Small magnitude, statistically significant increases in total retinal thickness and retinal layer thickness occur from early childhood to adolescence. The most prominent changes appear to occur in the outer retinal layers of the central fovea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose:Race appears to be associated with myopiogenesis, with East Asians showing high myopia prevalence. Considering structural variations in the eye, it is possible that retinal shapes are different between races. The purpose of this study was to quantify and compare retinal shapes between racial groups using peripheral refraction (PR) and peripheral eye lengths (PEL). Methods:A Shin-Nippon SRW5000 autorefractor and a Haag-Streit Lenstar LS900 biometer measured PR and PEL, respectively, along horizontal (H) and vertical (V) fields out to ±35° in 5° steps in 29 Caucasian (CA), 16 South Asian (SA) and 23 East Asian (EA) young adults (spherical equivalent range +0.75D to –5.00D in all groups). Retinal vertex curvature Rv and asphericity Q were determined from two methods: a) PR (Dunne): The Gullstrand-Emsley eye was modified according to participant’s intraocular lengths and anterior cornea curvature. Ray-tracing was performed at each angle through the stop, altering cornea asphericity until peripheral astigmatism matched experimental measurements. Retinal curvature and hence retinal co-ordinate intersection with the chief ray were altered until sagittal refraction matched its measurement. b) PEL: Ray-tracing was performed at each angle through the anterior corneal centre of curvature of the Gullstrand-Emsley eye. Ignoring lens refraction, retinal co-ordinates relative to the fovea were determined from PEL and trigonometry. From sets of retinal co-ordinates, conic retinal shapes were fitted in terms of Rv and Q. Repeated-measures ANOVA were conducted on Rv and Q, and post hoc t-tests with Bonferroni correction were used to compare races. Results:In all racial groups both methods showed greater Rv for the horizontal than for the vertical meridian and greater Rv for myopes than emmetropes. Rv was greater in EA than in CA (P=0.02), with Rv for SA being intermediate and not significantly different from CA and EA. The PEL method provided larger Rv than the PR method: PEL: EA vs CA 87±13 vs 83±11 m-1 (H), 79±13 vs 72±14 m-1 (V); PR: EA vs CA 79±10 vs 67±10 m-1 (H), 71±17 vs 66±12 m-1 (V). Q did not vary significantly with race. Conclusions:Estimates of Rv, but not of Q, varied significantly with race. The greater Rv found in EA than in CA and the comparatively high prevalence rate of myopia in many Asian countries may be related.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is growing interest in the biomechanics of ‘fusionless’ implant constructs used for deformity correction in the thoracic spine, however, there are questions over the comparability of in vitro biomechanical studies from different research groups due to the various methods used for specimen preparation, testing and data collection. The aim of this study was to identify the effect of two key factors on the stiffness of immature bovine thoracic spine motion segments: (i) repeated cyclic loading and (ii) multiple freeze-thaw cycles, to aid in the planning and interpretation of in vitro studies. Two groups of thoracic spine motion segments from 6-8 week old calves were tested in flexion/extension, right/left lateral bending, and right/left axial rotation under moment control. Group (A) were tested with continuous repeated cyclic loading for 500 cycles with data recorded at cycles 3, 5, 10, 25, 50, 100, 200, 300, 400 and 500. Group (B) were tested after each of five freeze-thaw sequences, with data collected from the 10th load cycle in each sequence. Group A: Flexion/extension stiffness reduced significantly over the 500 load cycles (-22%; P=0.001), but there was no significant change between the 5th and 200th load cycles. Lateral bending stiffness decreased significantly (-18%; P=0.009) over the 500 load cycles, but there was no significant change in axial rotation stiffness (P=0.137). Group B: There was no significant difference between mean stiffness over the five freeze-thaw sequences in flexion/extension (P=0.813) and a near significant reduction in mean stiffness in axial rotation (-6%; P=0.07). However, there was a statistically significant increase in stiffness in lateral bending (+30%; P=0.007). Comparison of in vitro testing results for immature thoracic bovine spine segments between studies can be performed with up to 200 load cycles without significant changes in stiffness. However, when testing protocols require greater than 200 cycles, or when repeated freeze-thaw cycles are involved, it is important to account for the effect of cumulative load and freeze-thaw cycles on spine segment stiffness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Texture enhancement is an important component of image processing that finds extensive application in science and engineering. The quality of medical images, quantified using the imaging texture, plays a significant role in the routine diagnosis performed by medical practitioners. Most image texture enhancement is performed using classical integral order differential mask operators. Recently, first order fractional differential operators were used to enhance images. Experimentation with these methods led to the conclusion that fractional differential operators not only maintain the low frequency contour features in the smooth areas of the image, but they also nonlinearly enhance edges and textures corresponding to high frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we apply the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other first order fractional differential operators, we find that our new algorithms provide higher signal to noise values and superior image quality.