891 resultados para Logical Correction


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis investigates “where were the auditors in asset securitizations”, a criticism of the audit profession before and after the onset of the global financial crisis (GFC). Asset securitizations increase audit complexity and audit risks, which are expected to increase audit fees. Using US bank holding company data from 2003 to 2009, this study examines the association between asset securitization risks and audit fees, and its changes during the global financial crisis. The main test is based on an ordinary least squares (OLS) model, which is adapted from the Fields et al. (2004) bank audit fee model. I employ a principal components analysis to address high correlations among asset securitization risks. Individual securitization risks are also separately tested. A suite of sensitivity tests indicate the results are robust. These include model alterations, sample variations, further controls in the tests, and correcting for the securitizer self-selection problem. A partial least squares (PLS) path modelling methodology is introduced as a separate test, which allows for high intercorrelations, self-selection correction, and sequential order hypotheses in one simultaneous model. The PLS results are consistent with the main results. The study finds significant and positive associations between securitization risks and audit fees. After the commencement of the global financial crisis in 2007, there was an increased focus on the role of audits on asset securitization risks resulting from bank failures; therefore I expect that auditors would become more sensitive to bank asset securitization risks after the commencement of the crisis. I find that auditors appear to focus on different aspects of asset securitization risks during the crisis and that auditors appear to charge a GFC premium for banks. Overall, the results support the view that auditors consider asset securitization risks and market changes, and adjust their audit effort and risk considerations accordingly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Health Law in Australia is the country’s leading text in this area and was the first book to deal with health law on a comprehensive national basis. In this important field that continues to give rise to challenges for society Health Law in Australia takes a logical, structured approach to explain the breadth of this area of law across all Australian jurisdictions. By covering all the major areas in this diverse field, Health Law in Australia enhances the understanding of the discipline as a whole. Beginning with an exploration of the general principles of health law, including chapters on “Negligence”, “Children and Consent to Medical Treatment”, and “Medical Confidentiality and Patient Privacy”, the book goes on to consider beginning-of-life and end-of-life issues before concluding with chapters on emerging areas in health law, such as biotechnology, genetic technologies and medical research. The contributing authors are national leaders who are specialists in these areas of health law and who can share with readers the results of their research. Health Law in Australia has been written for both legal and health audiences and is essential reading for undergraduate and postgraduate students, researchers and scholars in the disciplines of law, health and medicine, as well as health and legal practitioners, government departments and bodies in the health area, and private health providers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exhaust emissions from motor vehicles vary widely and depend on factors such as engine operating conditions, fuel, age, mileage and service history. A method has been devised to rapidly identify high-polluting vehicles as they travel on the road. The method is able to monitor emissions from a large number of vehicles in a short time and avoids the need to conduct expensive and time consuming tests on chassis dynamometers. A sample of the exhaust plume is captured as each vehicle passes a roadside monitoring station and the pollutant emission factors are calculated from the measured concentrations using carbon dioxide as a tracer. Although, similar methods have been used to monitor soot and gaseous mass emissions, to-date it has not been used to monitor particle number emissions from a large fleet of vehicles. This is particularly important as epidemiological studies have shown that particle number concentration is an important parameter in determining adverse health effects. The method was applied to measurements of particle number emissions from individual buses in the Brisbane City Council diesel fleet operating on the South-East Busway. Results indicate that the particle number emission factors are gamma- distributed, with a high proportion of the emissions being emitted by a small percentage of the buses. Although most of the high-emitters are the oldest buses in the fleet, there are clear exceptions, with some newer buses emitting as much. We attribute this to their recent service history, particularly pertaining to improper tuning of the engines. We recommend that a targeted correction program would be a highly effective measure in mitigating urban environmental pollution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a concept, the magic circle is in reality just 4 years old. Whilst often accredited to Johan Huizinga (1955), the modern usage of term in truth belongs to Katie Salen and Eric Zimmerman. It became in academia following the publication of “Rules of Play” in 2003. Because of the terminologyused, it carries with it unhelpful preconceptions that the game world, or play-space, excludes reality. In this paper, I argue that Salen and Zimmerman (2003) have taken a term used as an example, and applied a meaning to it that was never intended, based primarily upon definitions given by other authors, namely Apter (1991) and Sniderman (n.d.). I further argue that the definition itself contains a logical fallacy, which has prevented the full understanding of the definition in later work. Through a study of the literature in Game Theory, and examples of possible issues which could arise in contemporary games, I suggest that the emotions of the play experience continue beyond the play space, and that emotions from the “real world” enter it with the participants. I consider a reprise of the Stanley Milgram Obedience Experiment (2006), and what that tells us about human emotions and the effect that events taking place in a virtual environment can have upon them. I evaluate the opinion espoused by some authors of there being different magic circles for different players, and assert that this is not a useful approach to take when studying games, because it prevents the analysis of a game as a single entity. Furthermore I consider the reasons given by other authors for the existence of the Magic Circle, and I assert that the term “Magic Circle” should be discarded, that it has no relevance to contemporary games, and indeed it acts as a hindrance to the design and study of games. I conclude that the play space which it claims to protect from the courts and other governmental authorities would be better served by the existing concepts of intent, consent, and commonly accepted principles associated with international travel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With respect to “shape” marks, there would appear to be a “break”, imposed by the Australian Courts, in the logical conclusion that registration of a shape, which performs a functional purpose, or even further, is indistinguishable from the shape of the item or product, creates a perpetual monopoly in the manufacture of that product.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is often said that Australia is a world leader in rates of copyright infringement for entertainment goods. In 2012, the hit television show, Game of Thrones, was the most downloaded television show over bitorrent, and estimates suggest that Australians accounted for a plurality of nearly 10% of the 3-4 million downloads each week. The season finale of 2013 was downloaded over a million times within 24 hours of its release, and again Australians were the largest block of illicit downloaders over BitTorrent, despite our relatively small population. This trend has led the former US Ambassador to Australia to implore Australians to stop 'stealing' digital content, and rightsholders to push for increasing sanctions on copyright infringers. The Australian Government is looking to respond by requiring Internet Service Providers to issue warnings and potentially punish consumers who are alleged by industry groups to have infringed copyright. This is the logical next step in deterring infringement, given that the operators of infringing networks (like The Pirate Bay, for example) are out of regulatory reach. This steady ratcheting up of the strength of copyright, however, comes at a significant cost to user privacy and autonomy, and while the decentralisation of enforcement reduces costs, it also reduces the due process safeguards provided by the judicial process. This article presents qualitative evidence that substantiates a common intuition: one of the major reasons that Australians seek out illicit downloads of content like Game of Thrones in such numbers is that it is more difficult to access legitimately in Australia. The geographically segmented way in which copyright is exploited at an international level has given rise to a ‘tyranny of digital distance’, where Australians have less access to copyright goods than consumers in other countries. Compared to consumers in the US and the EU, Australians pay more for digital goods, have less choice in distribution channels, are exposed to substantial delays in access, and are sometimes denied access completely. In this article we focus our analysis on premium film and television offerings, like Game of Thrones, and through semi-structured interviews, explore how choices in distribution impact on the willingness of Australian consumers to seek out infringing copies of copyright material. Game of Thrones provides an excellent case study through which to frame this analysis: it is both one of the least legally accessible television offerings and one of the most downloaded through filesharing networks of recent times. Our analysis shows that at the same time as rightsholder groups, particularly in the film and television industries, are lobbying for stronger laws to counter illicit distribution, the business practices of their member organisations are counter-productively increasing incentives for consumers to infringe. The lack of accessibility and high prices of copyright goods in Australia leads to substantial economic waste. The unmet consumer demand means that Australian consumers are harmed by lower access to information and entertainment goods than consumers in other jurisdictions. The higher rates of infringement that fulfils some of this unmet demand increases enforcement costs for copyright owners and imposes burdens either on our judicial system or on private entities – like ISPs – who may be tasked with enforcing the rights of third parties. Most worryingly, the lack of convenient and cheap legitimate digital distribution channels risks undermining public support for copyright law. Our research shows that consumers blame rightsholders for failing to meet market demand, and this encourages a social norm that infringing copyright, while illegal, is not morally wrongful. The implications are as simple as they are profound: Australia should not take steps to increase the strength of copyright law at this time. The interests of the public and those of rightsholders align better when there is effective competition in distribution channels and consumers can legitimately get access to content. While foreign rightsholders are seeking enhanced protection for their interests, increasing enforcement is likely to increase their ability to engage in lucrative geographical price-discrimination, particularly for premium content. This is only likely to increase the degree to which Australian consumers feel that their interests are not being met and, consequently, to further undermine the legitimacy of copyright law. If consumers are to respect copyright law, increasing sanctions for infringement without enhancing access and competition in legitimate distribution channels could be dangerously counter-productive. We suggest that rightsholders’ best strategy for addressing infringement in Australia at this time is to ensure that Australians can access copyright goods in a timely, affordable, convenient, and fair lawful manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Small field x-ray beam dosimetry is difficult due to a lack of lateral electronic equilibrium, source occlusion, high dose gradients and detector volume averaging. Currently there is no single definitive detector recommended for small field dosimetry. The objective of this work was to evaluate the performance of a new commercial synthetic diamond detector, namely the PTW 60019 microDiamond, for the dosimetry of small x-ray fields as used in stereotactic radiosurgery (SRS). Methods Small field sizes were defined by BrainLAB circular cones (4 – 30 mm diameter) on a Novalis Trilogy linear accelerator and using the 6 MV SRS x-ray beam mode for all measurements. Percentage depth doses were measured and compared to an IBA SFD and a PTW 60012 E diode. Cross profiles were measured and compared to an IBA SFD diode. Field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated by Monte Carlo methods using BEAMnrc and correction factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Results For the small fields of 4 to 30 mm diameter, there were dose differences in the PDDs of up to 1.5% when compared to an IBA SFD and PTW 60012 E diode detector. For the cross profile measurements the penumbra values varied, depending upon the orientation of the detector. The field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated for these field diameters at a depth of 1.4 cm in water and they were within 2.7% of published values for a similar linear accelerator. The corrections factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Conclusions We conclude that the new PTW 60019 microDiamond detector is generally suitable for relative dosimetry in small 6 MV SRS beams for a Novalis Trilogy linear equipped with circular cones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing planning theories tend to be limited in their analytical scope and often fail to account for the impact of many interactions between the multitudes of stakeholders involved in strategic planning processes. Although many theorists rejected structural–functional approaches from the 1970s, this article argues that many of structural–functional concepts remain relevant and useful to planning practitioners. In fact, structural–functional approaches are highly useful and practical when used as a foundation for systemic analysis of real-world, multi-layered, complex planning systems to support evidence-based governance reform. Such approaches provide a logical and systematic approach to the analysis of the wider governance of strategic planning systems that is grounded in systems theory and complementary to existing theories of complexity and planning. While we do not propose its use as a grand theory of planning, this article discusses how structural–functional concepts and approaches might be applied to underpin a practical analysis of the complex decision-making arrangements that drive planning practice, and to provide the evidence needed to target reform of poorly performing arrangements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Astigmatism is an important refractive condition in children. However, the functional impact of uncorrected astigmatism in this population is not well established, particularly with regard to academic performance. This study investigated the impact of simulated bilateral astigmatism on academic-related tasks before and after sustained near work in children. Methods: Twenty visually normal children (mean age: 10.8 ± 0.7 years; 6 males and 14 females) completed a range of standardised academic-related tests with and without 1.50 D of simulated bilateral astigmatism (with both academic-related tests and the visual condition administered in a randomised order). The simulated astigmatism was induced using a positive cylindrical lens while maintaining a plano spherical equivalent. Performance was assessed before and after 20 minutes of sustained near work, during two separate testing sessions. Academic-related measures included a standardised reading test (the Neale Analysis of Reading Ability), visual information processing tests (Coding and Symbol Search subtests from the Wechsler Intelligence Scale for Children) and a reading-related eye movement test (the Developmental Eye Movement test). Each participant was systematically assigned either with-the-rule (WTR, axis 180°) or against-the-rule (ATR, axis 90°) simulated astigmatism to evaluate the influence of axis orientation on any decrements in performance. Results: Reading, visual information processing and reading-related eye movement performance were all significantly impaired by both simulated bilateral astigmatism (p<0.001) and sustained near work (p<0.001), however, there was no significant interaction between these factors (p>0.05). Simulated astigmatism led to a reduction of between 5% and 12% in performance across the academic-related outcome measures, but there was no significant effect of the axis (WTR or ATR) of astigmatism (p>0.05). Conclusion: Simulated bilateral astigmatism impaired children’s performance on a range of academic–related outcome measures irrespective of the orientation of the astigmatism. These findings have implications for the clinical management of non-amblyogenic levels of astigmatism in relation to academic performance in children. Correction of low to moderate levels of astigmatism may improve the functional performance of children in the classroom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To investigate the effect of different levels of refractive blur on real-world driving performance measured under day and nighttime conditions. Methods Participants included 12 visually normal, young adults (mean age = 25.8 ± 5.2 years) who drove an instrumented research vehicle around a 4 km closed road circuit with three different levels of binocular spherical refractive blur (+0.50 diopter sphere [DS], +1.00 DS, +2.00 DS) compared with a baseline condition. The subjects wore optimal spherocylinder correction and the additional blur lenses were mounted in modified full-field goggles; the order of testing of the blur conditions was randomized. Driving performance was assessed in two different sessions under day and nighttime conditions and included measures of road signs recognized, hazard detection and avoidance, gap detection, lane-keeping, sign recognition distance, speed, and time to complete the course. Results Refractive blur and time of day had significant effects on driving performance (P < 0.05), where increasing blur and nighttime driving reduced performance on all driving tasks except gap judgment and lane keeping. There was also a significant interaction between blur and time of day (P < 0.05), such that the effects of blur were exacerbated under nighttime driving conditions; performance differences were evident even for +0.50 DS blur relative to baseline for some measures. Conclusions The effects of blur were greatest under nighttime conditions, even for levels of binocular refractive blur as low as +0.50 DS. These results emphasize the importance of accurate and up-to-date refractive correction of even low levels of refractive error when driving at night.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigated in detail the physics of small X-ray fields used in radiotherapy treatments. Because of this work, the ability to accurately measure dose from these very small X-ray fields has been improved in several ways. These include scientifically quantifying when highly accurate measurements are required by introducing the concept of a very small field, and by the invention of a new detector that responds the same in very small fields as in normal fields.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To provide a comprehensive overview of research examining the impact of astigmatism on clinical and functional measures of vision, the short and longer term adaptations to astigmatism that occur in the visual system, and the currently available clinical options for the management of patients with astigmatism. Recent findings: The presence of astigmatism can lead to substantial reductions in visual performance in a variety of clinical vision measures and functional visual tasks. Recent evidence demonstrates that astigmatic blur results in short-term adaptations in the visual system that appear to reduce the perceived impact of astigmatism on vision. In the longer term, uncorrected astigmatism in childhood can also significantly impact on visual development, resulting in amblyopia. Astigmatism is also associated with the development of spherical refractive errors. Although the clinical correction of small magnitudes of astigmatism is relatively straightforward, the precise, reliable correction of astigmatism (particularly high astigmatism) can be challenging. A wide variety of refractive corrections are now available for the patient with astigmatism, including spectacle, contact lens and surgical options. Conclusion: Astigmatism is one of the most common refractive errors managed in clinical ophthalmic practice. The significant visual and functional impacts of astigmatism emphasise the importance of its reliable clinical management. With continued improvements in ocular measurement techniques and developments in a range of different refractive correction technologies, the future promises the potential for more precise and comprehensive correction options for astigmatic patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose To examine macular retinal thickness and retinal layer thickness with spectral domain optical coherence tomography (OCT) in a population of children with normal ocular health and minimal refractive errors. Methods High resolution macular OCT scans from 196 children aged from 4 to 12 years (mean age 8 ± 2 years) were analysed to determine total retinal thickness and the thickness of 6 different retinal layers across the central 5 mm of the posterior pole. Automated segmentation with manual correction was used to derive retinal thickness values. Results The mean total retinal thickness in the central 1 mm foveal zone was 255 ± 16 μm, and this increased significantly with age (mean increase of 1.8 microns per year) in childhood (p<0.001). Age-related increases in thickness of some retinal layers were also observed, with changes of highest statistical significance found in the outer retinal layers in the central foveal region (p<0.01). Significant topographical variations in thickness of each of the retinal layers were also observed (p<0.001). Conclusions Small magnitude, statistically significant increases in total retinal thickness and retinal layer thickness occur from early childhood to adolescence. The most prominent changes appear to occur in the outer retinal layers of the central fovea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose:Race appears to be associated with myopiogenesis, with East Asians showing high myopia prevalence. Considering structural variations in the eye, it is possible that retinal shapes are different between races. The purpose of this study was to quantify and compare retinal shapes between racial groups using peripheral refraction (PR) and peripheral eye lengths (PEL). Methods:A Shin-Nippon SRW5000 autorefractor and a Haag-Streit Lenstar LS900 biometer measured PR and PEL, respectively, along horizontal (H) and vertical (V) fields out to ±35° in 5° steps in 29 Caucasian (CA), 16 South Asian (SA) and 23 East Asian (EA) young adults (spherical equivalent range +0.75D to –5.00D in all groups). Retinal vertex curvature Rv and asphericity Q were determined from two methods: a) PR (Dunne): The Gullstrand-Emsley eye was modified according to participant’s intraocular lengths and anterior cornea curvature. Ray-tracing was performed at each angle through the stop, altering cornea asphericity until peripheral astigmatism matched experimental measurements. Retinal curvature and hence retinal co-ordinate intersection with the chief ray were altered until sagittal refraction matched its measurement. b) PEL: Ray-tracing was performed at each angle through the anterior corneal centre of curvature of the Gullstrand-Emsley eye. Ignoring lens refraction, retinal co-ordinates relative to the fovea were determined from PEL and trigonometry. From sets of retinal co-ordinates, conic retinal shapes were fitted in terms of Rv and Q. Repeated-measures ANOVA were conducted on Rv and Q, and post hoc t-tests with Bonferroni correction were used to compare races. Results:In all racial groups both methods showed greater Rv for the horizontal than for the vertical meridian and greater Rv for myopes than emmetropes. Rv was greater in EA than in CA (P=0.02), with Rv for SA being intermediate and not significantly different from CA and EA. The PEL method provided larger Rv than the PR method: PEL: EA vs CA 87±13 vs 83±11 m-1 (H), 79±13 vs 72±14 m-1 (V); PR: EA vs CA 79±10 vs 67±10 m-1 (H), 71±17 vs 66±12 m-1 (V). Q did not vary significantly with race. Conclusions:Estimates of Rv, but not of Q, varied significantly with race. The greater Rv found in EA than in CA and the comparatively high prevalence rate of myopia in many Asian countries may be related.