924 resultados para Mean Field Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In induction machines the tooth frequency losses due to permeance variation constitute a signif'icant, portion of the total loss. In order to predict and estimate these losses it, is essential to obtain a clear understanding of the no-load distribution of the air gap magnetic field and the magnitude of flux pulsation in both stator and rotor teeth. The existing theories and methods by which the air gap permeance variation in a doubly slotted structure is calculated are either empirical or restricted. The main objective of this thesis is to obtain a detailed analysis of the no-load air gap magnetic field distribution and the effect of air gap geometry on the magnitude and waveform of the tooth flux pulsation. In this thesis a detaiiled theoretical and experimental analysis of flux distribution not only leads to a better understanding of the distribution of no-load losses but also provides theoretical analysis for calculating the losses with greater accuracy

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A method is described which enables the spatial pattern of discrete objects in histological sections of brain tissue to be determined. The method can be applied to cell bodies, sections of blood vessels or the characteristic lesions which develop in the brain of patients with neurodegenerative disorders. The density of the histological feature under study is measured in a series of contiguous sample fields arranged in a grid or transect. Data from adjacent sample fields are added together to provide density data for larger field sizes. A plot of the variance/mean ratio (V/M) of the data versus field size reveals whether the objects are distributed randomly, uniformly or in clusters. If the objects are clustered, the analysis determines whether the clusters are randomly or regularly distributed and the mean size of the clusters. In addition, if two different histological features are clustered, the analysis can determine whether their clusters are in phase, out of phase or unrelated to each other. To illustrate the method, the spatial patterns of senile plaques and neurofibrillary tangles were studied in histological sections of brain tissue from patients with Alzheimer's disease.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aeromonas genomes were investigated by restriction digesting chromosomal DNA with the endonuclease XbaI, separation of restriction fragments by pulsed field gel electrophoresis (PFGE) and principal components analysis (PCA) of resulting separation patterns. A. salmonicida salmonicida were unique amongst the isolates investigated. Separation profiles of these isolates were similar and all characterised by a distinct absence of bands in the 250kb region. Principal components analysis represented these strains as a clearly defined homogeneous group separated by insignificant Euclidian distances. However, A. salmonicida achromogenes isolates in common with those of A. hydrophila and A. sobria were shown by principal components analysis to be more heterogeneous in nature. Fragments from these isolates were more uniform in size distribution but as demonstrated by the Euclidian distances attained through PCA potentially characteristic of each strain. Furthermore passaging of Aeromonas isolates through an appropriate host did not greatly modify fragment separation profiles, indicative of the genomic stability of test aeromonads and the potential of restriction digesting/PFGE/PCA in Aeromonas typing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Plasmid constitutions of Aeromonas salmonicida isolates were characterised by flat-bed and pulsed field gel electrophoresis. Resolution of plasmids by pulsed field gel electrophoresis was greater and more consistent than that achieved by flat-bed gel electrophoresis. The number of plasmids separated by pulsed field gel electrophoresis varied between A. salmonicida isolates, with five being the most common number present in the isolates used in this study. Plasmid profiles were diverse and the reproducibility of the distances migrated facilitated the use of principal components analysis for the characterisation of the isolates. Isolates were grouped according to the number of plasmids supported. Further principal components analysis of groups of isolates supporting five and seven plasmids showed a spatial separation of plasmids based upon distance migrated. Principal components analysis of plasmid profiles and antimicrobial minimum inhibitory concentrations could not be correlated suggesting that resistance to antimicrobial agents is not associated with either one plasmid or a particular plasmid constitution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To investigate the correlation between tests of visual function and perceived visual ability recorded with a 'quality-of-life' questionnaire for patients with central field loss. Method: 12 females and 7 males (mean age = 53.1 years; Range = 23 - 80 years) with subfoveal neovascular membranes underwent a comprehensive assessment of visual function. Tests included unaided distance vision, high and low contrast distance logMAR visual acuity (VA), Pelli-Robson contrast senstivity (at 1m), near logMAR word VA and text reading speed. All tests were done both monocularly and binocularly. The patients also completed a 28 point questionnaire separated into a 'core' section consisting of general questions about perceived visual function and a 'module' section with specific questions on reading function. Results: Step-wise multiple regression analysis was used to determine which visual function tests were correlated with the patients's perceived visual function and to rank them in order of importance. The visual function test that explains most of the variance in both 'core' score (66%0 and the 'module' score (68%) of the questionnaire is low contrast VA in the better eye (P<0.001 in both cases). Further, the module score also accounts for a significant proportion of the variance (P<0.01) of the distance logMAR VA in both the better and worse eye, and the near logMAR in both the better eye and binocularly. Conclusions: The best predictor of both perceived reading ability and of general perceived visual ability in this study is low contrast logMAR VA. The results highlight that distance VA is not the only relevant measure of visual fucntion in relation to a patients's perceived visual performance and should not be considered a determinant of surgical or management success.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Predicting future need for water resources has traditionally been, at best, a crude mixture of art and science. This has prevented the evaluation of water need from being carried out in either a consistent or comprehensive manner. This inconsistent and somewhat arbitrary approach to water resources planning led to well publicised premature developments in the 1970's and 1980's but privatisation of the Water Industry, including creation of the Office of Water Services and the National Rivers Authority in 1989, turned the tide of resource planning to the point where funding of schemes and their justification by the Regulators could no longer be assumed. Furthermore, considerable areas of uncertainty were beginning to enter the debate and complicate the assessment It was also no longer appropriate to consider that contingencies would continue to lie solely on the demand side of the equation. An inability to calculate the balance between supply and demand may mean an inability to meet standards of service or, arguably worse, an excessive provision of water resources and excessive costs to customers. United Kingdom Water Industry Research limited (UKWlR) Headroom project in 1998 provided a simple methodology for the calculation of planning margins. This methodology, although well received, was not, however, accepted by the Regulators as a tool sufficient to promote resource development. This thesis begins by considering the history of water resource planning in the UK, moving on to discuss events following privatisation of the water industry post·1985. The mid section of the research forms the bulk of original work and provides a scoping exercise which reveals a catalogue of uncertainties prevalent within the supply-demand balance. Each of these uncertainties is considered in terms of materiality, scope, and whether it can be quantified within a risk analysis package. Many of the areas of uncertainty identified would merit further research. A workable, yet robust, methodology for evaluating the balance between water resources and water demands by using a spreadsheet based risk analysis package is presented. The technique involves statistical sampling and simulation such that samples are taken from input distributions on both the supply and demand side of the equation and the imbalance between supply and demand is calculated in the form of an output distribution. The percentiles of the output distribution represent different standards of service to the customer. The model allows dependencies between distributions to be considered, for improved uncertainties to be assessed and for the impact of uncertain solutions to any imbalance to be calculated directly. The method is considered a Significant leap forward in the field of water resource planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to the failure of PRARE the orbital accuracy of ERS-1 is typically 10-15 cm radially as compared to 3-4cm for TOPEX/Poseidon. To gain the most from these simultaneous datasets it is necessary to improve the orbital accuracy of ERS-1 so that it is commensurate with that of TOPEX/Poseidon. For the integration of these two datasets it is also necessary to determine the altimeter and sea state biases for each of the satellites. Several models for the sea state bias of ERS-1 are considered by analysis of the ERS-1 single satellite crossovers. The model adopted consists of the sea state bias as a percentage of the significant wave height, namely 5.95%. The removal of ERS-1 orbit error and recovery of an ERS-1 - TOPEX/Poseidon relative bias are both achieved by analysis of dual crossover residuals. The gravitational field based radial orbit error is modelled by a finite Fourier expansion series with the dominant frequencies determined by analysis of the JGM-2 co-variance matrix. Periodic and secular terms to model the errors due to atmospheric density, solar radiation pressure and initial state vector mis-modelling are also solved for. Validation of the dataset unification consists of comparing the mean sea surface topographies and annual variabilities derived from both the corrected and uncorrected ERS-1 orbits with those derived from TOPEX/Poseidon. The global and regional geographically fixed/variable orbit errors are also analysed pre and post correction, and a significant reduction is noted. Finally the use of dual/single satellite crossovers and repeat pass data, for the calibration of ERS-2 with respect to ERS-1 and TOPEX/Poseidon is shown by calculating the ERS-1/2 sea state and relative biases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background - An evaluation of standard automated perimetry (SAP) and short wavelength automated perimetry (SWAP) for the central 10–2 visual field test procedure in patients with age-related macular degeneration (AMD) is presented in order to determine methods of quantifying the central sensitivity loss in patients at various stages of AMD. Methods - 10–2 SAP and SWAP Humphrey visual fields and stereoscopic fundus photographs were collected in 27 eyes of 27 patients with AMD and 22 eyes of 22 normal subjects. Results - Mean Deviation and Pattern Standard Deviation (PSD) varied significantly with stage of disease in SAP (both p<0.001) and SWAP (both p<0.001), but post hoc analysis revealed overlap of functional values among stages. In SWAP, indices of focal loss were more sensitive to detecting differences in AMD from normal. SWAP defects were greater in depth and area than those in SAP. Central sensitivity (within 1°) changed by -3.9 and -4.9 dB per stage in SAP and SWAP, respectively. Based on defect maps, an AMD Severity Index was derived. Conclusions - Global indices of focal loss were more sensitive to detecting early stage AMD from normal. The SWAP sensitivity decline with advancing stage of AMD was greater than in SAP. A new AMD Severity Index quantifies visual field defects on a continuous scale. Although not all patients are suitable for SWAP examinations, it is of value as a tool in research studies of visual loss in AMD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Presentation Purpose:To determine methods of quantifying the sensitivity loss in the central 10o visual field in a cross section of patients at various stages of age-related macular degeneration (AMD). Methods:Standard and short-wavelength automated perimetry (SAP and SWAP) visual fields were collected using program 10-2 of the Humphrey Field Analyzer, in 44 eyes of 27 patients with AMD and 41 eyes of 22 normal subjects. Stereoscopic fundus photographs were graded by two independent observers and the stage of disease determined. Global indices were compared for their ability to delineate the normal visual field from early stages of AMD and to differentiate between stages. Results:Mean Deviation (MD) and Pattern Standard Deviation (PSD) varied significantly with stage of disease in SAP (both p<0.001) and SWAP (both p<0.001), but post-hoc analysis revealed overlap of functional values between stages. Global indices of focal loss, PSD and local spatial variability (LSV) were the most sensitive to detecting differences between normal subjects and early stage AMD patients, in SAP and SWAP, respectively. Overall, defects were confined to the central 5°. SWAP defects were consistently greater in depth and area than those in SAP. The most vulnerable region of the 10° field to sensitivity loss with increasing stage of AMD was the central 1°, in which the sensitivity decline was -4.8dB per stage in SAP and -4.9dB per stage in SWAP. Based on the pattern deviation defect maps, a severity index of AMD visual field loss was derived. Threshold variability was considerably increased in late stage AMD eyes. Conclusions:Global indices of focal loss were more sensitive to the detection of early stage AMD from normal. The sensitivity decline with advancing stage of AMD was greater in SWAP compared to SAP, however the trend was not strong across all stages of disease. The less commonly used index LSV represents relatively statistically unmanipulated summary measure of focal loss. A new severity index is described which is sensitive to visual field change in AMD, measures visual field defects on a continuous scale and may serve as a useful measure of functional change in AMD in longitudinal studies. Keywords: visual fields • age-related macular degeneration • perimetry

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: To validate a new miniaturised, open-field wavefront device which has been developed with the capacity to be attached to an ophthalmic surgical microscope or slit-lamp. SETTING: Solihull Hospital and Aston University, Birmingham, UK DESIGN: Comparative non-interventional study. METHODS: The dynamic range of the Aston Aberrometer was assessed using a calibrated model eye. The validity of the Aston Aberrometer was compared to a conventional desk mounted Shack-Hartmann aberrometer (Topcon KR1W) by measuring the refractive error and higher order aberrations of 75 dilated eyes with both instruments in random order. The Aston Aberrometer measurements were repeated five times to assess intra-session repeatability. Data was converted to vector form for analysis. RESULTS: The Aston Aberrometer had a large dynamic range of at least +21.0 D to -25.0 D. It gave similar measurements to a conventional aberrometer for mean spherical equivalent (mean difference ± 95% confidence interval: 0.02 ± 0.49D; correlation: r=0.995, p<0.001), astigmatic components (J0: 0.02 ± 0.15D; r=0.977, p<0.001; J45: 0.03 ± 0.28; r=0.666, p<0.001) and higher order aberrations RMS (0.02 ± 0.20D; r=0.620, p<0.001). Intraclass correlation coefficient assessments of intra-sessional repeatability for the Aston Aberrometer were excellent (spherical equivalent =1.000, p<0.001; astigmatic components J0 =0.998, p<0.001, J45=0.980, p<0.01; higher order aberrations RMS =0.961, p<0.001). CONCLUSIONS: The Aston Aberrometer gives valid and repeatable measures of refractive error and higher order aberrations over a large range. As it is able to measure continuously, it can provide direct feedback to surgeons during intraocular lens implantations and corneal surgery as to the optical status of the visual system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With few exceptions (e.g. Fincham & Clark, 2002; Lounsbury, 2002, 2007; Montgomery & Oliver, 2007), we know little about how emerging professions, such as management consulting, professionalize and establish their services as a taken-for-granted element of social life. This is surprising given that professionals have long been recognized as “institutional agents” (DiMaggio & Powell, 1983; Scott, 2008) (see Chapter 17) and professionalization projects have been closely associated with institutionalization (DiMaggio, 1991). Therefore, in this chapter we take a closer look at a specific type of entrepreneurship in PSFs; drawing on the concept of “institutional entrepreneurship” (DiMaggio, 1988; Garud, Hardy, & Maguire, 2007; Hardy & Maguire, 2008) we describe some generic strategies by which proto-professions can enhance their “institutional capital” (Oliver, 1997), that is, their capacity to extract institutionally contingent resources such as legitimacy, reputation, or client relationships from their environment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The concept of plagiarism is not uncommonly associated with the concept of intellectual property, both for historical and legal reasons: the approach to the ownership of ‘moral’, nonmaterial goods has evolved to the right to individual property, and consequently a need was raised to establish a legal framework to cope with the infringement of those rights. The solution to plagiarism therefore falls most often under two categories: ethical and legal. On the ethical side, education and intercultural studies have addressed plagiarism critically, not only as a means to improve academic ethics policies (PlagiarismAdvice.org, 2008), but mainly to demonstrate that if anything the concept of plagiarism is far from being universal (Howard & Robillard, 2008). Even if differently, Howard (1995) and Scollon (1994, 1995) argued, and Angèlil-Carter (2000) and Pecorari (2008) later emphasised that the concept of plagiarism cannot be studied on the grounds that one definition is clearly understandable by everyone. Scollon (1994, 1995), for example, claimed that authorship attribution is particularly a problem in non-native writing in English, and so did Pecorari (2008) in her comprehensive analysis of academic plagiarism. If among higher education students plagiarism is often a problem of literacy, with prior, conflicting social discourses that may interfere with academic discourse, as Angèlil-Carter (2000) demonstrates, we then have to aver that a distinction should be made between intentional and inadvertent plagiarism: plagiarism should be prosecuted when intentional, but if it is part of the learning process and results from the plagiarist’s unfamiliarity with the text or topic it should be considered ‘positive plagiarism’ (Howard, 1995: 796) and hence not an offense. Determining the intention behind the instances of plagiarism therefore determines the nature of the disciplinary action adopted. Unfortunately, in order to demonstrate the intention to deceive and charge students with accusations of plagiarism, teachers necessarily have to position themselves as ‘plagiarism police’, although it has been argued otherwise (Robillard, 2008). Practice demonstrates that in their daily activities teachers will find themselves being required a command of investigative skills and tools that they most often lack. We thus claim that the ‘intention to deceive’ cannot inevitably be dissociated from plagiarism as a legal issue, even if Garner (2009) asserts that generally plagiarism is immoral but not illegal, and Goldstein (2003) makes the same severance. However, these claims, and the claim that only cases of copyright infringement tend to go to court, have recently been challenged, mainly by forensic linguists, who have been actively involved in cases of plagiarism. Turell (2008), for instance, demonstrated that plagiarism is often connoted with an illegal appropriation of ideas. Previously, she (Turell, 2004) had demonstrated by comparison of four translations of Shakespeare’s Julius Caesar to Spanish that the use of linguistic evidence is able to demonstrate instances of plagiarism. This challenge is also reinforced by practice in international organisations, such as the IEEE, to whom plagiarism potentially has ‘severe ethical and legal consequences’ (IEEE, 2006: 57). What plagiarism definitions used by publishers and organisations have in common – and which the academia usually lacks – is their focus on the legal nature. We speculate that this is due to the relation they intentionally establish with copyright laws, whereas in education the focus tends to shift from the legal to the ethical aspects. However, the number of plagiarism cases taken to court is very small, and jurisprudence is still being developed on the topic. In countries within the Civil Law tradition, Turell (2008) claims, (forensic) linguists are seldom called upon as expert witnesses in cases of plagiarism, either because plagiarists are rarely taken to court or because there is little tradition of accepting linguistic evidence. In spite of the investigative and evidential potential of forensic linguistics to demonstrate the plagiarist’s intention or otherwise, this potential is restricted by the ability to identify a text as being suspect of plagiarism. In an era with such a massive textual production, ‘policing’ plagiarism thus becomes an extraordinarily difficult task without the assistance of plagiarism detection systems. Although plagiarism detection has attracted the attention of computer engineers and software developers for years, a lot of research is still needed. Given the investigative nature of academic plagiarism, plagiarism detection has of necessity to consider not only concepts of education and computational linguistics, but also forensic linguistics. Especially, if intended to counter claims of being a ‘simplistic response’ (Robillard & Howard, 2008). In this paper, we use a corpus of essays written by university students who were accused of plagiarism, to demonstrate that a forensic linguistic analysis of improper paraphrasing in suspect texts has the potential to identify and provide evidence of intention. A linguistic analysis of the corpus texts shows that the plagiarist acts on the paradigmatic axis to replace relevant lexical items with a related word from the same semantic field, i.e. a synonym, a subordinate, a superordinate, etc. In other words, relevant lexical items were replaced with related, but not identical, ones. Additionally, the analysis demonstrates that the word order is often changed intentionally to disguise the borrowing. On the other hand, the linguistic analysis of linking and explanatory verbs (i.e. referencing verbs) and prepositions shows that these have the potential to discriminate instances of ‘patchwriting’ and instances of plagiarism. This research demonstrates that the referencing verbs are borrowed from the original in an attempt to construct the new text cohesively when the plagiarism is inadvertent, and that the plagiarist has made an effort to prevent the reader from identifying the text as plagiarism, when it is intentional. In some of these cases, the referencing elements prove being able to identify direct quotations and thus ‘betray’ and denounce plagiarism. Finally, we demonstrate that a forensic linguistic analysis of these verbs is critical to allow detection software to identify them as proper paraphrasing and not – mistakenly and simplistically – as plagiarism.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.