901 resultados para high linear


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The degradation of high voltage electrical insulation is a prime factor that can significantly influence the reliability performance and the costs of maintaining high voltage electricity networks. Little information is known about the system of localized degradation from corona discharges on the relatively new silicone rubber sheathed composite insulators that are now being widely used in high voltage applications. This current work focuses on the fundamental principles of electrical corona discharge phenomena to provide further insights to where damaging surface discharges may localize and examines how these discharges may degrade the silicone rubber material. Although water drop corona has been identified by many authors as a major cause of deterioration of silicone rubber high voltage insulation until now no thorough studies have been made of this phenomenon. Results from systematic measurements taken using modern digital instrumentation to simultaneously record the discharge current pulses and visible images associated with corona discharges from between metal electrodes, metal electrodes and water drops, and between waters drops on the surface of silicone rubber insulation, using a range of 50 Hz voltages are inter compared. Visual images of wet electrodes show how water drops can play a part in encouraging flashover, and the first reproducible visual images of water drop corona at the triple junction of water air and silicone rubber insulation are presented. A study of the atomic emission spectra of the corona produced by the discharge from its onset up to and including spark-over, using a high resolution digital spectrometer with a fiber optic probe, provides further understanding of the roles of the active species of atoms and molecules produced by the discharge that may be responsible for not only for chemical changes of insulator surfaces, but may also contribute to the degradation of the metal fittings that support the high voltage insulators. Examples of real insulators and further work specific to the electrical power industry are discussed. A new design concept to prevent/reduce the damaging effects of water drop corona is also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The issue of what an effective high quality / high equity education system might look like remains contested. Indeed there is more educational commentary on those systems that do not achieve this goal (see for example Luke & Woods, 2009 for a detailed review of the No Child Left Behind policy initiatives put forward in the United States under the Bush Administration) than there is detailed consideration of what such a system might enact and represent. A long held critique of socio cultural and critical perspectives in education has been their focus on deconstruction to the supposed detriment of reconstructive work. This critique is less warranted in recent times based on work in the field, especially the plethora of qualitative research focusing on case studies of ‘best practice’. However it certainly remains the case that there is more work to be done in investigating the characteristics of a socially just system. This issue of Point and Counterpoint aims to progress such a discussion. Several of the authors call for a reconfiguration of the use of large scale comparative assessment measures and all suggest new ways of thinking about quality and equity for school systems. Each of the papers tackles different aspects of the problematic of how to achieve high equity without compromising quality within a large education system. They each take a reconstructive focus, highlighting ways forward for education systems in Australia and beyond. While each paper investigates different aspects of the issue, the clearly stated objective of seeking to delineate and articulate characteristics of socially just education is consistent throughout the issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study considers the solution of a class of linear systems related with the fractional Poisson equation (FPE) (−∇2)α/2φ=g(x,y) with nonhomogeneous boundary conditions on a bounded domain. A numerical approximation to FPE is derived using a matrix representation of the Laplacian to generate a linear system of equations with its matrix A raised to the fractional power α/2. The solution of the linear system then requires the action of the matrix function f(A)=A−α/2 on a vector b. For large, sparse, and symmetric positive definite matrices, the Lanczos approximation generates f(A)b≈β0Vmf(Tm)e1. This method works well when both the analytic grade of A with respect to b and the residual for the linear system are sufficiently small. Memory constraints often require restarting the Lanczos decomposition; however this is not straightforward in the context of matrix function approximation. In this paper, we use the idea of thick-restart and adaptive preconditioning for solving linear systems to improve convergence of the Lanczos approximation. We give an error bound for the new method and illustrate its role in solving FPE. Numerical results are provided to gauge the performance of the proposed method relative to exact analytic solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To assess the repeatability and validity of lens densitometry derived from the Pentacam Scheimpflug imaging system. Setting Eye Clinic, Queensland University of Technology, Brisbane, Australia. Methods This prospective cross-sectional study evaluated 1 eye of subjects with or without cataract. Scheimpflug measurements and slitlamp and retroillumination photographs were taken through a dilated pupil. Lenses were graded with the Lens Opacities Classification System III. Intraobserver and interobserver reliability of 3 observers performing 3 repeated Scheimpflug lens densitometry measurements each was assessed. Three lens densitometry metrics were evaluated: linear, for which a line was drawn through the visual axis and a mean lens densitometry value given; peak, which is the point at which lens densitometry is greatest on the densitogram; 3-dimensional (3D), in which a fixed, circular 3.0 mm area of the lens is selected and a mean lens densitometry value given. Bland and Altman analysis of repeatability for multiple measures was applied; results were reported as the repeatability coefficient and relative repeatability (RR). Results Twenty eyes were evaluated. Repeatability was high. Overall, interobserver repeatability was marginally lower than intraobserver repeatability. The peak was the least reliable metric (RR 37.31%) and 3D, the most reliable (RR 5.88%). Intraobserver and interobserver lens densitometry values in the cataract group were slightly less repeatable than in the noncataract group. Conclusion The intraobserver and interobserver repeatability of Scheimpflug lens densitometry was high in eyes with cataract and eyes without cataract, which supports the use of automated lens density scoring using the Scheimpflug system evaluated in the study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Television viewing time, independent of leisure-time physical activity, has cross-sectional relationships with the metabolic syndrome and its individual components. We examined whether baseline and five-year changes in self-reported television viewing time are associated with changes in continuous biomarkers of cardio-metabolic risk (waist circumference, triglycerides, high density lipoprotein cholesterol, systolic and diastolic blood pressure, fasting plasma glucose; and a clustered cardio-metabolic risk score) in Australian adults. Methods: AusDiab is a prospective, population-based cohort study with biological, behavioral, and demographic measures collected in 1999–2000 and 2004–2005. Non-institutionalized adults aged ≥ 25 years were measured at baseline (11,247; 55% of those completing an initial household interview); 6,400 took part in the five-year follow-up biomedical examination, and 3,846 met the inclusion criteria for this analysis. Multiple linear regression analysis was used and unstandardized B coefficients (95% CI) are provided. Results: Baseline television viewing time (10 hours/week unit) was not significantly associated with change in any of the biomarkers of cardio-metabolic risk. Increases in television viewing time over five years (10 hours/week unit) were associated with increases in: waist circumference (cm) (men: 0.43 (0.08, 0.78), P = 0.02; women: 0.68 (0.30, 1.05), P <0.001), diastolic blood pressure (mmHg) (women: 0.47 (0.02, 0.92), P = 0.04), and the clustered cardio-metabolic risk score (women: 0.03 (0.01, 0.05), P = 0.007). These associations were independent of baseline television viewing time and baseline and change in physical activity and other potential confounders. Conclusion: These findings indicate that an increase in television viewing time is associated with adverse cardio-metabolic biomarker changes. Further prospective studies using objective measures of several sedentary behaviors are required to confirm causality of the associations found.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To understand the diffusion of high technology products such as PCs, digital cameras and DVD players it is necessary to consider the dynamics of successive generations of technology. From the consumer’s perspective, these technology changes may manifest themselves as either a new generation product substituting for the old (for instance digital cameras) or as multiple generations of a single product (for example PCs). To date, research has been confined to aggregate level sales models. These models consider the demand relationship between one generation of a product and a successor generation. However, they do not give insights into the disaggregate-level decisions by individual households – whether to adopt the newer generation, and if so, when. This paper makes two contributions. It is the first large scale empirical study to collect household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in contrast to traditional analysis in diffusion research that conceptualizes technology substitution as an “adoption of innovation” type process, we propose that from a consumer’s perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing generation I product with generation II). Key Propositions In some cases, successive generations are clear “substitutes” for the earlier generation (e.g. PCs Pentium I to II to III ). More commonly the new generation II technology is a “partial substitute” for existing generation I technology (e.g. DVD players and VCRs). Some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Moreover, drawing on adoption theory consumer innovativeness is the most important consumer characteristic for adoption timing of new products. Hence, we hypothesize consumer innovativeness to influence the timing of both additional and substitute generation II purchases but to have a stronger impact on additional generation II purchases. We further propose that substitute generation II purchases act partially as a replacement purchase for the generation I product. Thus, we hypothesize that households with older generation I products will make substitute generation II purchases earlier. Methods We employ Cox hazard modeling to study factors influencing the timing of a household’s adoption of generation II products. A separate hazard model is conducted for additional and substitute purchases. The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include size and income of household, age and education of decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases and substitute purchases. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD players and a strong influence for PCs/notebooks. Yet, also as hypothesized, there was no influence on additional purchases. This implies that there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. For substitute purchases, product age is a key driver. Therefore marketers of high technology products can utilize data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the following non-linear fractional reaction–subdiffusion process (NFR-SubDP): Formula where f(u, x, t) is a linear function of u, the function g(u, x, t) satisfies the Lipschitz condition and 0Dt1–{gamma} is the Riemann–Liouville time fractional partial derivative of order 1 – {gamma}. We propose a new computationally efficient numerical technique to simulate the process. Firstly, the NFR-SubDP is decoupled, which is equivalent to solving a non-linear fractional reaction–subdiffusion equation (NFR-SubDE). Secondly, we propose an implicit numerical method to approximate the NFR-SubDE. Thirdly, the stability and convergence of the method are discussed using a new energy method. Finally, some numerical examples are presented to show the application of the present technique. This method and supporting theoretical results can also be applied to fractional integrodifferential equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Restrung New Chamber Festival was a practice-led research project which explored the intricacies of musical relationships. Specifically, it investigated the relationships between new music ensembles and pop-oriented bands inspired by the new music genre. The festival, held at the Brisbane Powerhouse (28 February-2 March 2009) comprised 17 diverse groups including the Brodsky Quartet, Topology, Wood, Fourplay and CODA. Restrung used a new and distinctive model which presented new music and syncretic musical genres within an immersive environment. Restrung brought together approaches used in both contemporary classical and popular music festivals, using musical, visual and spatial aspects to engage audiences. Interactivity was encouraged through video and sound installations, workshops and forums. This paper will investigate some of the issues surrounding the conception and design of the Restrung model, within the context of an overview of European new music trends. It includes a discussion of curating such an event in a musically sensitive and effective way, and approaches to identifying new and receptive audiences. As a guide to programming Restrung, I formulated a working definition of new music, further developed by interviews with specialists in Australia and Europe, and this will be outlined below.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While increasing numbers of young high school students engage in part-time work, there is no consensus about its impact on educational outcomes. Indeed this field has had a dearth of research. The present paper presents a review of recent research, primarily from Australia and the US, although it is acknowledged that there are considerable contextual differences. Suggestions for school counsellors to harness the students’ experiences to assist in educational and career decision-making are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The results of a numerical investigation into the errors for least squares estimates of function gradients are presented. The underlying algorithm is obtained by constructing a least squares problem using a truncated Taylor expansion. An error bound associated with this method contains in its numerator terms related to the Taylor series remainder, while its denominator contains the smallest singular value of the least squares matrix. Perhaps for this reason the error bounds are often found to be pessimistic by several orders of magnitude. The circumstance under which these poor estimates arise is elucidated and an empirical correction of the theoretical error bounds is conjectured and investigated numerically. This is followed by an indication of how the conjecture is supported by a rigorous argument.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. To evaluate the performance of the dynamic-area high-speed videokeratoscopy technique in the assessment of tear film surface quality with and without the presence of soft contact lenses on eye. Methods. Retrospective data from a tear film study using basic high-speed videokeratoscopy, captured at 25 frames per second, (Kopf et al., 2008, J Optom) were used. Eleven subjects had tear film analysis conducted in the morning, midday and evening on the first and seventh day of one week of no lens wear. Five of the eleven subjects then completed an extra week of hydrogel lens wear followed by a week of silicone hydrogel lens wear. Analysis was performed on a 6 second period of the inter-blink recording. The dynamic-area high-speed videokeratoscopy technique uses the maximum available area of Placido ring pattern reflected from the tear interface and eliminates regions of disturbance due to shadows from the eyelashes. A value of tear film surface quality was derived using image rocessing techniques, based on the quality of the reflected ring pattern orientation. Results. The group mean tear film surface quality and the standard deviations for each of the conditions (bare eye, hydrogel lens, and silicone hydrogel lens) showed a much lower coefficient of variation than previous methods (average reduction of about 92%). Bare eye measurements from the right and left eyes of eleven individuals showed high correlation values (Pearson’s correlation r = 0.73, p < 0.05). Repeated measures ANOVA across the 6 second period of measurement in the normal inter-blink period for the bare eye condition showed no statistically significant changes. However, across the 6 second inter-blink period with both contact lenses, statistically significant changes were observed (p < 0.001) for both types of contact lens material. Overall, wearing hydrogel and silicone hydrogel lenses caused the tear film surface quality to worsen compared with the bare eye condition (repeated measures ANOVA, p < 0.0001 for both hydrogel and silicone hydrogel). Conclusions. The results suggest that the dynamic-area method of high-speed videokeratoscopy was able to distinguish and quantify the subtle, but systematic worsening of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method for noninvasive assessment of tear film surface quality (TFSQ) is proposed. The method is based on high-speed videokeratoscopy in which the corneal area for the analysis is dynamically estimated in a manner that removes videokeratoscopy interference from the shadows of eyelashes but not that related to the poor quality of the precorneal tear film that is of interest. The separation between the two types of seemingly similar videokeratoscopy interference is achieved by region-based classification in which the overall noise is first separated from the useful signal (unaltered videokeratoscopy pattern), followed by a dedicated interference classification algorithm that distinguishes between the two considered interferences. The proposed technique provides a much wider corneal area for the analysis of TFSQ than the previously reported techniques. A preliminary study with the proposed technique, carried out for a range of anterior eye conditions, showed an effective behavior in terms of noise to signal separation, interference classification, as well as consistent TFSQ results. Subsequently, the method proved to be able to not only discriminate between the bare eye and the lens on eye conditions but also to have the potential to discriminate between the two types of contact lenses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.