967 resultados para Average method
Resumo:
The application of object-based approaches to the problem of extracting vegetation information from images requires accurate delineation of individual tree crowns. This paper presents an automated method for individual tree crown detection and delineation by applying a simplified PCNN model in spectral feature space followed by post-processing using morphological reconstruction. The algorithm was tested on high resolution multi-spectral aerial images and the results are compared with two existing image segmentation algorithms. The results demonstrate that our algorithm outperforms the other two solutions with the average accuracy of 81.8%.
Resumo:
The effects of particulate matter on environment and public health have been widely studied in recent years. A number of studies in the medical field have tried to identify the specific effect on human health of particulate exposure, but agreement amongst these studies on the relative importance of the particles’ size and its origin with respect to health effects is still lacking. Nevertheless, air quality standards are moving, as the epidemiological attention, towards greater focus on the smaller particles. Current air quality standards only regulate the mass of particulate matter less than 10 μm in aerodynamic diameter (PM10) and less than 2.5 μm (PM2.5). The most reliable method used in measuring Total Suspended Particles (TSP), PM10, PM2.5 and PM1 is the gravimetric method since it directly measures PM concentration, guaranteeing an effective traceability to international standards. This technique however, neglects the possibility to correlate short term intra-day variations of atmospheric parameters that can influence ambient particle concentration and size distribution (emission strengths of particle sources, temperature, relative humidity, wind direction and speed and mixing height) as well as human activity patterns that may also vary over time periods considerably shorter than 24 hours. A continuous method to measure the number size distribution and total number concentration in the range 0.014 – 20 μm is the tandem system constituted by a Scanning Mobility Particle Sizer (SMPS) and an Aerodynamic Particle Sizer (APS). In this paper, an uncertainty budget model of the measurement of airborne particle number, surface area and mass size distributions is proposed and applied for several typical aerosol size distributions. The estimation of such an uncertainty budget presents several difficulties due to i) the complexity of the measurement chain, ii) the fact that SMPS and APS can properly guarantee the traceability to the International System of Measurements only in terms of number concentration. In fact, the surface area and mass concentration must be estimated on the basis of separately determined average density and particle morphology. Keywords: SMPS-APS tandem system, gravimetric reference method, uncertainty budget, ultrafine particles.
Resumo:
This study considers the solution of a class of linear systems related with the fractional Poisson equation (FPE) (−∇2)α/2φ=g(x,y) with nonhomogeneous boundary conditions on a bounded domain. A numerical approximation to FPE is derived using a matrix representation of the Laplacian to generate a linear system of equations with its matrix A raised to the fractional power α/2. The solution of the linear system then requires the action of the matrix function f(A)=A−α/2 on a vector b. For large, sparse, and symmetric positive definite matrices, the Lanczos approximation generates f(A)b≈β0Vmf(Tm)e1. This method works well when both the analytic grade of A with respect to b and the residual for the linear system are sufficiently small. Memory constraints often require restarting the Lanczos decomposition; however this is not straightforward in the context of matrix function approximation. In this paper, we use the idea of thick-restart and adaptive preconditioning for solving linear systems to improve convergence of the Lanczos approximation. We give an error bound for the new method and illustrate its role in solving FPE. Numerical results are provided to gauge the performance of the proposed method relative to exact analytic solutions.
Resumo:
Anxiety disorders are the most common psychopathology experienced by young people, with up to 18% of adolescents developing an anxiety disorder. The consequences of these disorders, if left untreated, include impaired peer relationships, school absenteeism and self-concept problems. In addition, anxiety disorders may play a causal role in the development of depression in young people, precede eating disorders and predispose adolescents to substance abuse disorders. While the school is often chosen as a place to provide early intervention for this debilitating disorder, the fact that excessive anxiety is often not recognised in school and that young people are reluctant to seek help, makes identifying these adolescents difficult. Even when these young people are identified, there are problems in providing sensitive programs which are not stigmatising to them within a school setting. One method which may engage this adolescent population could be cross-age peer tutoring. This paper reports on a small pilot study using the “Worrybusters” program and a cross-age peer tutoring method to engage the anxious adolescents. These anxious secondary school students planned activities for teacher-referred anxious primary school students for a term in the high school setting and then delivered those activities to the younger students weekly in the next term in the primary school. Although the secondary school students decreased their scores on anxiety self-report measures there were no significant differences for primary school students’ self-reports. However, the primary school parent reports indicated a significant decrease in their child’s anxiety.
Resumo:
In the past decade, scholars have proposed a range of terms to describe the relationship between practice and research in the creative arts, including increasingly nuanced definitions of practice-based research, practice-led research and practice-as-research. In this paper, I consider the efficacy of creative practice as method. I use the example of The Ex/Centric Fixations Project – a project in which I have embedded creative practice in a research project, rather than embedding research in a creative project. The Ex/Centric Fixations project investigates the way spectators interpret human experiences – especially human experiences of difference, marginalisation or discrimination – depicted onstage. In particular, it investigates the way postmodern performance writing strategies, and the presence of performing bodied to which the experience depicted can be attached, impacts on interpretations. It is part of a broader research project which examines the performativity of spectatorship, and intervenes in emergent debates about performance, ethics and spectatorship in the context of debate about whether live performance is a privileged site for the emergence of an ethical face-to-face encounter with the Other. Using the metaphor of the Mobius strip, I examines the way practice – as a method, rather than an output – has informed, influenced and problematised the broader research project.
Resumo:
This paper presents a novel approach of estimating the confidence interval of speaker verification scores. This approach is utilised to minimise the utterance lengths required in order to produce a confident verification decision. The confidence estimation method is also extended to address both the problem of high correlation in consecutive frame scores, and robustness with very limited training samples. The proposed technique achieves a drastic reduction in the typical data requirements for producing confident decisions in an automatic speaker verification system. When evaluated on the NIST 2005 SRE, the early verification decision method demonstrates that an average of 5–10 seconds of speech is sufficient to produce verification rates approaching those achieved previously using an average in excess of 100 seconds of speech.
Resumo:
Ceramic membranes are of particular interest in many industrial processes due to their ability to function under extreme conditions while maintaining their chemical and thermal stability. Major structural deficiencies under conventional fabrication approach are pin-holes and cracks, and the dramatic losses of flux when pore sizes are reduced to enhance selectivity. We overcome these structural deficiencies by constructing hierarchically structured separation layer on a porous substrate using larger titanate nanofibres and smaller boehmite nanofibres. This yields a radical change in membrane texture. The differences in the porous supports have no substantial influences on the texture of resulting membranes. The membranes with top layer of nanofibres coated on different porous supports by spin-coating method have similar size of the filtration pores, which is in a range of 10–100 nm. These membranes are able to effectively filter out species larger than 60 nm at flow rates orders of magnitude greater than conventional membranes. The retention can attain more than 95%, while maintaining a high flux rate about 900 L m-2 h. The calcination after spin-coating creates solid linkages between the fibres and between fibres and substrate, in addition to convert boehmite into -alumina nanofibres. This reveals a new direction in membrane fabrication.
Resumo:
In this paper, we consider a modified anomalous subdiffusion equation with a nonlinear source term for describing processes that become less anomalous as time progresses by the inclusion of a second fractional time derivative acting on the diffusion term. A new implicit difference method is constructed. The stability and convergence are discussed using a new energy method. Finally, some numerical examples are given. The numerical results demonstrate the effectiveness of theoretical analysis
Resumo:
In this paper, we consider the following non-linear fractional reaction–subdiffusion process (NFR-SubDP): Formula where f(u, x, t) is a linear function of u, the function g(u, x, t) satisfies the Lipschitz condition and 0Dt1–{gamma} is the Riemann–Liouville time fractional partial derivative of order 1 – {gamma}. We propose a new computationally efficient numerical technique to simulate the process. Firstly, the NFR-SubDP is decoupled, which is equivalent to solving a non-linear fractional reaction–subdiffusion equation (NFR-SubDE). Secondly, we propose an implicit numerical method to approximate the NFR-SubDE. Thirdly, the stability and convergence of the method are discussed using a new energy method. Finally, some numerical examples are presented to show the application of the present technique. This method and supporting theoretical results can also be applied to fractional integrodifferential equations.
Resumo:
In this paper we identify elements in Marx´s economic and political writings that are relevant to contemporary critical discourse analysis (CDA). We argue that Marx can be seen to be engaging in a form of discourse analysis. We identify the elements in Marx´s historical materialist method that support such a perspective, and exemplify these in a longitudinal comparison of Marx´s texts.
Resumo:
This thesis is a study of naturally occurring radioactive materials (NORM) activity concentration, gamma dose rate and radon (222Rn) exhalation from the waste streams of large-scale onshore petroleum operations. Types of activities covered included; sludge recovery from separation tanks, sludge farming, NORM storage, scaling in oil tubulars, scaling in gas production and sedimentation in produced water evaporation ponds. Field work was conducted in the arid desert terrain of an operational oil exploration and production region in the Sultanate of Oman. The main radionuclides found were 226Ra and 210Pb (238U - series), 228Ra and 228Th (232Th - series), and 227Ac (235U - series), along with 40K. All activity concentrations were higher than the ambient soil level and varied over several orders of magnitude. The range of gamma dose rates at a 1 m height above ground for the farm treated sludge had a range of 0.06 0.43 µSv h 1, and an average close to the ambient soil mean of 0.086 ± 0.014 µSv h 1, whereas the untreated sludge gamma dose rates had a range of 0.07 1.78 µSv h 1, and a mean of 0.456 ± 0.303 µSv h 1. The geometric mean of ambient soil 222Rn exhalation rate for area surrounding the sludge was mBq m 2 s 1. Radon exhalation rates reported in oil waste products were all higher than the ambient soil value and varied over three orders of magnitude. This study resulted in some unique findings including: (i) detection of radiotoxic 227Ac in the oil scales and sludge, (ii) need of a new empirical relation between petroleum sludge activity concentrations and gamma dose rates, and (iii) assessment of exhalation of 222Rn from oil sludge. Additionally the study investigated a method to determine oil scale and sludge age by the use of inherent behaviour of radionuclides as 228Ra:226Ra and 228Th:228Ra activity ratios.
Resumo:
This paper discusses a method, Generation in Context, for interrogating theories of music analysis and music perception. Given an analytic theory, the method consists of creating a generative process that implements the theory in reverse. Instead of using the theory to create analyses from scores, the theory is used to generate scores from analyses. Subjective evaluation of the quality of the musical output provides a mechanism for testing the theory in a contextually robust fashion. The method is exploratory, meaning that in addition to testing extant theories it provides a general mechanism for generating new theoretical insights. We outline our initial explorations in the use of generative processes for music research, and we discuss how generative processes provide evidence as to the veracity of theories about how music is experienced, with insights into how these theories may be improved and, concurrently, provide new techniques for music creation. We conclude that Generation in Context will help reveal new perspectives on our understanding of music.
Resumo:
Objectives. To evaluate the performance of the dynamic-area high-speed videokeratoscopy technique in the assessment of tear film surface quality with and without the presence of soft contact lenses on eye. Methods. Retrospective data from a tear film study using basic high-speed videokeratoscopy, captured at 25 frames per second, (Kopf et al., 2008, J Optom) were used. Eleven subjects had tear film analysis conducted in the morning, midday and evening on the first and seventh day of one week of no lens wear. Five of the eleven subjects then completed an extra week of hydrogel lens wear followed by a week of silicone hydrogel lens wear. Analysis was performed on a 6 second period of the inter-blink recording. The dynamic-area high-speed videokeratoscopy technique uses the maximum available area of Placido ring pattern reflected from the tear interface and eliminates regions of disturbance due to shadows from the eyelashes. A value of tear film surface quality was derived using image rocessing techniques, based on the quality of the reflected ring pattern orientation. Results. The group mean tear film surface quality and the standard deviations for each of the conditions (bare eye, hydrogel lens, and silicone hydrogel lens) showed a much lower coefficient of variation than previous methods (average reduction of about 92%). Bare eye measurements from the right and left eyes of eleven individuals showed high correlation values (Pearson’s correlation r = 0.73, p < 0.05). Repeated measures ANOVA across the 6 second period of measurement in the normal inter-blink period for the bare eye condition showed no statistically significant changes. However, across the 6 second inter-blink period with both contact lenses, statistically significant changes were observed (p < 0.001) for both types of contact lens material. Overall, wearing hydrogel and silicone hydrogel lenses caused the tear film surface quality to worsen compared with the bare eye condition (repeated measures ANOVA, p < 0.0001 for both hydrogel and silicone hydrogel). Conclusions. The results suggest that the dynamic-area method of high-speed videokeratoscopy was able to distinguish and quantify the subtle, but systematic worsening of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions.
Analysis of wide spaced reinforced concrete masonry shear walls using explicit finite element method