973 resultados para Z-scan technique
Resumo:
For decades, marketing and marketing research have been based on a concept of consumer behaviour that is deeply embedded in a linear notion of marketing activities. With increasing regularity, key organising frameworks for marketing and marketing activities are being challenged by academics and practitioners alike. In turn, this has led to the search for new approaches and tools that will help marketers understand the interaction among attitudes, emotions and product/brand choice. More recently, the approach developed by Harvard Professor, Gerald Zaltman, referred to as the Zaltman Metaphor Elicitation Technique (ZMET) has gained considerable interest. This paper seeks to demonstrate the effectiveness of this alternative qualitative method, using a non-conventional approach, thus providing a useful contribution to the qualitative research area.
Resumo:
Fusionless scoliosis surgery is an emerging treatment for idiopathic scoliosis as it offers theoretical advantages over current forms of treatment. Anterior vertebral stapling using a nitinol staple is one such treatment. Despite increasing interest in this technique, little is known about the effects on the spine following insertion, or the mechanism of action of the staple. The aims of this study were threefold; (1) to measure changes in the bending stiffness of a single motion segment following staple insertion, (2) to describe the forces that occur within the staple during spinal movement, and (3) to describe the anatomical changes that occur following staple insertion. Results suggest that staple insertion consistently decreased stiffness in all directions of motion. An explanation for the finding may be found in the outcomes of the strain gauge testing and micro-CT scan. The strain gauge testing showed that once inserted, the staple tips applied a baseline compressive force to the surrounding trabecular bone and vertebral end-plate. This finding would be consistent with the current belief that the clinical effect of the staples is via unilateral compression of the physis. Interestingly however, as each specimen progressed through the five cycles of each test, the baseline load on the staple tips gradually decreased, implying that the force at the staple tip-bone interface was decreasing. We believe that this was likely occurring as a result of structural damage to the trabecular bone and vertebral end-plate by the staple effectively causing ‘loosening’ of the staple. This hypothesis is further supported by the findings of the micro-CT scan. The pictures depict significant trabecular bone and physeal injury around the staple blades. These results suggest that the current hypothesis that stapling modulates growth through physeal compression may be incorrect, but rather the effect occurs through mechanical disruption of the vertebral growth plate.
Resumo:
The measurement of Cobb angles from radiographs is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. Between November 2008 and December 2008 20 patients were selected at random from the Paediatric Spine Research Groups Database. A power calculation was performed which indicated if n=240 measurements the study had a 96% chance of detecting a 5 degree difference between groups. All patients had idiopathic scoliosis with a range of curve types and severities. The study found the i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.
Resumo:
Quantitative behaviour analysis requires the classification of behaviour to produce the basic data. In practice, much of this work will be performed by multiple observers, and maximising inter-observer consistency is of particular importance. Another discipline where consistency in classification is vital is biological taxonomy. A classification tool of great utility, the binary key, is designed to simplify the classification decision process and ensure consistent identification of proper categories. We show how this same decision-making tool - the binary key - can be used to promote consistency in the classification of behaviour. The construction of a binary key also ensures that the categories in which behaviour is classified are complete and non-overlapping. We discuss the general principles of design of binary keys, and illustrate their construction and use with a practical example from education research.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.
Resumo:
The measurement of Cobb angles on radiographs of patients with spinal deformities is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. The i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.
Resumo:
Bag sampling techniques can be used to temporarily store an aerosol and therefore provide sufficient time to utilize sensitive but slow instrumental techniques for recording detailed particle size distributions. Laboratory based assessment of the method were conducted to examine size dependant deposition loss coefficients for aerosols held in VelostatTM bags conforming to a horizontal cylindrical geometry. Deposition losses of NaCl particles in the range of 10 nm to 160 nm were analysed in relation to the bag size, storage time, and sampling flow rate. Results of this study suggest that the bag sampling method is most useful for moderately short sampling periods of about 5 minutes.
Resumo:
The problem of impostor dataset selection for GMM-based speaker verification is addressed through the recently proposed data-driven background dataset refinement technique. The SVM-based refinement technique selects from a candidate impostor dataset those examples that are most frequently selected as support vectors when training a set of SVMs on a development corpus. This study demonstrates the versatility of dataset refinement in the task of selecting suitable impostor datasets for use in GMM-based speaker verification. The use of refined Z- and T-norm datasets provided performance gains of 15% in EER in the NIST 2006 SRE over the use of heuristically selected datasets. The refined datasets were shown to generalise well to the unseen data of the NIST 2008 SRE.
Resumo:
Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.
Resumo:
In this paper, we consider a modified anomalous subdiffusion equation with a nonlinear source term for describing processes that become less anomalous as time progresses by the inclusion of a second fractional time derivative acting on the diffusion term. A new implicit difference method is constructed. The stability and convergence are discussed using a new energy method. Finally, some numerical examples are given. The numerical results demonstrate the effectiveness of theoretical analysis