348 resultados para solid sampling technique


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Size distributions of expiratory droplets expelled during coughing and speaking and the velocities of the expiration air jets of healthy volunteers were measured. Droplet size was measured using the Interferometric Mie imaging (IMI) technique while the Particle Image Velocimetry (PIV) technique was used for measuring air velocity. These techniques allowed measurements in close proximity to the mouth and avoided air sampling losses. The average expiration air velocity was 11.7 m/s for coughing and 3.9 m/s for speaking. Under the experimental setting, evaporation and condensation effects had negligible impact on the measured droplet size. The geometric mean diameter of droplets from coughing was 13.5m and it was 16.0m for speaking (counting 1 to 100). The estimated total number of droplets expelled ranged from 947 – 2085 per cough and 112 – 6720 for speaking. The estimated droplet concentrations for coughing ranged from 2.4 - 5.2cm-3 per cough and 0.004 – 0.223 cm-3 for speaking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The measurement of Cobb angles from radiographs is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. Between November 2008 and December 2008 20 patients were selected at random from the Paediatric Spine Research Groups Database. A power calculation was performed which indicated if n=240 measurements the study had a 96% chance of detecting a 5 degree difference between groups. All patients had idiopathic scoliosis with a range of curve types and severities. The study found the i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Field and laboratory measurements identified a complex relationship between odour emission rates provided by the US EPA dynamic emission chamber and the University of New South Wales wind tunnel. Using a range of model compounds in an aqueous odour source, we demonstrate that emission rates derived from the wind tunnel and flux chamber are a function of the solubility of the materials being emitted, the concentrations of the materials within the liquid; and the aerodynamic conditions within the device – either velocity in the wind tunnel, or flushing rate for the flux chamber. The ratio of wind tunnel to flux chamber odour emission rates (OU m-2 s) ranged from about 60:1 to 112:1. The emission rates of the model odorants varied from about 40:1 to over 600:1. These results may provide, for the first time, a basis for the development of a model allowing an odour emission rate derived from either device to be used for odour dispersion modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative behaviour analysis requires the classification of behaviour to produce the basic data. In practice, much of this work will be performed by multiple observers, and maximising inter-observer consistency is of particular importance. Another discipline where consistency in classification is vital is biological taxonomy. A classification tool of great utility, the binary key, is designed to simplify the classification decision process and ensure consistent identification of proper categories. We show how this same decision-making tool - the binary key - can be used to promote consistency in the classification of behaviour. The construction of a binary key also ensures that the categories in which behaviour is classified are complete and non-overlapping. We discuss the general principles of design of binary keys, and illustrate their construction and use with a practical example from education research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer to peer systems have been widely used in the internet. However, most of the peer to peer information systems are still missing some of the important features, for example cross-language IR (Information Retrieval) and collection selection / fusion features. Cross-language IR is the state-of-art research area in IR research community. It has not been used in any real world IR systems yet. Cross-language IR has the ability to issue a query in one language and receive documents in other languages. In typical peer to peer environment, users are from multiple countries. Their collections are definitely in multiple languages. Cross-language IR can help users to find documents more easily. E.g. many Chinese researchers will search research papers in both Chinese and English. With Cross-language IR, they can do one query in Chinese and get documents in two languages. The Out Of Vocabulary (OOV) problem is one of the key research areas in crosslanguage information retrieval. In recent years, web mining was shown to be one of the effective approaches to solving this problem. However, how to extract Multiword Lexical Units (MLUs) from the web content and how to select the correct translations from the extracted candidate MLUs are still two difficult problems in web mining based automated translation approaches. Discovering resource descriptions and merging results obtained from remote search engines are two key issues in distributed information retrieval studies. In uncooperative environments, query-based sampling and normalized-score based merging strategies are well-known approaches to solve such problems. However, such approaches only consider the content of the remote database but do not consider the retrieval performance of the remote search engine. This thesis presents research on building a peer to peer IR system with crosslanguage IR and advance collection profiling technique for fusion features. Particularly, this thesis first presents a new Chinese term measurement and new Chinese MLU extraction process that works well on small corpora. An approach to selection of MLUs in a more accurate manner is also presented. After that, this thesis proposes a collection profiling strategy which can discover not only collection content but also retrieval performance of the remote search engine. Based on collection profiling, a web-based query classification method and two collection fusion approaches are developed and presented in this thesis. Our experiments show that the proposed strategies are effective in merging results in uncooperative peer to peer environments. Here, an uncooperative environment is defined as each peer in the system is autonomous. Peer like to share documents but they do not share collection statistics. This environment is a typical peer to peer IR environment. Finally, all those approaches are grouped together to build up a secure peer to peer multilingual IR system that cooperates through X.509 and email system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important aspect of designing any product is validation. Virtual design process (VDP) is an alternative to hardware prototyping in which analysis of designs can be done without manufacturing physical samples. In recent years, VDP have been generated either for animation or filming applications. This paper proposes a virtual reality design process model on one of the applications when used as a validation tool. This technique is used to generate a complete design guideline and validation tool of product design. To support the design process of a product, a virtual environment and VDP method were developed that supports validation and an initial design cycle performed by a designer. The product model car carrier is used as illustration for which virtual design was generated. The loading and unloading sequence of the model for the prototype was generated using automated reasoning techniques and was completed by interactively animating the product in the virtual environment before complete design was built. By using the VDP process critical issues like loading, unloading, Australian Design rules (ADR) and clearance analysis were done. The process would save time, money in physical sampling and to large extent in complete math generation. Since only schematic models are required, it saves time in math modelling and handling of bigger size assemblies due to complexity of the models. This extension of VDP process for design evaluation is unique and was developed, implemented successfully. In this paper a Toll logistics and J Smith and Sons car carrier which is developed under author’s responsibility has been used to illustrate our approach of generating design validation via VDP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The measurement of Cobb angles on radiographs of patients with spinal deformities is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. The i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process of structural health monitoring (SHM) involves monitoring a structure over a period of time using appropriate sensors, extracting damage sensitive features from the measurements made by the sensors and analysing these features to determine the current state of the structure. Various techniques are available for structural health monitoring of structures and acoustic emission (AE) is one technique that is finding an increasing use. Acoustic emission waves are the stress waves generated by the mechanical deformation of materials. AE waves produced inside a structure can be recorded by means of sensors attached on the surface. Analysis of these recorded signals can locate and assess the extent of damage. This paper describes preliminary studies on the application of AE technique for health monitoring of bridge structures. Crack initiation or structural damage will result in wave propagation in solid and this can take place in various forms. Propagation of these waves is likely to be affected by the dimensions, surface properties and shape of the specimen. This, in turn, will affect source localization. Various laboratory test results will be presented on source localization, using pencil lead break tests. The results from the tests can be expected to aid in enhancement of knowledge of acoustic emission process and development of effective bridge structure diagnostics system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Do commencing students possess the level of information literacy (IL) knowledge and skills they need to succeed at university? What impact does embedding IL within the engineering and design curriculum have? This paper reports on the self-perception versus the reality of IL knowledge and skills, across a large cohort of first year built environment and engineering students. Acting on the findings of this evaluation, the authors (a team of academic librarians) developed an intensive IL skills program which was integrated into a faculty wide unit. Perceptions, knowledge and skills were re-evaluated at the end of the semester to determine if embedded IL education made a difference. Findings reveal that both the perception and reality of IL skills were significantly and measurably improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider a modified anomalous subdiffusion equation with a nonlinear source term for describing processes that become less anomalous as time progresses by the inclusion of a second fractional time derivative acting on the diffusion term. A new implicit difference method is constructed. The stability and convergence are discussed using a new energy method. Finally, some numerical examples are given. The numerical results demonstrate the effectiveness of theoretical analysis