425 resultados para experimental methodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Listening used in language teaching refers to a complex process that allows us to understand spoken language. The current study, conducted in Iran with an experimental design, investigated the effectiveness of teaching listening strategies delivered in L1 (Persian) and its effect on listening comprehension in L2. Five listening strategies: Guessing, making inferences, identifying topics, repetition, and note-taking were taught over 14 weeks during a semester. Sixty lower intermediate female participants came from two EFL classrooms in an English language institute. The experimental class (n = 30) who listened to their classroom activities performed better (t value = 10.083) than the control class using a methodology that led learners through five listening strategies in Persian. The same teacher taught the students in the control class (n = 30), who listened to the same classroom listening activities without any of the above listening strategies. A pre and post listening test made by a group of experts in the language institute assessed the effect of teaching listening strategies delivered in L1. Results gathered on the post intervention listening test revealed that listening strategies delivered in L1 led to a statistically significant improvement in their discrete listening scores compared with the control group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to calculate mechanical properties of tough skinned vegetables as a part of Finite Element Modelling (FEM) and simulation of tissue damage during mechanical peeling of tough skinned vegetables. Design/methodology: There are some previous studies on mechanical properties of fruits and vegetables however, behaviour of tissue under different processing operations will be different. In this study indentation test was performed on Peel, Flesh and Unpeeled samples of pumpkin as a tough skinned vegetable. Additionally, the test performed in three different loading rates for peel: 1.25, 10, 20 mm/min and 20 mm/min for flesh and unpeeled samples respectively. The spherical end indenter with 8mm diameter used for the experimental tests. Samples prepare from defect free and ripped pumpkin purchased from local shops in Brisbane, Australia. Humidity and temperature were 20-55% and 20-250C respectively. Findings: Consequently, force deformation and stress and strain of samples were calculated and shown in presented figures. Relative contribution (%) of skin to different mechanical properties is computed and compared with data available from literature. According the results, peel samples had the highest value of rupture force (291N) and as well as highest value of firmness (1411Nm-1). Research limitations/implications: The proposed study focused on one type of tough skinned vegetables and one variety of pumpkin however, more tests will give better understandings of behaviours of tissue. Additionally, the behaviours of peel, unpeeled and flesh samples in different speed of loading will provide more details of tissue damages during mechanical loading. Originality/value: Mechanical properties of pumpkin tissue calculated using the results of indentation test, specifically the behaviours of peel, flesh and unpeeled samples were explored which is a new approach in Finite Element Modelling (FEM) of food processes. Keywords: Finite Element Modelling (FEM), relative contribution, firmness, toughness and rupture force.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although topic detection and tracking techniques have made great progress, most of the researchers seldom pay more attention to the following two aspects. First, the construction of a topic model does not take the characteristics of different topics into consideration. Second, the factors that determine the formation and development of hot topics are not further analyzed. In order to correctly extract news blog hot topics, the paper views the above problems in a new perspective based on the W2T (Wisdom Web of Things) methodology, in which the characteristics of blog users, context of topic propagation and information granularity are investigated in a unified way. The motivations and features of blog users are first analyzed to understand the characteristics of news blog topics. Then the context of topic propagation is decomposed into the blog community, topic network and opinion network, respectively. Some important factors such as the user behavior pattern, opinion leader and network opinion are identified to track the development trends of news blog topics. Moreover, a blog hot topic detection algorithm is proposed, in which news blog hot topics are identified by measuring the duration, topic novelty, attention degree of users and topic growth. Experimental results show that the proposed method is feasible and effective. These results are also useful for further studying the formation mechanism of opinion leaders in blogspace.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrostatic discharges have been identified as the most likely cause in a number of incidents of fire and explosion with unexplained ignitions. The lack of data and suitable models for this ignition mechanism creates a void in the analysis to quantify the importance of static electricity as a credible ignition mechanism. Quantifiable hazard analysis of the risk of ignition by static discharge cannot, therefore, be entirely carried out with our current understanding of this phenomenon. The study of electrostatics has been ongoing for a long time. However, it was not until the wide spread use of electronics that research was developed for the protection of electronics from electrostatic discharges. Current experimental models for electrostatic discharge developed for intrinsic safety with electronics are inadequate for ignition analysis and typically are not supported by theoretical analysis. A preliminary simulation and experiment with low voltage was designed to investigate the characteristics of energy dissipation and provided a basis for a high voltage investigation. It was seen that for a low voltage the discharge energy represents about 10% of the initial capacitive energy available and that the energy dissipation was within 10 ns of the initial discharge. The potential difference is greatest at the initial break down when the largest amount of the energy is dissipated. The discharge pathway is then established and minimal energy is dissipated as energy dissipation becomes greatly influenced by other components and stray resistance in the discharge circuit. From the initial low voltage simulation work, the importance of the energy dissipation and the characteristic of the discharge were determined. After the preliminary low voltage work was completed, a high voltage discharge experiment was designed and fabricated. Voltage and current measurement were recorded on the discharge circuit allowing the discharge characteristic to be recorded and energy dissipation in the discharge circuit calculated. Discharge energy calculations show consistency with the low voltage work relating to discharge energy with about 30-40% of the total initial capacitive energy being discharged in the resulting high voltage arc. After the system was characterised and operation validated, high voltage ignition energy measurements were conducted on a solution of n-Pentane evaporating in a 250 cm3 chamber. A series of ignition experiments were conducted to determine the minimum ignition energy of n-Pentane. The data from the ignition work was analysed with standard statistical regression methods for tests that return binary (yes/no) data and found to be in agreement with recent publications. The research demonstrates that energy dissipation is heavily dependent on the circuit configuration and most especially by the discharge circuit's capacitance and resistance. The analysis established a discharge profile for the discharges studied and validates the application of this methodology for further research into different materials and atmospheres; by systematically looking at discharge profiles of test materials with various parameters (e.g., capacitance, inductance, and resistance). Systematic experiments looking at the discharge characteristics of the spark will also help understand the way energy is dissipated in an electrostatic discharge enabling a better understanding of the ignition characteristics of materials in terms of energy and the dissipation of that energy in an electrostatic discharge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the earliest human creative expressions there has been a relationship between art, technology and science. In Western history this relationship is often seen as drawing from the advances in both art and science that occurred during the Renaissance, and as captured in the polymath figure of da Vinci. The 20th century development of computer technology, and the more recent emergence of creative practice-led research as a recognised methodology, has lead to a renewed appreciation of the relationship between art, science and technology. This chapter focuses on transdisciplinary practices that bring together arts, science and technology in imaginative ways. Showing how such combinations have led to changes in both practice and forms of creative expression for artists and their partners across disciplines. The aim of this chapter is to sketch an outline of the types of transdisiplinary creative research projects that currently signify best practice in the field, which is done in reference to key literature and exemplars drawn from the Australian context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Timely diagnosis and reporting of patient symptoms in hospital emergency departments (ED) is a critical component of health services delivery. However, due to dispersed information resources and a vast amount of manual processing of unstructured information, accurate point-of-care diagnosis is often difficult. Aims The aim of this research is to report initial experimental evaluation of a clinician-informed automated method for the issue of initial misdiagnoses associated with delayed receipt of unstructured radiology reports. Method A method was developed that resembles clinical reasoning for identifying limb abnormalities. The method consists of a gazetteer of keywords related to radiological findings; the method classifies an X-ray report as abnormal if it contains evidence contained in the gazetteer. A set of 99 narrative reports of radiological findings was sourced from a tertiary hospital. Reports were manually assessed by two clinicians and discrepancies were validated by a third expert ED clinician; the final manual classification generated by the expert ED clinician was used as ground truth to empirically evaluate the approach. Results The automated method that attempts to individuate limb abnormalities by searching for keywords expressed by clinicians achieved an F-measure of 0.80 and an accuracy of 0.80. Conclusion While the automated clinician-driven method achieved promising performances, a number of avenues for improvement were identified using advanced natural language processing (NLP) and machine learning techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, the beauty leaf plant (Calophyllum Inophyllum) is being considered as a potential 2nd generation biodiesel source due to high seed oil content, high fruit production rate, simple cultivation and ability to grow in a wide range of climate conditions. However, however, due to the high free fatty acid (FFA) content in this oil, the potential of this biodiesel feedstock is still unrealized, and little research has been undertaken on it. In this study, transesterification of beauty leaf oil to produce biodiesel has been investigated. A two-step biodiesel conversion method consisting of acid catalysed pre-esterification and alkali catalysed transesterification has been utilized. The three main factors that drive the biodiesel (fatty acid methyl ester (FAME)) conversion from vegetable oil (triglycerides) were studied using response surface methodology (RSM) based on a Box-Behnken experimental design. The factors considered in this study were catalyst concentration, methanol to oil molar ratio and reaction temperature. Linear and full quadratic regression models were developed to predict FFA and FAME concentration and to optimize the reaction conditions. The significance of these factors and their interaction in both stages was determined using analysis of variance (ANOVA). The reaction conditions for the largest reduction in FFA concentration for acid catalysed pre-esterification was 30:1 methanol to oil molar ratio, 10% (w/w) sulfuric acid catalyst loading and 75 °C reaction temperature. In the alkali catalysed transesterification process 7.5:1 methanol to oil molar ratio, 1% (w/w) sodium methoxide catalyst loading and 55 °C reaction temperature were found to result in the highest FAME conversion. The good agreement between model outputs and experimental results demonstrated that this methodology may be useful for industrial process optimization for biodiesel production from beauty leaf oil and possibly other industrial processes as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the potential of Australian native plants as a source of second-generation biodiesel for internal combustion engines application. Biodiesels were evaluated from a number of non-edible oil seeds which are grow naturally in Queensland, Australia. The quality of the produced biodiesels has been investigated by several experimental and numerical methods. The research methodology and numerical model developed in this study can be used for a broad range of biodiesel feedstocks and for the future development of renewable native biodiesel in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

'Pars pro toto: Experimental Exhibition Design and Curatorial Paradigms' is situated within the ongoing debate over the conflation of art and curating, and the subsequent tension between artistic autonomy and curatorial intervention. This practice-led research project acclimates these polarities using a collaborative and discursive curatorial methodology in the creation of two exhibitions. Both exhibitions, one digital and one primarily physical, investigated how the temporary exhibition can operate as a site for provocation, how the suggested methodology facilitates the relationship between artist and curator within this paradigm, and outlines factors that assist in expanding the definition of the contemporary curatorial role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eleven carotid atherothrombotic plaque samples were harvested from patients. Three samples that were highly calcified were discarded, while eight yielded results. The elastic properties of the material were estimated by fitting the measured indentation response to finite element simulations. The methodology was refined and its accuracy quantified using a synthetic rubber. The neo-Hookean form of the material model gave a good fit to the measured response of the tissue. The inferred shear modulus μ was found to be in the range 7-100 kPa, with a median value of 11 kPa. A review of published materials data showed a wide range of material properties for human atherothrombotic tissue. The effects of anisotropy and time dependency in these published results were highlighted. The present measurements were comparable to the static radial compression tests of Lee et al, 1991 [Structure-dependent dynamic behaviour of fibrous caps from human atherosclerotic plaques. Circulation 83, 1764-1770].