982 resultados para science experiments


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many continuum mechanical models have been developed such as liquid drop models, solid models, and so on for single living cell biomechanics studies. However, these models do not give a fully approach to exhibit a clear understanding of the behaviour of single living cells such as swelling behaviour, drag effect, etc. Hence, the porohyperelastic (PHE) model which can capture those aspects would be a good candidature to study cells behaviour (e.g. chondrocytes in this study). In this research, an FEM model of single chondrocyte cell will be developed by using this PHE model to simulate Atomic Force Microscopy (AFM) experimental results with the variation of strain rate. This material model will be compared with viscoelastic model to demonstrate the advantages of PHE model. The results have shown that the maximum value of force applied of PHE model is lower at lower strain rates. This is because the mobile fluid does not have enough time to exude in case of very high strain rate and also due to the lower permeability of the membrane than that of the protoplasm of chondrocyte. This behavior is barely observed in viscoelastic model. Thus, PHE model is the better model for cell biomechanics studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "standard" procedure for calibrating the Vesuvio eV neutron spectrometer at the ISIS neutron source, forming the basis for data analysis over at least the last decade, was recently documented in considerable detail by the instrument’s scientists. Additionally, we recently derived analytic expressions of the sensitivity of recoil peak positions with respect to fight-path parameters and presented neutron–proton scattering results that together called in to question the validity of the "standard" calibration. These investigations should contribute significantly to the assessment of the experimental results obtained with Vesuvio. Here we present new results of neutron–deuteron scattering from D2 in the backscattering angular range (theata > 90 degrees) which are accompanied by a striking energy increase that violates the Impulse Approximation, thus leading unequivocally the following dilemma: (A) either the "standard" calibration is correct and then the experimental results represent a novel quantum dynamical effect of D which stands in blatant contradiction of conventional theoretical expectations; (B) or the present "standard" calibration procedure is seriously deficient and leads to artificial outcomes. For Case(A), we allude to the topic of attosecond quantumdynamical phenomena and our recent neutron scattering experiments from H2 molecules. For Case(B),some suggestions as to how the "standard" calibration could be considerably improved are made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project explores how interdisciplinary art practices can provide ways for questioning and envisaging alternative modes of coexistence between humans and the non-humans who together, make up the environment. As a practiceled project, it combines a body of creative work (50%) and this exegesis (50%). My interdisciplinary artistic practice appropriates methods and processes from science and engineering and merges them into artistic contexts for critical and poetic ends. By blending pseudo-scientific experimentation with creative strategies like visual fiction, humour, absurd public performance and scripted audience participation, my work engages with a range of debates around ecology. This exegesis details the interplay between critical theory relating to these debates, the work of other creative practitioners and my own evolving artistic practice. Through utilising methods and processes drawn from my prior career in water engineering, I present an interdisciplinary synthesis that seeks to promote improved understandings of the causes and consequences of our ecological actions and inactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure for the evaluation of multiple scattering contributions is described, for deep inelastic neutron scattering (DINS) studies using an inverse geometry time-of-flight spectrometer. The accuracy of a Monte Carlo code DINSMS, used to calculate the multiple scattering, is tested by comparison with analytic expressions and with experimental data collected from polythene, polycrystalline graphite and tin samples. It is shown that the Monte Carlo code gives an accurate representation of the measured data and can therefore be used to reliably correct DINS data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe cooperative control algorithms for robots and sensor nodes in an underwater environment. Cooperative navigation is defined as the ability of a coupled system of autonomous robots to pool their resources to achieve long-distance navigation and a larger controllability space. Other types of useful cooperation in underwater environments include: exchange of information such as data download and retasking; cooperative localization and tracking; and physical connection (docking) for tasks such as deployment of underwater sensor networks, collection of nodes and rescue of damaged robots. We present experimental results obtained with an underwater system that consists of two very different robots and a number of sensor network modules. We present the hardware and software architecture of this underwater system. We then describe various interactions between the robots and sensor nodes and between the two robots, including cooperative navigation. Finally, we describe our experiments with this underwater system and present data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines manual textual categorisation by human coders with the hypothesis that the law of total probability may be violated for difficult categories. An empirical evaluation was conducted to compare a one step categorisation task with a two step categorisation task using crowdsourcing. It was found that the law of total probability was violated. Both a quantum and classical probabilistic interpretations for this violation are presented. Further studies are required to resolve whether quantum models are more appropriate for this task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new cold-formed and resistance welded section known as the Hollow Flange Beam (HFB) has been developed recently in Australia. In contrast to the common lateral torsional buckling mode of I-beams, this unique section comprising two stiff triangular flanges and a slender web is susceptible to a lateral distortional buckling mode of failure involving lateral deflection, twist and cross-section change due to web distortion. This lateral distortional buckling behaviour has been shown to cause significant reduction of the available flexural strength of HFBs. An investigation using finite element analyses and large scale experiments was carried out into the use of transverse web plate stiffeners to improve the lateral buckling capacity of HFBs. This paper presents the details of the experimental investigation, the results, and the final stiffener arrangement whereas the details of the finite element analyses are presented in a companion paper at this conference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hollow flange beam (HFB) is a new cold-formed and resistance-welded section developed in Australia. Due to its unique geometry comprising two stiff triangular flanges and a slender web, the HFB is susceptible to a lateral-distortional buckling mode of failure involving web distortion. Investigation using finite-element analyses showed that the use of transverse web plate stiffeners effectively eliminated lateral-distortional buckling of HFBs and thus any associated reduction in flexural capacity. A detailed experimental investigation was then carried out to validate the results from the finite-element analysis and to improve the stiffener configuration further. This led to the development of a special stiffener that is screw-fastened to the flanges on alternate sides of the web. This paper presents the details of the experimental investigations, the results, and the final stiffener arrangement whereas the details of the finite-element analyses are presented in a companion paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The movement of exotic biota into native ecosystems are central to debates about the acclimatisation of plants in the settler colonies of the nineteenth century. For example, plants like lucerne from Europe and sudan grass from South Africa were transferred to Australia to support pastoral economies. The saltbush Atriplex spp. is an anomaly-it too, eventually, became the subject of acclimatisation within its native Australia because it was also deemed useful to the pastoralists of arid and semi-arid New South Wales. When settlers first came to this part of Australia, however, initial perceptions were that the plants were useless. We trace this transformation from the desert 'desperation' plant during early settlement to the 'precious' conservation species, from the 1880s, when there were changes in both management strategies and cultural responses to saltbush in Australia. This reconsideration can be seen in scientific assessments and experiments, in the way that it was commoditised by seeds and nursery traders, and in its use as a metaphor in bush poetry to connote a gendered nationalist figure in Saltbush Bill. We argue that while initial settlers were often so optimistic about European management techniques, they had nothing but contempt for indigenous plants. The later impulses to the conservation of natives arose from experiences of bitter failure and despair over attempts to impose European methods, which in turn forced this re-evaluation of Australian species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Facial expression recognition (FER) systems must ultimately work on real data in uncontrolled environments although most research studies have been conducted on lab-based data with posed or evoked facial expressions obtained in pre-set laboratory environments. It is very difficult to obtain data in real-world situations because privacy laws prevent unauthorized capture and use of video from events such as funerals, birthday parties, marriages etc. It is a challenge to acquire such data on a scale large enough for benchmarking algorithms. Although video obtained from TV or movies or postings on the World Wide Web may also contain ‘acted’ emotions and facial expressions, they may be more ‘realistic’ than lab-based data currently used by most researchers. Or is it? One way of testing this is to compare feature distributions and FER performance. This paper describes a database that has been collected from television broadcasts and the World Wide Web containing a range of environmental and facial variations expected in real conditions and uses it to answer this question. A fully automatic system that uses a fusion based approach for FER on such data is introduced for performance evaluation. Performance improvements arising from the fusion of point-based texture and geometry features, and the robustness to image scale variations are experimentally evaluated on this image and video dataset. Differences in FER performance between lab-based and realistic data, between different feature sets, and between different train-test data splits are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives The goal of this article is to examine whether or not the results of the Queensland Community Engagement Trial (QCET)-a randomized controlled trial that tested the impact of procedural justice policing on citizen attitudes toward police-were affected by different types of nonresponse bias. Method We use two methods (Cochrane and Elffers methods) to explore nonresponse bias: First, we assess the impact of the low response rate by examining the effects of nonresponse group differences between the experimental and control conditions and pooled variance under different scenarios. Second, we assess the degree to which item response rates are influenced by the control and experimental conditions. Results Our analysis of the QCET data suggests that our substantive findings are not influenced by the low response rate in the trial. The results are robust even under extreme conditions, and statistical significance of the results would only be compromised in cases where the pooled variance was much larger for the nonresponse group and the difference between experimental and control conditions was greatly diminished. We also find that there were no biases in the item response rates across the experimental and control conditions. Conclusion RCTs that involve field survey responses-like QCET-are potentially compromised by low response rates and how item response rates might be influenced by the control or experimental conditions. Our results show that the QCET results were not sensitive to the overall low response rate across the experimental and control conditions and the item response rates were not significantly different across the experimental and control groups. Overall, our analysis suggests that the results of QCET are robust and any biases in the survey responses do not significantly influence the main experimental findings.