22 resultados para Simple methods

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A retrospective assessment of exposure to benzene was carried out for a nested case control study of lympho-haematopoietic cancers, including leukaemia, in the Australian petroleum industry. Each job or task in the industry was assigned a Base Estimate (BE) of exposure derived from task-based personal exposure assessments carried out by the company occupational hygienists. The BEs corresponded to the estimated arithmetic mean exposure to benzene for each job or task and were used in a deterministic algorithm to estimate the exposure of subjects in the study. Nearly all of the data sets underlying the BEs were found to contain some values below the limit of detection (LOD) of the sampling and analytical methods and some were very heavily censored; up to 95% of the data were below the LOD in some data sets. It was necessary, therefore, to use a method of calculating the arithmetic mean exposures that took into account the censored data. Three different methods were employed in an attempt to select the most appropriate method for the particular data in the study. A common method is to replace the missing (censored) values with half the detection limit. This method has been recommended for data sets where much of the data are below the limit of detection or where the data are highly skewed; with a geometric standard deviation of 3 or more. Another method, involving replacing the censored data with the limit of detection divided by the square root of 2, has been recommended when relatively few data are below the detection limit or where data are not highly skewed. A third method that was examined is Cohen's method. This involves mathematical extrapolation of the left-hand tail of the distribution, based on the distribution of the uncensored data, and calculation of the maximum likelihood estimate of the arithmetic mean. When these three methods were applied to the data in this study it was found that the first two simple methods give similar results in most cases. Cohen's method on the other hand, gave results that were generally, but not always, higher than simpler methods and in some cases gave extremely high and even implausible estimates of the mean. It appears that if the data deviate substantially from a simple log-normal distribution, particularly if high outliers are present, then Cohen's method produces erratic and unreliable estimates. After examining these results, and both the distributions and proportions of censored data, it was decided that the half limit of detection method was most suitable in this particular study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the past 10 years or so, confidence intervals have become increasingly recognised in program evaluation and quantitative health measurement generally as the preferred way of reporting the accuracy of statistical estimates. Statisticians have found that the more traditional ways of reporting results - using P-values and hypothesis tests - are often very difficult to interpret and can be misleading. This is particularly the case when sample sizes are small and results are 'negative' (ie P>0.05); in these cases, a confidence interval can communicate much more information about the sample and, by inference, about the population. Despite this trend among statisticians and health promotion evaluators towards the use of confidence intervals, it is surprisingly difficult to find succinct and reasonably simple methods to actually compute a confidence interval. This is particularly the case for proportions or percentages. Much of the data which are analysed in health promotion are binary or categorical, rather than the quantities and continuous variables often found in laboratories or other branches of science, so there is a need for health promotion evaluators to be able to present confidence intervals for percentages or proportions. However, the most popular statistical analysis computer package among health promotion professionals, SPSS does not have a routine to compute a simple confidence interval for a proportion! To address this shortcoming, I present in this paper some fairly simple strategies for computing confidence intervals for population percentages, both manually and using the right computer software.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent discovery of liquid crystalline (LC) behavior of graphene oxide (GO) dispersions in various organic, and aqueous media brings added control to the assembly of larger structures using the chemical process approach.[1-3] The LC state can be used to direct the ordered assembly of nanocomponents in macroscopic structures via simple methods like wet-spinning. [3] Here, we developed a scaleable fabrication route to produce graphene fibers via a facile continuoes wetspinning methode. We develop solid understanding in the required criteria to correlate processability with LC behavior, aspect ratio and the dispersion concentration to provide a viable platform for spinning of LC GO. We demonstrate a striking result that highlits the importance of GO sheet size and polydispersity in generating wetspinnable LC GO dispersions from very low spinning dope concentrations (as low as 0.075 wt. %). The new knowledge gained through rheological investigations provides a sound explanation as to why continuous spinning of binder-free GO fibers is enabled by the LC behavior at this very low concentration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents simple methods of determining parameters of interior permanent magnet (IPM) synchronous generator such as magnet flux (λM), d-axis inductance (Ld) and q-axis inductance (Lq) of IPM synchronous generator, which are used to control the wind turbine generator. These methods are simple and do not require any complex theory, signal injection or special equipment. Moreover, a sensorless speed estimator is proposed to estimate the speed of the generator without using speed sensor. The measured parameters are used in this speed estimator. The elimination of speed sensor will enhance the system robustness and reduce the design complexity and system cost for a small-scale wind turbine considered in this paper. The effectiveness of parameter measurement methods and sensorless speed estimator is demonstrated by experimental results. Experimental results show that the proposed speed estimator that uses the measured parameters can estimate the generator speed with a small error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Morphometric data on 92 Black-eared Miners and 47 Yellow-throated Miners that had been independently sexed using molecular techniques were analysed to investigate size dimorphism between the sexes. We found that both species are sexually dimorphic in size, with males being the larger sex. Discriminant analyses of morphometric data were used to develop a simple method for sexing both species in the hand. Additionally, alula shape was consistent with other methods that we applied for ageing individuals. Sex-specific size differences between Black-eared and Yellow-throated Miners detected here add further support to the contention that they represent different taxa. The application of these sexing and ageing techniques for both species of mallee miner will improve ongoing field management of the endangered Black-eared Miner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: With an increasing focus on obesity prevention there is a need for simple, valid tools to assess dietary indicators that may be the targets of  intervention programs. The objective of this study was to determine the relative  validity of previous day dietary intake using a newly developed parent-proxy  questionnaire (EPAQ) for two to five year old children.

Methods: A convenience sample of participants (n = 90) recruited through preschools and the community in Geelong, Australia provided dietary data for their child via EPAQ and interviewer administered 24-hour dietary recall (24 hr-recall). Comparison of mean food and beverage group servings between the  EPAQ and 24 hr-recall was conducted and Spearman rank correlations were computed to examine the association between the two methods.

Results
: Mean servings of food/beverage groups were comparable between methods for all groups except water, and significant correlations were found between the servings of food and beverages using the EPAQ and 24-hr recall methods (ranging from 0.57 to 0.88).

Conclusion
: The EPAQ is a simple and useful population-level tool for  estimating the intake of obesity-related foods and beverages in children aged two to five years. When compared with 24-hour recall data, the EPAQ produced an acceptable level of relative validity and this short survey has application for population monitoring and the evaluation of population-based obesity prevention interventions for young children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the results of the surge in interest in the internet is the great increase in availability of pictorial and video data. Web browsers such as Netscape give access to an enormous range of such data. In order to make use of large amounts of pictorial and video data, it is necessary to develop indexing and retrieval methods. Pictorial databases have made great progress recently, to the extent that there are now a number of commercially available products. Video databases are now being researched and developed from a number of different viewpoints. Given a general indexing scheme for video, the next step is to reuse clips in further applications. In this paper we present an initial application for the reuse of video clips. The aim of the system is to resequence video clips for a particular application. We have chosen a well-constrained application for this purpose, the aim being to produce a video tour of a campus between designated start and destination points from a set of indexed video clips. We use clips of a guide entering and leaving buildings on our campus, and when visitors select a start location and a destination, the system will retrieve clips suitable for guiding the visitor along the correct path. The system uses an index of spatial relationships of key objects for the video clips to decide which clips provide the correct sequence of motion around the campus. Although the full power of the indexing notation is unnecessary for this simple problem, the results from this initial implementation indicate that the concept could be applicable to more complex problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From data generated using 1H NMR titrations, different methodologies to calculate binding constants are compared. The ‘local’ analysis method that uses only a single isotherm (only one H-bond donor) is compared against the ‘global’ method (that includes many or all H-bond donors). The results indicate that for simple systems both methods are suitable, however, the global approach consistently provides a K a value with uncertainties up to 30% smaller. For more complex binding, the global analysis method gives much more robust results than the local methods. This study also highlights the need to explore several different modes when data do not fit well to a simple 1:1 complexation model and illustrates the need for better methods to estimate uncertainties in supramolecular binding experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several simple techniques are presented for the identification of the boundaries of chromatographic peaks. These methods provide a significant reduction in the time needed to perform the rapid, automatic calculation of the central peak moments and to evaluate the quality of a separation while improving the accuracy of the measurements of column efficiencies. It was found that the identification of the peak boundaries as functions of the peak widths and the examination of the slope of the signal to noise versus time plot are viable alternatives to a manual determination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on part of a teacher/researcher’s PhD action research study. It explains the complexity of features that social media brings to the teaching and learning process while discussing the simplicity and power of its use. Through the action research cycle, learning programs were designed to take advantage of the unique communicative methods offered by social media and web 2.0 whilst maintaining the value of face-to-face learning. Students used social media spaces such as blogs, groups and discussion forums as well as developing their own profiles and avatars to communicate online by making friends, leaving comments and uploading content which included publishing, peer reviewing and self assessment. The author argues that, by designing learning that valued and combined the attributes of social media, Web 2.0 and face-to-face teaching she was able to produce a more student-centred approach; hence, developing a ‘Hybrid’ learning environment which supported many 21st Century skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reports on part of the author’s PhD action research study. It examines the complexity of features that social media and Web 2.0 offer when combined with face-to-face teaching and learning. Action research was used to help redesign the learning programs of thirteen Middle Years classes over an eighteen month period. These learning programs took advantage of the unique communicative methods offered by social media and provided spaces such as blogs, groups and discussion forums. Students developed their own identity when working online, made online friends, left comments for peers and uploaded content which included publishing, peer reviewing and self assessment. The research highlighted the simplicity in the creation and exchange of user-generated content and interaction while identifying a complex depth behind such interaction. Designing learning programs using social media enabled the students to be active and valued participants in the learning process and a ‘hybrid’ learning environment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architects and designers could readily use a quick and easy tool to determine the solar heat gains of their selected glazing systems for particular orientations, tilts and climate data. Speedy results under variable solar angles and degree of irradiance would be welcomed by most. Furthermore, a newly proposed program should utilise the outputs of existing glazing tools and their standard information, such as the use of U-values and Solar Heat Gain Coefficients (SHGC’s) as generated for numerous glazing configurations by the well-known program WINDOW 6.0 (LBNL, 2001). The results of this tool provide interior glass surface temperature and transmitted solar radiation which link into comfort analysis inputs required by the ASHRAE Thermal Comfort Tool –V2 (ASHRAE, 2011). This tool is a simple-to-use calculator providing the total solar heat gain of a glazing system exposed to various angles of solar incidence. Given basic climate (solar) data, as well as the orientation of the glazing under consideration the solar heat gain can be calculated. The calculation incorporates the Solar Heat Gain Coefficient function produced for the glazing system under various angles of solar incidence WINDOW 6.0 (LBNL, 2001). The significance of this work rests in providing an orientation-based heat transfer calculator through an easy-to-use tool (using Microsoft EXCEL) for user inputs of climate and Solar Heat Gain Coefficient (WINDOW-6) data. We address the factors to be considered such as solar position and the incident angles to the horizontal and the window surface, and the fact that the solar heat gain coefficient is a function of the angle of incidence. We also discuss the effect of the diffuse components of radiation from the sky and those from ground surface reflection, which require refinement of the calculation methods. The calculator is implemented in an Excel workbook allowing the user to input a dataset and immediately produce the resulting solar gain. We compare this calculated total solar heat gain with measurements from a test facility described elsewhere in this conference (Luther et.al., 2012).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of the first few moments of elution peaks is necessary to determine: the amount of component in the sample (peak area or zeroth moment), the retention factor (first moment), and the column efficiency (second moment). It is a time consuming and tedious task for the analyst to perform these calculations, thus data analysis is generally completed by data stations associated to modern chromatographs. However, data acquisition software is a black box which provides no information to chromatographers on how their data are treated. These results are too important to be accepted on blind faith. The location of the peak integration boundaries is most important. In this manuscript, we explore the relationships between the size of the integration area, the relative position of the peak maximum within this area, and the accuracy of the calculated moments. We found that relationships between these parameters do exist and that computers can be programmed with relatively simple routines to automatize the extraction of key peak parameters and to select acceptable integration boundaries. It was also found that the most accurate results are obtained when the S/N exceeds 200.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This alternative event for the 2013 iConference is a combination of lightning talks, a demonstration of an assessment technology for knowledge construction in complex domains, and a hands-on exercise in using the tools discussed. The unifying logic for this presentation is that meaningful learning often involves solving challenging and complex problems that allow for multiple solution approaches and a variety of acceptable solutions. While it is important to prepare students to solve such problems, it is difficult to determine the extent to which various interventions and programs are contributing to the development of appropriate problem-solving strategies and attitudes. Simply testing domain knowledge or the ability to solve simple, single-solution problems may not provide support for improving individual student ability or relevant programs and activities. A reliable and robust methodology for assessing the relevant knowledge constructions of students engaged in solving challenging problems is needed, and that is our focus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to overcome interfacial incompatibility issues in natural fibre reinforced polymer bio-composites, surface modifications of the natural fibres using complex and environmentally unfriendly chemical methods is necessary. In this paper, we demonstrate that the interfacial properties of cellulose-based bio-composites can be tailored through surface adsorption of polyethylene glycol (PEG) based amphiphilic block copolymers using a greener alternative methodology. Mixtures of water or water/acetone were used to form amphiphilic emulsions or micro-crystal suspensions of PEG based amphiphilic block copolymers, and their deposition from solution onto the cellulosic substrate was carried out by simple dip-coating. The findings of this study evidence that, by tuning the amphiphilicity and the type of building blocks attached to the PEG unit, the flexural and dynamic thermo-mechanical properties of cellulose-based bio-composites comprised of either polylactide (PLA) or high density polyethylene (HDPE) as a matrix, can be remarkably enhanced. The trends, largely driven by interfacial effects, can be ascribed to the combined action of the hydrophilic and hydrophobic components of these amphiphiles. The nature of the interactions formed across the fibre-matrix interface is discussed. The collective outcome from this study provides a technological template to significantly improve the performance of cellulose-based bio-composite materials.