839 resultados para automated
Resumo:
This chapter discusses a range of issues associated with supporting inquiry and deep reasoning while utilising information and communications technology (ICT). The role of questioning in critical thinking and reflection is considered in the context of scaffolding and new opportunities for ICT-enabled scaffolding identified. In particular, why-questioning provides a key point of focus and is presented as an important consideration in the design of systems that not only require cognitive engagement but aim to nurture it. Advances in automated question generation within intelligent tutoring systems are shown to hold promise for both teaching and learning in a range of other applications. While shortening attention spans appear to be a hazard of engaging with digital media cognitive engagement is presented as something with broader scope than attention span and is best conceived of as a crucible within which a rich mix of cognitive activities take place and from which new knowledge is created.
Resumo:
Masonry is one of the most ancient construction materials in the World. When compared to other civil engineering practices, masonry construction is highly labour intensive, which can affect the quality and productivity adversely. With a view to improving quality and in light of the limited skilled labour in the recent times several innovative masonry construction methods such as the dry stack and the thin bed masonry have been developed. This paper focuses on the thin bed masonry system, which is used in many parts of Europe. Thin bed masonry system utilises thin layer of polymer modified mortars connecting the accurately dimensioned and/or interlockable units. This assembly process has the potential for automated panelised construction system in the industry setting or being adopted in the site using less skilled labour, without sacrificing the quality. This is because unlike the conventional masonry construction, the thin bed technology uses thinner mortar (or glue) layer which can be controlled easily through some novel methods described in this paper. Structurally, reduction in the thickness of the mortar joint has beneficial effects; for example it increases the compressive strength of masonry; in addition polymer added glue mortar enhances lateral load capacity relative to conventional masonry. This paper reviews the details of the recent research outcomes on the structural characteristics and construction practices of thin bed masonry. Finally the suitability of thin bed masonry in developing countries where masonry remains as the most common material for residential building construction is discussed.
Resumo:
Abstract: The LiteSteel Beam (LSB) is a new cold-formed hollow flange channel section produced using dual electric resistance welding and automated continuous roll-forming technologies. The innovative LSB sections have many beneficial characteristics and are commonly used as flexural members in building construction. However, limited research has been undertaken on the shear behaviour of LSBs. Therefore a detailed investigation including both numerical and experimental studies was undertaken to investigate the shear behaviour of LSBs. Finite element models of LSBs in shear were developed to simulate the nonlinear ultimate strength behaviour of LSBs including their elastic buckling characteristics, and were validated by comparing their results with experimental test results. Validated finite element models were then used in a detailed parametric study into the shear behaviour of LSBs. The parametric study results showed that the current design rules in cold-formed steel structures design codes are very conservative for the shear design of LSBs. Significant improvements to web shear buckling occurred due to the presence of torsionally rigid rectangular hollow flanges while considerable post-buckling strength was also observed. This paper therefore proposes improved shear strength design rules for LSBs within the current cold-formed steel code guidelines. It presents the details of the parametric study and the new shear strength equations. The new equations were also developed based on the direct strength method. The proposed shear strength equations have the potential to be used with other conventional cold-formed steel sections such as lipped channel sections.
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
This paper presents an overview of NTCIR-9 Cross-lingual Link Discovery (Crosslink) task. The overview includes: the motivation of cross-lingual link discovery; the Crosslink task definition; the run submission specification; the assessment and evaluation framework; the evaluation metrics; and the evaluation results of submitted runs. Cross-lingual link discovery (CLLD) is a way of automatically finding potential links between documents in different languages. The goal of this task is to create a reusable resource for evaluating automated CLLD approaches. The results of this research can be used in building and refining systems for automated link discovery. The task is focused on linking between English source documents and Chinese, Korean, and Japanese target documents.
Resumo:
The LiteSteel Beam (LSB) is a new cold-formed hollow flange channel section developed by OneSteel Australian Tube Mills using their patented dual electric resistance welding and automated continuous roll-forming process. It has a unique geometry consisting of torsionally rigid rectangular hollow flanges and a relatively slender web. In addition to this unique geometry, the LSB sections also have unique characteristics relating to their stress-strain curves, residual stresses, initial geometric imperfections and hollow flanges that are not encountered in conventional hot-rolled and cold-formed steel channel sections. An experimental study including 20 section moment capacity tests was therefore conducted to investigate the behaviour and strength of LSB flexural members. The presence of inelastic reserve bending capacity in these beams was investigated in detail although the current design rules generally limit the section moment capacities of cold-formed steel members to their first yield moments. The ultimate moment capacities from the tests were compared with the section moment capacities predicted by the current cold-formed and hot-rolled steel design standards. It was found that compact and non-compact LSB sections have greater moment capacities than their first yield moments. The current cold-formed steel design standards were found to be conservative in predicting the section moment capacities of compact and non-compact LSB sections while the hot-rolled steel design standards were able to better predict them. This paper has shown that suitable modifications are needed to the current design rules to allow the inclusion of available inelastic bending capacities of LSBs in design.
Resumo:
The LiteSteel Beam (LSB) is a new hollow flange channel section developed by OneSteel Australian Tube Mills using its patented dual electric resistance welding and automated continuous roll-forming technologies. The LSB has a unique geometry consisting of torsionally rigid rectangular hollow flanges and a relatively slender web. Its flexural strength for intermediate spans is governed by lateral distortional buckling characterised by simultaneous lateral deflection, twist and web distortion. Recent research on LSBs has mainly focussed on their lateral distortional buckling behaviour under uniform moment conditions. However, in practice, LSB flexural members are subjected to non-uniform moment distributions and load height effects as they are often under transverse loads applied above or below their shear centre. These loading conditions are known to have significant effects on the lateral buckling strength of beams. Many steel design codes have adopted equivalent uniform moment distribution and load height factors based on data for conventional hot-rolled, doubly symmetric I-beams subject to lateral torsional buckling. The non-uniform moment distribution and load height effects of transverse loading on cantilever LSBs, and the suitability of the current design modification factors to include such effects are not known. This paper presents a numerical study based on finite element analyses of the elastic lateral buckling strength of cantilever LSBs subject to transverse loading, and the results. The applicability of the design modification factors from various steel design codes was reviewed, and suitable recommendations are presented for cantilever LSBs subject to transverse loading.
Resumo:
Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.
Resumo:
The time consuming and labour intensive task of identifying individuals in surveillance video is often challenged by poor resolution and the sheer volume of stored video. Faces or identifying marks such as tattoos are often too coarse for direct matching by machine or human vision. Object tracking and super-resolution can then be combined to facilitate the automated detection and enhancement of areas of interest. The object tracking process enables the automatic detection of people of interest, greatly reducing the amount of data for super-resolution. Smaller regions such as faces can also be tracked. A number of instances of such regions can then be utilized to obtain a super-resolved version for matching. Performance improvement from super-resolution is demonstrated using a face verification task. It is shown that there is a consistent improvement of approximately 7% in verification accuracy, using both Eigenface and Elastic Bunch Graph Matching approaches for automatic face verification, starting from faces with an eye to eye distance of 14 pixels. Visual improvement in image fidelity from super-resolved images over low-resolution and interpolated images is demonstrated on a small database. Current research and future directions in this area are also summarized.
Resumo:
The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.
Resumo:
Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.
Resumo:
The thesis presented in this paper is that the land fraud committed by Matthew Perrin in Queensland and inflicted upon Roger Mildenhall in Western Australia demonstrates the need for urgent procedural reform to the conveyancing process. Should this not occur, then calls to reform the substantive principles of the Torrens system will be heard throughout the jurisdictions that adopt title by registration, particularly in those places where immediate indefeasibility is still the norm. This paper closely examines the factual matrix behind both of these frauds, and asks what steps should have been taken to prevent them occurring. With 2012 bringing us Australian legislation embedding a national e-conveyancing system and a new Land Transfer Act for New Zealand we ask what legislative measures should be introduced to minimise the potential for such fraud. In undertaking this study, we reflect on whether the activities of Perrin and the criminals responsible for stealing Mildenhall's land would have succeeded under the present system for automated registration utilised in New Zealand.
Resumo:
Sustainability, smartness and safety are three sole components of a modern transportation system. The objective of this study is to introduce a modern transportation system in the light of a 3‘S’ approach: sustainable, smart and safe. In particular this paper studies the transportation system of Singapore to address how this system is progressing in this three-pronged approach towards a modern transportation system. While sustainability targets environmental justice and social equity without compromising economical efficiency, smartness incorporates qualities like automated sensing, processing and decision making, and action-taking into the transportation system. Since a system cannot be viable without being safe, the safety of the modern transportation system aims minimizing crash risks of all users including motorists, motorcyclists, pedestrians, and bicyclists. Various policy implications and technology applications inside the transportation system of Singapore are discussed to illustrate a modern transportation system within the framework of the 3‘S’ model.
Resumo:
The temporal variations in CO2, CH4 and N2O fluxes were measured over two consecutive years from February 2007 to March 2009 from a subtropical rainforest in south-eastern Queensland, Australia, using an automated sampling system. A concurrent study using an additional 30 manual chambers examined the spatial variability of emissions distributed across three nearby remnant rainforest sites with similar vegetation and climatic conditions. Interannual variation in fluxes of all gases over the 2 years was minimal, despite large discrepancies in rainfall, whereas a pronounced seasonal variation could only be observed for CO2 fluxes. High infiltration, drainage and subsequent high soil aeration under the rainforest limited N2O loss while promoting substantial CH4 uptake. The average annual N2O loss of 0.5 ± 0.1 kg N2O-N ha−1 over the 2-year measurement period was at the lower end of reported fluxes from rainforest soils. The rainforest soil functioned as a sink for atmospheric CH4 throughout the entire 2-year period, despite periods of substantial rainfall. A clear linear correlation between soil moisture and CH4 uptake was found. Rates of uptake ranged from greater than 15 g CH4-C ha−1 day−1 during extended dry periods to less than 2–5 g CH4-C ha−1 day−1 when soil water content was high. The calculated annual CH4 uptake at the site was 3.65 kg CH4-C ha−1 yr−1. This is amongst the highest reported for rainforest systems, reiterating the ability of aerated subtropical rainforests to act as substantial sinks of CH4. The spatial study showed N2O fluxes almost eight times higher, and CH4 uptake reduced by over one-third, as clay content of the rainforest soil increased from 12% to more than 23%. This demonstrates that for some rainforest ecosystems, soil texture and related water infiltration and drainage capacity constraints may play a more important role in controlling fluxes than either vegetation or seasonal variability