240 resultados para Automatic tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reports on the development of online assessment tools for disengaged youth in flexible learning environments. Sociocultural theories of learning and assessment and Bourdieu’s sociological concepts of capital and exchange were used to design a purpose-built content management system. This design experiment engaged participants in assessment that led to the exchange of self, peer and teacher judgements for credentialing. This collaborative approach required students and teachers to adapt and amend social networking practices for students to submit and judge their own and others’ work using comments, ratings, keywords and tags. Students and teachers refined their evaluative expertise across contexts, and negotiated meanings and values of digital works, which gave rise to revised versions and emergent assessment criteria. By combining social networking tools with sociological models of capital, assessment activities related to students’ digital productions were understood as valuations and judgements within an emergent, negotiable social field of exchange.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explored the development of statistical methods to support the monitoring and improvement in quality of treatment delivered to patients undergoing coronary angioplasty procedures. To achieve this goal, a suite of outcome measures was identified to characterise performance of the service, statistical tools were developed to monitor the various indicators and measures to strengthen governance processes were implemented and validated. Although this work focused on pursuit of these aims in the context of a an angioplasty service located at a single clinical site, development of the tools and techniques was undertaken mindful of the potential application to other clinical specialties and a wider, potentially national, scope.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Worksite wellness efforts can generate enormous health-care savings. Many of the methods available to obtain health and wellness measures can be confusing and lack clarity; for example it can be difficult to understand if measures are appropriate for individuals or population health. Come along and enjoy a hands-on learning experience about measures and better understanding health and wellness outcomes from baseline, midway and beyond.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents a disturbance attenuation controller for horizontal position stabilisation for hover and automatic landings of a rotary-wing unmanned aerial vehicle (RUAV) operating close to the landing deck in rough seas. Based on a helicopter model representing aerodynamics during the landing phase, a non-linear state feedback H∞ controller is designed to achieve rapid horizontal position tracking in a gusty environment. Practical constraints including flapping dynamics, servo dynamics and time lag effect are considered. A high-fidelity closed-loop simulation using parameters of the Vario XLC gas-turbine helicopter verifies performance of the proposed horizontal position controller. The proposed controller not only increases the disturbance attenuation capability of the RUAV, but also enables rapid position response when gusts occur. Comparative studies show that the H∞ controller exhibits performance improvement and can be applied to ship/RUAV landing systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel and practical procedure for estimating the mean deck height to assist in automatic landing operations of a Rotorcraft Unmanned Aerial Vehicle (RUAV) in harsh sea environments. A modified Prony Analysis (PA) procedure is outlined to deal with real-time observations of deck displacement, which involves developing an appropriate dynamic model to approach real deck motion with parameters identified through implementing the Forgetting Factor Recursive Least Square (FFRLS) method. The model order is specified using a proper order-selection criterion based on minimizing the summation of accumulated estimation errors. In addition, a feasible threshold criterion is proposed to separate the dominant components of deck displacement, which results in an accurate instantaneous estimation of the mean deck position. Simulation results demonstrate that the proposed recursive procedure exhibits satisfactory estimation performance when applied to real-time deck displacement measurements, making it well suited for integration into ship-RUAV approach and landing guidance systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social media are becoming increasingly integrated into political practices around the world. Politicians, citizens and journalists employ new media tools to support and supplement their political goals. This report examines the way in which social media are portrayed as political tools in Australian mainstream media in order to establish what the relations are between social media and mainstream media in political news reporting. Through the close content-analysis of 93 articles sampled from the years 2008, 2010 and 2012, we provide a longitudinal insight into how the perception by Australian journalists and news media organisations of social media as political tools has changed over time. As the mainstream media remain crucial in framing the public understanding of new technologies and practices, this enhances our understanding of the positioning of social media tools for political communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following eco-driving instructions can reduce fuel consumption between 5 to 20% on urban roads with manual cars. The majority of Australian cars have an automatic transmission gear-box. It is therefore of interest to verify whether current eco-driving instructions are e cient for such vehicles. In this pilot study, participants (N=13) drove an instrumented vehicle (Toyota Camry 2007) with an automatic transmission. Fuel consumption of the participants was compared before and after they received simple eco-driving instructions. Participants drove the same vehicle on the same urban route under similar tra c conditions. We found that participants drove at similar speeds during their baseline and eco-friendly drives, and reduced the level of their accelerations and decelerations during eco-driving. Fuel consumption decreased for the complete drive by 7%, but not on the motorway and inclined sections of the study. Gas emissions were estimated with the VT-micro model, and emissions of the studied pollutants (CO2, CO, NOX and HC) were reduced, but no di erence was observed for CO2 on the motorway and inclined sections. The di erence for the complete lap is 3% for CO2. We have found evidence showing that simple eco-driving instructions are e cient in the case of automatic transmission in an urban environment, but towards the lowest values of the spectrum of fuel consumption reduction from the di erent eco-driving studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The candidate gene approach has been a pioneer in the field of genetic epidemiology, identifying risk alleles and their association with clinical traits. With the advent of rapidly changing technology, there has been an explosion of in silico tools available to researchers, giving them fast, efficient resources and reliable strategies important to find casual gene variants for candidate or genome wide association studies (GWAS). In this review, following a description of candidate gene prioritisation, we summarise the approaches to single nucleotide polymorphism (SNP) prioritisation and discuss the tools available to assess functional relevance of the risk variant with consideration to its genomic location. The strategy and the tools discussed are applicable to any study investigating genetic risk factors associated with a particular disease. Some of the tools are also applicable for the functional validation of variants relevant to the era of GWAS and next generation sequencing (NGS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the recent decision Association for Molecular Pathology v. Myriad Genetics1, the US Supreme Court held that naturally occurring sequences from human genomic DNA are not patentable subject matter. Only certain complementary DNAs (cDNA), modified sequences and methods to use sequences are potentially patentable. It is likely that this distinction will hold for all DNA sequences, whether animal, plant or microbial2. However, it is not clear whether this means that other naturally occurring informational molecules, such as polypeptides (proteins) or polysaccharides, will also be excluded from patents. The decision underscores a pressing need for precise analysis of patents that disclose and reference genetic sequences, especially in the claims. Similarly, data sets, standards compliance and analytical tools must be improved—in particular, data sets and analytical tools must be made openly accessible—in order to provide a basis for effective decision making and policy setting to support biological innovation. Here, we present a web-based platform that allows such data aggregation, analysis and visualization in an open, shareable facility. To demonstrate the potential for the extension of this platform to global patent jurisdictions, we discuss the results of a global survey of patent offices that shows that much progress is still needed in making these data freely available for aggregation in the first place.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large number of methods have been published that aim to evaluate various components of multi-view geometry systems. Most of these have focused on the feature extraction, description and matching stages (the visual front end), since geometry computation can be evaluated through simulation. Many data sets are constrained to small scale scenes or planar scenes that are not challenging to new algorithms, or require special equipment. This paper presents a method for automatically generating geometry ground truth and challenging test cases from high spatio-temporal resolution video. The objective of the system is to enable data collection at any physical scale, in any location and in various parts of the electromagnetic spectrum. The data generation process consists of collecting high resolution video, computing accurate sparse 3D reconstruction, video frame culling and down sampling, and test case selection. The evaluation process consists of applying a test 2-view geometry method to every test case and comparing the results to the ground truth. This system facilitates the evaluation of the whole geometry computation process or any part thereof against data compatible with a realistic application. A collection of example data sets and evaluations is included to demonstrate the range of applications of the proposed system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims at developing a planetary rover capable of acting as an assistant astrobiologist: making a preliminary analysis of the collected visual images that will help to make better use of the scientists time by pointing out the most interesting pieces of data. This paper focuses on the problem of detecting and recognising particular types of stromatolites. Inspired by the processes actual astrobiologists go through in the field when identifying stromatolites, the processes we investigate focus on recognising characteristics associated with biogenicity. The extraction of these characteristics is based on the analysis of geometrical structure enhanced by passing the images of stromatolites into an edge-detection filter and its Fourier Transform, revealing typical spatial frequency patterns. The proposed analysis is performed on both simulated images of stromatolite structures and images of real stromatolites taken in the field by astrobiologists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Camera-laser calibration is necessary for many robotics and computer vision applications. However, existing calibration toolboxes still require laborious effort from the operator in order to achieve reliable and accurate results. This paper proposes algorithms that augment two existing trustful calibration methods with an automatic extraction of the calibration object from the sensor data. The result is a complete procedure that allows for automatic camera-laser calibration. The first stage of the procedure is automatic camera calibration which is useful in its own right for many applications. The chessboard extraction algorithm it provides is shown to outperform openly available techniques. The second stage completes the procedure by providing automatic camera-laser calibration. The procedure has been verified by extensive experimental tests with the proposed algorithms providing a major reduction in time required from an operator in comparison to manual methods.