964 resultados para machine-tools
Resumo:
Worksite wellness efforts can generate enormous health-care savings. Many of the methods available to obtain health and wellness measures can be confusing and lack clarity; for example it can be difficult to understand if measures are appropriate for individuals or population health. Come along and enjoy a hands-on learning experience about measures and better understanding health and wellness outcomes from baseline, midway and beyond.
Resumo:
Social media are becoming increasingly integrated into political practices around the world. Politicians, citizens and journalists employ new media tools to support and supplement their political goals. This report examines the way in which social media are portrayed as political tools in Australian mainstream media in order to establish what the relations are between social media and mainstream media in political news reporting. Through the close content-analysis of 93 articles sampled from the years 2008, 2010 and 2012, we provide a longitudinal insight into how the perception by Australian journalists and news media organisations of social media as political tools has changed over time. As the mainstream media remain crucial in framing the public understanding of new technologies and practices, this enhances our understanding of the positioning of social media tools for political communication.
Resumo:
The candidate gene approach has been a pioneer in the field of genetic epidemiology, identifying risk alleles and their association with clinical traits. With the advent of rapidly changing technology, there has been an explosion of in silico tools available to researchers, giving them fast, efficient resources and reliable strategies important to find casual gene variants for candidate or genome wide association studies (GWAS). In this review, following a description of candidate gene prioritisation, we summarise the approaches to single nucleotide polymorphism (SNP) prioritisation and discuss the tools available to assess functional relevance of the risk variant with consideration to its genomic location. The strategy and the tools discussed are applicable to any study investigating genetic risk factors associated with a particular disease. Some of the tools are also applicable for the functional validation of variants relevant to the era of GWAS and next generation sequencing (NGS).
Resumo:
Raven and Song Scope are two automated sound anal-ysis tools based on machine learning technique for en-vironmental monitoring. Many research works have been conducted upon them, however, no or rare explo-ration mentions about the performance and comparison between them. This paper investigates the comparisons from six aspects: theory, software interface, ease of use, detection targets, detection accuracy, and potential application. Through deep exploration one critical gap is identified that there is a lack of approach to detect both syllables and call structures, since Raven only aims to detect syllables while Song Scope targets call structures. Therefore, a Timed Probabilistic Automata (TPA) system is proposed which separates syllables first and clusters them into complex structures after.
Resumo:
Through practice-led research, TESSA SMALLHORN examines the influence of digital technology on the performance space. From the mechanisation of modernist culture to the digitalisation of present day, technology acts as response material for scenographers investigating the stage as machine. The interactive, real-time tools of digital culture encourage a systems-orientated approach that challenges user and operator alike. This article explores the studio practice and critical theory that was combined to offer a functional model of a digital stage machine.
Resumo:
In the recent decision Association for Molecular Pathology v. Myriad Genetics1, the US Supreme Court held that naturally occurring sequences from human genomic DNA are not patentable subject matter. Only certain complementary DNAs (cDNA), modified sequences and methods to use sequences are potentially patentable. It is likely that this distinction will hold for all DNA sequences, whether animal, plant or microbial2. However, it is not clear whether this means that other naturally occurring informational molecules, such as polypeptides (proteins) or polysaccharides, will also be excluded from patents. The decision underscores a pressing need for precise analysis of patents that disclose and reference genetic sequences, especially in the claims. Similarly, data sets, standards compliance and analytical tools must be improved—in particular, data sets and analytical tools must be made openly accessible—in order to provide a basis for effective decision making and policy setting to support biological innovation. Here, we present a web-based platform that allows such data aggregation, analysis and visualization in an open, shareable facility. To demonstrate the potential for the extension of this platform to global patent jurisdictions, we discuss the results of a global survey of patent offices that shows that much progress is still needed in making these data freely available for aggregation in the first place.
Resumo:
This paper makes a case for thinking about the primary school as a logic machine (apparatus) as a way of thinking about processes of in-school stratification. Firstly we discuss related literature on in-school stratification in primary schools, particularly as it relates to literacy learning. Secondly we explain how school reform can be thought about in terms of the idea of the machine or apparatus. In which case the processes of in-school stratification can be mapped as more than simply concerns about school organisation (such as students grouping) but also involve a politics of truth, played out in each school, that constitutes school culture and what counts as ‘good’ pedagogy. Thirdly, the chapter will focus specifically on research conducted into primary schools in the Northern Suburbs of Adelaide, one of the most educationally disadvantaged regions in Australia, as a case study of the relationship between in-school stratification and the reproduction of inequality. We will draw from more than 20 years of ethnographic work in primary school in the northern suburbs of Adelaide and provide a snapshot of a recent attempt to improve literacy achievement in a few Northern Suburbs public primary schools (SILA project). The SILA project, through diagnostic reviews, has provided a significant analysis of the challenges facing policy and practice in such challenging school contexts that also maps onto existing (inter)national research. These diagnostic reviews said ‘hard things’ that required attention by SILA schools and these included: · an over reliance on whole class, low level, routine tasks and hence a lack of challenge and rigour in the learning tasks offered to students ; · a focus on the 'code breaking' function of language at the expense of richer conceptualisations of literacy that might guide teachers’ understanding of challenging pedagogies ; · the need for substantial shifts in the culture of schools, especially unsettling deficit views of students and their communities ; · a need to provide a more ‘consistent’ approach to teaching literacy across the school; · a need to focus School Improvement Plans in order to implement a clear focus on literacy learning; and, · a need to sustain professional learning to produce new knowledge and practice . The paper will conclude with suggestions for further research and possible reform projects into the primary school as a logic machine.
Resumo:
In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.
Resumo:
Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation technology. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches consider the energy consumption by physical machines only, but do not consider the energy consumption in communication network, in a data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement. In our preliminary research, we have proposed a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both physical machines and the communication network in a data center. Aiming at improving the performance and efficiency of the genetic algorithm, this paper presents a hybrid genetic algorithm for the energy-efficient virtual machine placement problem. Experimental results show that the hybrid genetic algorithm significantly outperforms the original genetic algorithm, and that the hybrid genetic algorithm is scalable.
Resumo:
Due to the health impacts caused by exposures to air pollutants in urban areas, monitoring and forecasting of air quality parameters have become popular as an important topic in atmospheric and environmental research today. The knowledge on the dynamics and complexity of air pollutants behavior has made artificial intelligence models as a useful tool for a more accurate pollutant concentration prediction. This paper focuses on an innovative method of daily air pollution prediction using combination of Support Vector Machine (SVM) as predictor and Partial Least Square (PLS) as a data selection tool based on the measured values of CO concentrations. The CO concentrations of Rey monitoring station in the south of Tehran, from Jan. 2007 to Feb. 2011, have been used to test the effectiveness of this method. The hourly CO concentrations have been predicted using the SVM and the hybrid PLS–SVM models. Similarly, daily CO concentrations have been predicted based on the aforementioned four years measured data. Results demonstrated that both models have good prediction ability; however the hybrid PLS–SVM has better accuracy. In the analysis presented in this paper, statistic estimators including relative mean errors, root mean squared errors and the mean absolute relative error have been employed to compare performances of the models. It has been concluded that the errors decrease after size reduction and coefficients of determination increase from 56 to 81% for SVM model to 65–85% for hybrid PLS–SVM model respectively. Also it was found that the hybrid PLS–SVM model required lower computational time than SVM model as expected, hence supporting the more accurate and faster prediction ability of hybrid PLS–SVM model.
Resumo:
This chapter was developed as part of the ‘People, communities and economies of the Lake Eyre Basin’ project. It has been written for communities, government agencies and interface organisations involved in natural resource management (NRM) in the Lake Eyre Basin (LEB). Its purpose is to identify the key factors for successful community engagement processes relevant to the LEB and present tools and principles for successful engagement processes. The term ‘interface organisation’ is used here to refer to the diverse range of local and regional organisations (such as Catchment Committees or NRM Regional Bodies) that serve as linkages, or translators, between local communities and broader Australian and State Governments. The importance of fostering and harnessing effective processes of community engagement has been identified as crucial to building a prosperous future for rural and remote regions in Australia. The chapter presents an overview of the literature on successful community engagement processes for NRM, as well as an overview of the current NRM arrangements in the LEB. The main part of the chapter presents findings of the series of interviews conducted with the government liaison officers representing both state and federal organisations who are responsible for coordinating and facilitating regional NRM in the LEB, and with the members of communities of the LEB.
Resumo:
This work is motivated by the need to efficiently machine the edges of ophthalmic polymer lenses for mounting in spectacle or instrument frames. The polymer materials used are required to have suitable optical characteristics such high refractive index and Abbe number, combined with low density and high scratch and impact resistance. Edge surface finish is an important aesthetic consideration; its quality is governed by the material removal operation and the physical properties of the material being processed. The wear behaviour of polymer materials is not as straightforward as for other materials due to their molecular and structural complexity, not to mention their time-dependent properties. Four commercial ophthalmic polymers have been studied in this work using nanoindentation techniques which are evaluated as tools for probing surface mechanical properties in order to better understand the grinding response of polymer materials.
Resumo:
Many emerging economies are dangling the patent system to stimulate bio-technological innovations with the ultimate premise that these will improve their economic and social growth. The patent system mandates full disclosure of the patented invention in exchange of a temporary exclusive patent right. Recently, however, patent offices have fallen short of complying with such a mandate, especially for genetic inventions. Most patent offices provide only static information about disclosed patent sequences and even some do not keep track of the sequence listing data in their own database. The successful partnership of QUT Library and Cambia exemplifies advocacy in Open Access, Open Innovation and User Participation. The library extends its services to various departments within the university, builds and encourages research networks to complement skills needed to make a contribution in the real world.
Resumo:
Sugar cane is a major source of food and fuel worldwide. Biotechnology has the potential to improve economically-important traits in sugar cane as well as diversify sugar cane beyond traditional applications such as sucrose production. High levels of transgene expression are key to the success of improving crops through biotechnology. Here we describe new molecular tools that both expand and improve gene expression capabilities in sugar cane. We have identified promoters that can be used to drive high levels of gene expression in the leaf and stem of transgenic sugar cane. One of these promoters, derived from the Cestrum yellow leaf curling virus, drives levels of constitutive transgene expression that are significantly higher than those achieved by the historical benchmark maize polyubiquitin-1 (Zm-Ubi1) promoter. A second promoter, the maize phosphonenolpyruvate carboxylate promoter, was found to be a strong, leaf-preferred promoter that enables levels of expression comparable to Zm-Ubi1 in this organ. Transgene expression was increased approximately 50-fold by gene modification, which included optimising the codon usage of the coding sequence to better suit sugar cane. We also describe a novel dual transcriptional enhancer that increased gene expression from different promoters, boosting expression from Zm-Ubi1 over eightfold. These molecular tools will be extremely valuable for the improvement of sugar cane through biotechnology.