440 resultados para Basis path testing
Resumo:
Background The requirement for dual screening of titles and abstracts to select papers to examine in full text can create a huge workload, not least when the topic is complex and a broad search strategy is required, resulting in a large number of results. An automated system to reduce this burden, while still assuring high accuracy, has the potential to provide huge efficiency savings within the review process. Objectives To undertake a direct comparison of manual screening with a semi‐automated process (priority screening) using a machine classifier. The research is being carried out as part of the current update of a population‐level public health review. Methods Authors have hand selected studies for the review update, in duplicate, using the standard Cochrane Handbook methodology. A retrospective analysis, simulating a quasi‐‘active learning’ process (whereby a classifier is repeatedly trained based on ‘manually’ labelled data) will be completed, using different starting parameters. Tests will be carried out to see how far different training sets, and the size of the training set, affect the classification performance; i.e. what percentage of papers would need to be manually screened to locate 100% of those papers included as a result of the traditional manual method. Results From a search retrieval set of 9555 papers, authors excluded 9494 papers at title/abstract and 52 at full text, leaving 9 papers for inclusion in the review update. The ability of the machine classifier to reduce the percentage of papers that need to be manually screened to identify all the included studies, under different training conditions, will be reported. Conclusions The findings of this study will be presented along with an estimate of any efficiency gains for the author team if the screening process can be semi‐automated using text mining methodology, along with a discussion of the implications for text mining in screening papers within complex health reviews.
Resumo:
This paper addresses less recognised factors which influence the diffusion of a particular technology. While an innovation’s attributes and performance are paramount, many fail because of external factors which favour an alternative. This paper, with theoretic input from diffusion, lock-in and path-dependency, presents a qualitative study of external factors that influenced the evolution of transportation in USA. This historical account reveals how one technology and its emergent systems become dominant while other choices are overridden by socio-political, economic and technological interests which include not just the manufacturing and service industries associated with the automobile but also government and market stakeholders. Termed here as a large socio-economic regime (LSER),its power in ensuring lock-in and continued path-dependency is shown to pass through three stages, weakening eventually as awareness improves. The study extends to transport trends in China, Korea, Indonesia and Malaysia and they all show the dominant role of an LSER. As transportation policy is increasingly accountable to address both demand and environmental concerns and innovators search for solutions, this paper presents important knowledge for innovators, marketers and policy makers for commercial and societal reasons, especially when negative externalities associated with an incumbent transportation technology may lead to market failure.
Resumo:
School curriculum change processes have traditionally been managed internally. However, in Queensland, Australia, as a response to the current high-stakes accountability regime, more and more principals are outsourcing this work to external change agents (ECAs). In 2009, one of the authors (a university lecturer and ECA) developed a curriculum change model (the Controlled Rapid Approach to Curriculum Change (CRACC)), specifically outlining the involvement of an ECA in the initiation phase of a school’s curriculum change process. The purpose of this paper is to extend the CRACC model by unpacking the implementation phase, drawing on data from a pilot study of a single school. Interview responses revealed that during the implementation phase, teachers wanted to be kept informed of the wider educational context; use data to constantly track students; relate pedagogical practices to testing practices; share information between departments and professional levels; and, own whole school performance. It is suggested that the findings would be transferable to other school settings and internal leadership of curriculum change. The paper also strikes a chord of concern – Do the responses from teachers operating in such an accountability regime live their professional lives within this corporate and globalised ideology whether they want to or not?
Resumo:
As negative employee attitudes towards alcohol and other drug (AOD) policies may have serious consequences for organizations, the present study examined demographic and attitudinal dimensions leading to employees’ perceptions of AOD policy effectiveness. Survey responses were obtained from 147 employees in an Australian agricultural organization. Three dimensions of attitudes towards AOD policies were examined: knowledge of policy features, attitudes towards testing, and preventative measures such as job design and organizational involvement in community health. Demographic differences were identified, with males and blue-collar employees reporting significantly more negative attitudes towards the AOD policy. Attitude dimensions were stronger predictors of perceptions of policy effectiveness than demographics, and the strongest predictor was preventative measures. This suggests that organizations should do more than design adequate and fair AOD policies, and take a more holistic approach to AOD impairment by engaging in workplace design to reduce AOD use and promote a consistent health message to employees and the community.
Resumo:
Drawing upon an action learning perspective, we hypothesized that a leader’s learning of project leadership skills would be related to facilitative leadership, team reflexivity, and team performance. Secondly, we proposed that new and experienced leaders would differ in the amount they learn from their current and recent experience as project managers, and in the strength of the relationship between their self-reported learning, facilitative leadership, and team reflexivity. We conducted a 1-year longitudinal study of 50 R&D teams, led by 25 new and 25 experienced leaders, with 313 team members and 22 project customers, collecting both quantitative and qualitative data. We found evidence of a significant impact of the leader’s learning on subsequent facilitative leadership and team performance 8 and 12 months later, suggesting a lag between learning leadership skills and translating these skills into leadership behavior. The findings contribute to an understanding of how leaders consolidate their learned experience into facilitative leadership behavior.
Resumo:
This chapter explores the possibility and exigencies of employing hypotheses, or educated guesses, as the basis for ethnographic research design. The authors’ goal is to examine whether using hypotheses might provide a path to resolve some of the challenges to knowledge claims produced by ethnographic studies. Through resolution of the putative division between qualitative and quantitative research traditions , it is argued that hypotheses can serve as inferential warrants in qualitative and ethnographic studies.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
Aromatherapy has been found to have some effectiveness in treating conditions such as postoperative nausea and vomiting, however unless clinicians are aware of and convinced by this evidence, it is unlikely they will choose to use it with their patients. The aim of this study was to test and modify an existing tool, Martin and Furnham’s Beliefs About Aromatherapy Scale in order to make it relevant and meaningful for use with a population of nurses and midwives working in an acute hospital setting. A Delphi process was used to modify the tool and then it was tested in a population of nurses and midwives, then exploratory factor analysis was conducted. The modified tool is reliable and valid for measuring beliefs about aromatherapy in this population.
Resumo:
This paper is not about the details of yet another robot control system, but rather the issues surrounding realworld robotic implementation. It is a fact that in order to realise a future where robots co-exist with people in everyday places, we have to pass through a developmental phase that involves some risk. Putting a “Keep Out, Experiment in Progress” sign on the door is no longer possible since we are now at a level of capability that requires testing over long periods of time in complex realistic environments that contain people. We all know that controlling the risk is important – a serious accident could set the field back globally – but just as important is convincing others that the risks are known and controlled. In this article, we describe our experience going down this path and we show that mobile robotics research health and safety assessment is still unexplored territory in universities and is often ignored. We hope that the article will make robotics research labs in universities around the world take note of these issues rather than operating under the radar to prevent any catastrophic accidents.
Resumo:
Many researchers in the field of civil structural health monitoring have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Field work has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. This paper presents some preliminary results of experimental modal testing and analysis of the bridge model presented in the companion paper, using the peak picking method, and compares these results with those of a simple numerical model of the structure. Three dominant modes of vibration were experimentally identified under 15 Hz. The mode shapes and order of the modes matched those of the numerical model; however, the frequencies did not match.
Resumo:
While the implementation of the IEC 61850 standard has significantly enhanced the performance of communications in electrical substations, it has also increased the complexity of the system. Subsequently, these added elaborations have introduced new challenges in relation to the skills and tools required for the design, test and maintenance of 61850-compatible substations. This paper describes a practical experience of testing a protection relay using a non-conventional test equipment; in addition, it proposes a third party software technique to reveal the contents of the packets transferred on the substation network. Using this approach, the standard objects can be linked and interpreted to what the end-users normally see in the IED and test equipment proprietary software programs.
Resumo:
One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. As design criteria transition from empirical to mechanistic-empirical, soil test methods and equipment that measure properties such as stiffness and modulus and how they relate to Florida materials are needed. Requirements for the selected equipment are that they be portable, cost effective, reliable, a ccurate, and repeatable. A second objective is that the selected equipment measures soil properties without the use of nuclear materials.The current device used to measure soil compaction is the nuclear density gauge (NDG). Equipment evaluated in this research included lightweight deflectometers (LWD) from different manufacturers, a dynamic cone penetrometer (DCP), a GeoGauge, a Clegg impact soil tester (CIST), a Briaud compaction device (BCD), and a seismic pavement analyzer (SPA). Evaluations were conducted over ranges of measured densities and moistures.Testing (Phases I and II) was conducted in a test box and test pits. Phase III testing was conducted on materials found on five construction projects located in the Jacksonville, Florida, area. Phase I analyses determined that the GeoGauge had the lowest overall coefficient of variance (COV). In ascending order of COV were the accelerometer-type LWD, the geophone-type LWD, the DCP, the BCD, and the SPA which had the highest overall COV. As a result, the BCD and the SPA were excluded from Phase II testing.In Phase II, measurements obtained from the selected equipment were compared to the modulus values obtained by the static plate load test (PLT), the resilient modulus (MR) from laboratory testing, and the NDG measurements. To minimize soil and moisture content variability, the single spot testing sequence was developed. At each location, test results obtained from the portable equipment under evaluation were compared to the values from adjacent NDG, PLT, and laboratory MR measurements. Correlations were developed through statistical analysis. Target values were developed for various soils for verification on similar soils that were field tested in Phase III. The single spot testing sequence also was employed in Phase III, field testing performed on A-3 and A-2-4 embankments, limerock-stabilized subgrade, limerock base, and graded aggregate base found on Florida Department of Transportation construction projects. The Phase II and Phase III results provided potential trend information for future research—specifically, data collection for in-depth statistical analysis for correlations with the laboratory MR for specific soil types under specific moisture conditions. With the collection of enough data, stronger relationships could be expected between measurements from the portable equipment and the MR values. Based on the statistical analyses and the experience gained from extensive use of the equipment, the combination of the DCP and the LWD was selected for in-place soil testing for compaction control acceptance. Test methods and developmental specifications were written for the DCP and the LWD. The developmental specifications include target values for the compaction control of embankment, subgrade, and base materials.
Resumo:
Canonical single-stranded DNA-binding proteins (SSBs) from the oligosaccharide/oligonucleotide-binding (OB) domain family are present in all known organisms and are critical for DNA replication, recombination and repair. The SSB from the hyperthermophilic crenarchaeote Sulfolobus solfataricus (SsoSSB) has a ‘simple’ domain organization consisting of a single DNA-binding OB fold coupled to a flexible C-terminal tail, in contrast with other SSBs in this family that incorporate up to four OB domains. Despite the large differences in the domain organization within the SSB family, the structure of the OB domain is remarkably similar all cellular life forms. However, there are significant differences in the molecular mechanism of ssDNA binding. We have determined the structure of the SsoSSB OB domain bound to ssDNA by NMR spectroscopy. We reveal that ssDNA recognition is modulated by base-stacking of three key aromatic residues, in contrast with the OB domains of human RPA and the recently discovered human homologue of SsoSSB, hSSB1. We also demonstrate that SsoSSB binds ssDNA with a footprint of five bases and with a defined binding polarity. These data elucidate the structural basis of DNA binding and shed light on the molecular mechanism by which these ‘simple’ SSBs interact with ssDNA.
Resumo:
First year nursing students commonly find bioscience to be challenging. A Facebook community site was established to support and engage these students. The site was facilitated by virtual peer mentors and the unit coordinator. The high participation rate and the strong recommendation to future students indicated that the site successfully enabled student interaction and engagement with their learning. The students found it to be a readily accessible network and valued the useful resources and learning strategies provided by their peers. The sharing of both learning challenges and successful learning practices can help students build a sense of belonging and an understanding of academic practices and behaviours that can contribute to their learning success at university.
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.