59 resultados para Data acquisition system
Resumo:
Recently there has been significant interest of researchers and practitioners on the use of Bluetooth as a complementary transport data. However, literature is limited with the understanding of the Bluetooth MAC Scanner (BMS) based data acquisition process and the properties of the data being collected. This paper first provides an insight on the BMS data acquisition process. Thereafter, it discovers the interesting facts from analysis of the real BMS data from both motorway and arterial networks of Brisbane, Australia. The knowledge gained is helpful for researchers and practitioners to understand the BMS data being collected which is vital to the development of management and control algorithms using the data.
Resumo:
Aerosol mass spectrometers (AMS) are powerful tools in the analysis of the chemical composition of airborne particles, particularly organic aerosols which are gaining increasing attention. However, the advantages of AMS in providing on-line data can be outweighed by the difficulties involved in its use in field measurements at multiple sites. In contrast to the on-line measurement by AMS, a method which involves sample collection on filters followed by subsequent analysis by AMS could significantly broaden the scope of AMS application. We report the application of such an approach to field studies at multiple sites. An AMS was deployed at 5 urban schools to determine the sources of the organic aerosols at the schools directly. PM1 aerosols were also collected on filters at these and 20 other urban schools. The filters were extracted with water and the extract run through a nebulizer to generate the aerosols, which were analysed by an AMS. The mass spectra from the samples collected on filters at the 5 schools were found to have excellent correlations with those obtained directly by AMS, with r2 ranging from 0.89 to 0.98. Filter recoveries varied between the schools from 40 -115%, possibly indicating that this method provides qualitative rather than quantitative information. The stability of the organic aerosols on Teflon filters was demonstrated by analysing samples stored for up to two years. Application of the procedure to the remaining 20 schools showed that secondary organic aerosols were the main source of aerosols at the majority of the schools. Overall, this procedure provides accurate representation of the mass spectra of ambient organic aerosols and could facilitate rapid data acquisition at multiple sites where AMS could not be deployed for logistical reasons.
Resumo:
Cell line array (CMA) and tissue microarray (TMA) technologies are high-throughput methods for analysing both the abundance and distribution of gene expression in a panel of cell lines or multiple tissue specimens in an efficient and cost-effective manner. The process is based on Kononen's method of extracting a cylindrical core of paraffin-embedded donor tissue and inserting it into a recipient paraffin block. Donor tissue from surgically resected paraffin-embedded tissue blocks, frozen needle biopsies or cell line pellets can all be arrayed in the recipient block. The representative area of interest is identified and circled on a haematoxylin and eosin (H&E)-stained section of the donor block. Using a predesigned map showing a precise spacing pattern, a high density array of up to 1,000 cores of cell pellets and/or donor tissue can be embedded into the recipient block using a tissue arrayer from Beecher Instruments. Depending on the depth of the cell line/tissue removed from the donor block 100-300 consecutive sections can be cut from each CMA/TMA block. Sections can be stained for in situ detection of protein, DNA or RNA targets using immunohistochemistry (IHC), fluorescent in situ hybridisation (FISH) or mRNA in situ hybridisation (RNA-ISH), respectively. This chapter provides detailed methods for CMA/TMA design, construction and analysis with in-depth notes on all technical aspects including tips to deal with common pitfalls the user may encounter. © Springer Science+Business Media, LLC 2011.
Resumo:
For decades Supervisory Control and Data Acquisition (SCADA) and Industrial Control Systems (ICS) have used computers to monitor and control physical processes in many critical industries, including electricity generation, gas pipelines, water distribution, waste treatment, communications and transportation. Increasingly these systems are interconnected with corporate networks via the Internet, making them vulnerable and exposed to the same risks as those experiencing cyber-attacks on a conventional network. Very often SCADA networks services are viewed as a specialty subject, more relevant to engineers than standard IT personnel. Educators from two Australian universities have recognised these cultural issues and highlighted the gap between specialists with SCADA systems engineering skills and the specialists in network security with IT background. This paper describes a learning approach designed to help students to bridge this gap, gain theoretical knowledge of SCADA systems' vulnerabilities to cyber-attacks via experiential learning and acquire practical skills through actively participating in hands-on exercises.
Resumo:
This study extends the ‘zero scan’ method for CT imaging of polymer gel dosimeters to include multi-slice acquisitions. Multi slice CT images consisting of 24 slices of 1.2 mm thickness were acquired of an irradiated polymer gel dosimeter, and processed with the zero scan technique. The results demonstrate that zero scan based gel readout can be successfully applied to generate a three dimensional image of the irradiated gel field. Compared to the raw CT images the processed figures and cross gel profiles demonstrated reduced noise and clear visibility of the penumbral region. Moreover these improved results further highlight the suitability of this method in volumetric reconstruction with reduced CT data acquisition per slice. This work shows that 3D volumes of irradiated polymer gel dosimeters can be acquired and processed with x-ray CT.
Resumo:
A new era of cyber warfare has appeared on the horizon with the discovery and detection of Stuxnet. Allegedly planned, designed, and created by the United States and Israel, Stuxnet is considered the first known cyber weapon to attack an adversary state. Stuxnet's discovery put a lot of attention on the outdated and obsolete security of critical infrastructure. It became very apparent that electronic devices that are used to control and operate critical infrastructure like programmable logic controllers (PLCs) or supervisory control and data acquisition (SCADA) systems lack very basic security and protection measures. Part of that is due to the fact that when these devices were designed, the idea of exposing them to the Internet was not in mind. However, now with this exposure, these devices and systems are considered easy prey to adversaries.
Resumo:
This contribution outlines Synchrotron-based X-ray micro-tomography and its potential use in structural geology and rock mechanics. The paper complements several recent reviews of X-ray microtomography. We summarize the general approach to data acquisition, post-processing as well as analysis and thereby aim to provide an entry point for the interested reader. The paper includes tables listing relevant beamlines, a list of all available imaging techniques, and available free and commercial software packages for data visualization and quantification. We highlight potential applications in a review of relevant literature including time-resolved experiments and digital rock physics. The paper concludes with a report on ongoing developments and upgrades at synchrotron facilities to frame the future possibilities for imaging sub-second processes in centimetre-sized samples.
Resumo:
Supervisory Control and Data Acquisition systems (SCADA) are widely used to control critical infrastructure automatically. Capturing and analyzing packet-level traffic flowing through such a network is an essential requirement for problems such as legacy network mapping and fault detection. Within the framework of captured network traffic, we present a simple modeling technique, which supports the mapping of the SCADA network topology via traffic monitoring. By characterizing atomic network components in terms of their input-output topology and the relationship between their data traffic logs, we show that these modeling primitives have good compositional behaviour, which allows complex networks to be modeled. Finally, the predictions generated by our model are found to be in good agreement with experimentally obtained traffic.
Resumo:
This paper describes the limitations of using the International Statistical Classification of Diseases and Related Health Problems, Tenth Revision, Australian Modification (ICD-10-AM) to characterise patient harm in hospitals. Limitations were identified during a project to use diagnoses flagged by Victorian coders as hospital-acquired to devise a classification of 144 categories of hospital acquired diagnoses (the Classification of Hospital Acquired Diagnoses or CHADx). CHADx is a comprehensive data monitoring system designed to allow hospitals to monitor their complication rates month-to-month using a standard method. Difficulties in identifying a single event from linear sequences of codes due to the absence of code linkage were the major obstacles to developing the classification. Obstetric and perinatal episodes also presented challenges in distinguishing condition onset, that is, whether conditions were present on admission or arose after formal admission to hospital. Used in the appropriate way, the CHADx allows hospitals to identify areas for future patient safety and quality initiatives. The value of timing information and code linkage should be recognised in the planning stages of any future electronic systems.
Resumo:
The 2010 biodiversity target agreed by signatories to the Convention on Biological Diversity directed the attention of conservation professionals toward the development of indicators with which to measure changes in biological diversity at the global scale. We considered why global biodiversity indicators are needed, what characteristics successful global indicators have, and how existing indicators perform. Because monitoring could absorb a large proportion of funds available for conservation, we believe indicators should be linked explicitly to monitoring objectives and decisions about which monitoring schemes deserve funding should be informed by predictions of the value of such schemes to decision making. We suggest that raising awareness among the public and policy makers, auditing management actions, and informing policy choices are the most important global monitoring objectives. Using four well-developed indicators of biological diversity (extent of forests, coverage of protected areas, Living Planet Index, Red List Index) as examples, we analyzed the characteristics needed for indicators to meet these objectives. We recommend that conservation professionals improve on existing indicators by eliminating spatial biases in data availability, fill gaps in information about ecosystems other than forests, and improve understanding of the way indicators respond to policy changes. Monitoring is not an end in itself, and we believe it is vital that the ultimate objectives of global monitoring of biological diversity inform development of new indicators. ©2010 Society for Conservation Biology.
Resumo:
IT consumerization is both a major opportunity and significant challenge for organizations. However, IS research has hardly discussed the implications for IT management so far. In this paper we address this topic by empirically identifying organizational themes for IT consumerization and conceptually exploring the direct and indirect effects on the business value of IT, IT capabilities, and the IT function. More specifically, based on two case studies, we identify eight organizational themes: consumer IT strategy, policy development and responsibilities, consideration of private life of employees, user involvement into IT-related processes, individualization, updated IT infrastructure, end user support, and data and system security. The contributions of this paper are: (1) the identification of organizational themes for IT consumerization; (2) the proposed effects on the business value of IT, IT capabilities and the IT function, and; (3) combining empirical insights into IT consumerization with managerial theories in the IS discipline.
Resumo:
This case-study examines innovative experimentation with mobile and cloud-based technologies, utilising “Guerrilla Research Tactics” (GRT), as a means of covertly retrieving data from the urban fabric. Originally triggered by participatory action research (Kindon et al., 2008) and unobtrusive research methods (Kellehear, 1993), the potential for GRT lies in its innate ability to offer researchers an alternative, creative approach to data acquisition, whilst simultaneously allowing them to engage with the public, who are active co-creators of knowledge. Key characteristics are political agenda, the unexpected and the unconventional, which allow for an interactive, unique and thought-provoking experience for both researcher and participant.
Resumo:
Real-time locating systems (RTLSs) are considered an effective way to identify and track the location of an object in both indoor and outdoor environments. Various RTLSs have been developed and made commercially available in recent years. Research into RTLSs in the construction sector is ubiquitous and results have been published in many construction-related academic journals over the past decade. A succinct and systematic review of current applications would help academics, researchers and industry practitioners in identifying existing research deficiencies and therefore future research directions. However, such a review is lacking to date. This paper provides a framework for understanding RTLS research and development in the construction literature over the last decade. The research opportunities and directions of construction RTLS are highlighted. Background information relating to construction RTLS trends, accuracy, deployment, cost, purposes, advantages and limitations is provided. Four major research gaps are identified and research opportunities and directions are highlighted.
Resumo:
This thesis evaluates the security of Supervisory Control and Data Acquisition (SCADA) systems, which are one of the key foundations of many critical infrastructures. Specifically, it examines one of the standardised SCADA protocols called the Distributed Network Protocol Version 3, which attempts to provide a security mechanism to ensure that messages transmitted between devices, are adequately secured from rogue applications. To achieve this, the thesis applies formal methods from theoretical computer science to formally analyse the correctness of the protocol.