881 resultados para Brain sMRI data
Resumo:
Objective: to assess the accuracy of data linkage across the spectrum of emergency care in the absence of a unique patient identifier, and to use the linked data to examine service delivery outcomes in an emergency department setting. Design: automated data linkage and manual data linkage were compared to determine their relative accuracy. Data were extracted from three separate health information systems: ambulance, ED and hospital inpatients, then linked to provide information about the emergency journey of each patient. The linking was done manually through physical review of records and automatically using a data linking tool (Health Data Integration) developed by the CSIRO. Match rate and quality of the linking were compared. Setting: 10, 835 patient presentations to a large, regional teaching hospital ED over a two month period (August-September 2007). Results: comparison of the manual and automated linkage outcomes for each pair of linked datasets demonstrated a sensitivity of between 95% and 99%; a specificity of between 75% and 99%; and a positive predictive value of between 88% and 95%. Conclusions: Our results indicate that automated linking provides a sound basis for health service analysis, even in the absence of a unique patient identifier. The use of an automated linking tool yields accurate data suitable for planning and service delivery purposes and enables the data to be linked regularly to examine service delivery outcomes.
Resumo:
Road safety is a major concern worldwide. Road safety will improve as road conditions and their effects on crashes are continually investigated. This paper proposes to use the capability of data mining to include the greater set of road variables for all available crashes with skid resistance values across the Queensland state main road network in order to understand the relationships among crash, traffic and road variables. This paper presents a data mining based methodology for the road asset management data to find out the various road properties that contribute unduly to crashes. The models demonstrate high levels of accuracy in predicting crashes in roads when various road properties are included. This paper presents the findings of these models to show the relationships among skid resistance, crashes, crash characteristics and other road characteristics such as seal type, seal age, road type, texture depth, lane count, pavement width, rutting, speed limit, traffic rates intersections, traffic signage and road design and so on.
Resumo:
Developing safe and sustainable road systems is a common goal in all countries. Applications to assist with road asset management and crash minimization are sought universally. This paper presents a data mining methodology using decision trees for modeling the crash proneness of road segments using available road and crash attributes. The models quantify the concept of crash proneness and demonstrate that road segments with only a few crashes have more in common with non-crash roads than roads with higher crash counts. This paper also examines ways of dealing with highly unbalanced data sets encountered in the study.
Resumo:
It is commonly accepted that wet roads have higher risk of crash than dry roads; however, providing evidence to support this assumption presents some difficulty. This paper presents a data mining case study in which predictive data mining is applied to model the skid resistance and crash relationship to search for discernable differences in the probability of wet and dry road segments having crashes based on skid resistance. The models identify an increased probability of wet road segments having crashes for mid-range skid resistance values.
Resumo:
The Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) is a research programme that aims to uncover the factors that initiate, hinder and facilitate the process of emergence of new economic activities and organizations. It is widely acknowledged that entrepreneurship is one of the most important forces shaping changes in a country’s economic landscape (Baumol 1968; Birch 1987; Acs 1999). An understanding of the process by which new economic activity and business entities emerge is vital (Gartner 1993; Sarasvathy 2001). An important development in the study of ‘nascent entrepreneurs’ and ‘firms in gestation’ was the Panel Study of Entrepreneurial Dynamics (PSED) (Gartner et al. 2004) and its extensions in Argentina, Canada, Greece, the Netherlands, Norway and Sweden. Yet while PSED I is an important first step towards systematically studying new venture emergence, it represents just the beginning of a stream of nascent venture studies – most notably PSED II is currently being undertaken in the US (2005– 10) (Reynolds and Curtin 2008).
Resumo:
Road crashes cost world and Australian society a significant proportion of GDP, affecting productivity and causing significant suffering for communities and individuals. This paper presents a case study that generates data mining models that contribute to understanding of road crashes by allowing examination of the role of skid resistance (F60) and other road attributes in road crashes. Predictive data mining algorithms, primarily regression trees, were used to produce road segment crash count models from the road and traffic attributes of crash scenarios. The rules derived from the regression trees provide evidence of the significance of road attributes in contributing to crash, with a focus on the evaluation of skid resistance.
Resumo:
This paper describes the characterisation for airborne uses of the public mobile data communication systems known broadly as 3G. The motivation for this study was to explore how this mature public communication systems could be used for aviation purposes. An experimental system was fitted to a light aircraft to record communication latency, line speed, RF level, packet loss and cell tower identifier. Communications was established using internet protocols and connection was made to a local server. The aircraft was flown in both remote and populous areas at altitudes up to 8500ft in a region located in South East Queensland, Australia. Results show that the average airborne RF levels are better than those on the ground by 21% and in the order of -77 dbm. Latencies were in the order of 500 ms (1/2 the latency of Iridium), an average download speed of 0.48 Mb/s, average uplink speed of 0.85 Mb/s, a packet of information loss of 6.5%. The maximum communication range was also observed to be 70km from a single cell station. The paper also describes possible limitations and utility of using such a communications architecture for both manned and unmanned aircraft systems.
Resumo:
A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.
Resumo:
When an organisation becomes aware that one of its products may pose a safety risk to customers, it must take appropriate action as soon as possible or it can be held liable. The ability to automatically trace potentially dangerous goods through the supply chain would thus help organisations fulfill their legal obligations in a timely and effective manner. Furthermore, product recall legislation requires manufacturers to separately notify various government agencies, the health department and the public about recall incidents. This duplication of effort and paperwork can introduce errors and data inconsistencies. In this paper, we examine traceability and notification requirements in the product recall domain from two perspectives: the activities carried out during the manufacturing and recall processes and the data collected during the enactment of these processes. We then propose a workflow-based coordination framework to support these data and process requirements.
Resumo:
Monitoring and assessing environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods of time. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data effectively and efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; collaboration, manual, automatic and human-in-the loop analysis.
Comparison of standard image segmentation methods for segmentation of brain tumors from 2D MR images
Resumo:
In the analysis of medical images for computer-aided diagnosis and therapy, segmentation is often required as a preliminary step. Medical image segmentation is a complex and challenging task due to the complex nature of the images. The brain has a particularly complicated structure and its precise segmentation is very important for detecting tumors, edema, and necrotic tissues in order to prescribe appropriate therapy. Magnetic Resonance Imaging is an important diagnostic imaging technique utilized for early detection of abnormal changes in tissues and organs. It possesses good contrast resolution for different tissues and is, thus, preferred over Computerized Tomography for brain study. Therefore, the majority of research in medical image segmentation concerns MR images. As the core juncture of this research a set of MR images have been segmented using standard image segmentation techniques to isolate a brain tumor from the other regions of the brain. Subsequently the resultant images from the different segmentation techniques were compared with each other and analyzed by professional radiologists to find the segmentation technique which is the most accurate. Experimental results show that the Otsu’s thresholding method is the most suitable image segmentation method to segment a brain tumor from a Magnetic Resonance Image.
Resumo:
Participatory sensing enables collection, processing, dissemination and analysis of environmental sensory data by ordinary citizens, through mobile devices. Researchers have recognized the potential of participatory sensing and attempted applying it to many areas. However, participants may submit low quality, misleading, inaccurate, or even malicious data. Therefore, finding a way to improve the data quality has become a significant issue. This study proposes using reputation management to classify the gathered data and provide useful information for campaign organizers and data analysts to facilitate their decisions.