257 resultados para Automatic rule extraction
Resumo:
Automatic labeling of white matter fibres in diffusion-weighted brain MRI is vital for comparing brain integrity and connectivity across populations, but is challenging. Whole brain tractography generates a vast set of fibres throughout the brain, but it is hard to cluster them into anatomically meaningful tracts, due to wide individual variations in the trajectory and shape of white matter pathways. We propose a novel automatic tract labeling algorithm that fuses information from tractography and multiple hand-labeled fibre tract atlases. As streamline tractography can generate a large number of false positive fibres, we developed a top-down approach to extract tracts consistent with known anatomy, based on a distance metric to multiple hand-labeled atlases. Clustering results from different atlases were fused, using a multi-stage fusion scheme. Our "label fusion" method reliably extracted the major tracts from 105-gradient HARDI scans of 100 young normal adults. © 2012 Springer-Verlag.
Resumo:
An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required. The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.
Resumo:
Currently we are facing an overburdening growth of the number of reliable information sources on the Internet. The quantity of information available to everyone via Internet is dramatically growing each year [15]. At the same time, temporal and cognitive resources of human users are not changing, therefore causing a phenomenon of information overload. World Wide Web is one of the main sources of information for decision makers (reference to my research). However our studies show that, at least in Poland, the decision makers see some important problems when turning to Internet as a source of decision information. One of the most common obstacles raised is distribution of relevant information among many sources, and therefore need to visit different Web sources in order to collect all important content and analyze it. A few research groups have recently turned to the problem of information extraction from the Web [13]. The most effort so far has been directed toward collecting data from dispersed databases accessible via web pages (related to as data extraction or information extraction from the Web) and towards understanding natural language texts by means of fact, entity, and association recognition (related to as information extraction). Data extraction efforts show some interesting results, however proper integration of web databases is still beyond us. Information extraction field has been recently very successful in retrieving information from natural language texts, however it is still lacking abilities to understand more complex information, requiring use of common sense knowledge, discourse analysis and disambiguation techniques.
Resumo:
We present an empirical evaluation and comparison of two content extraction methods in HTML: absolute XPath expressions and relative XPath expressions. We argue that the relative XPath expressions, although not widely used, should be used in preference to absolute XPath expressions in extracting content from human-created Web documents. Evaluation of robustness covers four thousand queries executed on several hundred webpages. We show that in referencing parts of real world dynamic HTML documents, relative XPath expressions are on average significantly more robust than absolute XPath ones.
Resumo:
Product reviews are the foremost source of information for customers and manufacturers to help them make appropriate purchasing and production decisions. Natural language data is typically very sparse; the most common words are those that do not carry a lot of semantic content, and occurrences of any particular content-bearing word are rare, while co-occurrences of these words are rarer. Mining product aspects, along with corresponding opinions, is essential for Aspect-Based Opinion Mining (ABOM) as a result of the e-commerce revolution. Therefore, the need for automatic mining of reviews has reached a peak. In this work, we deal with ABOM as sequence labelling problem and propose a supervised extraction method to identify product aspects and corresponding opinions. We use Conditional Random Fields (CRFs) to solve the extraction problem and propose a feature function to enhance accuracy. The proposed method is evaluated using two different datasets. We also evaluate the effectiveness of feature function and the optimisation through multiple experiments.
Resumo:
As critical infrastructure such as transportation hubs continue to grow in complexity, greater importance is placed on monitoring these facilities to ensure their secure and efficient operation. In order to achieve these goals, technology continues to evolve in response to the needs of various infrastructure. To date, however, the focus of technology for surveillance has been primarily concerned with security, and little attention has been placed on assisting operations and monitoring performance in real-time. Consequently, solutions have emerged to provide real-time measurements of queues and crowding in spaces, but have been installed as system add-ons (rather than making better use of existing infrastructure), resulting in expensive infrastructure outlay for the owner/operator, and an overload of surveillance systems which in itself creates further complexity. Given many critical infrastructure already have camera networks installed, it is much more desirable to better utilise these networks to address operational monitoring as well as security needs. Recently, a growing number of approaches have been proposed to monitor operational aspects such as pedestrian throughput, crowd size and dwell times. In this paper, we explore how these techniques relate to and complement the more commonly seen security analytics, and demonstrate the value that can be added by operational analytics by demonstrating their performance on airport surveillance data. We explore how multiple analytics and systems can be combined to better leverage the large amount of data that is available, and we discuss the applicability and resulting benefits of the proposed framework for the ongoing operation of airports and airport networks.
Resumo:
We present a methodology to extract legal norms from regulatory documents for their formalisation and later compliance checking. The need for the methodology is motivated from the shortcomings of existing approaches where the rule type and process aspects relevant to the rules are largely overlook. The methodology incorporates the well–known IF. . . THEN structure extended with the process aspect and rule type, and guides how to properly extract the conditions and logical structure of the legal rules for reasoning and modelling of obligations for compliance checking.
Resumo:
Purpose Traditional construction planning relies upon the critical path method (CPM) and bar charts. Both of these methods suffer from visualization and timing issues that could be addressed by 4D technology specifically geared to meet the needs of the construction industry. This paper proposed a new construction planning approach based on simulation by using a game engine. Design/methodology/approach A 4D automatic simulation tool was developed and a case study was carried out. The proposed tool was used to simulate and optimize the plans for the installation of a temporary platform for piling in a civil construction project in Hong Kong. The tool simulated the result of the construction process with three variables: 1) equipment, 2) site layout and 3) schedule. Through this, the construction team was able to repeatedly simulate a range of options. Findings The results indicate that the proposed approach can provide a user-friendly 4D simulation platform for the construction industry. The simulation can also identify the solution being sought by the construction team. The paper also identifies directions for further development of the 4D technology as an aid in construction planning and decision-making. Research limitations/implications The tests on the tool are limited to a single case study and further research is needed to test the use of game engines for construction planning in different construction projects to verify its effectiveness. Future research could also explore the use of alternative game engines and compare their performance and results. Originality/value The authors proposed the use of game engine to simulate the construction process based on resources, working space and construction schedule. The developed tool can be used by end-users without simulation experience.
Resumo:
INTRODUCTION There is a large range in the reported prevalence of end plate lesions (EPLs), sometimes referred to as Schmorl's nodes in the general population (3.8-76%). One possible reason for this large range is the differences in definitions used by authors. Previous research has suggested that EPLs may potentially be a primary disturbance of growth plates that leads to the onset of scoliosis. The aim of this study was to develop a technique to measure the size, prevalence and location of EPLs on Computed Tomography (CT) images of scoliosis patients in a consistent manner. METHODS A detection algorithm was developed and applied to measure EPLs for five adolescent females with idiopathic scoliosis (average age 15.1 years, average major Cobb 60°). In this algorithm, the EPL definition was based on the lesion depth, the distance from the edge of the vertebral body and the gradient of the lesion edge. Existing low-dose, CT scans of the patients' spines were segmented semi-automatically to extract 3D vertebral endplate morphology. Manual sectioning of any attachments between posterior elements of adjacent vertebrae and, if necessary, endplates was carried out before the automatic algorithm was used to determine the presence and position of EPLs. RESULTS EPLs were identified in 15 of the 170 (8.8%) endplates analysed with an average depth of 3.1mm. 73% of the EPLs were seen in the lumbar spines (11/15). A sensitivity study demonstrated that the algorithm was most sensitive to changes in the minimum gradient required at the lesion edge. CONCLUSION An imaging analysis technique for consistent measurement of the prevalence, location and size of EPLs on CT images has been developed. Although the technique was tested on scoliosis patients, it can be used to analyse other populations without observer errors in EPL definitions.
Resumo:
Background: Recently there have been efforts to derive safe, efficient processes to rule out acute coronary syndrome (ACS) in emergency department (ED) chest pain patients. We aimed to prospectively validate an ACS assessment pathway (the 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Symptoms Using Contemporary Troponins as the Only Biomarker (ADAPT) pathway) under pragmatic ED working conditions. Methods: This prospective cohort study included patients with atraumatic chest pain in whom ACS was suspected but who did not have clear evidence of ischaemia on ECG. Thrombolysis in myocardial infarction (TIMI) score and troponin (TnI Ultra) were measured at ED presentation, 2 h later and according to current national recommendations. The primary outcome of interest was the occurrence of major adverse cardiac events (MACE) including prevalent myocardial infarction (MI) at 30 days in the group who had a TIMI score of 0 and had presentation and 2-h TnI assays <99th percentile. Results: Eight hundred and forty patients were studied of whom 177 (21%) had a TIMI score of 0. There were no MI, MACE or revascularization in the per protocol and intention-to-treat 2-h troponin groups (0%, 95% confidence interval (CI) 0% to 4.5% and 0%, 95% CI 0% to 3.8%, respectively). The negative predictive value (NPV) was 100% (95% CI 95.5% to 100%) and 100% (95% CI 96.2% to 100%), respectively. Conclusions: A 2-h accelerated rule-out process for ED chest pain patients using electrocardiography, a TIMI score of 0 and a contemporary sensitive troponin assay accurately identifies a group at very low risk of 30-day MI or MACE.
The new Vancouver Chest Pain Rule using troponin as the only biomarker: An external validation study
Resumo:
Objectives To externally evaluate the accuracy of the new Vancouver Chest Pain Rule and to assess the diagnostic accuracy using either sensitive or highly sensitive troponin assays. Methods Prospectively collected data from 2 emergency departments (EDs) in Australia and New Zealand were analysed. Based on the new Vancouver Chest Pain Rule, low-risk patients were identified using electrocardiogram results, cardiac history, nitrate use, age, pain characteristics and troponin results at 2 hours after presentation. The primary outcome was 30-day diagnosis of acute coronary syndrome (ACS), including acute myocardial infarction, and unstable angina. Sensitivity, specificity, positive predictive values and negative predictive values were calculated to assess the accuracy of the new Vancouver Chest Pain Rule using either sensitive or highly sensitive troponin assay results. Results Of the 1635 patients, 20.4% had an ACS diagnosis at 30 days. Using the highly sensitive troponin assay, 212 (13.0%) patients were eligible for early discharge with 3 patients (1.4%) diagnosed with ACS. Sensitivity was 99.1% (95% CI 97.4-99.7), specificity was 16.1 (95% CI 14.2-18.2), positive predictive values was 23.3 (95% CI 21.1-25.5) and negative predictive values was 98.6 (95% CI 95.9-99.5). The diagnostic accuracy of the rule was similar using the sensitive troponin assay. Conclusions The new Vancouver Chest Pain Rule should be used for the identification of low risk patients presenting to EDs with symptoms of possible ACS, and will reduce the proportion of patients requiring lengthy assessment; however we recommend further outpatient investigation for coronary artery disease in patients identified as low risk.
Resumo:
A method for determination of tricyclazole in water using solid phase extraction and high performance liquid chromatography (HPLC) with UV detection at 230nm and a mobile phase of acetonitrile:water (20:80, v/v) was developed. A performance comparison between two types of solid phase sorbents, the C18 sorbent of Supelclean ENVI-18 cartridge and the styrene-divinyl benzene copolymer sorbent of Sep-Pak PS2-Plus cartridge was conducted. The Sep-Pak PS2-Plus cartridges were found more suitable for extracting tricyclazole from water samples than the Supelclean ENVI-18 cartridges. For this cartridge, both methanol and ethyl acetate produced good results. The method was validated with good linearity and with a limit of detection of 0.008gL-1 for a 500-fold concentration through the SPE procedure. The recoveries of the method were stable at 80% and the precision was from 1.1-6.0% within the range of fortified concentrations. The validated method was also applied to measure the concentrations of tricyclazole in real paddy water.
Resumo:
Cyclists are among the most vulnerable road users. Many recent interventions have aimed at improving their safety on the road, such as the minimum overtaking distance rule introduced in Queensland in 2014. Smartphones offer excellent opportunities for technical intervention for road safety at a limited cost. Indeed, they have a lot of available processing power and many embedded sensors that allow analysing a rider's (or driver's) motion, behaviour, and environment; this is especially relevant for cyclists, as they do not have the space or power allowance that can be found in most motor vehicles. The aim of the study presented in this paper is to assess cyclists’ support for a range of new smartphone-based safety technologies. The preliminary results for an online survey with cyclists recruited from Bicycle Queensland and Triathlon Queensland, with N=191, are presented. A number of innovative safety systems such as automatic logging of incidents without injuries, reporting of dangerous area via a website/app, automatic notification of emergency services in case of crash or fall, and advanced navigation apps were assessed. A significant part of the survey is dedicated to GoSafeCycle, a cooperative collision prevention app based on motion tracking and Wi-Fi communications developed at CARRS-Q. Results show a marked preference toward automatic detection and notification of emergencies (62-70% positive assessment) and GoSafeCycle (61.7% positive assessment), as well as reporting apps (59.1% positive assessment). Such findings are important in the context of current promotion of active transports and highlight the need for further development of system supported by the general public.
Resumo:
Frog protection has become increasingly essential due to the rapid decline of its biodiversity. Therefore, it is valuable to develop new methods for studying this biodiversity. In this paper, a novel feature extraction method is proposed based on perceptual wavelet packet decomposition for classifying frog calls in noisy environments. Pre-processing and syllable segmentation are first applied to the frog call. Then, a spectral peak track is extracted from each syllable if possible. Track duration, dominant frequency and oscillation rate are directly extracted from the track. With k-means clustering algorithm, the calculated dominant frequency of all frog species is clustered into k parts, which produce a frequency scale for wavelet packet decomposition. Based on the adaptive frequency scale, wavelet packet decomposition is applied to the frog calls. Using the wavelet packet decomposition coefficients, a new feature set named perceptual wavelet packet decomposition sub-band cepstral coefficients is extracted. Finally, a k-nearest neighbour (k-NN) classifier is used for the classification. The experiment results show that the proposed features can achieve an average classification accuracy of 97.45% which outperforms syllable features (86.87%) and Mel-frequency cepstral coefficients (MFCCs) feature (90.80%).
Resumo:
In this report an artificial neural network (ANN) based automated emergency landing site selection system for unmanned aerial vehicle (UAV) and general aviation (GA) is described. The system aims increase safety of UAV operation by emulating pilot decision making in emergency landing scenarios using an ANN to select a safe landing site from available candidates. The strength of an ANN to model complex input relationships makes it a perfect system to handle the multicriteria decision making (MCDM) process of emergency landing site selection. The ANN operates by identifying the more favorable of two landing sites when provided with an input vector derived from both landing site's parameters, the aircraft's current state and wind measurements. The system consists of a feed forward ANN, a pre-processor class which produces ANN input vectors and a class in charge of creating a ranking of landing site candidates using the ANN. The system was successfully implemented in C++ using the FANN C++ library and ROS. Results obtained from ANN training and simulations using randomly generated landing sites by a site detection simulator data verify the feasibility of an ANN based automated emergency landing site selection system.