302 resultados para Automatic Generation
Resumo:
A long query provides more useful hints for searching relevant documents, but it is likely to introduce noise which affects retrieval performance. In order to smooth such adverse effect, it is important to reduce noisy terms, introduce and boost additional relevant terms. This paper presents a comprehensive framework, called Aspect Hidden Markov Model (AHMM), which integrates query reduction and expansion, for retrieval with long queries. It optimizes the probability distribution of query terms by utilizing intra-query term dependencies as well as the relationships between query terms and words observed in relevance feedback documents. Empirical evaluation on three large-scale TREC collections demonstrates that our approach, which is automatic, achieves salient improvements over various strong baselines, and also reaches a comparable performance to a state of the art method based on user’s interactive query term reduction and expansion.
Resumo:
This is a discussion of the journal article: "Construcing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation". The article and discussion have appeared in the Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Resumo:
Distributed generation (DG) resources are commonly used in the electric systems to obtain minimum line losses, as one of the benefits of DG, in radial distribution systems. Studies have shown the importance of appropriate selection of location and size of DGs. This paper proposes an analytical method for solving optimal distributed generation placement (ODGP) problem to minimize line losses in radial distribution systems using loss sensitivity factor (LSF) based on bus-injection to branch-current (BIBC) matrix. The proposed method is formulated and tested on 12 and 34 bus radial distribution systems. The classical grid search algorithm based on successive load flows is employed to validate the results. The main advantages of the proposed method as compared with the other conventional methods are the robustness and no need to calculate and invert large admittance or Jacobian matrices. Therefore, the simulation time and the amount of computer memory, required for processing data especially for the large systems, decreases.
Resumo:
We present an approach to automatically de-identify health records. In our approach, personal health information is identified using a Conditional Random Fields machine learning classifier, a large set of linguistic and lexical features, and pattern matching techniques. Identified personal information is then removed from the reports. The de-identification of personal health information is fundamental for the sharing and secondary use of electronic health records, for example for data mining and disease monitoring. The effectiveness of our approach is first evaluated on the 2007 i2b2 Shared Task dataset, a widely adopted dataset for evaluating de-identification techniques. Subsequently, we investigate the robustness of the approach to limited training data; we study its effectiveness on different type and quality of data by evaluating the approach on scanned pathology reports from an Australian institution. This data contains optical character recognition errors, as well as linguistic conventions that differ from those contained in the i2b2 dataset, for example different date formats. The findings suggest that our approach compares to the best approach from the 2007 i2b2 Shared Task; in addition, the approach is found to be robust to variations of training size, data type and quality in presence of sufficient training data.
Resumo:
Objective To develop and evaluate machine learning techniques that identify limb fractures and other abnormalities (e.g. dislocations) from radiology reports. Materials and Methods 99 free-text reports of limb radiology examinations were acquired from an Australian public hospital. Two clinicians were employed to identify fractures and abnormalities from the reports; a third senior clinician resolved disagreements. These assessors found that, of the 99 reports, 48 referred to fractures or abnormalities of limb structures. Automated methods were then used to extract features from these reports that could be useful for their automatic classification. The Naive Bayes classification algorithm and two implementations of the support vector machine algorithm were formally evaluated using cross-fold validation over the 99 reports. Result Results show that the Naive Bayes classifier accurately identifies fractures and other abnormalities from the radiology reports. These results were achieved when extracting stemmed token bigram and negation features, as well as using these features in combination with SNOMED CT concepts related to abnormalities and disorders. The latter feature has not been used in previous works that attempted classifying free-text radiology reports. Discussion Automated classification methods have proven effective at identifying fractures and other abnormalities from radiology reports (F-Measure up to 92.31%). Key to the success of these techniques are features such as stemmed token bigrams, negations, and SNOMED CT concepts associated with morphologic abnormalities and disorders. Conclusion This investigation shows early promising results and future work will further validate and strengthen the proposed approaches.
Resumo:
Synthetic hydrogels selectively decorated with cell adhesion motifs are rapidly emerging as promising substrates for 3D cell culture. When cells are grown in 3D they experience potentially more physiologically relevant cell-cell interactions and physical cues compared with traditional 2D cell culture on stiff surfaces. A newly developed polymer based on poly(2-oxazoline)s has been used for the first time to control attachment of fibroblast cells and is discussed here for its potential use in 3D cell culture with particular focus on cancer cells towards the ultimate aim of high throughput screening of anti-cancer therapies. Advantages and limitations of using poly(2-oxazoline) hydrogels are discussed and compared with more established polymers, especially polyethylene glycol (PEG).
Resumo:
Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.
Resumo:
Platelet-derived microparticles (PMPs) which are produced during platelet activation contribute to coagulation1 and bind to traumatized endothelium in an animal model2. Such endothelial injury occurs during percutaneous transluminal coronary angioplasty (PTCA), a procedure which restores the diameter of occluded coronary arteries using balloon inflations. However, re-occlusions subsequently develop in 20-25% of patients3, although this is limited by treatment with anti-platelet glycoprotein IIb/IIIa receptor drugs such as abciximab4. However, abciximab only partially decreases the need for revascularisation5, and therefore other mechanisms appear to be involved. As platelet activation occurs during PTCA, it is likely that PMPs may be produced and contribute to restenosis. This study population consisted of 113 PTCA patients, of whom 38 received abciximab. Paired peripheral arterial blood samples were obtained from the PTCA sheath: 1) following heparinisation (baseline); and 2) subsequent to all vessel manipulation (post-PTCA). Blood was prepared with an anti-CD61 (glycoprotein IIIa) fluorescence conjugated antibody to identify PMPs using flow cytometry, and PMP results expressed as a percentage of all CD61 events. The level of PMPs increased significantly from baseline following PTCA in the without abciximab group (paired t test, P=0.019). However, there was no significant change in the level of PMPs following PTCA in patients who received abciximab. Baseline clinical characteristics between patient groups were similar, although patients administered abciximab had more complex PTCA procedures, such as increased balloon inflation pressures (ANOVA, P=0.0219). In this study, we have clearly demonstrated that the level of CD61-positive PMPs increased during PTCA. This trend has been demonstrated previously, although a low sample size prevented statistical significance being attained6. The results of our work also demonstrate that there was no increase in PMPs after PTCA with abiciximab treatment. The increased PMPs may adhere to traumatized endothelium, contributing to re-occlusion of the arteries, but this remains to be determined. References: (1) Holme PA, Brosstad F, Solum NO. Blood Coagulation and Fibrinolysis. 1995;6:302-310. (2) Merten M, Pakala R, Thiagarajan P, Benedict CR. Circulation. 1999;99:2577-2582. (3) Califf RM. American Heart Journal.1995;130:680-684. (4) Coller BS, Scudder LE. Blood. 1985;66:1456-1459. (5) Topol EJ, Califf RM, Weisman HF, Ellis SG, Tcheng JE, Worley S, Ivanhoe R, George BS, Fintel D, Weston M, Sigmon K, Anderson KM, Lee KL, Willerson JT on behalf of the EPIC investigators. Lancet. 1994;343:881-886. (6) Scharf RE, Tomer A, Marzec UM, Teirstein PS, Ruggeri ZM, Harker LA. Arteriosclerosis and Thrombosis. 1992;12:1475-87.
Resumo:
This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.
Resumo:
Given the increased importance of adaptation debates in global climate negotiations, pressure to achieve biodiversity, food and water security through managed landscape-scale adaptation will likely increase across the globe over the coming decade. In parallel, emerging market-based, terrestrial greenhouse gas abatement programs present a real opportunity to secure such adaptation to climate change through enhanced landscape resilience. Australia has an opportunity to take advantage of such programs through regional planning aspects of its governance arrangements for NRM. This paper explores necessary reforms to Australia's regional NRM planning systems to ensure that they will be better able to direct the nation's emerging GGA programs to secure enhanced landscape adaptation. © 2013 Planning Institute Australia.
Resumo:
In the last years, the trade-o between exibility and sup- port has become a leading issue in work ow technology. In this paper we show how an imperative modeling approach used to de ne stable and well-understood processes can be complemented by a modeling ap- proach that enables automatic process adaptation and exploits planning techniques to deal with environmental changes and exceptions that may occur during process execution. To this end, we designed and imple- mented a Custom Service that allows the Yawl execution environment to delegate the execution of subprocesses and activities to the SmartPM execution environment, which is able to automatically adapt a process to deal with emerging changes and exceptions. We demonstrate the fea- sibility and validity of the approach by showing the design and execution of an emergency management process de ned for train derailments.
Resumo:
Molecular orbital calculations have predicted the stability of a range of connectivities for the radical C5H potential surface. The most energetically favorable of these include the linear C4CH geometry and two ring-chain structures HC2C3 and C2C3H The corresponding anions are also shown to be theoretically stable, and furthermore, a fourth isomer, C2CHC2, is predicted to be the most stable anion connectivity. These results have motivated experimental efforts. Methodologies for the generation of the non-ring-containing isomeric anions C4CH and C2CHC2 have been developed utilizing negative ion mass spectrometry. The absolute connectivities of the anions have been established using deuterium labeling, charge reversal, and neutralization reionization techniques. The success of the latter experiment confirms theoretical predictions of stability of the corresponding neutral species. This is the first reported observation of the neutral C2CHC2 species that calculations predict to be substantially less stable than the C4CH connectivity but still bound relative to isomerization processes.
Resumo:
The dicoordinated borinium ion, dihydroxyborinium, B(OH)(2)(+) is generated from methyl boronic acid CH3B(OH)(2) by dissociative electron ionization and its connectivity confirmed by collisional activation. Neutralization-reionization (NR) experiments on this ion indicate that the neutral B(OH)(2) radical is a viable species in the gas phase. Both vertical neutralization of B(OH)(2)(+) and reionization of B(OH)(2) in the NR experiment are, however, associated with particularly unfavorable Franck-Condon factors. The differences in adiabatic and vertical electron transfer behavior can be traced back to a particular pi stabilization of the cationic species compared to the sp(2)-type neutral radical. Thermochemical data on several neutral and cationic boron compounds are presented based on calculations performed at the G2 level of theory.
Resumo:
This paper describes a novel system for automatic classification of images obtained from Anti-Nuclear Antibody (ANA) pathology tests on Human Epithelial type 2 (HEp-2) cells using the Indirect Immunofluorescence (IIF) protocol. The IIF protocol on HEp-2 cells has been the hallmark method to identify the presence of ANAs, due to its high sensitivity and the large range of antigens that can be detected. However, it suffers from numerous shortcomings, such as being subjective as well as time and labour intensive. Computer Aided Diagnostic (CAD) systems have been developed to address these problems, which automatically classify a HEp-2 cell image into one of its known patterns (eg. speckled, homogeneous). Most of the existing CAD systems use handpicked features to represent a HEp-2 cell image, which may only work in limited scenarios. We propose a novel automatic cell image classification method termed Cell Pyramid Matching (CPM), which is comprised of regional histograms of visual words coupled with the Multiple Kernel Learning framework. We present a study of several variations of generating histograms and show the efficacy of the system on two publicly available datasets: the ICPR HEp-2 cell classification contest dataset and the SNPHEp-2 dataset.