987 resultados para SOURCE EXTRACTION
Resumo:
Several I- and A-type granite, syenite plutons and spatially associated, giant Fe–Ti–V deposit-bearing mafic ultramafic layered intrusions occur in the Pan–Xi(Panzhihua–Xichang) area within the inner zone of the Emeishan large igneous province (ELIP). These complexes are interpreted to be related to the Emeishan mantle plume. We present LA-ICP-MS and SIMS zircon U–Pb ages and Hf–Nd isotopic compositions for the gabbros, syenites and granites from these complexes. The dating shows that the age of the felsic intrusive magmatism (256.2 ± 3.0–259.8 ± 1.6 Ma) is indistinguishable from that of the mafic intrusive magmatism (255.4 ± 3.1–259.5 ± 2.7 Ma) and represents the final phase of a continuous magmatic episode that lasted no more than 10 Myr. The upper gabbros in the mafic–ultramafic intrusions are generally more isotopically enriched (lower eNd and eHf) than the middle and lower gabbros, suggesting that the upper gabbros have experienced a higher level of crustal contamination than the lower gabbros. The significantly positive eHf(t) values of the A-type granites and syenites (+4.9 to +10.8) are higher than those of the upper gabbros of the associated mafic intrusion, which shows that they cannot be derived by fractional crystallization of these bodies. They are however identical to those of the mafic enclaves (+7.0 to +11.4) and middle and lower gabbros, implying that they are cogenetic. We suggest that they were generated by fractionation of large-volume, plume-related basaltic magmas that ponded deep in the crust. The deep-seated magma chamber erupted in two stages: the first near a density minimum in the basaltic fractionation trend and the second during the final stage of fractionation when the magma was a low density Fe-poor, Si-rich felsic magma. The basaltic magmas emplaced in the shallowlevel magma chambers differentiated to form mafic–ultramafic layered intrusions accompanied by a small amount of crustal assimilation through roof melting. Evolved A-type granites (synenites and syenodiorites) were produced dominantly by crystallization in the deep crustal magma chamber. In contrast, the I-type granites have negative eNd(t) [-6.3 to -7.5] and eHf(t) [-1.3 to -6.7] values, with the Nd model ages (T Nd DM2) of 1.63-1.67 Ga and Hf model ages (T Hf DM2) of 1.56-1.58 Ga, suggesting that they were mainly derived from partial melting of Mesoproterozoic crust. In combination with previous studies, this study also shows that plume activity not only gave rise to reworking of ancient crust, but also significant growth of juvenile crust in the center of the ELIP.
Resumo:
This paper presents the design of μAV, a palm size open source micro quadrotor constructed on a single Printed Circuit Board. The aim of the micro quadrotor is to provide a lightweight (approximately 86g) and cheap robotic research platform that can be used for a range of robotic applications. One possible application could be a cheap test bed for robotic swarm research. The goal of this paper is to give an overview of the design and capabilities of the micro quadrotor. The micro quadrotor is complete with a 9 Degree of Freedom Inertial Measurement Unit, a Gumstix Overo® Computer-On-Module which can run the widely used Robot Operating System (ROS) for use with other research algorithms.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
In recent years, the Web 2.0 has provided considerable facilities for people to create, share and exchange information and ideas. Upon this, the user generated content, such as reviews, has exploded. Such data provide a rich source to exploit in order to identify the information associated with specific reviewed items. Opinion mining has been widely used to identify the significant features of items (e.g., cameras) based upon user reviews. Feature extraction is the most critical step to identify useful information from texts. Most existing approaches only find individual features about a product without revealing the structural relationships between the features which usually exist. In this paper, we propose an approach to extract features and feature relationships, represented as a tree structure called feature taxonomy, based on frequent patterns and associations between patterns derived from user reviews. The generated feature taxonomy profiles the product at multiple levels and provides more detailed information about the product. Our experiment results based on some popularly used review datasets show that our proposed approach is able to capture the product features and relations effectively.
Resumo:
As of today, opinion mining has been widely used to iden- tify the strength and weakness of products (e.g., cameras) or services (e.g., services in medical clinics or hospitals) based upon people's feed- back such as user reviews. Feature extraction is a crucial step for opinion mining which has been used to collect useful information from user reviews. Most existing approaches only find individual features of a product without the structural relationships between the features which usually exists. In this paper, we propose an approach to extract features and feature relationship, represented as tree structure called a feature hi- erarchy, based on frequent patterns and associations between patterns derived from user reviews. The generated feature hierarchy profiles the product at multiple levels and provides more detailed information about the product. Our experiment results based on some popularly used review datasets show that the proposed feature extraction approach can identify more correct features than the baseline model. Even though the datasets used in the experiment are about cameras, our work can be ap- plied to generate features about a service such as the services in hospitals or clinics.
Resumo:
Guaranteeing the quality of extracted features that describe relevant knowledge to users or topics is a challenge because of the large number of extracted features. Most popular existing term-based feature selection methods suffer from noisy feature extraction, which is irrelevant to the user needs (noisy). One popular method is to extract phrases or n-grams to describe the relevant knowledge. However, extracted n-grams and phrases usually contain a lot of noise. This paper proposes a method for reducing the noise in n-grams. The method first extracts more specific features (terms) to remove noisy features. The method then uses an extended random set to accurately weight n-grams based on their distribution in the documents and their terms distribution in n-grams. The proposed approach not only reduces the number of extracted n-grams but also improves the performance. The experimental results on Reuters Corpus Volume 1 (RCV1) data collection and TREC topics show that the proposed method significantly outperforms the state-of-art methods underpinned by Okapi BM25, tf*idf and Rocchio.
Resumo:
Background Mycobacterium abscessus is a rapidly growing mycobacterium responsible for progressive pulmonary disease, soft tissue and wound infections. The incidence of disease due to M. abscessus has been increasing in Queensland. In a study of Brisbane drinking water, M. abscessus was isolated from ten different locations. The aim of this study was to compare genotypically the M. abscessus isolates obtained from water to those obtained from human clinical specimens. Methods Between 2007 and 2009, eleven isolates confirmed as M. abscessus were recovered from potable water, one strain was isolated from a rainwater tank and another from a swimming pool and two from domestic taps. Seventy-four clinical isolates referred during the same time period were available for comparison using rep-PCR strain typing (Diversilab). Results The drinking water isolates formed two clusters with ≥97% genetic similarity (Water patterns 1 and 2). The tankwater isolate (WP4), one municipal water isolate (WP3) and the pool isolate (WP5) were distinctly different. Patient isolates formed clusters with all of the water isolates except for WP3. Further patient isolates were unrelated to the water isolates. Conclusion The high degree of similarity between strains of M. abscessus from potable water and strains causing infection in humans from the same geographical area, strengthens the possibility that drinking water may be the source of infection in these patients.
Resumo:
Social networking sites (SNSs), with their large numbers of users and large information base, seem to be perfect breeding grounds for exploiting the vulnerabilities of people, the weakest link in security. Deceiving, persuading, or influencing people to provide information or to perform an action that will benefit the attacker is known as “social engineering.” While technology-based security has been addressed by research and may be well understood, social engineering is more challenging to understand and manage, especially in new environments such as SNSs, owing to some factors of SNSs that reduce the ability of users to detect the attack and increase the ability of attackers to launch it. This work will contribute to the knowledge of social engineering by presenting the first two conceptual models of social engineering attacks in SNSs. Phase-based and source-based models are presented, along with an intensive and comprehensive overview of different aspects of social engineering threats in SNSs.
Resumo:
Changes in the molecular structure of polymer antioxidants such as hindered amine light stabilisers (HALS) is central to their efficacy in retarding polymer degradation and therefore requires careful monitoring during their in-service lifetime. The HALS, bis-(1-octyloxy-2,2,6,6-tetramethyl-4-piperidinyl) sebacate (TIN123) and bis-(1,2,2,6,6-pentamethyl-4-piperidinyl) sebacate (TIN292), were formulated in different polymer systems and then exposed to various curing and ageing treatments to simulate in-service use. Samples of these coatings were then analysed directly using liquid extraction surface analysis (LESA) coupled with a triple quadrupole mass spectrometer. Analysis of TIN123 formulated in a cross-linked polyester revealed that the polymer matrix protected TIN123 from undergoing extensive thermal degradation that would normally occur at 292 degrees C, specifically, changes at the 1- and 4-positions of the piperidine groups. The effect of thermal versus photo-oxidative degradation was also compared for TIN292 formulated in polyacrylate films by monitoring the in situ conversion of N-CH3 substituted piperidines to N-H. The analysis confirmed that UV light was required for the conversion of N-CH3 moieties to N-H - a major pathway in the antioxidant protection of polymers - whereas this conversion was not observed with thermal degradation. The use of tandem mass spectrometric techniques, including precursor-ion scanning, is shown to be highly sensitive and specific for detecting molecular-level changes in HALS compounds and, when coupled with LESA, able to monitor these changes in situ with speed and reproducibility. (C) 2013 Elsevier B. V. All rights reserved.
Resumo:
To obtain accurate Monte Carlo simulations of small radiation fields, it is important model the initial source parameters (electron energy and spot size) accurately. However recent studies have shown that small field dosimetry correction factors are insensitive to these parameters. The aim of this work is to extend this concept to test if these parameters affect dose perturbations in general, which is important for detector design and calculating perturbation correction factors. The EGSnrc C++ user code cavity was used for all simulations. Varying amounts of air between 0 and 2 mm were deliberately introduced upstream to a diode and the dose perturbation caused by the air was quantified. These simulations were then repeated using a range of initial electron energies (5.5 to 7.0 MeV) and electron spot sizes (0.7 to 2.2 FWHM). The resultant dose perturbations were large. For example 2 mm of air caused a dose reduction of up to 31% when simulated with a 6 mm field size. However these values did not vary by more than 2 % when simulated across the full range of source parameters tested. If a detector is modified by the introduction of air, one can be confident that the response of the detector will be the same across all similar linear accelerators and the Monte Carlo modelling of each machine is not required.
Resumo:
Network coding is a method for achieving channel capacity in networks. The key idea is to allow network routers to linearly mix packets as they traverse the network so that recipients receive linear combinations of packets. Network coded systems are vulnerable to pollution attacks where a single malicious node floods the network with bad packets and prevents the receiver from decoding correctly. Cryptographic defenses to these problems are based on homomorphic signatures and MACs. These proposals, however, cannot handle mixing of packets from multiple sources, which is needed to achieve the full benefits of network coding. In this paper we address integrity of multi-source mixing. We propose a security model for this setting and provide a generic construction.
Resumo:
M. fortuitum is a rapidly growing mycobacterium associated with community-acquired and nosocomial wound, soft tissue, and pulmonary infections. It has been postulated that water has been the source of infection especially in the hospital setting. The aim of this study was to determine if municipal water may be the source of community-acquired or nosocomial infections in the Brisbane area. Between 2007 and 2009, 20 strains of M. fortuitum were recovered from municipal water and 53 patients’ isolates were submitted to the reference laboratory. A wide variation in strain types was identified using repetitive element sequence-based PCR, with 13 clusters of ≥2 indistinguishable isolates, and 28 patterns consisting of individual isolates. The clusters could be grouped into seven similar groups (>95% similarity). Municipal water and clinical isolates collected during the same time period and from the same geographical area consisted of different strain types, making municipal water an unlikely source of sporadic human infection.
Resumo:
Two sources of uncertainty in the X ray computed tomography imaging of polymer gel dosimeters are investigated in the paper.The first cause is a change in postirradiation density, which is proportional to the computed tomography signal and is associated with a volume change. The second cause of uncertainty is reconstruction noise.A simple technique that increases the residual signal to noise ratio by almost two orders of magnitude is examined.
Resumo:
RATIONALE: Polymer-based surface coatings in outdoor applications experience accelerated degradation due to exposure to solar radiation, oxygen and atmospheric pollutants. These deleterious agents cause undesirable changes to the aesthetic and mechanical properties of the polymer, reducing its lifetime. The use of antioxidants such as hindered amine light stabilisers (HALS) retards these degradative processes; however, mechanisms for HALS action and polymer degradation are poorly understood. METHODS: Detection of the HALS TINUVINW123 (bis(1-octyloxy-2,2,6,6-tetramethyl-4-piperidyl) sebacate) and the polymer degradation products directly from a polyester-based coil coating was achieved by liquid extraction surface analysis (LESA) coupled to a triple quadrupole QTRAPW 5500 mass spectrometer. The detection of TINUVINW123 and melamine was confirmed by the characteristic fragmentation pattern observed in LESA-MS/MS spectra that was identical to that reported for authentic samples. RESULTS: Analysis of an unstabilised coil coating by LESA-MS after exposure to 4 years of outdoor field testing revealed the presence of melamine (1,3,5-triazine-2,4,6-triamine) as a polymer degradation product at elevated levels. Changes to the physical appearance of the coil coating, including powder-like deposits on the coating's surface, were observed to coincide with melamine deposits and are indicative of the phenomenon known as polymer ' blooming'. CONCLUSIONS: For the first time, in situ detection of analytes from a thermoset polymer coating was accomplished without any sample preparation, providing advantages over traditional extraction-analysis approaches and some contemporary ambient MS methods. Detection of HALS and polymer degradation products such as melamine provides insight into the mechanisms by which degradation occurs and suggests LESA-MS is a powerful new tool for polymer analysis. Copyright (C) 2012 John Wiley & Sons, Ltd.
Resumo:
In the current market, extensive software development is taking place and the software industry is thriving. Major software giants have stated source code theft as a major threat to revenues. By inserting an identity-establishing watermark in the source code, a company can prove it's ownership over the source code. In this paper, we propose a watermarking scheme for C/C++ source codes by exploiting the language restrictions. If a function calls another function, the latter needs to be defined in the code before the former, unless one uses function pre-declarations. We embed the watermark in the code by imposing an ordering on the mutually independent functions by introducing bogus dependency. Removal of dependency by the attacker to erase the watermark requires extensive manual intervention thereby making the attack infeasible. The scheme is also secure against subtractive and additive attacks. Using our watermarking scheme, an n-bit watermark can be embedded in a program having n independent functions. The scheme is implemented on several sample codes and performance changes are analyzed.