983 resultados para Extraction techniques
Resumo:
In this paper we present a description of the role of definitional verbal patterns for the extraction of semantic relations. Several studies show that semantic relations can be extracted from analytic definitions contained in machine-readable dictionaries (MRDs). In addition, definitions found in specialised texts are a good starting point to search for different types of definitions where other semantic relations occur. The extraction of definitional knowledge from specialised corpora represents another interesting approach for the extraction of semantic relations. Here, we present a descriptive analysis of definitional verbal patterns in Spanish and the first steps towards the development of a system for the automatic extraction of definitional knowledge.
Investigation into Improved Pavement Curing Materials and Techniques: Part 2 - Phase III, March 2003
Resumo:
Appropriate curing is important for concrete to obtain the designed properties. This research was conducted to evaluate the curing effects of different curing materials and methods on pavement properties. At present the sprayed curing compound is a common used method for pavement and other concrete structure construction. Three curing compounds were selected for testing. Two different application rates were employed for the white-pigmented liquid curing compounds. The concrete properties of temperature, moisture content, conductivity, and permeability were examined at several test locations. It was found, in this project, that the concrete properties varied with the depth. Of the tests conducted (maturity, sorptivity, permeability, and conductivity), conductivity appears to be the best method to evaluate the curing effects in the field and bears potential for field application. The results indicated that currently approved curing materials in Iowa, when spread uniformly in a single or double application, provide adequate curing protection and meet the goals of the Iowa Department of Transportation. Experimental curing methods can be compared to this method through the use of conductivity testing to determine their application in the field.
Resumo:
Concrete curing is closely related to cement hydration, microstructure development, and concrete performance. Application of a liquid membrane-forming curing compound is among the most widely used curing methods for concrete pavements and bridge decks. Curing compounds are economical, easy to apply, and maintenance free. However, limited research has been done to investigate the effectiveness of different curing compounds and their application technologies. No reliable standard testing method is available to evaluate the effectiveness of curing, especially of the field concrete curing. The present research investigates the effects of curing compound materials and application technologies on concrete properties, especially on the properties of surface concrete. This report presents a literature review of curing technology, with an emphasis on curing compounds, and the experimental results from the first part of this research—lab investigation. In the lab investigation, three curing compounds were selected and applied to mortar specimens at three different times after casting. Two application methods, single- and double-layer applications, were employed. Moisture content, conductivity, sorptivity, and degree of hydration were measured at different depths of the specimens. Flexural and compressive strength of the specimens were also tested. Statistical analysis was conducted to examine the relationships between these material properties. The research results indicate that application of a curing compound significantly increased moisture content and degree of cement hydration and reduced sorptivity of the near-surface-area concrete. For given concrete materials and mix proportions, optimal application time of curing compounds depended primarily upon the weather condition. If a sufficient amount of a high-efficiency-index curing compound was uniformly applied, no double-layer application was necessary. Among all test methods applied, the sorptivity test is the most sensitive one to provide good indication for the subtle changes in microstructure of the near-surface-area concrete caused by different curing materials and application methods. Sorptivity measurement has a close relation with moisture content and degree of hydration. The research results have established a baseline for and provided insight into the further development of testing procedures for evaluation of curing compounds in field. Recommendations are provided for further field study.
Resumo:
Standards for the construction of full-depth patching in portland cement concrete pavement usually require replacement of all deteriorated based materials with crushed stone, up to the bottom of the existing pavement layer. In an effort to reduce the time of patch construction and costs, the Iowa Department of Transportation and the Department of Civil, Construction and Environmental Engineering at Iowa State University studied the use of extra concrete depth as an option for base construction. This report compares the impact of additional concrete patching material depth on rate of strength gain, potential for early opening to traffic, patching costs, and long-term patch performance. This report also compares those characteristics in terms of early setting and standard concrete mixes. The results have the potential to change the method of Portland cement concrete pavement patch construction in Iowa.
Resumo:
Automatic creation of polarity lexicons is a crucial issue to be solved in order to reduce time andefforts in the first steps of Sentiment Analysis. In this paper we present a methodology based onlinguistic cues that allows us to automatically discover, extract and label subjective adjectivesthat should be collected in a domain-based polarity lexicon. For this purpose, we designed abootstrapping algorithm that, from a small set of seed polar adjectives, is capable to iterativelyidentify, extract and annotate positive and negative adjectives. Additionally, the methodautomatically creates lists of highly subjective elements that change their prior polarity evenwithin the same domain. The algorithm proposed reached a precision of 97.5% for positiveadjectives and 71.4% for negative ones in the semantic orientation identification task.
Resumo:
This work briefly analyses the difficulties to adopt the Semantic Web, and in particular proposes systems to know the present level of migration to the different technologies that make up the Semantic Web. It focuses on the presentation and description of two tools, DigiDocSpider and DigiDocMetaEdit, designed with the aim of verifYing, evaluating, and promoting its implementation.
Resumo:
Solid-phase extraction (SPE) in tandem with dispersive liquid-liquid microextraction (DLLME) has been developed for the determination of mononitrotoluenes (MNTs) in several aquatic samples using gas chromatography-flame ionization (GC-FID) detection system. In the hyphenated SPE-DLLME, initially MNTs were extracted from a large volume of aqueous samples (100 mL) into a 500-mg octadecyl silane (C(18) ) sorbent. After the elution of analytes from the sorbent with acetonitrile, the obtained solution was put under the DLLME procedure, so that the extra preconcentration factors could be achieved. The parameters influencing the extraction efficiency such as breakthrough volume, type and volume of the elution solvent (disperser solvent) and extracting solvent, as well as the salt addition, were studied and optimized. The calibration curves were linear in the range of 0.5-500 μg/L and the limit of detection for all analytes was found to be 0.2 μg/L. The relative standard deviations (for 0.75 μg/L of MNTs) without internal standard varied from 2.0 to 6.4% (n=5). The relative recoveries of the well, river and sea water samples, spiked at the concentration level of 0.75 μg/L of the analytes, were in the range of 85-118%.
Resumo:
Pavement settlement occurring in and around utility cuts is a common problem, resulting in uneven pavement surfaces, annoyance to drivers, and ultimately, further maintenance. A survey of municipal authorities and field and laboratory investigations were conducted to identify the factors contributing to the settlement of utility cut restorations in pavement sections. Survey responses were received from seven cities across Iowa and indicate that utility cut restorations often last less than two years. Observations made during site inspections showed that backfill material varies from one city to another, backfill lift thickness often exceeds 12 inches, and the backfill material is often placed at bulking moisture contents with no Quality control/Quality Assurance. Laboratory investigation of the backfill materials indicate that at the field moisture contents encountered, the backfill materials have collapse potentials up to 35%. Falling Weight Deflectometer (FWD) deflection data and elevation shots indicate that the maximum deflection in the pavement occurs in the area around the utility cut restoration. The FWD data indicate a zone of influence around the perimeter of the restoration extending two to three feet beyond the trench perimeter. The research team proposes moisture control, the use of 65% relative density in a granular fill, and removing and compacting the native material near the ground surface around the trench. Test sections with geogrid reinforcement were also incorporated. The performance of inspected and proposed utility cuts needs to be monitored for at least two more years.
Resumo:
PURPOSE: To compare examination time with radiologist time and to measure radiation dose of computed tomographic (CT) fluoroscopy, conventional CT, and conventional fluoroscopy as guiding modalities for shoulder CT arthrography. MATERIALS AND METHODS: Glenohumeral injection of contrast material for CT arthrography was performed in 64 consecutive patients (mean age, 32 years; age range, 16-74 years) and was guided with CT fluoroscopy (n = 28), conventional CT (n = 14), or conventional fluoroscopy (n = 22). Room times (arthrography, room change, CT, and total examination times) and radiologist times (time the radiologist spent in the fluoroscopy or CT room) were measured. One-way analysis of variance and Bonferroni-Dunn posthoc tests were performed for comparison of mean times. Mean effective radiation dose was calculated for each method with examination data, phantom measurements, and standard software. RESULTS: Mean total examination time was 28.0 minutes for CT fluoroscopy, 28.6 minutes for conventional CT, and 29.4 minutes for conventional fluoroscopy; mean radiologist time was 9.9 minutes, 10.5 minutes, and 9.0 minutes, respectively. These differences were not statistically significant. Mean effective radiation dose was 0.0015 mSv for conventional fluoroscopy (mean, nine sections), 0.22 mSv for CT fluoroscopy (120 kV; 50 mA; mean, 15 sections), and 0.96 mSv for conventional CT (140 kV; 240 mA; mean, six sections). Effective radiation dose can be reduced to 0.18 mSv for conventional CT by changing imaging parameters to 120 kV and 100 mA. Mean effective radiation dose of the diagnostic CT arthrographic examination (140 kV; 240 mA; mean, 25 sections) was 2.4 mSv. CONCLUSION: CT fluoroscopy and conventional CT are valuable alternative modalities for glenohumeral CT arthrography, as examination and radiologist times are not significantly different. CT guidance requires a greater radiation dose than does conventional fluoroscopy, but with adequate parameters CT guidance constitutes approximately 8% of the radiation dose.
Resumo:
Ligands and receptors of the TNF superfamily are therapeutically relevant targets in a wide range of human diseases. This chapter describes assays based on ELISA, immunoprecipitation, FACS, and reporter cell lines to monitor interactions of tagged receptors and ligands in both soluble and membrane-bound forms using unified detection techniques. A reporter cell assay that is sensitive to ligand oligomerization can identify ligands with high probability of being active on endogenous receptors. Several assays are also suitable to measure the activity of agonist or antagonist antibodies, or to detect interactions with proteoglycans. Finally, self-interaction of membrane-bound receptors can be evidenced using a FRET-based assay. This panel of methods provides a large degree of flexibility to address questions related to the specificity, activation, or inhibition of TNF-TNF receptor interactions in independent assay systems, but does not substitute for further tests in physiologically relevant conditions.
Resumo:
Objective: To evaluate the safety of the performance of the traditional and protected collection techniques of tracheal aspirate and to identify qualitative and quantitative agreement of the results of microbiological cultures between the techniques. Method: Clinical, prospective, comparative, single-blind research. The sample was composed of 54 patients of >18 years of age, undergoing invasive mechanical ventilation for a period of ≥48 hours and with suspected Ventilator Associated Pneumonia. The two techniques were implemented in the same patient, one immediately after the other, with an order of random execution, according to randomization by specialized software. Results: No significant events occurred oxygen desaturation, hemodynamic instability or tracheobronchial hemorrhage (p<0.05) and, although there were differences in some strains, there was qualitative and quantitative agreement between the techniques (p<0.001). Conclusion: Utilization of the protected technique provided no advantage over the traditional and execution of both techniques was safe for the patient.
Resumo:
Histoire discursive du « cinéma-vérité ». Techniques, controverses, historiographie (1960-1970) retrace l'histoire du succès et de la disgrâce du label « cinéma vérité » en France qui, entre 1960 - date à laquelle Edgar Morin publie son essai programmatique « Pour un nouveau "cinéma vérité" » dans France Observateur - et 1964-65 - moment où la notion commence à perdre en popularité - sert de bannière à un mouvement cinématographique supposé renouveler les rapports entre cinéma et réalité. Une vingtaine de films - comme Chronique d'un été de Jean Rouch et Edgar Morin, Primary de Richard Leacock et Robert Drew, Les Inconnus de la terre ou Regard sur la folie de Mario Ruspoli, Hitler, connais pas de Bertrand Blier, Le Chemin de la mauvaise route de Jean Herman, Le Joli Mai de Chris Marker, La Punition de Jean Rouch ou Pour la Suite du monde de Michel Brault et Pierre Perrault - revendiquent cette étiquette ou y sont associés par la presse hexagonale qui y consacre des centaines d'articles. En effet, la sortie en salles de ces « films-vérité » provoque en France de virulentes controverses qui interrogent aussi bien l'éthique de ces projets où les personnes filmées sont supposées révéler une vérité intime face à la caméra, le statut artistique de ces réalisations, ou l'absence d'un engagement politique marqué des « cinéastes-vérité » devant les questions abordées par les protagonistes (par exemple la Guerre d'Algérie, la jeunesse française, la politique internationale). L'hypothèse à la base de cette recherche est que la production cinématographique qui se réclame du « cinéma-vérité » se caractérise par une étroite corrélation entre film et discours sur le film. D'une part car la première moitié de la décennie est marquée par de nombreuses rencontres entre les « cinéastes vérité », les critiques ou les constructeurs de caméras légères et de magnétophones synchrones ; rencontres qui contribuent à accentuer et à médiatiser les dissensions au sein du mouvement. D'autre part car la particularité de nombreux projets est d'inclure dans le film des séquences méta-discursives où les participants, les réalisateurs ou des experts débattent de la réussite du tournage. Ce travail montre que le succès du mouvement entre 1960 et 1964-65 ne se fait pas malgré une forte polémique, mais qu'au contraire, nombre de longs métrages intègrent la controverse en leur sein, interrogeant, sur un plan symbolique, l'abolition du filtre entre le film et son spectateur. Si les films qui s'inscrivent dans la mouvance du « cinéma vérité » octroient une large place à la confrontation, c'est parce que la « vérité » est pensée comme un processus dialectique, qui émerge dans une dynamique d'échanges (entre les réalisateurs de cette mouvance, entre les protagonistes, entre le film et son public). Les querelles internes ou publiques qui rythment ces quelques années font partie du dispositif « cinéma-vérité » et justifient de faire l'histoire de ce mouvement cinématographique par le biais des discours qu'il a suscité au sein de la cinéphilie française.
Resumo:
Fungal symbionts commonly occur in plants influencing host growth, physiology, and ecology (Carlile et al., 2001). However, while whole-plant growth responses to biotrophic fungi are readily demonstrated, it has been much more difficult to identify and detect the physiological mechanisms responsible. Previous work on the clonal grass Glyceria striata has revealed that the systemic fungal endophyte Epichloë glyceriae has a positive effect on clonal growth of its host (Pan & Clay, 2002; 2003). The latest study from these authors, in this issue (pp. 467- 475), now suggests that increased carbon movement in hosts infected by E. glyceriae may function as one mechanism by which endophytic fungi could increase plant growth. Given the widespread distribution of both clonal plants and symbiotic fungi, this research will have implications for our understanding of the ecology and evolution of fungus-plant associations in natural communities.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
Resumo:
ABSTRACT: In sexual assault cases, autosomal DNA analysis of gynecological swabs is a challenge, as the presence of a large quantity of female material may prevent the detection of the male DNA. A solution to this problem is differential DNA extraction, but as there are different protocols, it was decided to test their efficiency on simulated casework samples. Four difficult samples were sent to the nine Swiss laboratories active in the forensic genetics. They used their routine protocols to separate the epithelial cell fraction, enriched with the non-sperm DNA, from the sperm fraction. DNA extracts were then sent to the organizing laboratory for analysis. Estimates of male to female DNA ratio without differential DNA extraction ranged from 1:38 to 1:339, depending on the semen used to prepare the samples. After differential DNA extraction, most of the ratios ranged from 1:12 to 9:1, allowing the detection of the male DNA. Compared to direct DNA extraction, cell separation resulted in losses of 94-98% of the male DNA. As expected, more male DNA was generally present in the sperm than in the epithelial cell fraction. However, for about 30% of the samples, the reverse trend was observed. The recovery of male and female DNA was highly variable depending on the laboratories. Experimental design similar to the one used in this study may help for local protocol testing and improvement.