60 resultados para TRACE TECHNIQUES
Analysis and evaluation of techniques for the extraction of classes in the ontology learning process
Resumo:
This paper analyzes and evaluates, in the context of Ontology learning, some techniques to identify and extract candidate terms to classes of a taxonomy. Besides, this work points out some inconsistencies that may be occurring in the preprocessing of text corpus, and proposes techniques to obtain good terms candidate to classes of a taxonomy.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
The part proportional to the Euler-Poincar characteristic of the contribution of spin-2 fields to the gravitational trace anomaly is computed. It is seen to be of the same sign as all the lower-spin contributions, making anomaly cancellation impossible. Subtleties related to Weyl invariance, gauge independence, ghosts, and counting of degrees of freedom are pointed out.
Resumo:
It is argued that previous computations of the spin-(3/2 anomaly have spurious contributions, as there is Weyl-invariance breaking already at the classical level. The genuine, gauge-invariant, spin-(3/2 gravitational trace anomaly is computed here.
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.
Resumo:
En la investigació de la complexació de metalls mitjançant eines electroanalítiques són emprades dues aproximacions generals. La primera, anomenada de modelatge dur (hardmodelling), es basa en la formulació d'un model fisicoquímic conjunt per als processos electròdic i de complexació i en la resolució analítica o numèrica del model. Posteriorment, l'ajust dels paràmetres del model a les dades experimentals donarà la informació desitjada sobre el procés de complexació. La segona aproximació, anomenada de modelatge tou (soft-modelling), es basa en la identificació d'un model de complexació a partir de l'anàlisi numèrica i estadística de les dades, sense cap assumpció prèvia d'un model. Aquesta aproximació, que ha estat extensivament emprada amb dades espectroscòpiques, ho ha estat poquíssim amb dades electroquímiques. En aquest article tractem de la formulació d'un model (hard-modelling) per a la complexació de metalls en sistemes amb mescles de lligands, incloent-hi lligands macromoleculars, i de l'aplicació d
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
Transmission electron microscopy is a proven technique in the field of cell biology and a very useful tool in biomedical research. Innovation and improvements in equipment together with the introduction of new technology have allowed us to improve our knowledge of biological tissues, to visualizestructures better and both to identify and to locate molecules. Of all the types ofmicroscopy exploited to date, electron microscopy is the one with the mostadvantageous resolution limit and therefore it is a very efficient technique fordeciphering the cell architecture and relating it to function. This chapter aims toprovide an overview of the most important techniques that we can apply to abiological sample, tissue or cells, to observe it with an electron microscope, fromthe most conventional to the latest generation. Processes and concepts aredefined, and the advantages and disadvantages of each technique are assessedalong with the image and information that we can obtain by using each one ofthem.
Resumo:
This Handbook contains a collection of articles describing instrumental techniques used for Materials, Chemical and Biosciences research that are available at the Scientific and Technological Centers of theUniversity of Barcelona (CCiTUB). The CCiTUB are a group of facilities of the UB that provide both the research community and industry with ready access to a wide range of major instrumentation.Together with the latest equipment and technology, the CCiTUB provide expertise in addressing the methodological research needs of the user community and they also collaborate in R+D+i Projectswith industry. CCiTUB specialists include technical and Ph.D.-level professional staff members who are actively engaged in methodological research. Detailed information on the centers’ resources andactivities can be found at the CCiTUB website www.ccit.ub.edu ...
Resumo:
This article summarizes the basic principles of light microscopy, with examples of applications in biomedicine that illustrate the capabilities of thetechnique.
Resumo:
We analyse the use of the ordered weighted average (OWA) in decision-making giving special attention to business and economic decision-making problems. We present several aggregation techniques that are very useful for decision-making such as the Hamming distance, the adequacy coefficient and the index of maximum and minimum level. We suggest a new approach by using immediate weights, that is, by using the weighted average and the OWA operator in the same formulation. We further generalize them by using generalized and quasi-arithmetic means. We also analyse the applicability of the OWA operator in business and economics and we see that we can use it instead of the weighted average. We end the paper with an application in a business multi-person decision-making problem regarding production management