874 resultados para Computer Supported Cooperative Work (CSCW)
Resumo:
Comunicación presentada en el IX Workshop de Agentes Físicos (WAF'2008), Vigo, 11-12 septiembre 2008.
Resumo:
Comunicación presentada en el X Workshop of Physical Agents, Cáceres, 10-11 septiembre 2009.
Resumo:
This paper analyzes the learning experiences and opinions from a group of undergraduate students in a course about Robotics. The contents of this course were taught as a set of seminars. In each seminar, the student learned interdisciplinary knowledge of computer science, control engineering, electronics and other fields related to Robotics. The aim of this course is that the students are able to design and implement their own and custom robotic solution for a series of tests planned by the teachers. These tests measure the behavior and mechatronic features of the students' robots. Finally, the students' robots are confronted with some competitions. In this paper, the low-cost robotic architecture used by the students, the contents of the course, the tests to compare the solutions of students and the opinion of them are amply discussed.
Resumo:
Background: in both Spain and Italy the number of immigrants has strongly increased in the last 20 years, currently representing more than the 10% of workforce in each country. The segregation of immigrants into unskilled or risky jobs brings negative consequences for their health. The objective of this study is to compare prevalence of work-related health problems between immigrants and native workers in Italy and Spain. Methods: data come from the Italian Labour Force Survey (n=65 779) and Spanish Working Conditions Survey (n=11 019), both conducted in 2007. We analyzed merged datasets to evaluate whether interviewees, both natives and migrants, judge their health being affected by their work conditions and, if so, which specific diseases. For migrants, we considered those coming from countries with a value of the Human Development Index lower than 0.85. Logistic regression models were used, including gender, age, and education as adjusting factors. Results: migrants reported skin diseases (Mantel-Haenszel pooled OR=1.49; 95%CI: 0.59-3.74) and musculoskeletal problems among those employed in agricultural sector (Mantel-Haenszel pooled OR=1.16; 95%CI: 0.69-1.96) more frequently than natives; country-specific analysis showed higher risks of musculoskeletal problems among migrants compared to the non-migrant population in Italy (OR=1.17; 95% CI: 0.48-1.59) and of respiratory problems in Spain (OR=2.02; 95%CI: 1.02-4.0). In both countries the risk of psychological stress was predominant among national workers. Conclusions: this collaborative study allows to strength the evidence concerning the health of migrant workers in Southern European countries.
Resumo:
We propose an original method to geoposition an audio/video stream with multiple emitters that are at the same time receivers of the mixed signal. The achieved method is suitable for those comes where a list of positions within a designated area is encoded with a degree of precision adjusted to the visualization capabilities; and is also easily extensible to support new requirements. This method extends a previously proposed protocol, without incurring in any performance penalty.
Resumo:
To provide more efficient and flexible alternatives for the applications of secret sharing schemes, this paper describes a threshold sharing scheme based on exponentiation of matrices in Galois fields. A significant characteristic of the proposed scheme is that each participant has to keep only one master secret share which can be used to reconstruct different group secrets according to the number of threshold values.
Resumo:
This paper presents a new approach to the delineation of local labor markets based on evolutionary computation. The aim of the exercise is the division of a given territory into functional regions based on travel-to-work flows. Such regions are defined so that a high degree of inter-regional separation and of intra-regional integration in both cases in terms of commuting flows is guaranteed. Additional requirements include the absence of overlap between delineated regions and the exhaustive coverage of the whole territory. The procedure is based on the maximization of a fitness function that measures aggregate intra-region interaction under constraints of inter-region separation and minimum size. In the experimentation stage, two variations of the fitness function are used, and the process is also applied as a final stage for the optimization of the results from one of the most successful existing methods, which are used by the British authorities for the delineation of travel-to-work areas (TTWAs). The empirical exercise is conducted using real data for a sufficiently large territory that is considered to be representative given the density and variety of travel-to-work patterns that it embraces. The paper includes the quantitative comparison with alternative traditional methods, the assessment of the performance of the set of operators which has been specifically designed to handle the regionalization problem and the evaluation of the convergence process. The robustness of the solutions, something crucial in a research and policy-making context, is also discussed in the paper.
Resumo:
In this paper we address two issues. The first one analyzes whether the performance of a text summarization method depends on the topic of a document. The second one is concerned with how certain linguistic properties of a text may affect the performance of a number of automatic text summarization methods. For this we consider semantic analysis methods, such as textual entailment and anaphora resolution, and we study how they are related to proper noun, pronoun and noun ratios calculated over original documents that are grouped into related topics. Given the obtained results, we can conclude that although our first hypothesis is not supported, since it has been found no evident relationship between the topic of a document and the performance of the methods employed, adapting summarization systems to the linguistic properties of input documents benefits the process of summarization.
Resumo:
The need to digitise music scores has led to the development of Optical Music Recognition (OMR) tools. Unfortunately, the performance of these systems is still far from providing acceptable results. This situation forces the user to be involved in the process due to the need of correcting the mistakes made during recognition. However, this correction is performed over the output of the system, so these interventions are not exploited to improve the performance of the recognition. This work sets the scenario in which human and machine interact to accurately complete the OMR task with the least possible effort for the user.
Resumo:
Over the past decade, the numerical modeling of the magnetic field evolution in astrophysical scenarios has become an increasingly important field. In the crystallized crust of neutron stars the evolution of the magnetic field is governed by the Hall induction equation. In this equation the relative contribution of the two terms (Hall term and Ohmic dissipation) varies depending on the local conditions of temperature and magnetic field strength. This results in the transition from the purely parabolic character of the equations to the hyperbolic regime as the magnetic Reynolds number increases, which presents severe numerical problems. Up to now, most attempts to study this problem were based on spectral methods, but they failed in representing the transition to large magnetic Reynolds numbers. We present a new code based on upwind finite differences techniques that can handle situations with arbitrary low magnetic diffusivity and it is suitable for studying the formation of sharp current sheets during the evolution. The code is thoroughly tested in different limits and used to illustrate the evolution of the crustal magnetic field in a neutron star in some representative cases. Our code, coupled to cooling codes, can be used to perform long-term simulations of the magneto-thermal evolution of neutron stars.
Resumo:
We report on an outburst of the high mass X-ray binary 4U 0115+634 with a pulse period of 3.6 s in 2008 March/April as observed with RXTE and INTEGRAL. During the outburst the neutron star’s luminosity varied by a factor of 10 in the 3–50 keV band. In agreement with earlier work we find evidence of five cyclotron resonance scattering features at ~10.7, 21.8, 35.5, 46.7, and 59.7 keV. Previous work had found an anticorrelation between the fundamental cyclotron line energy and the X-ray flux. We show that this apparent anticorrelation is probably due to the unphysical interplay of parameters of the cyclotron line with the continuum models used previously, e.g., the negative and positive exponent power law (NPEX). For this model, we show that cyclotron line modeling erroneously leads to describing part of the exponential cutoff and the continuum variability, and not the cyclotron lines. When the X-ray continuum is modeled with a simple exponentially cutoff power law modified by a Gaussian emission feature around 10 keV, the correlation between the line energy and the flux vanishes, and the line parameters remain virtually constant over the outburst. We therefore conclude that the previously reported anticorrelation is an artifact of the assumptions adopted in the modeling of the continuum.
Resumo:
We present an analysis of a pointed 141 ks Chandra high-resolution transmission gratings observation of the Be X-ray emitting star HD110432, a prominent member of the γ Cas analogs. This observation represents the first high-resolution spectrum taken for this source as well as the longest uninterrupted observation of any γ Cas analog. The Chandra light curve shows a high variability but its analysis fails to detect any coherent periodicity up to a frequency of 0.05 Hz. Hardness ratio versus intensity analyses demonstrate that the relative contributions of the [1.5-3] Å, [3-6] Å, and [6-16] Å energy bands to the total flux change rapidly in the short term. The analysis of the Chandra High Energy Transmission Grating (HETG) spectrum shows that, to correctly describe the spectrum, three model components are needed. Two of those components are optically thin thermal plasmas of different temperatures (kT ≈ 8-9 and 0.2-0.3 keV, respectively) described by the models vmekal or bvapec. The Fe abundance in each of these two components appears equal within the errors and is slightly subsolar with Z ≈ 0.75 Z ☉. The bvapec model better describes the Fe L transitions, although it cannot fit well the Na XI Lyα line at 10.02 Å, which appears to be overabundant. Two different models seem to describe well the third component. One possibility is a third hot optically thin thermal plasma at kT = 16-21 keV with an Fe abundance Z ≈ 0.3 Z ☉, definitely smaller than for the other two thermal components. Furthermore, the bvapec model describes well the Fe K shell transitions because it accounts for the turbulence broadening of the Fe XXV and Fe XXVI lines with a v turb ≈ 1200 km s–1. These two lines, contributed mainly by the hot thermal plasma, are significantly wider than the Fe Kα line whose FWHM < 5 mÅ is not resolved by Chandra. Alternatively, the third component can be described by a power law with a photon index of Γ = 1.56. In either case, the Chandra HETG spectrum establishes that each one of these components must be modified by distinct absorption columns. The analysis of a noncontemporaneous 25 ks Suzaku observation shows the presence of a hard tail extending up to at least 33 keV. The Suzaku spectrum is described with the sum of two components: an optically thin thermal plasma at kT ≈ 9 keV and Z ≈ 0.74 Z ☉, and a very hot second plasma with kT ≈ 33 keV or, alternatively, a power law with photon index of Γ = 1.58. In either case, each one of the two components must be affected by different absorption columns. Therefore, the kT = 8-9 keV component is definitely needed while the nature of the harder emission cannot be unambiguously established with the present data sets. The analysis of the Si XIII and S XV He-like triplets present in the Chandra spectrum points to a very dense (ne ~ 1013 cm–3) plasma located either close to the stellar surface (r < 3R *) of the Be star or, alternatively, very close (r ~ 1.5R WD) to the surface of a (hypothetical) white dwarf companion. We argue, however, that the available data support the first scenario.
Resumo:
Nowadays, the use of RGB-D sensors have focused a lot of research in computer vision and robotics. These kinds of sensors, like Kinect, allow to obtain 3D data together with color information. However, their working range is limited to less than 10 meters, making them useless in some robotics applications, like outdoor mapping. In these environments, 3D lasers, working in ranges of 20-80 meters, are better. But 3D lasers do not usually provide color information. A simple 2D camera can be used to provide color information to the point cloud, but a calibration process between camera and laser must be done. In this paper we present a portable calibration system to calibrate any traditional camera with a 3D laser in order to assign color information to the 3D points obtained. Thus, we can use laser precision and simultaneously make use of color information. Unlike other techniques that make use of a three-dimensional body of known dimensions in the calibration process, this system is highly portable because it makes use of small catadioptrics that can be placed in a simple manner in the environment. We use our calibration system in a 3D mapping system, including Simultaneous Location and Mapping (SLAM), in order to get a 3D colored map which can be used in different tasks. We show that an additional problem arises: 2D cameras information is different when lighting conditions change. So when we merge 3D point clouds from two different views, several points in a given neighborhood could have different color information. A new method for color fusion is presented, obtaining correct colored maps. The system will be tested by applying it to 3D reconstruction.
Resumo:
Paper submitted to the 43rd International Symposium on Robotics (ISR2012), Taipei, Taiwan, Aug. 29-31, 2012.
Resumo:
A parallel algorithm for image noise removal is proposed. The algorithm is based on peer group concept and uses a fuzzy metric. An optimization study on the use of the CUDA platform to remove impulsive noise using this algorithm is presented. Moreover, an implementation of the algorithm on multi-core platforms using OpenMP is presented. Performance is evaluated in terms of execution time and a comparison of the implementation parallelised in multi-core, GPUs and the combination of both is conducted. A performance analysis with large images is conducted in order to identify the amount of pixels to allocate in the CPU and GPU. The observed time shows that both devices must have work to do, leaving the most to the GPU. Results show that parallel implementations of denoising filters on GPUs and multi-cores are very advisable, and they open the door to use such algorithms for real-time processing.