92 resultados para Computer Experiments
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper presents the distributed environment for virtual and/or real experiments for underwater robots (DEVRE). This environment is composed of a set of processes running on a local area network composed of three sites: 1) the onboard AUV computer; 2) a surface computer used as human-machine interface (HMI); and 3) a computer used for simulating the vehicle dynamics and representing the virtual world. The HMI can be transparently linked to the real sensors and actuators dealing with a real mission. It can also be linked with virtual sensors and virtual actuators, dealing with a virtual mission. The aim of DEVRE is to assist engineers during the software development and testing in the lab prior to real experiments
Resumo:
We present a computer-simulation study of the effect of the distribution of energy barriers in an anisotropic magnetic system on the relaxation behavior of the magnetization. While the relaxation law for the magnetization can be approximated in all cases by a time logarithmic decay, the law for the dependence of the magnetic viscosity with temperature is found to be quite sensitive to the shape of the distribution of barriers. The low-temperature region for the magnetic viscosity never extrapolates to a positive no-null value. Moreover our computer simulation results agree reasonably well with some recent relaxation experiments on highly anisotropic single-domain particles.
Resumo:
A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program
Resumo:
L'objectiu principal d'aquest treball és aplicar tècniques de visió articial per aconseguir localitzar i fer el seguiment de les extremitats dels ratolins dins l'entorn de prova de les investigacions d'optogenètica del grup de recerca del Neuroscience Institute de la Universitat de Princeton, Nova Jersey.
Resumo:
In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape
Resumo:
A brain-computer interface (BCI) is a new communication channel between the human brain and a computer. Applications of BCI systems comprise the restoration of movements, communication and environmental control. In this study experiments were made that used the BCI system to control or to navigate in virtual environments (VE) just by thoughts. BCI experiments for navigation in VR were conducted so far with synchronous BCI and asynchronous BCI systems. The synchronous BCI analyzes the EEG patterns in a predefined time window and has 2 to 3 degrees of freedom.
Resumo:
We discuss how technologies of peer punishment might bias the results that are observed in experiments. A crucial parameter is the “fine-to-fee” ratio, which describes by how much the punished subjects income is reduced relatively to the fee the punishing subject has to pay to inflict punishment. We show that a punishment technology commonly used in experiments embeds a variable fine-to-fee ratio and show that it confounds the empirical findings about why, whom, and how much subjects punish.
Resumo:
We use structural methods to assess equilibrium models of bidding with data from first-price auction experiments. We identify conditions to test the Nash equilibrium models for homogenous and for heterogeneous constant relative risk aversion when bidders private valuations are independent and uniformly drawn. The outcomes of our study indicate that behavior may have been affected by the procedure used to conduct the experiments and that the usual Nash equilibrium model for heterogeneous constant relative risk averse bidders does not consistently explain the observed overbidding. From an empirical standpoint, our analysis shows the possible drawbacks of overlooking the homogeneity hypothesis when testing symmetric equilibrium models of bidding and it puts in perspective the sensitivity of structural inferences to the available information.
Resumo:
Report for the scientific sojourn carried out at the Music Technology Area (Sound Processing and Control Lab), Faculty of Music, McGill University, Montreal, Canada, from October to December 2005.The aim of this research is to study the singing voice for controlling virtual musical instrument synthesis. It includes analysis and synthesis algorithms based on spectral audio processing. After digitalising the acoustic voice signal in the computer, a number of expressive descriptors of the singer are extracted. This process is achieved synchronously, thus all the nuance of the singer performance have been tracked. In a second stage, the extracted parameters are mapped to a sound synthesizer, the so-called digital musical instruments. In order achieve it, several tests with music students of the Faculty of Music, McGill University have been developed. These experiments have contributed to evaluate the system and to derive new control strategies to integrate: clarinet synthesis, bass guitar, visual representation of voice signals.
Resumo:
Report for the scientific sojourn at the Swiss Federal Institute of Technology Zurich, Switzerland, between September and December 2007. In order to make robots useful assistants for our everyday life, the ability to learn and recognize objects is of essential importance. However, object recognition in real scenes is one of the most challenging problems in computer vision, as it is necessary to deal with difficulties. Furthermore, in mobile robotics a new challenge is added to the list: computational complexity. In a dynamic world, information about the objects in the scene can become obsolete before it is ready to be used if the detection algorithm is not fast enough. Two recent object recognition techniques have achieved notable results: the constellation approach proposed by Lowe and the bag of words approach proposed by Nistér and Stewénius. The Lowe constellation approach is the one currently being used in the robot localization project of the COGNIRON project. This report is divided in two main sections. The first section is devoted to briefly review the currently used object recognition system, the Lowe approach, and bring to light the drawbacks found for object recognition in the context of indoor mobile robot navigation. Additionally the proposed improvements for the algorithm are described. In the second section the alternative bag of words method is reviewed, as well as several experiments conducted to evaluate its performance with our own object databases. Furthermore, some modifications to the original algorithm to make it suitable for object detection in unsegmented images are proposed.
Resumo:
L’objectiu del projecte consisteix en l’estudi, simulació i implantació d’un conjunt d’aplicacions que permeten tenir un control sobre possibles problemes que puguin succeir a la nostra xarxa. Aquest projecte és la solució als problemes de detecció d’errors en el funcionament de les infraestructures de networking de les que disposen els nostres clients.
Resumo:
La E/S Paralela es un área de investigación que tiene una creciente importancia en el cómputo de Altas Prestaciones. Si bien durante años ha sido el cuello de botella de los computadores paralelos en la actualidad, debido al gran aumento del poder de cómputo, el problema de la E/S se ha incrementado y la comunidad del Cómputo de Altas Prestaciones considera que se debe trabajar en mejorar el sistema de E/S de los computadores paralelos, para lograr cubrir las exigencias de las aplicaciones científicas que usan HPC. La Configuración de la Entrada/Salida (E/S) Paralela tiene una gran influencia en las prestaciones y disponibilidad, por ello es importante “Analizar configuraciones de E/S paralela para identificar los factores claves que influyen en las prestaciones y disponibilidad de la E/S de Aplicaciones Científicas que se ejecutan en un clúster”. Para realizar el análisis de las configuraciones de E/S se propone una metodología que permite identificar los factores de E/S y evaluar su influencia para diferentes configuraciones de E/S formada por tres fases: Caracterización, Configuración y Evaluación. La metodología permite analizar el computador paralelo a nivel de Aplicación Científica, librerías de E/S y de arquitectura de E/S, pero desde el punto de vista de la E/S. Los experimentos realizados para diferentes configuraciones de E/S y los resultados obtenidos indican la complejidad del análisis de los factores de E/S y los diferentes grados de influencia en las prestaciones del sistema de E/S. Finalmente se explican los trabajos futuros, el diseño de un modelo que de soporte al proceso de Configuración del sistema de E/S paralela para aplicaciones científicas. Por otro lado, para identificar y evaluar los factores de E/S asociados con la disponibilidad a nivel de datos, se pretende utilizar la Arquitectura Tolerante a Fallos RADIC.
Resumo:
El projecte "Laboratori Asssit per Ordinador Mitjançant Eines Ofimàtiques Convencionals" ha estat realitzat en la facultat de Física de la Universitat de Barcelona durant els anys 2007 i 2008 (projecte biennal). El principal objectiu d’aquest projecte és demostrar la possibilitat d’utilitzar les eines informàtiques més habituals en la realització d’experiències de laboratori assistit per ordinador (LAO). En particular, es proposa la utilització del Excel © juntament amb les seves macros (Visual Basic para Aplicacions, VBA) en pràctiques de laboratori d’assignatures en l’àrea de Física Aplicada. Excel és un full de càlcul molt conegut i usat tant per professors com pels estudiants. En aquest treball mostrem exemples concrets que abasten les diferents tècniques de control i adquisició de dades: programació del port sèrie (RS- 232) i paral·lel, i interfase GPIB. La implementació d’aquestes tècniques es realitza mitjançant macros VBA de Excel. La resta de programació de l’aplicació LAO, la representació gràfica i el tractament de les dades, es realitza de forma molt simple a partir del maneig habitual d’un full de càlcul. La realització del projecte ha demostrat la conveniència d’aquesta metodologia. Actualment pràcticament la totalitat de les pràctiques LAO de les quals és responsable el Departament de Física Aplicada utilitzen la programació a través del full de càlcul. La resposta dels estudiants ha estat molt positiva. La combinació de les característiques d’aquesta eina juntament amb la programació VBA té un enorme potencial i representa, probablement, una forma senzilla d’introduir tant a l’alumne com al professor en el món de la programació.
Resumo:
With the advent of High performance computing, it is now possible to achieve orders of magnitude performance and computation e ciency gains over conventional computer architectures. This thesis explores the potential of using high performance computing to accelerate whole genome alignment. A parallel technique is applied to an algorithm for whole genome alignment, this technique is explained and some experiments were carried out to test it. This technique is based in a fair usage of the available resource to execute genome alignment and how this can be used in HPC clusters. This work is a rst approximation to whole genome alignment and it shows the advantages of parallelism and some of the drawbacks that our technique has. This work describes the resource limitations of current WGA applications when dealing with large quantities of sequences. It proposes a parallel heuristic to distribute the load and to assure that alignment quality is mantained.
Resumo:
En l'article s'analitza prèviament l'estat de l'art de la gestió de l'ample de banda en entorns educatius, presentant en base a diverses classificacions anteriors solucions i experiències proposades. Amb la proposta presentada, mitjançant els experiments de simulació efectuats i els tests en entorns reals es tracta de comprovar-ne el correcte comportament, demostrant la utilitat de la mateixa alhora de fer la gestió de l'ample de banda dels centres.