996 resultados para Interaction testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In areas of seasonal frost, frost susceptibility composed by frost heaving during the winter and thaw softening during the spring is one of the most dangerous phenomenon for transportation, road and railway infrastructure. Therefore, the need for frost protection layer becomes imperative. The purpose of frost protection layer is to prevent frost from penetrating down through the pavement and into the sub-soils. Frost susceptible soils under the road can be cause damages on the roads or other structures due to frost heave or reduced capacity characteristics thaw period. "Frost heave" is the term given to the upwards displacement of the ground surface caused by the formation of ice within soils or aggregates (Rempel et al., 2004). Nowadays in Scandinavia the most common material used in frost protection layer in the pavement structure of roads and in the ballast of the railway tracks are coarse-grain crushed rocks aggregates. Based on the capillary rise, the mechanics of frost heave phenomenon is based on the interaction between aggregates and water, as suggested by Konrad and Lemieux in 2005 that said that the fraction of material below the 0.063 mm sieve for coarse-grained soils must be controlled so as to reduce the sensitivity to frost heave. The study conducted in this thesis project is divided in two parts: - the analysis of the coarse grained aggregates used in frost protection layer in Norway; - the analysis of the frost heave phenomenon in the laboratory under known boundary conditions, through the use of the most widely used method, the frost heave test, in” closed system” (without access of water).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As lightweight and slender structural elements are more frequently used in the design, large scale structures become more flexible and susceptible to excessive vibrations. To ensure the functionality of the structure, dynamic properties of the occupied structure need to be estimated during the design phase. Traditional analysis method models occupants simply as an additional mass; however, research has shown that human occupants could be better modeled as an additional degree-of- freedom. In the United Kingdom, active and passive crowd models are proposed by the Joint Working Group as a result of a series of analytical and experimental research. It is expected that the crowd models would yield a more accurate estimation to the dynamic response of the occupied structure. However, experimental testing recently conducted through a graduate student project at Bucknell University indicated that the proposed passive crowd model might be inaccurate in representing the impact on the structure from the occupants. The objective of this study is to provide an assessment of the validity of the crowd models proposed by JWG through comparing the dynamic properties obtained from experimental testing data and analytical modeling results. The experimental data used in this study was collected by Firman in 2010. The analytical results were obtained by performing a time-history analysis on a finite element model of the occupied structure. The crowd models were created based on the recommendations from the JWG combined with the physical properties of the occupants during the experimental study. During this study, SAP2000 was used to create the finite element models and to implement the analysis; Matlab and ME¿scope were used to obtain the dynamic properties of the structure through processing the time-history analysis results from SAP2000. The result of this study indicates that the active crowd model could quite accurately represent the impact on the structure from occupants standing with bent knees while the passive crowd model could not properly simulate the dynamic response of the structure when occupants were standing straight or sitting on the structure. Future work related to this study involves improving the passive crowd model and evaluating the crowd models with full-scale structure models and operating data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vibration serviceability is a widely recognized design criterion for assembly-type structures, such as stadiums, that are likely subjected to rhythmic human-induced excitation. Human-induced excitation of a structure occurs from the movement of the occupants such as walking, running, jumping, or dancing. Vibration serviceability is based on the level of comfort that people have with the vibrations of a structure. Current design guidance uses the natural frequency of the structure to assess vibration serviceability. However, a phenomenon known as human-structure interaction suggests that there is a dynamic interaction between the structure and passive occupants, altering the natural frequency of the system. Human-structure interaction is dependent on many factors, including the dynamic properties of the structure, posture of the occupants, and relative size of the crowd. It is unknown if the shift in natural frequency due to humanstructure interaction is significant enough to warrant consideration in the design process. This study explores the interface of both structural and crowd characteristics through experimental testing to determine if human-structure interaction should be considered because of its potential impact on serviceability assessment. An experimental test structure that represents the dynamic properties of a cantilevered stadium structure was designed and constructed. Experimental modal analysis was implemented to determine the dynamic properties of the empty test structure and when occupied with up to seven people arranged in different locations and postures. Comparisons of the dynamic properties were made between the empty and occupied testing configurations and analytical results from the use of a dynamic crowd model recommended from the Joint Working Group of Europe. Data trends lead to the development of a refined dynamic crowd model. This dynamic model can be used in conjunction with a finite element model of the test structure to estimate the dynamic influence due to human-structure interaction due to occupants standing with straight knees. In the future, the crowd model will be refined and can aid in assessing the dynamic properties of in-service stadium structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose robust and e±cient tests and estimators for gene-environment/gene-drug interactions in family-based association studies. The methodology is designed for studies in which haplotypes, quantitative pheno- types and complex exposure/treatment variables are analyzed. Using causal inference methodology, we derive family-based association tests and estimators for the genetic main effects and the interactions. The tests and estimators are robust against population admixture and strati¯cation without requiring adjustment for confounding variables. We illustrate the practical relevance of our approach by an application to a COPD study. The data analysis suggests a gene-environment interaction between a SNP in the Serpine gene and smok- ing status/pack years of smoking that reduces the FEV1 volume by about 0.02 liter per pack year of smoking. Simulation studies show that the pro- posed methodology is su±ciently powered for realistic sample sizes and that it provides valid tests and effect size estimators in the presence of admixture and stratification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mainstreaming the LforS approach is a challenge due to dive rging institutional priorities, customs, and expectations of classically traine d staff. A workshop to test LforS theory and practice, and explore how to mainstream it, took place in a concrete context in a rural district of Mozambique, focusing on agricultural, forest and water resources. The evaluation showed that the principles of interaction applied pe rmitted to link rational know ledge with practical experience through mutual learning and iterative self-reflection. The combination of learning techniques was considered usef ul; participants called for further opportunities to apply the LforS methodology, proposing next steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There is evidence that drinking during residential treatment is related to various factors, such as patients’ general control beliefs and self-efficacy, as well as to external control of alcohol use by program’s staff and situations where there is temptation to drink. As alcohol use during treatment has been shown to be associated with the resumption of alcohol use after discharge from residential treatment, we aimed to investigate how these variables are related to alcohol use during abstinenceoriented residential treatment programs for alcohol use disorders (AUD). Methods: In total, 509 patients who entered 1 of 2 residential abstinence-oriented treatment programs for AUD were included in the study. After detoxification, patients completed a standardized diagnostic procedure including interviews and questionnaires. Drinking was assessed by patients’ selfreport of at least 1 standard drink or by positive breathalyzer testing. The 2 residential programs were categorized as high or low control according to the average number of tests per patient. Results: Regression analysis revealed a significant interaction effect between internal and external control suggesting that patients with high internal locus of control and high frequency of control by staff demonstrated the least alcohol use during treatment (16.7%) while patients with low internal locus of control in programs with low external control were more likely to use alcohol during Treatment (45.9%). No effects were found for self-efficacy and temptation. Conclusions: As alcohol use during treatment is most likely associated with poor treatment outcomes, external control may improve treatment outcomes and particularly support patients with low internal locus of control, who show the highest risk for alcohol use during treatment. High external control may complement high internal control to improve alcohol use prevention while in treatment. Key Words: Alcohol Dependence, Alcohol Use, Locus of Control, Alcohol Testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypertension (HT) is mediated by the interaction of many genetic and environmental factors. Previous genome-wide linkage analysis studies have found many loci that show linkage to HT or blood pressure (BP) regulation, but the results were generally inconsistent. Gene by environment interaction is among the reasons that potentially explain these inconsistencies between studies. Here we investigate influences of gene by smoking (GxS) interaction on HT and BP in European American (EA), African American (AA) and Mexican American (MA) families from the GENOA study. A variance component-based method was utilized to perform genome-wide linkage analysis of systolic blood pressure (SBP), diastolic blood pressure (DBP), and HT status, as well as bivariate analysis for SBP and DBP for smokers, non-smokers, and combined groups. The most significant results were found for SBP in MA. The strongest signal was for chromosome 17q24 (LOD = 4.2), increased to (LOD = 4.7) in bivariate analysis but there was no evidence of GxS interaction at this locus (p = 0.48). Two signals were identified only in one group: on chromosome 15q26.2 (LOD = 3.37) in non-smokers and chromosome 7q21.11 (LOD = 1.4) in smokers, both of which had strong evidence for GxS interaction (p = 0.00039 and 0.009 respectively). There were also two other signals, one on chromosome 20q12 (LOD = 2.45) in smokers, which became much higher in the combined sample (LOD = 3.53), and one on chromosome 6p22.2 (LOD = 2.06) in non-smokers. Neither peak had very strong evidence for GxS interaction (p = 0.08 and 0.06 respectively). A fine mapping association study was performed using 200 SNPs in 30 genes located under the linkage signals on chromosomes 15 and 17. Under the chromosome 15 peak, the association analysis identified 6 SNPs accounting for a 7 mmHg increase in SBP in MA non-smokers. For the chromosome 17 linkage peak, the association analysis identified 3 SNPs accounting for a 6 mmHg increase in SBP in MA. However, none of these SNPs was significant after correcting for multiple testing, and accounting for them in the linkage analysis produced very small reductions in the linkage signal. ^ The linkage analysis of BP traits considering the smoking status produced very interesting signals for SBP in the MA population. The fine mapping association analysis gave some insight into the contribution of some SNPs to two of the identified signals, but since these SNPs did not remain significant after multiple testing correction and did not explain the linkage peaks, more work is needed to confirm these exploratory results and identify the culprit variations under these linkage peaks. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Triglyceride levels are a component of plasma lipids that are thought to be an important risk factor for coronary heart disease and are influenced by genetic and environmental factors, such as single nucleotide polymorphisms (SNPs), alcohol intake, and smoking. This study used longitudinal data from the Bogalusa Heart Study, a biracial community-based survey of cardiovascular disease risk factors. A sample of 1191 individuals, 4 to 38 years of age, was measured multiple times from 1973 to 2000. The study sample consisted of 730 white and 461 African American participants. Individual growth models were developed in order to assess gene-environment interactions affecting plasma triglycerides over time. After testing for inclusion of significant covariates and interactions, final models, each accounting for the effects of a different SNP, were assessed for fit and normality. After adjustment for all other covariates and interactions, LIPC -514C/T was found to interact with age3, age2, and age and a non-significant interaction of CETP -971G/A genotype with smoking status was found (p = 0.0812). Ever-smokers had higher triglyceride levels than never smokers, but persons heterozygous at this locus, about half of both races, had higher triglyceride levels after smoking cessation compared to current smokers. Since tobacco products increase free fatty acids circulating in the bloodstream, smoking cessation programs have the potential to ultimately reduce triglyceride levels for many persons. However, due to the effect of smoking cessation on the triglyceride levels of CETP -971G/A heterozygotes, the need for smoking prevention programs is also demonstrated. Both smoking cessation and prevention programs would have a great public health impact on minimizing triglyceride levels and ultimately reducing heart disease. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous studies have been carried out to try to better understand the genetic predisposition for cardiovascular disease. Although it is widely believed that multifactorial diseases such as cardiovascular disease is the result from effects of many genes which working alone or interact with other genes, most genetic studies have been focused on identifying of cardiovascular disease susceptibility genes and usually ignore the effects of gene-gene interactions in the analysis. The current study applies a novel linkage disequilibrium based statistic for testing interactions between two linked loci using data from a genome-wide study of cardiovascular disease. A total of 53,394 single nucleotide polymorphisms (SNPs) are tested for pair-wise interactions, and 8,644 interactions are found to be significant with p-values less than 3.5×10-11. Results indicate that known cardiovascular disease susceptibility genes tend not to have many significantly interactions. One SNP in the CACNG1 (calcium channel, voltage-dependent, gamma subunit 1) gene and one SNP in the IL3RA (interleukin 3 receptor, alpha) gene are found to have the most significant pair-wise interactions. Findings from the current study should be replicated in other independent cohort to eliminate potential false positive results.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to investigate whether an incongruence between personality characteristics of individuals and concomitant charcteristics of health professional training environments on salient dimensions contributes to aspects of mental health. The dimensions examined were practical-theoretical orientation and the degree of structure-unstructure. They were selected for study as they are particularly important attributes of students and of learning environments. It was proposed that when the demand of the environment is disparate from the proclivities of the individual, strain arises. This strain was hypothesized to contribute to anxiety, depression, and subjective distress.^ Select subscales on the Omnibus Personality Inventory (OPI) were the operationalized measures for the personality component of the dimensions studied. An environmental index was developed to assess students' perceptions of the learning environment on these same dimensions. The Beck Depression Inventory, State-Trait Anxiety Inventory and General Well-Being schedule measured the outcome variables.^ A congruence model was employed to determine person-environment (P-E) interaction. Scores on the scales of the OPI and the environmental index were divided into high, medium, and low based on the range of scores. Congruence was defined as a match between the level of personality need and the complementary level of the perception of the environment. Alternatively, incongruence was defined as a mismatch between the person and the environment. The consistent category was compared to the inconsistent categories by an analysis of variance procedure. Furthermore, analyses of covariance were conducted with perceived supportiveness of the learning environment and life events external to the learning environment as the covariates. These factors were considered critical influences affecting the outcome measures.^ One hundred and eighty-five students (49% of the population) at the College of Optometry at the University of Houston participated in the study. Students in all four years of the program were equally represented in the study. However, the sample differed from the total population on representation by sex, marital status, and undergraduate major.^ The results of the study did not support the hypotheses. Further, after having adjusted for perceived supportiveness and life events external to the learning environment, there were no statistically significant differences between the congruent category and incongruent categories. Means indicated than the study sample experienced significantly lower depression and subjective distress than the normative samples.^ Results are interpreted in light of their utility for future study design in the investigation of the effects of P-E interaction. Emphasized is the question of the feasibility of testing a P-E interaction model with extant groups. Recommendations for subsequent research are proposed in light of the exploratory nature of the methodology. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document is a summary of the Bachelor thesis titled “VHDL-Based System Design of a Cognitive Sensorimotor Loop (CSL) for Haptic Human-Machine Interaction (HMI)” written by Pablo de Miguel Morales, Electronics Engineering student at the Universidad Politécnica de Madrid (UPM Madrid, Spain) during an Erasmus+ Exchange Program at the Beuth Hochschule für Technik (BHT Berlin, Germany). The tutor of this project is Dr. Prof. Hild. This project has been developed inside the Neurobotics Research Laboratory (NRL) in close collaboration with Benjamin Panreck, a member of the NRL, and another exchange student from the UPM Pablo Gabriel Lezcano. For a deeper comprehension of the content of the thesis, a deeper look in the document is needed as well as the viewing of the videos and the VHDL design. In the growing field of automation, a large amount of workforce is dedicated to improve, adapt and design motor controllers for a wide variety of applications. In the specific field of robotics or other machinery designed to interact with humans or their environment, new needs and technological solutions are often being discovered due to the existing, relatively unexplored new scenario it is. The project consisted of three main parts: Two VHDL-based systems and one short experiment on the haptic perception. Both VHDL systems are based on a Cognitive Sensorimotor Loop (CSL) which is a control loop designed by the NRL and mainly developed by Dr. Prof. Hild. The CSL is a control loop whose main characteristic is the fact that it does not use any external sensor to measure the speed or position of the motor but the motor itself. The motor always generates a voltage that is proportional to its angular speed so it does not need calibration. This method is energy efficient and simplifies control loops in complex systems. The first system, named CSL Stay In Touch (SIT), consists in a one DC motor system controller by a FPGA Board (Zynq ZYBO 7000) whose aim is to keep contact with any external object that touches its Sensing Platform in both directions. Apart from the main behavior, three features (Search Mode, Inertia Mode and Return Mode) have been designed to enhance the haptic interaction experience. Additionally, a VGA-Screen is also controlled by the FPGA Board for the monitoring of the whole system. This system has been completely developed, tested and improved; analyzing its timing and consumption properties. The second system, named CSL Fingerlike Mechanism (FM), consists in a fingerlike mechanical system controlled by two DC motors (Each controlling one part of the finger). The behavior is similar to the first system but in a more complex structure. This system was optional and not part of the original objectives of the thesis and it could not be properly finished and tested due to the lack of time. The haptic perception experiment was an experiment conducted to have an insight into the complexity of human haptic perception in order to implement this knowledge into technological applications. The experiment consisted in testing the capability of the subjects to recognize different objects and shapes while being blindfolded and with their ears covered. Two groups were done, one had full haptic perception while the other had to explore the environment with a plastic piece attached to their finger to create a haptic handicap. The conclusion of the thesis was that a haptic system based only on a CSL-based system is not enough to retrieve valuable information from the environment and that other sensors are needed (temperature, pressure, etc.) but that a CSL-based system is very useful to control the force applied by the system to interact with haptic sensible surfaces such as skin or tactile screens. RESUMEN. Este documento es un resumen del proyecto fin de grado titulado “VHDL-Based System Design of a Cognitive Sensorimotor Loop (CSL) for Haptic Human-Machine Interaction (HMI)” escrito por Pablo de Miguel, estudiante de Ingeniería Electrónica de Comunicaciones en la Universidad Politécnica de Madrid (UPM Madrid, España) durante un programa de intercambio Erasmus+ en la Beuth Hochschule für Technik (BHT Berlin, Alemania). El tutor de este proyecto ha sido Dr. Prof. Hild. Este proyecto se ha desarrollado dentro del Neurorobotics Research Laboratory (NRL) en estrecha colaboración con Benjamin Panreck (un miembro del NRL) y con Pablo Lezcano (Otro estudiante de intercambio de la UPM). Para una comprensión completa del trabajo es necesaria una lectura detenida de todo el documento y el visionado de los videos y análisis del diseño VHDL incluidos en el CD adjunto. En el creciente sector de la automatización, una gran cantidad de esfuerzo está dedicada a mejorar, adaptar y diseñar controladores de motor para un gran rango de aplicaciones. En el campo específico de la robótica u otra maquinaria diseñada para interactuar con los humanos o con su entorno, nuevas necesidades y soluciones tecnológicas se siguen desarrollado debido al relativamente inexplorado y nuevo escenario que supone. El proyecto consta de tres partes principales: Dos sistemas basados en VHDL y un pequeño experimento sobre la percepción háptica. Ambos sistemas VHDL están basados en el Cognitive Sesnorimotor Loop (CSL) que es un lazo de control creado por el NRL y cuyo desarrollador principal ha sido Dr. Prof. Hild. El CSL es un lazo de control cuya principal característica es la ausencia de sensores externos para medir la velocidad o la posición del motor, usando el propio motor como sensor. El motor siempre genera un voltaje proporcional a su velocidad angular de modo que no es necesaria calibración. Este método es eficiente en términos energéticos y simplifica los lazos de control en sistemas complejos. El primer sistema, llamado CSL Stay In Touch (SIT), consiste en un sistema formado por un motor DC controlado por una FPGA Board (Zynq ZYBO 7000) cuyo objetivo es mantener contacto con cualquier objeto externo que toque su plataforma sensible en ambas direcciones. Aparte del funcionamiento básico, tres modos (Search Mode, Inertia Mode y Return Mode) han sido diseñados para mejorar la interacción. Adicionalmente, se ha diseñado el control a través de la FPGA Board de una pantalla VGA para la monitorización de todo el sistema. El sistema ha sido totalmente desarrollado, testeado y mejorado; analizando su propiedades de timing y consumo energético. El segundo sistema, llamado CSL Fingerlike Mechanism (FM), consiste en un mecanismo similar a un dedo controlado por dos motores DC (Cada uno controlando una falange). Su comportamiento es similar al del primer sistema pero con una estructura más compleja. Este sistema no formaba parte de los objetivos iniciales del proyecto y por lo tanto era opcional. No pudo ser plenamente desarrollado debido a la falta de tiempo. El experimento de percepción háptica fue diseñado para profundizar en la percepción háptica humana con el objetivo de aplicar este conocimiento en aplicaciones tecnológicas. El experimento consistía en testear la capacidad de los sujetos para reconocer diferentes objetos, formas y texturas en condiciones de privación del sentido del oído y la vista. Se crearon dos grupos, en uno los sujetos tenían plena percepción háptica mientras que en el otro debían interactuar con los objetos a través de una pieza de plástico para generar un hándicap háptico. La conclusión del proyecto fue que un sistema háptico basado solo en sistemas CSL no es suficiente para recopilar información valiosa del entorno y que debe hacer uso de otros sensores (temperatura, presión, etc.). En cambio, un sistema basado en CSL es idóneo para el control de la fuerza aplicada por el sistema durante la interacción con superficies hápticas sensibles tales como la piel o pantallas táctiles.