968 resultados para Visual Basic (Programming Language)
Resumo:
With the growth in new technologies, using online tools have become an everyday lifestyle. It has a greater impact on researchers as the data obtained from various experiments needs to be analyzed and knowledge of programming has become mandatory even for pure biologists. Hence, VTT came up with a new tool, R Executables (REX) which is a web application designed to provide a graphical interface for biological data functions like Image analysis, Gene expression data analysis, plotting, disease and control studies etc., which employs R functions to provide results. REX provides a user interactive application for the biologists to directly enter the values and run the required analysis with a single click. The program processes the given data in the background and prints results rapidly. Due to growth of data and load on server, the interface has gained problems concerning time consumption, poor GUI, data storage issues, security, minimal user interactive experience and crashes with large amount of data. This thesis handles the methods by which these problems were resolved and made REX a better application for the future. The old REX was developed using Python Django and now, a new programming language, Vaadin has been implemented. Vaadin is a Java framework for developing web applications and the programming language is extremely similar to Java with new rich components. Vaadin provides better security, better speed, good and interactive interface. In this thesis, subset functionalities of REX was selected which includes IST bulk plotting and image segmentation and implemented those using Vaadin. A code of 662 lines was programmed by me which included Vaadin as the front-end handler while R language was used for back-end data retrieval, computing and plotting. The application is optimized to allow further functionalities to be migrated with ease from old REX. Future development is focused on including Hight throughput screening functions along with gene expression database handling
Resumo:
The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.
Resumo:
Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.
Resumo:
VERSION ANGLAISE DISPONIBLE AU DÉPARTEMENT; THÈSE RÉALISÉE CONJOINTEMENT AVEC L'ÉCOLE DES SCIENCES DE LA COMMUNICATION DE L'UNIVERSITÉ MCGILL (DRS. K. STEINHAUER ET J.E. DRURY).
Resumo:
Dans le but d’optimiser la représentation en mémoire des enregistrements Scheme dans le compilateur Gambit, nous avons introduit dans celui-ci un système d’annotations de type et des vecteurs contenant une représentation abrégée des enregistrements. Ces derniers omettent la référence vers le descripteur de type et l’entête habituellement présents sur chaque enregistrement et utilisent plutôt un arbre de typage couvrant toute la mémoire pour retrouver le vecteur contenant une référence. L’implémentation de ces nouvelles fonctionnalités se fait par le biais de changements au runtime de Gambit. Nous introduisons de nouvelles primitives au langage et modifions l’architecture existante pour gérer correctement les nouveaux types de données. On doit modifier le garbage collector pour prendre en compte des enregistrements contenants des valeurs hétérogènes à alignements irréguliers, et l’existence de références contenues dans d’autres objets. La gestion de l’arbre de typage doit aussi être faite automatiquement. Nous conduisons ensuite une série de tests de performance visant à déterminer si des gains sont possibles avec ces nouvelles primitives. On constate une amélioration majeure de performance au niveau de l’allocation et du comportement du gc pour les enregistrements typés de grande taille et des vecteurs d’enregistrements typés ou non. De légers surcoûts sont toutefois encourus lors des accès aux champs et, dans le cas des vecteurs d’enregistrements, au descripteur de type.
Resumo:
Il est maintenant bien établi que le cerveau humain est doté d’un système de neurones qui s’active tant à la perception qu’à l’exécution d’une action. Les neurones miroirs, ainsi que le système qu’ils forment avec des structures adjacentes appelées système neurones miroirs (SNM), ont été relié à la compréhension d’action et pourrait être impliqué dans les fonctions sociales de haut niveau tel que l’empathie et l’imitation. Dans la foulée spéculative reliant le SNM à la sphère sociale, le dysfonctionnement de ce système a rapidement gagné intérêt dans la genèse des anomalies du domaine social chez les personnes présentant le Trouble du spectre de l’autisme (TSA). Néanmoins, l’hypothèse voulant que le dysfonctionnement social des TSA repose sur une atteinte du SNM est controversée. En effet, les études soutenant cette hypothèse nécessitent des fonctions cognitives et sociales qui peuvent contribuer à l’obtention de résultats atypiques, telles que la compréhension des consignes, l’attention sur des stimuli sociaux ou la réalisation d’acte moteur. Récemment, un protocole auditif de négativité de discordance (MMN) utilisant des stimuli reliés à l’action humaine a été utilisé pour mesurer l’activité du SNM. Cette technique semble prometteuse dans la mesure où elle ne nécessite pas de capacités attentionnelles ou langagières, elle est brève et demande un montage minimal d’électrodes. Le premier article avait comme objectif principal de mesurer la validité de convergence du protocole MMN relié à l’action avec celui du rythme mu, le protocole le plus utilisé pour enregistrer l’activité miroir à l’aide de l’électroencéphalographie (EEG). Les modes de stimulation ont été délivrées en bloc successif à un groupe de 12 adultes en santé. Alors que les deux techniques ont modulé efficacement les régions fronto-centrales et centrales respectivement, mais ne sont pas corrélées, nous avons conclu qu’il est possible 2 qu’elles mesurent des aspects différents du SNM. Le deuxième article avait comme objectif principal de mesurer l’activité du SNM à l’aide du protocole MMN relié à l’action chez 10 enfants présentant un TSA ainsi que chez 12 enfants neurotypiques dans la même tranche d’âge (5-7ans). Chez les enfants TSA, nous avons montré un patron de latence inversée, comparativement aux enfants du groupe contrôle; ils traitaient plus rapidement les sons contrôles que les sons reliés à l’action humaine, alors que la tendance inverse était observée chez les contrôles. De plus, bien que les deux groupes différaient quant aux sons d’action, ils ne différaient pas quant aux sons contrôles. Quant à l’amplitude, les enfants TSA se distinguaient du groupe contrôle par une amplitude restreinte du son d’action provenant de la bouche. Par ailleurs, les mesures neurophysiologiques et neuropsychologiques n’étaient pas corrélées. En sommes, basé sur la prémisse que ce protocole MMN pourrait mesurer l’activité du SNM, cette thèse a comme but d’améliorer les connaissances quant à son utilisation chez l’adulte et l’enfant neurotypique ainsi que chez l’enfant TSA. Celui-ci pourrait ultimement être utilisé comme un biomarqueur potentiel du TSA.
Resumo:
The flexibility of the robot is the key to its success as a viable aid to production. Flexibility of a robot can be explained in two directions. The first is to increase the physical generality of the robot such that it can be easily reconfigured to handle a wide variety of tasks. The second direction is to increase the ability of the robot to interact with its environment such that tasks can still be successfully completed in the presence of uncertainties. The use of articulated hands are capable of adapting to a wide variety of grasp shapes, hence reducing the need for special tooling. The availability of low mass, high bandwidth points close to the manipulated object also offers significant improvements I the control of fine motions. This thesis provides a framework for using articulated hands to perform local manipulation of objects. N particular, it addresses the issues in effecting compliant motions of objects in Cartesian space. The Stanford/JPL hand is used as an example to illustrate a number of concepts. The examples provide a unified methodology for controlling articulated hands grasping with point contacts. We also present a high-level hand programming system based on the methodologies developed in this thesis. Compliant motion of grasped objects and dexterous manipulations can be easily described in the LISP-based hand programming language.
Resumo:
The statistical analysis of compositional data should be treated using logratios of parts, which are difficult to use correctly in standard statistical packages. For this reason a freeware package, named CoDaPack was created. This software implements most of the basic statistical methods suitable for compositional data. In this paper we describe the new version of the package that now is called CoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©), Visual Basic and Open GL, and it is oriented towards users with a minimum knowledge of computers with the aim at being simple and easy to use. This new version includes new graphical output in 2D and 3D. These outputs could be zoomed and, in 3D, rotated. Also a customization menu is included and outputs could be saved in jpeg format. Also this new version includes an interactive help and all dialog windows have been improved in order to facilitate its use. To use CoDaPack one has to access Excel© and introduce the data in a standard spreadsheet. These should be organized as a matrix where Excel© rows correspond to the observations and columns to the parts. The user executes macros that return numerical or graphical results. There are two kinds of numerical results: new variables and descriptive statistics, and both appear on the same sheet. Graphical output appears in independent windows. In the present version there are 8 menus, with a total of 38 submenus which, after some dialogue, directly call the corresponding macro. The dialogues ask the user to input variables and further parameters needed, as well as where to put these results. The web site http://ima.udg.es/CoDaPack contains this freeware package and only Microsoft Excel© under Microsoft Windows© is required to run the software. Kew words: Compositional data Analysis, Software
Resumo:
En cada unidad didáctica precede al tít.: Técnico Superior en Desarrollo de Aplicaciones Informáticas y consta en marbetes: Formación Profesional a Distancia y Ciclo Formativo de Grado Superior
Resumo:
This paper presents the design and implementation of a mission control system (MCS) for an autonomous underwater vehicle (AUV) based on Petri nets. In the proposed approach the Petri nets are used to specify as well as to execute the desired autonomous vehicle mission. The mission is easily described using an imperative programming language called mission control language (MCL) that formally describes the mission execution thread. A mission control language compiler (MCL-C) able to automatically translate the MCL into a Petri net is described and a real-time Petri net player that allows to execute the resulting Petri net onboard an AUV are also presented
Resumo:
The collection of Computer Applications course materials. Lectures, labs, additional resources, the lot!
Resumo:
Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 14:00-15:00 Location: B32/3077 File size: 802Mb Abstract The internal behavior of large-scale software systems cannot be determined on the basis of static (e.g., source code) analysis alone. Kieker provides complementary dynamic analysis capabilities, i.e., monitoring/profiling and analyzing a software system's runtime behavior. Application Performance Monitoring is concerned with continuously observing a software system's performance-specific runtime behavior, including analyses like assessing service level compliance or detecting and diagnosing performance problems. Architecture Discovery is concerned with extracting architectural information from an existing software system, including both structural and behavioral aspects like identifying architectural entities (e.g., components and classes) and their interactions (e.g., local or remote procedure calls). In addition to the Architecture Discovery of Java systems, Kieker supports Architecture Discovery for other platforms, including legacy systems, for instance, inplemented in C#, C++, Visual Basic 6, COBOL or Perl. Thanks to Kieker's extensible architecture it is easy to implement and use custom extensions and plugins. Kieker was designed for continuous monitoring in production systems inducing only a very low overhead, which has been evaluated in extensive benchmark experiments. Please, refer to http://kieker-monitoring.net/ for more information.
Resumo:
Programa informático de enseñanza-aprendizaje de control automático asistido por ordenador, de carácter tutorial didáctico e interactivo. Realizado por profesores de la Escuela Técnica Superior de Béjar y la Facultad de Ciencias de Salamanca. Se ha utilizado el programa de cálculo científico y visualización gráfica MATLAB -ampliamente utilizado y conocido en entornos científicos y académicos- en la enseñanza y aprendizaje de control automático, sin necesidad de programar con tal programa. Se ha realizado programación de aplicaciones informáticas basada en objetos mediante visual-basic. Programa informático softcontrol. Las características de la innovación diseñada es un programa interactivo de enseñanza y aprendizaje de control automático con ejemplos de ejercicios resueltos y ejercicios abiertos a la resolución en linea por parte del usuario. Contiene manual del usuario y manual de instalación.
Resumo:
En un primer momento presenta la evolución de las carreras de ingeniería a lo largo del siglo veinte en Argentina. Analiza la ciencia y la tecnología de los materiales alrededor de los grandes descubrimientos de la década de los 50 y cómo estos marcaron el inicio de un fuerte desarrollo tecnológico que no se había dado anteriormente y que afectaba sobre todo a algunas sociedades. Partiendo de las pautas curriculares que rigen la ingeniería metalúrgica expone su punto de vista como docente en lo que se refiere a las reformas que se proponen en el currículum y realiza un propuesta para plantear la enseñanza por áreas, reflexionando sobre el papel de la evaluación continua, a través de la cual habrán de verificarse comportamientos, procesos y conocimientos de forma permanente y sistemática. Enumera los objetivos y orientaciones didácticas de la asignatura Mineralogía y Tratamiento de Minerales, presentando un modelo de Tenología Mineral en el que trata las condiciones y condicionantes de la Mineralurgia y cómo estas van preparando al futuro ingeniero para convivir con la realidad. Por último desarrolla un diseño didáctico para ser aplicado mediante lenguajes de programación de Visual Basic; esta alternativa se puede usar para asistir al docente como usuario informático y para crear programas en los que éste interaccione con los alumnos.
Resumo:
El proyecto se realiza en la Escuela Universitaria Politécnica de Valladolid por seis profesores del Departamento de Expresión Gráfica de la Ingeniería. El objetivo perseguido ha sido desarrollar nuevos métodos y herramientas que ayuden al alumno a comprender mejor la geometría plana y que a la vez faciliten al profesorado la enseñanza de la misma. El sistema de trabajo llevado a cabo ha consistido en la realización de un profundo análisis de las dificultades con las que se encontraban los alumnos al abordar el estudio de las materias objeto del trabajo, para a continuación tratar de superarlas explotando las aportaciones de las tecnologías multimedia. Todo ello en un proceso interactivo en contacto permanente con los alumnos. Resultados: el proyecto supone un avance importante en la forma de impartir y estudiar el dibujo geométrico, permitiendo que el alumno aprenda de forma autónoma, a su propio ritmo, con dibujos mucho más claros que favorecen una mejor percepción y entendimiento del problema, además dispone de una explicación detallada de cada paso de construcción que le permite en su estudio personal el conocimiento completo del desarrollo del problema. Se ha sometido a evaluación de los alumnos en un grupo de prácticas, manifestándose como una herramienta eficaz que favorece el aprendizaje de los alumnos, potencia de forma cualitativa la eficacia de las prácticas y disminuye el fracaso escolar. Materiales elaborados: se ha generado una aplicación multimedia que reproduce paso a paso una serie de ejercicios de dibujo geométrico, con una explicación escrita y oral de cada paso, con la posibilidad de congelar la imagen en el momento que se desee y de dibujar a mano alzada en la pantalla. También se ha desarrollado un manual de manejo del programa. Se han utilizado los siguientes softwares: Autocad, Macromedia Director y Visual Basic. En la actualidad el mismo grupo de autores están trabajando en la aplicación de estas mismas técnicas en las materias de Geometría Descriptiva y Normalización Industrial. Se espera realizar una publicación conjunta de todo el trabajo.