887 resultados para programming interface
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Technological innovations, the development of the internet, and globalization have increased the number and complexity of web applications. As a result, keeping web user interfaces understandable and usable (in terms of ease-of-use, effectiveness, and satisfaction) is a challenge. As part of this, designing userintuitive interface signs (i.e., the small elements of web user interface, e.g., navigational link, command buttons, icons, small images, thumbnails, etc.) is an issue for designers. Interface signs are key elements of web user interfaces because ‘interface signs’ act as a communication artefact to convey web content and system functionality, and because users interact with systems by means of interface signs. In the light of the above, applying semiotic (i.e., the study of signs) concepts on web interface signs will contribute to discover new and important perspectives on web user interface design and evaluation. The thesis mainly focuses on web interface signs and uses the theory of semiotic as a background theory. The underlying aim of this thesis is to provide valuable insights to design and evaluate web user interfaces from a semiotic perspective in order to improve overall web usability. The fundamental research question is formulated as What do practitioners and researchers need to be aware of from a semiotic perspective when designing or evaluating web user interfaces to improve web usability? From a methodological perspective, the thesis follows a design science research (DSR) approach. A systematic literature review and six empirical studies are carried out in this thesis. The empirical studies are carried out with a total of 74 participants in Finland. The steps of a design science research process are followed while the studies were designed and conducted; that includes (a) problem identification and motivation, (b) definition of objectives of a solution, (c) design and development, (d) demonstration, (e) evaluation, and (f) communication. The data is collected using observations in a usability testing lab, by analytical (expert) inspection, with questionnaires, and in structured and semi-structured interviews. User behaviour analysis, qualitative analysis and statistics are used to analyze the study data. The results are summarized as follows and have lead to the following contributions. Firstly, the results present the current status of semiotic research in UI design and evaluation and highlight the importance of considering semiotic concepts in UI design and evaluation. Secondly, the thesis explores interface sign ontologies (i.e., sets of concepts and skills that a user should know to interpret the meaning of interface signs) by providing a set of ontologies used to interpret the meaning of interface signs, and by providing a set of features related to ontology mapping in interpreting the meaning of interface signs. Thirdly, the thesis explores the value of integrating semiotic concepts in usability testing. Fourthly, the thesis proposes a semiotic framework (Semiotic Interface sign Design and Evaluation – SIDE) for interface sign design and evaluation in order to make them intuitive for end users and to improve web usability. The SIDE framework includes a set of determinants and attributes of user-intuitive interface signs, and a set of semiotic heuristics to design and evaluate interface signs. Finally, the thesis assesses (a) the quality of the SIDE framework in terms of performance metrics (e.g., thoroughness, validity, effectiveness, reliability, etc.) and (b) the contributions of the SIDE framework from the evaluators’ perspective.
Resumo:
This article reports on the design and characteristics of substrate mimetics in protease-catalyzed reactions. Firstly, the basis of protease-catalyzed peptide synthesis and the general advantages of substrate mimetics over common acyl donor components are described. The binding behavior of these artificial substrates and the mechanism of catalysis are further discussed on the basis of hydrolysis, acyl transfer, protein-ligand docking, and molecular dynamics studies on the trypsin model. The general validity of the substrate mimetic concept is illustrated by the expansion of this strategy to trypsin-like, glutamic acid-specific, and hydrophobic amino acid-specific proteases. Finally, opportunities for the combination of the substrate mimetic strategy with the chemical solid-phase peptide synthesis and the use of substrate mimetics for non-peptide organic amide synthesis are presented.
Resumo:
Linguistic modelling is a rather new branch of mathematics that is still undergoing rapid development. It is closely related to fuzzy set theory and fuzzy logic, but knowledge and experience from other fields of mathematics, as well as other fields of science including linguistics and behavioral sciences, is also necessary to build appropriate mathematical models. This topic has received considerable attention as it provides tools for mathematical representation of the most common means of human communication - natural language. Adding a natural language level to mathematical models can provide an interface between the mathematical representation of the modelled system and the user of the model - one that is sufficiently easy to use and understand, but yet conveys all the information necessary to avoid misinterpretations. It is, however, not a trivial task and the link between the linguistic and computational level of such models has to be established and maintained properly during the whole modelling process. In this thesis, we focus on the relationship between the linguistic and the mathematical level of decision support models. We discuss several important issues concerning the mathematical representation of meaning of linguistic expressions, their transformation into the language of mathematics and the retranslation of mathematical outputs back into natural language. In the first part of the thesis, our view of the linguistic modelling for decision support is presented and the main guidelines for building linguistic models for real-life decision support that are the basis of our modeling methodology are outlined. From the theoretical point of view, the issues of representation of meaning of linguistic terms, computations with these representations and the retranslation process back into the linguistic level (linguistic approximation) are studied in this part of the thesis. We focus on the reasonability of operations with the meanings of linguistic terms, the correspondence of the linguistic and mathematical level of the models and on proper presentation of appropriate outputs. We also discuss several issues concerning the ethical aspects of decision support - particularly the loss of meaning due to the transformation of mathematical outputs into natural language and the issue or responsibility for the final decisions. In the second part several case studies of real-life problems are presented. These provide background and necessary context and motivation for the mathematical results and models presented in this part. A linguistic decision support model for disaster management is presented here – formulated as a fuzzy linear programming problem and a heuristic solution to it is proposed. Uncertainty of outputs, expert knowledge concerning disaster response practice and the necessity of obtaining outputs that are easy to interpret (and available in very short time) are reflected in the design of the model. Saaty’s analytic hierarchy process (AHP) is considered in two case studies - first in the context of the evaluation of works of art, where a weak consistency condition is introduced and an adaptation of AHP for large matrices of preference intensities is presented. The second AHP case-study deals with the fuzzified version of AHP and its use for evaluation purposes – particularly the integration of peer-review into the evaluation of R&D outputs is considered. In the context of HR management, we present a fuzzy rule based evaluation model (academic faculty evaluation is considered) constructed to provide outputs that do not require linguistic approximation and are easily transformed into graphical information. This is achieved by designing a specific form of fuzzy inference. Finally the last case study is from the area of humanities - psychological diagnostics is considered and a linguistic fuzzy model for the interpretation of outputs of multidimensional questionnaires is suggested. The issue of the quality of data in mathematical classification models is also studied here. A modification of the receiver operating characteristics (ROC) method is presented to reflect variable quality of data instances in the validation set during classifier performance assessment. Twelve publications on which the author participated are appended as a third part of this thesis. These summarize the mathematical results and provide a closer insight into the issues of the practicalapplications that are considered in the second part of the thesis.
Resumo:
Tässä työssä esiteltiin Android laitteisto- ja sovellusalustana sekä kuvattiin, kuinka Android-pelisovelluksen käyttöliittymä voidaan pitää yhtenäisenä eri näyttölaitteilla skaalauskertoimien ja ankkuroinnin avulla. Toisena osiona työtä käsiteltiin yksinkertaisia tapoja, joilla pelisovelluksien suorituskykyä voidaan parantaa. Näistä tarkempiin mittauksiin valittiin matalatarkkuuksinen piirtopuskuri ja näkymättömissä olevien kappaleiden piilotus. Mittauksissa valitut menetelmät vaikuttivat demosovelluksen suorituskykyyn huomattavasti. Tässä työssä rajauduttiin Android-ohjelmointiin Java-kielellä ilman ulkoisia kirjastoja, jolloin työn tuloksia voi helposti hyödyntää mahdollisimman monessa eri käyttökohteessa.
Resumo:
The state of the object-oriented programming course in Lappeenranta University of Technology had reached the point, where it required changes to provide better learning opportunities and thus the learning outcomes. Based on the student feedback the course was partially dated and ineffective. The components of the course were analysed and the ineffective elements were removed and new methods were introduced to improve the course. The major changes included the change from traditional teaching methods to reverse classroom method and the use of Java as the programming language. The changes were measured by the student feedback, lecturer’s observations and comparison to previous years. The feedback suggested that the changes were successful; the course received higher overall grade than before.
Resumo:
This thesis reports investigations on applying the Service Oriented Architecture (SOA) approach in the engineering of multi-platform and multi-devices user interfaces. This study has three goals: (1) analyze the present frameworks for developing multi-platform and multi-devices applications, (2) extend the principles of SOA for implementing a multi-platform and multi-devices architectural framework (SOA-MDUI), (3) applying and validating the proposed framework in the context of a specific application. One of the problems addressed in this ongoing research is the large amount of combinations for possible implementations of applications on different types of devices. Usually it is necessary to take into account the operating system (OS), user interface (UI) including the appearance, programming language (PL) and architectural style (AS). Our proposed approach extended the principles of SOA using patterns-oriented design and model-driven engineering approaches. Synthesizing the present work done in these domains, this research built and tested an engineering framework linking Model-driven Architecture (MDA) and SOA approaches to developing of UI. This study advances general understanding of engineering, deploying and managing multi-platform and multi-devices user interfaces as a service.
Resumo:
The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.
Resumo:
New emerging technologies in the recent decade have brought new options to cross platform computer graphics development. This master thesis took a look for cross platform 3D graphics development possibilities. All platform dependent and non real time solutions were excluded. WebGL and two different OpenGL based solutions were assessed via demo application by using most recent development tools. In the results pros and cons of the each solutions were noted.
Resumo:
With the growth in new technologies, using online tools have become an everyday lifestyle. It has a greater impact on researchers as the data obtained from various experiments needs to be analyzed and knowledge of programming has become mandatory even for pure biologists. Hence, VTT came up with a new tool, R Executables (REX) which is a web application designed to provide a graphical interface for biological data functions like Image analysis, Gene expression data analysis, plotting, disease and control studies etc., which employs R functions to provide results. REX provides a user interactive application for the biologists to directly enter the values and run the required analysis with a single click. The program processes the given data in the background and prints results rapidly. Due to growth of data and load on server, the interface has gained problems concerning time consumption, poor GUI, data storage issues, security, minimal user interactive experience and crashes with large amount of data. This thesis handles the methods by which these problems were resolved and made REX a better application for the future. The old REX was developed using Python Django and now, a new programming language, Vaadin has been implemented. Vaadin is a Java framework for developing web applications and the programming language is extremely similar to Java with new rich components. Vaadin provides better security, better speed, good and interactive interface. In this thesis, subset functionalities of REX was selected which includes IST bulk plotting and image segmentation and implemented those using Vaadin. A code of 662 lines was programmed by me which included Vaadin as the front-end handler while R language was used for back-end data retrieval, computing and plotting. The application is optimized to allow further functionalities to be migrated with ease from old REX. Future development is focused on including Hight throughput screening functions along with gene expression database handling
Resumo:
Marketing and finance are both facing challenges in the constantly changing business environment. Finance is challenged to change its role from cost control to value-adding business partner while marketing needs to be able to demonstrate its accountability so how it contributes to firm performance. Finance is the key partner for marketing to prove its impact by helping marketing to measure its actions. By doing so, finance can also emphasize its business partner role. There is not a lot of research conducted of the relationship between marketing and finance departments. The aim of this study is to investigate how the professional differences of marketing and finance and their forms of cooperation affect marketing performance measurement. Literature of marketing and finance disciplines, their cooperation, performance implications of their interface as well as the roles of marketing performance measurement, performance measurement system and measures were reviewed. This research was conducted as a qualitative case study among senior management of marketing and finance in the sporting goods company. The data collected through semi-structured interviews, participant observation and secondary data was described and classified and connections were made. The results of the study show that the nature of marketing and finance disciplines has many effects on their cooperation and performance measurement. Due to the ambiguous nature of marketing, measuring its performance is still seen as a challenge but digitalization is helping the measurement. It was indicated that marketing and finance professionals need to have different skillsets in order to perform their roles effectively and thus cooperation is needed. Marketing performance needs to be measured with both financial and nonfinancial measures. Both marketing and finance interviewees highlighted the importance of marketing measures over financial measures. Measuring marketing performance comprehensively is seen as a challenge since marketing and finance cooperation is still shaped by the cost control and budget management roles, rather than performance measurement. We recognized three constraints affecting this cooperation and performance measurement: people, time and software. If marketing and finance would develop deeper cooperation, they could create comprehensive performance measurement system that improves organizational performance.
Resumo:
The perovskite crystal structure is host to many different materials from insulating to superconducting providing a diverse range of intrinsic character and complexity. A better fundamental description of these materials in terms of their electronic, optical and magnetic properties undoubtedly precedes an effective realization of their application potential. SmTiOa, a distorted perovskite has a strongly localized electronic structure and undergoes an antiferromagnetic transition at 50 K in its nominally stoichiometric form. Sr2Ru04 is a layered perovskite superconductor (ie. Tc % 1 K) bearing the same structure as the high-tem|>erature superconductor La2_xSrrCu04. Polarized reflectance measurements were carried out on both of these materials revealing several interesting features in the far-infrared range of the spectrum. In the case of SmTiOa, although insulating, evidence indicates the presence of a finite background optical conductivity. As the temperature is lowered through the ordering temperature a resonance feature appears to narrow and strengthen near 120 cm~^ A nearby phonon mode appears to also couple to this magnetic transition as revealed by a growing asymmetry in the optica] conductivity. Experiments on a doped sample with a greater itinerant character and lower Neel temperature = 40 K also indicate the presence of this strongly temperature dependent mode even at twice the ordering temperature. Although the mode appears to be sensitive to the magnetic transition it is unclear whether a magnon assignment is appropriate. At very least, evidence suggests an interesting interaction between magnetic and electronic excitations. Although Sr2Ru04 is highly anisotropic it is metallic in three-dimensions at low temperatures and reveals its coherent transport in an inter-plane Drude-like component to the highest temperatures measured (ie. 90 K). An extended Drude analysis is used to probe the frequency dependent scattering character revealing a peak in both the mass enhancement and scattering rate near 80 cm~* and 100 cm~* respectively. All of these experimental observations appear relatively consistent with a Fermi-liquid picture of charge transport. To supplement the optical measurements a resistivity station was set up with an event driven object oriented user interface. The program controls a Keithley Current Source, HP Nano-Voltmeter and Switching Unit as well as a LakeShore Temperature Controller in order to obtain a plot of the Resistivity as a function of temperature. The system allows for resistivity measurements ranging from 4 K to 290 K using an external probe or between 0.4 K to 295 K using a Helium - 3 Cryostat. Several materials of known resistivity have confirmed the system to be robust and capable of measuring metallic samples distinguishing features of several fiQ-cm.