22 resultados para Intuitive


Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In my dissertation called Speaking of the unsayable: The circular philosophy of Nicholas of Cusa in his work De coniecturis, I presuppose an internal (conceptual) relation between the personal experience of God of Nicholas of Cusa (1401-64) in 1438 and, on the other hand, his philosophy. I hence try to describe the precise character of this relation. Referring to the Norwegian scholars Egil Wyller and Viggo Rossvær, I assume that there is a circularity in Cusanus’ philosophy which appears as self-references (= a sentence refers to itself: A is explained by B and B is explained by A). Wyller finds three phases in the thought of Cusanus (1. De docta ignorantia I-III, 2. De coniecturis I-II, 3. all subsequent works). Rossvær finds it impossible to presuppose certain phases, as the philosophy of Cusanus continuously proceeds and remains open to new ideas. As Cusanus however treats his experience of God far more consciously in his second work De coniecturis than in De docta ignorantia, I find it possible to distinguish between the earlier Cusanus (De docta ignorantia including his earlier works) and the later Cusanus (De coniecturis, about 1444, as well as the following works). Cusanus creates a philosophy of language in outline expressed in De coniecturis, in which he presents two concepts of necessity, i.e. absolute necessity and logical, or reasonable, necessity. These are interrelated in the sense that the mind, or the self, logically affirms the absolute, or unsayable, necessity, which shows itself in the mind and which the mind affirms conjecturally. The endeavour conceptually to understand absolute necessity implies intuitive (or intellectual) contemplation, or vision (investigatio symbolica), in which the four mental unities (the absolute, the intellectual, the rational and the sensuous) work together according to the rules described in De coniecturis. In De coniecturis Cusanus obviously turns from a negative concept of the unsayable to a paradigmatic, which implies that he looks for principles of speaking of the unsayable and presents the idea of a divine language (divinaliter). However, he leaves this idea behind after De coniecturis, although he continues to create new concepts of the unsayable and incomprehensible. The intellectual language of absolute seeing is expressed in the subjunctive, i.e. conditionally. In order to describe the unsayable, Cusanus uses tautologies, the primary one of which is a concept of God, i.e. non aliud est non aliud quam non aliud (the non-other is non-other than the nonother). Wyller considers this the crucial point of the philosophy of Cusanus (De non aliud), described by the latter as the definition of definitions, i.e. the absolute definition. However, this definition is empty regarding its content. It demonstrates that God surpasses the coincidence of opposites (coincidentia oppositorum) and that he is “superunsayable” (superineffabilis), i.e. he is beyond what can be conceived or said. Nothing hence prevents us from speaking of him, provided that he is described as unsayable (= the paradigmatic concept of the unsayable). Here the mode of seeing is decisive. Cusanus in this context (and especially in his later literary production) uses modalities which concern possibility and necessity. His aim is to conduct any willing reader ahead on the way of life (philosophia mentalis). In De coniecturis II he describes the notion of human self-consciousness as the basis of spiritual mutuality in accordance with the humanistic tradition of his time. I mainly oppose the negatively determined concept of Christian mysticism presented by the German philosopher Kurt Flasch and prefer the presentation of Burkhard Mojsisch of the translogical and conjectural use of language in De coniecturis. In particular, I take account of the Scandinavian research, basically that of Johannes Sløk, Birgit H. Helander, Egil Wyller and Viggo Rossvær, who all consider the personal experience of God described by Cusanus a tacit precondition of his philosophy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological innovations, the development of the internet, and globalization have increased the number and complexity of web applications. As a result, keeping web user interfaces understandable and usable (in terms of ease-of-use, effectiveness, and satisfaction) is a challenge. As part of this, designing userintuitive interface signs (i.e., the small elements of web user interface, e.g., navigational link, command buttons, icons, small images, thumbnails, etc.) is an issue for designers. Interface signs are key elements of web user interfaces because ‘interface signs’ act as a communication artefact to convey web content and system functionality, and because users interact with systems by means of interface signs. In the light of the above, applying semiotic (i.e., the study of signs) concepts on web interface signs will contribute to discover new and important perspectives on web user interface design and evaluation. The thesis mainly focuses on web interface signs and uses the theory of semiotic as a background theory. The underlying aim of this thesis is to provide valuable insights to design and evaluate web user interfaces from a semiotic perspective in order to improve overall web usability. The fundamental research question is formulated as What do practitioners and researchers need to be aware of from a semiotic perspective when designing or evaluating web user interfaces to improve web usability? From a methodological perspective, the thesis follows a design science research (DSR) approach. A systematic literature review and six empirical studies are carried out in this thesis. The empirical studies are carried out with a total of 74 participants in Finland. The steps of a design science research process are followed while the studies were designed and conducted; that includes (a) problem identification and motivation, (b) definition of objectives of a solution, (c) design and development, (d) demonstration, (e) evaluation, and (f) communication. The data is collected using observations in a usability testing lab, by analytical (expert) inspection, with questionnaires, and in structured and semi-structured interviews. User behaviour analysis, qualitative analysis and statistics are used to analyze the study data. The results are summarized as follows and have lead to the following contributions. Firstly, the results present the current status of semiotic research in UI design and evaluation and highlight the importance of considering semiotic concepts in UI design and evaluation. Secondly, the thesis explores interface sign ontologies (i.e., sets of concepts and skills that a user should know to interpret the meaning of interface signs) by providing a set of ontologies used to interpret the meaning of interface signs, and by providing a set of features related to ontology mapping in interpreting the meaning of interface signs. Thirdly, the thesis explores the value of integrating semiotic concepts in usability testing. Fourthly, the thesis proposes a semiotic framework (Semiotic Interface sign Design and Evaluation – SIDE) for interface sign design and evaluation in order to make them intuitive for end users and to improve web usability. The SIDE framework includes a set of determinants and attributes of user-intuitive interface signs, and a set of semiotic heuristics to design and evaluate interface signs. Finally, the thesis assesses (a) the quality of the SIDE framework in terms of performance metrics (e.g., thoroughness, validity, effectiveness, reliability, etc.) and (b) the contributions of the SIDE framework from the evaluators’ perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Succinate is a naturally occurring metabolite in organism’s cell and is industrially important chemical with various applications in food and pharmaceutical industry. It is also widely used to produce bio-degradable plastics, surfactants, detergents etc. In last decades, emphasis has been given to bio-based chemical production using industrial biotechnology route rather than fossil-based production considering sustainability and environment friendly economy. In this thesis I am presenting a computational model for silico metabolic engineering of Saccharomyces cerevisiae for large scale production of succinate. For metabolic modelling, I have used OptKnock and OptGene optimization algorithms to identify the reactions to delete from the genome-scale metabolic model of S. cerevisiae to overproduce succinate by coupling with organism’s growth. Both OptKnock and OptGene proposed numerous straightforward and non-intuitive deletion strategies when number of constraints including growth constraint to the model were applied. The most interesting strategy identified by both algorithms was deletion combination of pyruvate decarboxylase and Ubiquinol:ferricytochrome c reductase(respiratory enzyme) reactions thereby also suggesting anaerobic fermentation of the organism in glucose medium. Such strategy was never reported earlier for growth-coupled succinate production in S.cerevisiae.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The interest to small and media size enterprises’ (SMEs) internationalization process is increasing with a growth of SMEs’ contribution to GDP. Internet gives an opportunity to provide variety of services online and reach market niche worldwide. The overlapping of SMEs’ internationalization and online services is the main issue of the research. The most SMEs internationalize according to intuitive decisions of CEO of the company and lose limited resources to worthless attempts. The purpose of this research is to define effective approaches to online service internationalization and selection of the first international market. The research represents single holistic case study of local massive open online courses (MOOCs) platform going global. It considers internationalization costs and internationalization theories applicable to online services. The research includes preliminary screening of the markets and in-depth analysis based on macro parameters of the market and specific characteristics of the customers and expert evaluation of the results. The specific issues as GILT (Globalization, Internationalization, Localization and Translation) approach and Internet-enabled internationalization are considered. The research results include recommendations on international market selection methodology for online services and for effective internationalization strategy development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the short-term impact of credit rating announcements on daily stock returns of 41 European banks indexed in STOXX Europe 600 Banks. The time period of this study is 2002–2015 and the ratings represent long-term issuer ratings provided by S&P, Moody’s and Fitch. Bank ratings are significant for a bank’s operation costs so it is interesting to investigate how investors react to changes in creditworthiness. The study objective is achieved by conducting an event study. The event study is extended with a cross-sectional linear regression to investigate other potential determinants surrounding rating changes. The research hypotheses and the motivation for additional tests are derived from prior research. The main hypotheses are formed to explore whether rating changes have an effect on stock returns, when this possible reaction occurs and whether it is asymmetric between upgrades and downgrades. The findings provide evidence that rating announcements have an impact on stock returns in the context of European banks. The results also support the existence of an asymmetry in capital market reaction to rating upgrades and downgrades. The rating downgrades are associated with statistically significant negative abnormal returns on the event day although the reaction is rather modest. No statistically significant reaction is found associated with the rating upgrades on the event day. These results hold true with both rating changes and rating watches. No anticipation is observed in the case of rating changes but there is a statistically significant cumulative negative (positive) price reaction occurring before the event day for negative (positive) watch announcements. The regression provides evidence that the stock price reaction is stronger for rating downgrades occurring within below investment grade class compared with investment grade class. This is intuitive as investors are more concerned about their investments in lower-rated companies. Besides, the price reaction of larger banks is more mitigated compared with smaller banks in the case of rating downgrades. The reason for this may be that larger banks are usually more widely followed by the public. However, the study results may also provide evidence of the existence of the so-called “too big to fail” subsidy that dampens the negative returns of larger banks.