910 resultados para complexity metrics
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Technological innovations, the development of the internet, and globalization have increased the number and complexity of web applications. As a result, keeping web user interfaces understandable and usable (in terms of ease-of-use, effectiveness, and satisfaction) is a challenge. As part of this, designing userintuitive interface signs (i.e., the small elements of web user interface, e.g., navigational link, command buttons, icons, small images, thumbnails, etc.) is an issue for designers. Interface signs are key elements of web user interfaces because ‘interface signs’ act as a communication artefact to convey web content and system functionality, and because users interact with systems by means of interface signs. In the light of the above, applying semiotic (i.e., the study of signs) concepts on web interface signs will contribute to discover new and important perspectives on web user interface design and evaluation. The thesis mainly focuses on web interface signs and uses the theory of semiotic as a background theory. The underlying aim of this thesis is to provide valuable insights to design and evaluate web user interfaces from a semiotic perspective in order to improve overall web usability. The fundamental research question is formulated as What do practitioners and researchers need to be aware of from a semiotic perspective when designing or evaluating web user interfaces to improve web usability? From a methodological perspective, the thesis follows a design science research (DSR) approach. A systematic literature review and six empirical studies are carried out in this thesis. The empirical studies are carried out with a total of 74 participants in Finland. The steps of a design science research process are followed while the studies were designed and conducted; that includes (a) problem identification and motivation, (b) definition of objectives of a solution, (c) design and development, (d) demonstration, (e) evaluation, and (f) communication. The data is collected using observations in a usability testing lab, by analytical (expert) inspection, with questionnaires, and in structured and semi-structured interviews. User behaviour analysis, qualitative analysis and statistics are used to analyze the study data. The results are summarized as follows and have lead to the following contributions. Firstly, the results present the current status of semiotic research in UI design and evaluation and highlight the importance of considering semiotic concepts in UI design and evaluation. Secondly, the thesis explores interface sign ontologies (i.e., sets of concepts and skills that a user should know to interpret the meaning of interface signs) by providing a set of ontologies used to interpret the meaning of interface signs, and by providing a set of features related to ontology mapping in interpreting the meaning of interface signs. Thirdly, the thesis explores the value of integrating semiotic concepts in usability testing. Fourthly, the thesis proposes a semiotic framework (Semiotic Interface sign Design and Evaluation – SIDE) for interface sign design and evaluation in order to make them intuitive for end users and to improve web usability. The SIDE framework includes a set of determinants and attributes of user-intuitive interface signs, and a set of semiotic heuristics to design and evaluate interface signs. Finally, the thesis assesses (a) the quality of the SIDE framework in terms of performance metrics (e.g., thoroughness, validity, effectiveness, reliability, etc.) and (b) the contributions of the SIDE framework from the evaluators’ perspective.
Resumo:
In this work, the feasibility of the floating-gate technology in analog computing platforms in a scaled down general-purpose CMOS technology is considered. When the technology is scaled down the performance of analog circuits tends to get worse because the process parameters are optimized for digital transistors and the scaling involves the reduction of supply voltages. Generally, the challenge in analog circuit design is that all salient design metrics such as power, area, bandwidth and accuracy are interrelated. Furthermore, poor flexibility, i.e. lack of reconfigurability, the reuse of IP etc., can be considered the most severe weakness of analog hardware. On this account, digital calibration schemes are often required for improved performance or yield enhancement, whereas high flexibility/reconfigurability can not be easily achieved. Here, it is discussed whether it is possible to work around these obstacles by using floating-gate transistors (FGTs), and analyze problems associated with the practical implementation. FGT technology is attractive because it is electrically programmable and also features a charge-based built-in non-volatile memory. Apart from being ideal for canceling the circuit non-idealities due to process variations, the FGTs can also be used as computational or adaptive elements in analog circuits. The nominal gate oxide thickness in the deep sub-micron (DSM) processes is too thin to support robust charge retention and consequently the FGT becomes leaky. In principle, non-leaky FGTs can be implemented in a scaled down process without any special masks by using “double”-oxide transistors intended for providing devices that operate with higher supply voltages than general purpose devices. However, in practice the technology scaling poses several challenges which are addressed in this thesis. To provide a sufficiently wide-ranging survey, six prototype chips with varying complexity were implemented in four different DSM process nodes and investigated from this perspective. The focus is on non-leaky FGTs, but the presented autozeroing floating-gate amplifier (AFGA) demonstrates that leaky FGTs may also find a use. The simplest test structures contain only a few transistors, whereas the most complex experimental chip is an implementation of a spiking neural network (SNN) which comprises thousands of active and passive devices. More precisely, it is a fully connected (256 FGT synapses) two-layer spiking neural network (SNN), where the adaptive properties of FGT are taken advantage of. A compact realization of Spike Timing Dependent Plasticity (STDP) within the SNN is one of the key contributions of this thesis. Finally, the considerations in this thesis extend beyond CMOS to emerging nanodevices. To this end, one promising emerging nanoscale circuit element - memristor - is reviewed and its applicability for analog processing is considered. Furthermore, it is discussed how the FGT technology can be used to prototype computation paradigms compatible with these emerging two-terminal nanoscale devices in a mature and widely available CMOS technology.
Resumo:
Physical exercise is associated with parasympathetic withdrawal and increased sympathetic activity resulting in heart rate increase. The rate of post-exercise cardiodeceleration is used as an index of cardiac vagal reactivation. Analysis of heart rate variability (HRV) and complexity can provide useful information about autonomic control of the cardiovascular system. The aim of the present study was to ascertain the association between heart rate decrease after exercise and HRV parameters. Heart rate was monitored in 17 healthy male subjects (mean age: 20 years) during the pre-exercise phase (25 min supine, 5 min standing), during exercise (8 min of the step test with an ascending frequency corresponding to 70% of individual maximal power output) and during the recovery phase (30 min supine). HRV analysis in the time and frequency domains and evaluation of a newly developed complexity measure - sample entropy - were performed on selected segments of heart rate time series. During recovery, heart rate decreased gradually but did not attain pre-exercise values within 30 min after exercise. On the other hand, HRV gradually increased, but did not regain rest values during the study period. Heart rate complexity was slightly reduced after exercise and attained rest values after 30-min recovery. The rate of cardiodeceleration did not correlate with pre-exercise HRV parameters, but positively correlated with HRV measures and sample entropy obtained from the early phases of recovery. In conclusion, the cardiodeceleration rate is independent of HRV measures during the rest period but it is related to early post-exercise recovery HRV measures, confirming a parasympathetic contribution to this phase.
Resumo:
The brain is a complex system, which produces emergent properties such as those associated with activity-dependent plasticity in processes of learning and memory. Therefore, understanding the integrated structures and functions of the brain is well beyond the scope of either superficial or extremely reductionistic approaches. Although a combination of zoom-in and zoom-out strategies is desirable when the brain is studied, constructing the appropriate interfaces to connect all levels of analysis is one of the most difficult challenges of contemporary neuroscience. Is it possible to build appropriate models of brain function and dysfunctions with computational tools? Among the best-known brain dysfunctions, epilepsies are neurological syndromes that reach a variety of networks, from widespread anatomical brain circuits to local molecular environments. One logical question would be: are those complex brain networks always producing maladaptive emergent properties compatible with epileptogenic substrates? The present review will deal with this question and will try to answer it by illustrating several points from the literature and from our laboratory data, with examples at the behavioral, electrophysiological, cellular and molecular levels. We conclude that, because the brain is a complex system compatible with the production of emergent properties, including plasticity, its functions should be approached using an integrated view. Concepts such as brain networks, graphics theory, neuroinformatics, and e-neuroscience are discussed as new transdisciplinary approaches dealing with the continuous growth of information about brain physiology and its dysfunctions. The epilepsies are discussed as neurobiological models of complex systems displaying maladaptive plasticity.
Resumo:
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
Resumo:
The presentation consists of work-in-progress metrics of #digitalkoot, the crowdsourcing project launched by National Library of Finland
Resumo:
Sustainability issue of ICT have gathered attention in recent years, and researchers are working on this problem. Sustainability incorporates numerous interconnected aspects as well as methods to achieve it in ICT, so it is quite complicated to have a general view on a problem without a proper framework. However, a general methodology for such a research is missing. In this work it is proposed to use Biomimicry approach as a framework for sustainability research and development, as it introduces systematics and also forces to account sustainable aspects. Additionally, an interesting problem of green network measurements for enhancing sustainability in ICT will be researched using mentioned approach. The goal is to investigate Biomimicry as a systemic approach for developing sustainable systems, as well as to apply it in green network measurements study. Comparative study is performed for examining Biomimicry approach, as well as a use case of green network measurements research is presented. As a result, green network measurement can potentially improve sustainability, but only to a limited extent as it cannot incorporate all the aspects; within Biomimicry approach, two methodologies exist. It is possible to conclude that Biomimicry is a good framework for developing sustainable systems, nevertheless, another methodology has to be introduced within it; new methodology has to incorporate advantages of two existing ones.
Resumo:
In much of the previous research into the field of interactive storytelling, the focus has been on the creation of complete systems, then evaluating the performance of those systems based on user experience. Less focus has been placed on finding general solutions to problems that manifest in many different types of interactive storytelling systems. The goal of this thesis was to identify potential candidates for metrics that a system could use to predict player behavior or how players experience the story they are presented with, and to put these metrics to an empirical test. The three metrics that were used were morality, relationships and conflict. The game used for user testing of the metrics, Regicide is an interactive storytelling experience that was created in conjunction with Eero Itkonen. Data, in the forms of internal system data and survey answers, collected through user testing, was used to evaluate hypotheses for each metric. Out of the three chosen metrics, morality performed the best in this study. Though further research and refinement may be required, the results were promising, and point to the conclusion that user responses to questions of morality are a strong predictor for their choices in similar situations later on in the course of an interactive story. A similar examination for user relationships with other characters in the story did not produce promising results, but several problems were recognized in terms of methodology and further research with a better optimized system may yield different results. On the subject of conflict, several aspects, proposed by Ware et al. (2012), were evaluated separately. Results were inconclusive, with the aspect of directness showing the most promise.
Resumo:
Abstract Software product metrics aim at measuring the quality of software. Modu- larity is an essential factor in software quality. In this work, metrics related to modularity and especially cohesion of the modules, are considered. The existing metrics are evaluated, and several new alternatives are proposed. The idea of cohesion of modules is that a module or a class should consist of related parts. The closely related principle of coupling says that the relationships between modules should be minimized. First, internal cohesion metrics are considered. The relations that are internal to classes are shown to be useless for quality measurement. Second, we consider external relationships for cohesion. A detailed analysis using design patterns and refactorings confirms that external cohesion is a better quality indicator than internal. Third, motivated by the successes (and problems) of external cohesion metrics, another kind of metric is proposed that represents the quality of modularity of software. This metric can be applied to refactorings related to classes, resulting in a refactoring suggestion system. To describe the metrics formally, a notation for programs is developed. Because of the recursive nature of programming languages, the properties of programs are most compactly represented using grammars and formal lan- guages. Also the tools that were used for metrics calculation are described.
Resumo:
The effects of a complexly worded counterattitudinal appeal on laypeople's attitudes toward a legal issue were examined, using the Elaboration Likelihood Model (ELM) of persuasion as a theoretical framework. This model states that persuasion can result from the elaboration and scrutiny of the message arguments (i.e., central route processing), or can result from less cognitively effortful strategies, such as relying on source characteristics as a cue to message validity (i.e., peripheral route processing). One hundred and sixty-seven undergraduates (85 men and 81 women) listened to eitller a low status or high status source deliver a counterattitudinal speech on a legal issue. The speech was designed to contain strong or weak arguments. These arguments were 'worded in a simple and, therefore, easy to comprehend manner, or in a complex and, therefore, difficult to comprehend manner. Thus, there were three experimental manipulations: argument comprehensibility (easy to comprehend vs. difficult to comprehend), argumel11 strength (weak vs. strong), and source status (low vs. high). After listening to tIle speec.J] participants completed a measure 'of their attitude toward the legal issue, a thought listil1g task, an argument recall task,manipulation checks, measures of motivation to process the message, and measures of mood. As a result of the failure of the argument strength manipulation, only the effects of the comprehel1sibility and source status manipulations were tested. There was, however, some evidence of more central route processing in the easy comprehension condition than in the difficult comprehension condition, as predicted. Significant correlations were found between attitude and favourable and unfavourable thoughts about the legal issue with easy to comprehend arguments; whereas, there was a correlation only between attitude and favourable thoughts 11 toward the issue with difficult to comprehend arguments, suggesting, perhaps, that central route processing, \vhich involves argument scrutiny and elaboration, occurred under conditions of easy comprehension to a greater extent than under conditions of difficult comprehension. The results also revealed, among other findings, several significant effects of gender. Men had more favourable attitudes toward the legal issue than did women, men recalled more arguments from the speech than did women, men were less frustrated while listening to the speech than were ,vomen, and men put more effort into thinking about the message arguments than did women. When the arguments were difficult to comprehend, men had more favourable thoughts and fewer unfavourable thoughts about the legal issue than did women. Men and women may have had different affective responses to the issue of plea bargaining (with women responding more negatively than men), especially in light of a local and controversial plea bargain that occurred around the time of this study. Such pre-existing gender differences may have led to tIle lower frustration, the greater effort, the greater recall, and more positive attitudes for men than for WOlnen. Results· from this study suggest that current cognitive models of persuasion may not be very applicable to controversial issues which elicit strong emotional responses. Finally, these data indicate that affective responses, the controversial and emotional nature ofthe issue, gender and other individual differences are important considerations when experts are attempting to persuade laypeople toward a counterattitudinal position.
Resumo:
As the complexity of evolutionary design problems grow, so too must the quality of solutions scale to that complexity. In this research, we develop a genetic programming system with individuals encoded as tree-based generative representations to address scalability. This system is capable of multi-objective evaluation using a ranked sum scoring strategy. We examine Hornby's features and measures of modularity, reuse and hierarchy in evolutionary design problems. Experiments are carried out, using the system to generate three-dimensional forms, and analyses of feature characteristics such as modularity, reuse and hierarchy were performed. This work expands on that of Hornby's, by examining a new and more difficult problem domain. The results from these experiments show that individuals encoded with those three features performed best overall. It is also seen, that the measures of complexity conform to the results of Hornby. Moving forward with only this best performing encoding, the system was applied to the generation of three-dimensional external building architecture. One objective considered was passive solar performance, in which the system was challenged with generating forms that optimize exposure to the Sun. The results from these and other experiments satisfied the requirements. The system was shown to scale well to the architectural problems studied.
Resumo:
Grapevine winter hardiness is a key factor in vineyard success in many cool climate wine regions. Winter hardiness may be governed by a myriad of factors in addition to extreme weather conditions – e.g. soil factors (texture, chemical composition, moisture, drainage), vine water status, and yield– that are unique to each site. It was hypothesized that winter hardiness would be influenced by certain terroir factors , specifically that vines with low water status [more negative leaf water potential (leaf ψ)] would be more winter hardy than vines with high water status (more positive leaf ψ). Twelve different vineyard blocks (six each of Riesling and Cabernet franc) throughout the Niagara Region in Ontario, Canada were chosen. Data were collected during the growing season (soil moisture, leaf ψ), at harvest (yield components, berry composition), and during the winter (bud LT50, bud survival). Interpolation and mapping of the variables was completed using ArcGIS 10.1 (ESRI, Redlands, CA) and statistical analyses (Pearson’s correlation, principal component analysis, multilinear regression) were performed using XLSTAT. Clear spatial trends were observed in each vineyard for soil moisture, leaf ψ, yield components, berry composition, and LT50. Both leaf ψ and berry weight could predict the LT50 value, with strong positive correlations being observed between LT50 and leaf ψ values in eight of the 12 vineyard blocks. In addition, vineyards in different appellations showed many similarities (Niagara Lakeshore, Lincoln Lakeshore, Four Mile Creek, Beamsville Bench). These results suggest that there is a spatial component to winter injury, as with other aspects of terroir, in the Niagara region.