992 resultados para user click behavior
Resumo:
Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.
Resumo:
Reinforced concrete columns might fail because of buckling of the longitudinal reinforcing bar when exposed to earthquake motions. Depending on the hoop stiffness and the length-over-diameter ratio, the instability can be local (in between two subsequent hoops) or global (the buckling length comprises several hoop spacings). To get insight into the topic, an extensive literary research of 19 existing models has been carried out including different approaches and assumptions which yield different results. Finite element fiberanalysis was carried out to study the local buckling behavior with varying length-over-diameter and initial imperfection-over-diameter ratios. The comparison of the analytical results with some experimental results shows good agreement before the post buckling behavior undergoes large deformation. Furthermore, different global buckling analysis cases were run considering the influence of different parameters; for certain hoop stiffnesses and length-over-diameter ratios local buckling was encountered. A parametric study yields an adimensional critical stress in function of a stiffness ratio characterized by the reinforcement configuration. Colonne in cemento armato possono collassare per via dell’instabilità dell’armatura longitudinale se sottoposte all’azione di un sisma. In funzione della rigidezza dei ferri trasversali e del rapporto lunghezza d’inflessione-diametro, l’instabilità può essere locale (fra due staffe adiacenti) o globale (la lunghezza d’instabilità comprende alcune staffe). Per introdurre alla materia, è proposta un’esauriente ricerca bibliografica di 19 modelli esistenti che include approcci e ipotesi differenti che portano a risultati distinti. Tramite un’analisi a fibre e elementi finiti si è studiata l’instabilità locale con vari rapporti lunghezza d’inflessione-diametro e imperfezione iniziale-diametro. Il confronto dei risultati analitici con quelli sperimentali mostra una buona coincidenza fino al raggiungimento di grandi spostamenti. Inoltre, il caso d’instabilità globale è stato simulato valutando l’influenza di vari parametri; per certe configurazioni di rigidezza delle staffe e lunghezza d’inflessione-diametro si hanno ottenuto casi di instabilità locale. Uno studio parametrico ha permesso di ottenere un carico critico adimensionale in funzione del rapporto di rigidezza dato dalle caratteristiche dell’armatura.
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
Die Positronen-Emissions-Tomographie (PET) ist ein leistungsstarkes, nicht-invasives, bildgebendes Verfahren in der Nuklearmedizin und hat darüber hinaus zunehmende Bedeutung in der Arzneistoffentwicklung. Zur Verbesserung des therapeutischen Index von niedermolekularen Pharmaka werden vermehrt Wirkstofftransportsysteme eingesetzt. Eine Klasse dieser Wirkstofftransportsysteme sind Liposomen. Die Weiterentwicklung der klassischen Liposomen sind sogenannte „Stealth“-Liposomen, die eine Polyethylenglykol (PEG)-Korona zur Herabsetzung der Erkennung und Ausscheidung tragen. Zur (Weiter-)Entwicklung und deren in vivo-Evaluierung bietet die PET die Möglichkeit, die Auswirkungen von strukturellen Anpassungen auf die pharmakokinetischen Eigenschaften solcher Transportsysteme zu untersuchen. Zur Evaluierung neuartiger, cholesterolverankerter, linear-hyperverzweigter Polyglycerole (Ch-PEG-hbPG) als sterisch stabilisierende Polymere in Liposomen wurden diese im Rahmen dieser Arbeit mit der prosthetischen Gruppe 18F-TEG-N3 über kupferkatalysierte Alkin-Azid Cycloaddition (CuAAC) in sehr hohen Ausbeuten radiomarkiert. Zum systematischen Vergleich des in vivo-Verhaltens wurde ebenfalls ein cholesterolbasiertes lineares PEG (Ch-PEG) mit CuAAC nahezu quantitativ radiomarkiert. Als drittes Element wurde die Direktmarkierung von Cholesterol mit [18F]F- entwickelt. Diese drei Verbindungen wurden zuerst separat als Einzelkomponenten und anschließend, in Liposomen formuliert, in Tierstudien an Mäusen hinsichtlich ihrer initialen Pharmakokinetik und Biodistribution untersucht. Dabei zeigte sich ein ähnliches Verhalten der neuartigen Ch-PEG-hbPG-Derivate zu den bekannten Ch-PEG, mit dem Vorteil der Multifunktionalität an den hyperverzweigten Strukturen. Die liposomalen Strukturen mit der neuartigen sterischen Stabilisierung wiesen eine erhöhte Blutzirkulationszeit und vorteilhafte Blut-zu-Leber- und Blut-zu-Lunge-Verhältnisse im Vergleich zu den linear stabilisierten Analoga auf.rnEine weitere Klasse von Wirkstofftransportsystemen sind polymere Trägersysteme wie pHPMA. Alkinfunktionalisierte Polymere konnten in zwei verschiedenen Größen (~12 und 60 kDa) mittels CuAAC in sehr hohen Ausbeuten mit der prosthetischen Gruppe 18F-TEG-N3 radiomarkiert werden. Bicyclononinderivate der gleichen Größen konnten ohne Kupferkatalyse über ringspannungsvermittelte Alkin-Azid-Cycloaddition (SPAAC) mikrowellengestützt markiert werden und stehen somit zur in vivo-Untersuchung hinsichtlich des Einflusses der Markierungsart zur Verfügung.
Resumo:
Cognitive Wireless Sensor Network (CWSN) is a new paradigm which integrates cognitive features in traditional Wireless Sensor Networks (WSNs) to mitigate important problems such as spectrum occupancy. Security in Cognitive Wireless Sensor Networks is an important problem because these kinds of networks manage critical applications and data. Moreover, the specific constraints of WSN make the problem even more critical. However, effective solutions have not been implemented yet. Among the specific attacks derived from new cognitive features, the one most studied is the Primary User Emulation (PUE) attack. This paper discusses a new approach, based on anomaly behavior detection and collaboration, to detect the PUE attack in CWSN scenarios. A nonparametric CUSUM algorithm, suitable for low resource networks like CWSN, has been used in this work. The algorithm has been tested using a cognitive simulator that brings important results in this area. For example, the result shows that the number of collaborative nodes is the most important parameter in order to improve the PUE attack detection rates. If the 20% of the nodes collaborates, the PUE detection reaches the 98% with less than 1% of false positives.
Resumo:
This paper describes a knowledge-based approach for summarizing and presenting the behavior of hydrologic networks. This approach has been designed for visualizing data from sensors and simulations in the context of emergencies caused by floods. It follows a solution for event summarization that exploits physical properties of the dynamic system to automatically generate summaries of relevant data. The summarized information is presented using different modes such as text, 2D graphics and 3D animations on virtual terrains. The presentation is automatically generated using a hierarchical planner with abstract presentation fragments corresponding to discourse patterns, taking into account the characteristics of the user who receives the information and constraints imposed by the communication devices (mobile phone, computer, fax, etc.). An application following this approach has been developed for a national hydrologic information infrastructure of Spain.
Resumo:
Modern sensor technologies and simulators applied to large and complex dynamic systems (such as road traffic networks, sets of river channels, etc.) produce large amounts of behavior data that are difficult for users to interpret and analyze. Software tools that generate presentations combining text and graphics can help users understand this data. In this paper we describe the results of our research on automatic multimedia presentation generation (including text, graphics, maps, images, etc.) for interactive exploration of behavior datasets. We designed a novel user interface that combines automatically generated text and graphical resources. We describe the general knowledge-based design of our presentation generation tool. We also present applications that we developed to validate the method, and a comparison with related work.
Resumo:
Web transaction data between Web visitors and Web functionalities usually convey user task-oriented behavior pattern. Mining such type of click-stream data will lead to capture usage pattern information. Nowadays Web usage mining technique has become one of most widely used methods for Web recommendation, which customizes Web content to user-preferred style. Traditional techniques of Web usage mining, such as Web user session or Web page clustering, association rule and frequent navigational path mining can only discover usage pattern explicitly. They, however, cannot reveal the underlying navigational activities and identify the latent relationships that are associated with the patterns among Web users as well as Web pages. In this work, we propose a Web recommendation framework incorporating Web usage mining technique based on Probabilistic Latent Semantic Analysis (PLSA) model. The main advantages of this method are, not only to discover usage-based access pattern, but also to reveal the underlying latent factor as well. With the discovered user access pattern, we then present user more interested content via collaborative recommendation. To validate the effectiveness of proposed approach, we conduct experiments on real world datasets and make comparisons with some existing traditional techniques. The preliminary experimental results demonstrate the usability of the proposed approach.
Resumo:
Substance use has an effect on an individual's propensity to commit acquisitive crime with recent studies showing substance users more likely to leave forensic material at a crime scene. An examination of acquisitive crime solved in Northamptonshire, U.K., during 2006 enabled 70 crime scene behavior characteristics to be analyzed for substance and nonsubstance use offenders. Logistical regression analyses have identified statistically significant crime scene behavior predictors that were found to be either present at or absent from the crime scene when the offender was a substance user. Most significant predictors present were indicative of a lack of preparation by the offender, irrational behavior, and a desire to steal high value, easily disposed of, property. Most significant predictors absent from the crime scene were indicative of more planning, preparation, and execution by the offender. Consideration is given to how this crime scene behavior might be used by police investigators to identify offenders.
Resumo:
Using the resistance literature as an underpinning theoretical framework, this chapter analyzes how Web designers through their daily practices, (i) adopt recursive, adaptive, and resisting behavior regarding the inclusion of social cues online and (ii) shape the socio-technical power relationship between designers and other stakeholders. Five vignettes in the form of case studies with expert individual Web designers are used. Findings point out at three types of emerging resistance namely: market driven resistance, ideological resistance, and functional resistance. In addition, a series of propositions are provided linking the various themes. Furthermore, the authors suggest that stratification in Web designers’ type is occurring and that resistance offers a novel lens to analyze the debate.
Resumo:
In recent years, mobile technology has been one of the major growth areas in computing. Designing the user interface for mobile applications, however, is a very complex undertaking which is made even more challenging by the rapid technological developments in mobile hardware. Mobile human-computer interaction, unlike desktop-based interaction, must be cognizant of a variety of complex contextual factors affecting both users and technology. The Handbook of Research on User Interface Design and Evaluation provides students, researchers, educators, and practitioners with a compendium of research on the key issues surrounding the design and evaluation of mobile user interfaces, such as the physical environment and social context in which a mobile device is being used and the impact of multitasking behavior typically exhibited by mobile-device users. Compiling the expertise of over 150 leading experts from 26 countries, this exemplary reference tool will make an indispensable addition to every library collection.
Resumo:
The possibility to analyze, quantify and forecast epidemic outbreaks is fundamental when devising effective disease containment strategies. Policy makers are faced with the intricate task of drafting realistically implementable policies that strike a balance between risk management and cost. Two major techniques policy makers have at their disposal are: epidemic modeling and contact tracing. Models are used to forecast the evolution of the epidemic both globally and regionally, while contact tracing is used to reconstruct the chain of people who have been potentially infected, so that they can be tested, isolated and treated immediately. However, both techniques might provide limited information, especially during an already advanced crisis when the need for action is urgent. In this paper we propose an alternative approach that goes beyond epidemic modeling and contact tracing, and leverages behavioral data generated by mobile carrier networks to evaluate contagion risk on a per-user basis. The individual risk represents the loss incurred by not isolating or treating a specific person, both in terms of how likely it is for this person to spread the disease as well as how many secondary infections it will cause. To this aim, we develop a model, named Progmosis, which quantifies this risk based on movement and regional aggregated statistics about infection rates. We develop and release an open-source tool that calculates this risk based on cellular network events. We simulate a realistic epidemic scenarios, based on an Ebola virus outbreak; we find that gradually restricting the mobility of a subset of individuals reduces the number of infected people after 30 days by 24%.
Resumo:
Fast spreading unknown viruses have caused major damage on computer systems upon their initial release. Current detection methods have lacked capabilities to detect unknown viruses quickly enough to avoid mass spreading and damage. This dissertation has presented a behavior based approach to detecting known and unknown viruses based on their attempt to replicate. Replication is the qualifying fundamental characteristic of a virus and is consistently present in all viruses making this approach applicable to viruses belonging to many classes and executing under several conditions. A form of replication called self-reference replication, (SR-replication), has been formalized as one main type of replication which specifically replicates by modifying or creating other files on a system to include the virus itself. This replication type was used to detect viruses attempting replication by referencing themselves which is a necessary step to successfully replicate files. The approach does not require a priori knowledge about known viruses. Detection was accomplished at runtime by monitoring currently executing processes attempting to replicate. Two implementation prototypes of the detection approach called SRRAT were created and tested on the Microsoft Windows operating systems focusing on the tracking of user mode Win32 API system calls and Kernel mode system services. The research results showed SR-replication capable of distinguishing between file infecting viruses and benign processes with little or no false positives and false negatives. ^
Resumo:
Fast spreading unknown viruses have caused major damage on computer systems upon their initial release. Current detection methods have lacked capabilities to detect unknown virus quickly enough to avoid mass spreading and damage. This dissertation has presented a behavior based approach to detecting known and unknown viruses based on their attempt to replicate. Replication is the qualifying fundamental characteristic of a virus and is consistently present in all viruses making this approach applicable to viruses belonging to many classes and executing under several conditions. A form of replication called self-reference replication, (SR-replication), has been formalized as one main type of replication which specifically replicates by modifying or creating other files on a system to include the virus itself. This replication type was used to detect viruses attempting replication by referencing themselves which is a necessary step to successfully replicate files. The approach does not require a priori knowledge about known viruses. Detection was accomplished at runtime by monitoring currently executing processes attempting to replicate. Two implementation prototypes of the detection approach called SRRAT were created and tested on the Microsoft Windows operating systems focusing on the tracking of user mode Win32 API system calls and Kernel mode system services. The research results showed SR-replication capable of distinguishing between file infecting viruses and benign processes with little or no false positives and false negatives.
Resumo:
The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.