878 resultados para implementation and complexity theory
Resumo:
This paper discusses the advantages of database-backed websites and describes the model for a library website implemented at the University of Nottingham using open source software, PHP and MySQL. As websites continue to grow in size and complexity it becomes increasingly important to introduce automation to help manage them. It is suggested that a database-backed website offers many advantages over one built from static HTML pages. These include a consistency of style and content, the ability to present different views of the same data, devolved editing and enhanced security. The University of Nottingham Library Services website is described and issues surrounding its design, technological implementation and management are explored.
Resumo:
An extended formulation of a polyhedron P is a linear description of a polyhedron Q together with a linear map π such that π(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis’ factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441–466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix S. Moreover, Yannakakis also shows that the nonnegative rank of S is at most 2c, where c is the complexity of any deterministic protocol computing S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis’ factorization theorem, this implies that the base-2 logarithm of the smallest size of an extended formulation of a polytope P equals the minimum complexity of a randomized communication protocol computing the slack matrix of P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.
Resumo:
The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
This thesis attempts to provide deeper historical and theoretical grounding for sense-making, thereby illustrating its applicability to practical information seeking research. In Chapter One I trace the philosophical origins of Brenda Dervin’s theory known as “sense making,” reaching beyond current scholarship that locates the origins of sense-making in twentieth-century Phenomenology and Communication theory and find its rich ontological, epistemological, and etymological heritage that dates back to the Pre-Socratics. After exploring sense-making’s Greek roots, I examine sense-making’s philosophical undercurrents found in Hegel’s Phenomenology of Spirit (1807), where he also returns to the simplicity of the Greeks for his concept of sense. With Chapter Two I explore sense-making methodology and find, in light of the Greek and Hegelian dialectic, a dialogical bridge connecting sense-making’s theory with pragmatic uses. This bridge between Dervin’s situation and use occupies a distinct position in sense-making theory. Moreover, building upon Brenda Dervin’s model of sense-making, I use her metaphors of gap and bridge analogy to discuss the dialectic and dialogic components of sense making. The purpose of Chapter Three is pragmatic – to gain insight into the online information-seeking needs, experiences, and motivation of first-degree relatives (FDRs) of breast cancer survivors through the lens of sense-making. This research analyses four questions: 1) information-seeking behavior among FDRs of cancer survivors compared to survivors and to undiagnosed, non-related online cancer information seekers in the general population, 2) types of and places where information is sought, 3) barriers or gaps and satisfaction rates FDRs face in their cancer information quest, and 4) types and degrees of cancer information and resources FDRs want and use in their information search for themselves and other family members. An online survey instrument designed to investigate these questions was developed and pilot tested. Via an email communication, the Susan Love Breast Cancer Research Foundation distributed 322,000 invitations to its membership to complete the survey, and from March 24th to April 5th 10,692 women agreed to take the survey with 8,804 volunteers actually completing survey responses. Of the 8,804 surveys, 95% of FDRs have searched for cancer information online, and 84% of FDRs use the Internet as a sense-making tool for additional information they have received from doctors or nurses. FDRs report needing much more information than either survivors or family/friends in ten out of fifteen categories related to breast and ovarian cancer. When searching for cancer information online, FDRs also rank highest in several of sense-making’s emotional levels: uncertainty, confusion, frustration, doubt, and disappointment than do either survivors or friends and family. The sense-making process has existed in theory and praxis since the early Greeks. In applying sense–making’s theory to a contemporary problem, the survey reveals unaddressed situations and gaps of FDRs’ information search process. FDRs are a highly motivated group of online information seekers whose needs are largely unaddressed as a result of gaps in available online information targeted to address their specific needs. Since FDRs represent a quarter of the population, further research addressing their specific online information needs and experiences is necessary.
Resumo:
The problem: Around 300 million people worldwide have asthma and prevalence is increasing. Support for optimal self-management can be effective in improving a range of outcomes and is cost effective, but is underutilised as a treatment strategy. Supporting optimum self-management using digital technology shows promise, but how best to do this is not clear. Aim: The purpose of this project was to explore the potential role of a digital intervention in promoting optimum self-management in adults with asthma. Methods: Following the MRC Guidance on the Development and Evaluation of Complex Interventions which advocates using theory, evidence, user testing and appropriate modelling and piloting, this project had 3 phases. Phase 1: Examination of the literature to inform phases 2 and 3, using systematic review methods and focussed literature searching. Phase 2: Developing the Living Well with Asthma website. A prototype (paper-based) version of the website was developed iteratively with input from a multidisciplinary expert panel, empirical evidence from the literature (from phase 1), and potential end users via focus groups (adults with asthma and practice nurses). Implementation and behaviour change theories informed this process. The paper-based designs were converted to the website through an iterative user centred process (think aloud studies with adults with asthma). Participants considered contents, layout, and navigation. Development was agile using feedback from the think aloud sessions immediately to inform design and subsequent think aloud sessions. Phase 3: A pilot randomised controlled trial over 12 weeks to evaluate the feasibility of a Phase 3 trial of Living Well with Asthma to support self-management. Primary outcomes were 1) recruitment & retention; 2) website use; 3) Asthma Control Questionnaire (ACQ) score change from baseline; 4) Mini Asthma Quality of Life (AQLQ) score change from baseline. Secondary outcomes were patient activation, adherence, lung function, fractional exhaled nitric oxide (FeNO), generic quality of life measure (EQ-5D), medication use, prescribing and health services contacts. Results: Phase1: Demonstrated that while digital interventions show promise, with some evidence of effectiveness in certain outcomes, participants were poorly characterised, telling us little about the reach of these interventions. The interventions themselves were poorly described making drawing definitive conclusions about what worked and what did not impossible. Phase 2: The literature indicated that important aspects to cover in any self-management intervention (digital or not) included: asthma action plans, regular health professional review, trigger avoidance, psychological functioning, self-monitoring, inhaler technique, and goal setting. The website asked users to aim to be symptom free. Key behaviours targeted to achieve this include: optimising medication use (including inhaler technique); attending primary care asthma reviews; using asthma action plans; increasing physical activity levels; and stopping smoking. The website had 11 sections, plus email reminders, which promoted these behaviours. Feedback during think aloud studies was mainly positive with most changes focussing on clarification of language, order of pages and usability issues mainly relating to navigation difficulties. Phase 3: To achieve our recruitment target 5383 potential participants were invited, leading to 51 participants randomised (25 to intervention group). Age range 16-78 years; 75% female; 28% from most deprived quintile. Nineteen (76%) of the intervention group used the website for an average of 23 minutes. Non-significant improvements in favour of the intervention group observed in the ACQ score (-0.36; 95% confidence interval: -0.96, 0.23; p=0.225), and mini-AQLQ scores (0.38; -0.13, 0.89; p=0.136). A significant improvement was observed in the activity limitation domain of the mini-AQLQ (0.60; 0.05 to 1.15; p = 0.034). Secondary outcomes showed increased patient activation and reduced reliance on reliever medication. There was no significant difference in the remaining secondary outcomes. There were no adverse events. Conclusion: Living Well with Asthma has been shown to be acceptable to potential end users, and has potential for effectiveness. This intervention merits further development, and subsequent evaluation in a Phase III full scale RCT.
Resumo:
Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
INTRODUCTION: In common with much of the developed world, Scotland has a severe and well established problem with overweight and obesity in childhood with recent figures demonstrating that 31% of Scottish children aged 2-15 years old were overweight including obese in 2014. This problem is more pronounced in socioeconomically disadvantaged groups and in older children across all economic groups (Scottish Health Survey, 2014). Children who are overweight or obese are at increased risk of a number of adverse health outcomes in the short term and throughout their life course (Lobstein and Jackson-Leach, 2006). The Scottish Government tasked all Scottish Health Boards with developing and delivering child healthy weight interventions to clinically overweight or obese children in an attempt to address this health problem. It is therefore imperative to deliver high quality, affordable, appropriately targeted interventions which can make a sustained impact on children’s lifestyles, setting them up for life as healthy weight adults. This research aimed to inform the design, readiness for application and Health Board suitability of an effective primary school-based curricular child healthy weight intervention. METHODS: the process involved in conceptualising a child healthy weight intervention, developing the intervention, planning for implementation and subsequent evaluation was guided by the PRECEDE-PROCEED Model (Green and Kreuter, 2005) and the Intervention Mapping protocol (Lloyd et al. 2011). RESULTS: The outputs from each stage of the development process were used to formulate a child healthy weight intervention conceptual model then develop plans for delivery and evaluation. DISCUSSION: The Fit for School conceptual model developed through this process has the potential to theoretically modify energy balance related behaviours associated with unhealthy weight gain in childhood. It also has the potential to be delivered at a Health Board scale within current organisational restrictions.
Resumo:
We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new ‘Danger Theory’ (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of ‘grounding’ the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.
Resumo:
Doutoramento em Gestão.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
Support Vector Machines (SVMs) are widely used classifiers for detecting physiological patterns in Human-Computer Interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the application of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables, and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.
Resumo:
A Plasma Focus device can confine in a small region a plasma generated during the pinch phase. When the plasma is in the pinch condition it creates an environment that produces several kinds of radiations. When the filling gas is nitrogen, a self-collimated backwardly emitted electron beam, slightly spread by the coulomb repulsion, can be considered one of the most interesting outputs. That beam can be converted into X-ray pulses able to transfer energy at an Ultra-High Dose-Rate (UH-DR), up to 1 Gy pulse-1, for clinical applications, research, or industrial purposes. The radiation fields have been studied with the PFMA-3 hosted at the University of Bologna, finding the radiation behavior at different operating conditions and working parameters for a proper tuning of this class of devices in clinical applications. The experimental outcomes have been compared with available analytical formalisms as benchmark and the scaling laws have been proposed. A set of Monte Carlo models have been built with direct and adjoint techniques for an accurate X-ray source characterization and for setting fast and reliable irradiation planning for patients. By coupling deterministic and Monte Carlo codes, a focusing lens for the charged particles has been designed for obtaining a beam suitable for applications as external radiotherapy or intra-operative radiation therapy. The radiobiological effectiveness of the UH PF DR, a FLASH source, has been evaluated by coupling different Monte Carlo codes estimating the overall level of DNA damage at the multi-cellular and tissue levels by considering the spatial variation effects as well as the radiation field characteristics. The numerical results have been correlated to the experimental outcomes. Finally, ambient dose measurements have been performed for tuning the numerical models and obtaining doses for radiation protection purposes. The PFMA-3 technology has been fully characterized toward clinical implementation and installation in a medical facility.
Resumo:
The present Ph.D. thesis proposes three studies on coworking spaces to understand how they foster thriving and organizing in the new world of work. The first study maps and analyzes the thematic structure and evolution of the academic debate that has emerged around coworking spaces in recent years. In doing so, it conducts a science mapping analysis of 351 publications on coworking spaces to detect and visualize key themes in the literature and their co-occurrence with subthemes. The second study proposes an interpretive review of 98 publications from multiple disciplines to shed light on how coworking spaces emerge as sites of organizing for professionals who are not formally connected to one another. It suggests five dimensions that articulate coworking spaces as sites of organizing – ‘materiality,’ ‘temporality,’ ‘affect,’ ‘identity,’ and ‘formalization.’ This study aims to go beyond the community-related understanding of coworking that has characterized most scholarly attention, instead focusing on coworking spaces’ organizational character. The third study investigates what drives thriving at work for remote workers in coworking spaces. In doing so, it acknowledges the potential complex set of interrelationships underpinning thriving at work and mobilizes complexity theory and qualitative comparative analysis to uncover six different, yet equifinal, configurations of antecedents driving remote workers’ thriving in coworking spaces.
Resumo:
In this thesis, the problem of controlling a quadrotor UAV is considered. It is done by presenting an original control system, designed as a combination of Neural Networks and Disturbance Observer, using a composite learning approach for a system of the second order, which is a novel methodology in literature. After a brief introduction about the quadrotors, the concepts needed to understand the controller are presented, such as the main notions of advanced control, the basic structure and design of a Neural Network, the modeling of a quadrotor and its dynamics. The full simulator, developed on the MATLAB Simulink environment, used throughout the whole thesis, is also shown. For the guidance and control purposes, a Sliding Mode Controller, used as a reference, it is firstly introduced, and its theory and implementation on the simulator are illustrated. Finally the original controller is introduced, through its novel formulation, and implementation on the model. The effectiveness and robustness of the two controllers are then proven by extensive simulations in all different conditions of external disturbance and faults.