965 resultados para Primary contribution
Resumo:
Campaigners are increasingly using online social networking platforms for promoting products, ideas and information. A popular method of promoting a product or even an idea is incentivizing individuals to evangelize the idea vigorously by providing them with referral rewards in the form of discounts, cash backs, or social recognition. Due to budget constraints on scarce resources such as money and manpower, it may not be possible to provide incentives for the entire population, and hence incentives need to be allocated judiciously to appropriate individuals for ensuring the highest possible outreach size. We aim to do the same by formulating and solving an optimization problem using percolation theory. In particular, we compute the set of individuals that are provided incentives for minimizing the expected cost while ensuring a given outreach size. We also solve the problem of computing the set of individuals to be incentivized for maximizing the outreach size for given cost budget. The optimization problem turns out to be non trivial; it involves quantities that need to be computed by numerically solving a fixed point equation. Our primary contribution is, that for a fairly general cost structure, we show that the optimization problems can be solved by solving a simple linear program. We believe that our approach of using percolation theory to formulate an optimization problem is the first of its kind. (C) 2016 Elsevier B.V. All rights reserved.
Resumo:
The Bayesian perspective of designing for the consequences of hazard is discussed. Structural engineers should be educated in Bayesian theory and its underlying philosophy, and about the centrality to the prediction problem of the predictive distribution. The primary contribution that Bayesianism can make to the debate about extreme possibilities is its clarification of the language of and thinking about risk. Frequentist methodologies are the wrong approach to the decisions that engineers need to make, decisions that involve assessments of abstract future possibilities based on incomplete and abstract information.
Resumo:
We present a model for early vision tasks such as denoising, super-resolution, deblurring, and demosaicing. The model provides a resolution-independent representation of discrete images which admits a truly rotationally invariant prior. The model generalizes several existing approaches: variational methods, finite element methods, and discrete random fields. The primary contribution is a novel energy functional which has not previously been written down, which combines the discrete measurements from pixels with a continuous-domain world viewed through continous-domain point-spread functions. The value of the functional is that simple priors (such as total variation and generalizations) on the continous-domain world become realistic priors on the sampled images. We show that despite its apparent complexity, optimization of this model depends on just a few computational primitives, which although tedious to derive, can now be reused in many domains. We define a set of optimization algorithms which greatly overcome the apparent complexity of this model, and make possible its practical application. New experimental results include infinite-resolution upsampling, and a method for obtaining subpixel superpixels. © 2012 IEEE.
Resumo:
We present a type system that can effectively facilitate the use of types in capturing invariants in stateful programs that may involve (sophisticated) pointer manipulation. With its root in a recently developed framework Applied Type System (ATS), the type system imposes a level of abstraction on program states by introducing a novel notion of recursive stateful views and then relies on a form of linear logic to reason about such views. We consider the design and then the formalization of the type system to constitute the primary contribution of the paper. In addition, we mention a prototype implementation of the type system and then give a variety of examples that attests to the practicality of programming with recursive stateful views.
Resumo:
Charities need to understand why volunteers choose one brand rather than another in order to attract more volunteers to their organisation. There has been considerable academic interest in understanding why people volunteer generally. However, this research explores the more specific question of why a volunteer chooses one charity brand rather than another. It builds on previous conceptualisations of volunteering as a consumption decision. Seen through the lens of the individual volunteer, it considers the under-researched area of the decision-making process. The research adopts an interpretivist epistemology and subjectivist ontology. Qualitative data was collected through depth interviews and analysed using both Means-End Chain (MEC) and Framework Analysis methodology. The primary contribution of the research is to theory: understanding the role of brand in the volunteer decision-making process. It identifies two roles for brand. The first is as a specific reason for choice, an ‘attribute’ of the decision. Through MEC, volunteering for a well-known brand connects directly through to a sense of self, both self-respect but also social recognition by others. All four components of the symbolic consumption construct are found in the data: volunteers choose a well-known brand to say something about themselves. The brand brings credibility and reassurance, it reduces the risk and enables the volunteer to meet their need to make a difference and achieve a sense of accomplishment. The second closely related role for brand is within the process of making the volunteering decision. Volunteers built up knowledge about the charity brands from a variety of brand touchpoints, over time. At the point of decision-making that brand knowledge and engagement becomes relevant, enabling some to make an automatic choice despite the significant level of commitment being made. The research identifies four types of decision-making behaviour. The research also makes secondary contributions to MEC methodology and to the non-profit context. It concludes within practical implications for management practice and a rich agenda for future research.
Resumo:
In this paper we discuss the problem of how to discriminate moments of interest on videos or live broadcast shows. The primary contribution is a system which allows users to personalize their programs with previously created media stickers-pieces of content that may be temporarily attached to the original video. We present the system's architecture and implementation, which offer users operators to transparently annotate videos while watching them. We offered a soccer fan the opportunity to add stickers to the video while watching a live match: the user reported both enjoying and being comfortable using the stickers during the match-relevant results even though the experience was not fully representative.
Resumo:
Präsentiert wird ein vollständiger, exakter und effizienter Algorithmus zur Berechnung des Nachbarschaftsgraphen eines Arrangements von Quadriken (Algebraische Flächen vom Grad 2). Dies ist ein wichtiger Schritt auf dem Weg zur Berechnung des vollen 3D Arrangements. Dabei greifen wir auf eine bereits existierende Implementierung zur Berechnung der exakten Parametrisierung der Schnittkurve von zwei Quadriken zurück. Somit ist es möglich, die exakten Parameterwerte der Schnittpunkte zu bestimmen, diese entlang der Kurven zu sortieren und den Nachbarschaftsgraphen zu berechnen. Wir bezeichnen unsere Implementierung als vollständig, da sie auch die Behandlung aller Sonderfälle wie singulärer oder tangentialer Schnittpunkte einschließt. Sie ist exakt, da immer das mathematisch korrekte Ergebnis berechnet wird. Und schließlich bezeichnen wir unsere Implementierung als effizient, da sie im Vergleich mit dem einzigen bisher implementierten Ansatz gut abschneidet. Implementiert wurde unser Ansatz im Rahmen des Projektes EXACUS. Das zentrale Ziel von EXACUS ist es, einen Prototypen eines zuverlässigen und leistungsfähigen CAD Geometriekerns zu entwickeln. Obwohl wir das Design unserer Bibliothek als prototypisch bezeichnen, legen wir dennoch größten Wert auf Vollständigkeit, Exaktheit, Effizienz, Dokumentation und Wiederverwendbarkeit. Über den eigentlich Beitrag zu EXACUS hinaus, hatte der hier vorgestellte Ansatz durch seine besonderen Anforderungen auch wesentlichen Einfluss auf grundlegende Teile von EXACUS. Im Besonderen hat diese Arbeit zur generischen Unterstützung der Zahlentypen und der Verwendung modularer Methoden innerhalb von EXACUS beigetragen. Im Rahmen der derzeitigen Integration von EXACUS in CGAL wurden diese Teile bereits erfolgreich in ausgereifte CGAL Pakete weiterentwickelt.
Resumo:
The separation of small molecules by capillary electrophoresis is governed by a complex interplay among several physical effects. Until recently, a systematic understanding of how the influence of all of these effects is observed experimentally has remained unclear. The work presented in this thesis involves the use of transient isotachophoretic stacking (tITP) and computer simulation to improve and better understand an in-capillary chemical assay for creatinine. This assay involves the use of electrophoretically mediated micro-analysis (EMMA) to carry out the Jaffé reaction inside a capillary tube. The primary contribution of this work is the elucidation of the role of the length and concentration of the hydroxide plug used to achieve tITP stacking of the product formed by the in-capillary EMMA/Jaffé method. Computer simulation using SIMUL 5.0 predicts that a 3-4 fold gain in sensitivity can be recognized by timing the tITP stacking event such that the Jaffé product peak is at its maximum height as that peak is electrophoresing past the detection window. Overall, the length of the hydroxide plug alters the timing of the stacking event and lower concentration plugs of hydroxide lead to more rapidly occurring tITP stacking events. Also, the inclusion of intentional tITP stacking in the EMMA/Jaffé method improves the sensitivity of the assay, including creatinine concentrations within the normal biological range. Ultimately, improvement in assay sensitivity can be rationally designed by using the length and concentration of the hydroxide plug to engineer the timing of the tITP stacking event such that stacking occurs as the Jaffé product is passing the detection window.
Resumo:
Este trabajo de investigación presenta los resultados de una revisión sistemática realizada a partir de la recopilación, lectura y análisis de distintas fuentes bibliográficas dentro de un conjunto heterogéneo consistente de 175 estudios que forman la base bibliográfica actual del documento “Cognitive Accessibility User Research” (W3C, 2015a) del W3C. Esta base bibliográfica está compuesta por publicaciones científicas basadas en libros, artículos, conferencias y sitios Web especializados, en los cuales se potencia como objeto particular de análisis, la indagación en la búsqueda de pautas de accesibilidad en las tecnologías Web que apoyen la integración a personas con discapacidad cognitiva. Como parte de este proceso de investigación se ha recopilado y descrito la situación actual, particularmente, de los retos en la utilización de las tecnologías de la información y la comunicación (TIC) en relación a personas con dificultades de aprendizaje o discapacidades cognitivas, tales como la Dislexia, Afasia, Trastorno de Aprendizaje No verbal, Envejecimiento-Demencia, Trastornos por Déficit de Atención con o sin Hiperactividad, Autismo, Síndrome de Down y Discalculia. Como aporte primordial de este Trabajo Fin de Master (TFM), se intenta trazar una línea de criterios que permitan la evaluación objetiva de este tópico, con miras a ofrecer un enfoque práctico y reciente sobre ésta temática, mostrando de forma esquematizada las pautas existentes y sirviendo de síntesis orientativa para el diseño accesible de las TIC con la finalidad de promover un soporte real a personas con los tipo de discapacidad cognitiva en los que se ha enfocado esta investigación. Logrando obtener como resultado principal de este estudio, 36 pautas generales que agrupan las coincidencias del grupo de discapacidades estudiadas y que han sido distribuidas en categorías: texto, navegación y generales, para su mejor interpretación y manejo de la Accesibilidad en las TIC´S para Personas con Discapacidad Cognitiva.---ABSTRACT---This research presents the results of a systematic review from collecting, reading and analysis of different bibliographic sources within a heterogeneous group consisting of 175 studies that form the basis of current literature document "Accessibility User Cognitive Research" (W3C , 2015th) of the W3C. This bibliographic database is composed of scientific publications based on books, articles, lectures and specialized Web sites, in which is enhanced as a particular object of analysis, the inquiry into the search for accessibility guidelines for Web technologies to support integration of people with cognitive disabilities. As part of this research process, the current situation has been collected and described, particularly the challenges in the use of information and communications technology (ICT) in relation to people with learning disabilities or cognitive disabilities, such as Dyslexia, aphasia, nonverbal learning disorder, aging-Dementia, Attention Deficit Disorders with or without hyperactivity, autism, Down syndrome and dyscalculia. As primary contribution of this Master's Thesis (TFM), it tries to draw a line of criteria to allow an objective assessment of this topic, in order to provide a practical and recent focus on this theme, showing schematically existing guidelines and serving as guidance for accessible design of ICT in order to promote a real support to people with cognitive disabilities where this research has focused on. Managing to obtain the main result of this study, 36 general guidelines that group the set of disabilities studied and have been distributed in categories: text, navigation and general, for better interpretation and management of ICTs for Accessibility people with cognitive disabilities.
Resumo:
Photoreceptor proteins of the phytochrome family mediate light-induced inhibition of stem (hypocotyl) elongation during the development of photoautotrophy in seedlings. Analyses of overt mutant phenotypes have established the importance of phytochromes A and B (phyA and phyB) in this developmental process, but kinetic information that would augment emerging molecular models of phytochrome signal transduction is absent. We have addressed this deficiency by genetically dissecting phytochrome-response kinetics, after having solved the technical issues that previously limited growth studies of small Arabidopsis seedlings. We show here, with resolution on the order of minutes, that phyA initiated hypocotyl growth inhibition upon the onset of continuous red light. This primary contribution of phyA began to decrease after 3 hr of irradiation, the same time at which immunochemically detectable phyA disappeared and an exclusively phyB-dependent phase of inhibition began. The sequential and coordinated actions of phyA and phyB in red light were not observed in far-red light, which inhibited growth persistently through an exclusively phyA-mediated pathway.
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
The thesis aims to define further the biometric correlates in anisometropic eyes in order to provide a structural foundation for propositions concerning the development of ametropia.Biometric data are presented for 40 anisometropes and 40 isometropic controls drawn from Caucasian and Chinese populations.The principal finding was that the main structural correlate of myopia is an increase in axial rather than equatorial dimensions of the posterior globe. This finding has not been previously reported for in vivo work on humans. The computational method described in the thesis is a more accessible method for determination of eye shape than current imaging techniques such as magnetic resonance imaging or laser Doppler interferometry (LDI). Retinal contours derived from LDI and computation were shown to be closely matched. Corneal topography revealed no differences in corneal characteristics in anisometropic eyes, which supports the finding that anisometropia arises from differences in vitreous chamber depth.The corollary to axial expansion in myopia, that is retinal stretch in central regions of the posterior pole, was investigated by measurement of disc-to-fovea distances (DFD) using a scanning laser ophthalmoscope. DFD was found to increase with increased myopia, which demonstrates the primary contribution made by posterior central regions of the globe to axial expansion.The ocular pulse volume and choroidal blood flow, measured with the Ocular Blood Flow Tonograph, were found to be reduced in myopia; the reductions were found to be significantly correlated with vitreous chamber depth. The thesis includes preliminary data on whether the relationship arises from the influx of a blood bolus into eyes of different posterior volumes or represents actual differences in choroidal blood flow.The results presented in this thesis show the utility of computed retinal contour and demonstrate that the structural correlate of myopia is axial rather than equatorial expansion of the vitreous chamber. The technique is suitable for large population studies and its relative simplicity makes it feasible for longitudinal studies on the development of ametropia in, for example, children.
Resumo:
This research project has developed a novel decision support system using Geographical Information Systems and Multi Criteria Decision Analysis and used it to develop and evaluate energy-from-waste policy options. The system was validated by applying it to the UK administrative areas of Cornwall and Warwickshire. Different strategies have been defined by the size and number of the facilities, as well as the technology chosen. Using sensitivity on the results from the decision support system, it was found that key decision criteria included those affected by cost, energy efficiency, transport impacts and air/dioxin emissions. The conclusions of this work are that distributed small-scale energy-from-waste facilities score most highly overall and that scale is more important than technology design in determining overall policy impact. This project makes its primary contribution to energy-from-waste planning by its development of a Decision Support System that can be used to assist waste disposal authorities to identify preferred energy-from-waste options that have been tailored specifically to the socio-geographic characteristics of their jurisdictional areas. The project also highlights the potential of energy-from-waste policies that are seldom given enough attention to in the UK, namely those of a smaller-scale and distributed nature that often have technology designed specifically to cater for this market.
Resumo:
Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.
Resumo:
When a task must be executed in a remote or dangerous environment, teleoperation systems may be employed to extend the influence of the human operator. In the case of manipulation tasks, haptic feedback of the forces experienced by the remote (slave) system is often highly useful in improving an operator's ability to perform effectively. In many of these cases (especially teleoperation over the internet and ground-to-space teleoperation), substantial communication latency exists in the control loop and has the strong tendency to cause instability of the system. The first viable solution to this problem in the literature was based on a scattering/wave transformation from transmission line theory. This wave transformation requires the designer to select a wave impedance parameter appropriate to the teleoperation system. It is widely recognized that a small value of wave impedance is well suited to free motion and a large value is preferable for contact tasks. Beyond this basic observation, however, very little guidance exists in the literature regarding the selection of an appropriate value. Moreover, prior research on impedance selection generally fails to account for the fact that in any realistic contact task there will simultaneously exist contact considerations (perpendicular to the surface of contact) and quasi-free-motion considerations (parallel to the surface of contact). The primary contribution of the present work is to introduce an approximate linearized optimum for the choice of wave impedance and to apply this quasi-optimal choice to the Cartesian reality of such a contact task, in which it cannot be expected that a given joint will be either perfectly normal to or perfectly parallel to the motion constraint. The proposed scheme selects a wave impedance matrix that is appropriate to the conditions encountered by the manipulator. This choice may be implemented as a static wave impedance value or as a time-varying choice updated according to the instantaneous conditions encountered. A Lyapunov-like analysis is presented demonstrating that time variation in wave impedance will not violate the passivity of the system. Experimental trials, both in simulation and on a haptic feedback device, are presented validating the technique. Consideration is also given to the case of an uncertain environment, in which an a priori impedance choice may not be possible.