895 resultados para Customer-value based approach
Resumo:
DNA is nowadays swabbed routinely to investigate serious and volume crimes, but research remains scarce when it comes to determining the criteria that may impact the success rate of DNA swabs taken on different surfaces and situations. To investigate these criteria in fully operational conditions, DNA analysis results of 4772 swabs taken by the forensic unit of a police department in Western Switzerland over a 2.5-year period (2012-2014) in volume crime cases were considered. A representative and random sample of 1236 swab analyses was extensively examined and codified, describing several criteria such as whether the swabbing was performed at the scene or in the lab, the zone of the scene where it was performed, the kind of object or surface that was swabbed, whether the target specimen was a touch surface or a biological fluid, and whether the swab targeted a single surface or combined different surfaces. The impact of each criterion and of their combination was assessed in regard to the success rate of DNA analysis, measured through the quality of the resulting profile, and whether the profile resulted in a hit in the national database or not. Results show that some situations - such as swabs taken on door and window handles for instance - have a higher success rate than average swabs. Conversely, other situations lead to a marked decrease in the success rate, which should discourage further analyses of such swabs. Results also confirm that targeting a DNA swab on a single surface is preferable to swabbing different surfaces with the intent to aggregate cells deposited by the offender. Such results assist in predicting the chance that the analysis of a swab taken in a given situation will lead to a positive result. The study could therefore inform an evidence-based approach to decision-making at the crime scene (what to swab or not) and at the triage step (what to analyse or not), contributing thus to save resource and increase the efficiency of forensic science efforts.
Resumo:
JXTA is a mature set of open protocols, with morethan 10 years of history, that enable the creation and deployment of peer-to-peer (P2P) networks, allowing the execution of services in a distributed manner. Throughout its lifecycle, ithas slowly evolved in order to appeal a broad set of different applications. Part of this evolution includes providing basic security capabilities in its protocols in order to achieve some degree of message privacy and authentication. However, undersome contexts, more advanced security requirements should be met, such as anonymity. There are several methods to attain anonymity in generic P2P networks. In this paper, we proposehow to adapt a replicated message-based approach to JXTA, by taking advantage of its idiosyncracies and capabilities.
Resumo:
The purpose of this thesis is to examine how services can be developed and how the voice of the customer can be incorporated to the strategic planning of services. Furthermore, the objective is to investigate the methods of customer need analysis and service bundling. The data is collected from secondary and primary sources by reviewing the existing academic literature and by conducting in-depth interviews and surveys. The main findings of this research indicate that the service development in personal security service industry should be conducted through a formalized process and the process should begin with setting the strategic objectives. Moreover, the voice of the customer should be incorporated into all stages of the development process, especially into the front-end of the process. Furthermore, the information on customer needs should be gathered in a manner tailored for the purposes of service development.
Resumo:
The paper studied marketing of automatic fire suppression systems from the perspectives of customer value and institutions. The object of the study was research the special features of the sales and marketing of fire suppression systems, and find some practical applications for sales, and for lobbying of a new fire suppression technology. The theoretical background of the study was in the customer value literature and the theoretical concept of institutional entrepreneurship. The research was conducted as an electronic survey for three different groups of respondents; end customers, solution integrators, and re-sellers. From the answers was gathered generalisations about the customer value assessment and communication of the value related to the sales and marketing processes of the fire suppression systems. In addition, there was observed manners to receive information about the systems, and effects caused by institutions to the decision making of the different parties involved. The findings of the study support companies that are launching a new safety technology to the market focus their marketing, and help to understand institutional forces that are affecting to a safety related product.
Resumo:
Wind power is a rapidly developing, low-emission form of energy production. In Fin-land, the official objective is to increase wind power capacity from the current 1 005 MW up to 3 500–4 000 MW by 2025. By the end of April 2015, the total capacity of all wind power project being planned in Finland had surpassed 11 000 MW. As the amount of projects in Finland is record high, an increasing amount of infrastructure is also being planned and constructed. Traditionally, these planning operations are conducted using manual and labor-intensive work methods that are prone to subjectivity. This study introduces a GIS-based methodology for determining optimal paths to sup-port the planning of onshore wind park infrastructure alignment in Nordanå-Lövböle wind park located on the island of Kemiönsaari in Southwest Finland. The presented methodology utilizes a least-cost path (LCP) algorithm for searching of optimal paths within a high resolution real-world terrain dataset derived from airborne lidar scannings. In addition, planning data is used to provide a realistic planning framework for the anal-ysis. In order to produce realistic results, the physiographic and planning datasets are standardized and weighted according to qualitative suitability assessments by utilizing methods and practices offered by multi-criteria evaluation (MCE). The results are pre-sented as scenarios to correspond various different planning objectives. Finally, the methodology is documented by using tools of Business Process Management (BPM). The results show that the presented methodology can be effectively used to search and identify extensive, 20 to 35 kilometers long networks of paths that correspond to certain optimization objectives in the study area. The utilization of high-resolution terrain data produces a more objective and more detailed path alignment plan. This study demon-strates that the presented methodology can be practically applied to support a wind power infrastructure alignment planning process. The six-phase structure of the method-ology allows straightforward incorporation of different optimization objectives. The methodology responds well to combining quantitative and qualitative data. Additional-ly, the careful documentation presents an example of how the methodology can be eval-uated and developed as a business process. This thesis also shows that more emphasis on the research of algorithm-based, more objective methods for the planning of infrastruc-ture alignment is desirable, as technological development has only recently started to realize the potential of these computational methods.
Resumo:
Described herein is the chemoenzymatic total synthesis of several Amaryllidaceae constituents and their unnatural C-I analogues. A new approach to pancratistatin and related compounds will be discussed along with the completed total synthesis of 7 -deoxypancratistatin and trans-dihydrolycoricidine. Evaluation of all new C-l analogues as cancer cell growth inhibitory agents is described. The enzymatic oxidation of dibromobenzenes by Escherichia coli 1M 109 (pDTG60 1) is presented along with conversion of their metabolites to (-)-conduritol E. Investigation into the steric and functional factors governing the enzymatic dihydroxylation of various benzoates by the same organism is also discussed. The synthetic utility of these metabolites is demonstrated through their conversion to pseudo-sugars, aminocyclitols, and complex bicyclic ring systems. The current work on the total synthesis of some morphine alkaloids is also presented. Highlighted will be the synthesis of several model systems related to the efficient total synthesis of thebaine.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
This thesis summarizes the results on the studies on a syntax based approach for translation between Malayalam, one of Dravidian languages and English and also on the development of the major modules in building a prototype machine translation system from Malayalam to English. The development of the system is a pioneering effort in Malayalam language unattempted by previous researchers. The computational models chosen for the system is first of its kind for Malayalam language. An in depth study has been carried out in the design of the computational models and data structures needed for different modules: morphological analyzer , a parser, a syntactic structure transfer module and target language sentence generator required for the prototype system. The generation of list of part of speech tags, chunk tags and the hierarchical dependencies among the chunks required for the translation process also has been done. In the development process, the major goals are: (a) accuracy of translation (b) speed and (c) space. Accuracy-wise, smart tools for handling transfer grammar and translation standards including equivalent words, expressions, phrases and styles in the target language are to be developed. The grammar should be optimized with a view to obtaining a single correct parse and hence a single translated output. Speed-wise, innovative use of corpus analysis, efficient parsing algorithm, design of efficient Data Structure and run-time frequency-based rearrangement of the grammar which substantially reduces the parsing and generation time are required. The space requirement also has to be minimised
Resumo:
Cache look up is an integral part of cooperative caching in ad hoc networks. In this paper, we discuss a cooperative caching architecture with a distributed cache look up protocol which relies on a virtual backbone for locating and accessing data within a cooperate cache. Our proposal consists of two phases: (i) formation of a virtual backbone and (ii) the cache look up phase. The nodes in a Connected Dominating Set (CDS) form the virtual backbone. The cache look up protocol makes use of the nodes in the virtual backbone for effective data dissemination and discovery. The idea in this scheme is to reduce the number of nodes involved in cache look up process, by constructing a CDS that contains a small number of nodes, still having full coverage of the network. We evaluated the effect of various parameter settings on the performance metrics such as message overhead, cache hit ratio and average query delay. Compared to the previous schemes the proposed scheme not only reduces message overhead, but also improves the cache hit ratio and reduces the average delay