959 resultados para Information by segment
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.
Resumo:
Similar to Genetic algorithm, Evolution strategy is a process of continuous reproduction, trial and selection. Each new generation is an improvement on the one that went before. This paper presents two different proposals based on the vector space model (VSM) as a traditional model in information Retrieval (TIR). The first uses evolution strategy (ES). The second uses the document centroid (DC) in query expansion technique. Then the results are compared; it was noticed that ES technique is more efficient than the other methods.
Resumo:
In this paper a methodology for evaluation of information security of objects under attacks, processed by methods of compression, is represented. Two basic parameters for evaluation of information security of objects – TIME and SIZE – are chosen and the characteristics, which reflect on their evaluation, are analyzed and estimated. A co-efficient of information security of object is proposed as a mean of the coefficients of the parameter TIME and SIZE. From the simulation experiments which were carried out methods with the highest co-efficient of information security had been determined. Assessments and conclusions for future investigations are proposed.
Resumo:
We propose a new approach to the generation of an alphabet for secret key exchange relying on small variations in the cavity length of an ultra-long fiber laser. This new concept is supported by experimental results showing how the radio-frequency spectrum of the laser can be exploited as a carrier to exchange information. The test bench for our proof of principle is a 50 km-long fiber laser linking two users, Alice and Bob, where each user can randomly add an extra 1 km-long segment of fiber. The choice of laser length is driven by two independent random binary values, which makes such length become itself a random variable. The security of key exchange is ensured whenever the two independent random choices lead to the same laser length and, hence, to the same free spectral range.
Resumo:
All information systems have to be protected. As the number of information objects and the number of users increase the task of information system’s protection becomes more difficult. One of the most difficult problems is access rights assignment. This paper describes the graph model of access rights inheritance. This model takes into account relations and dependences between different objects and between different users. The model can be implemented in the information systems controlled by the metadata, describing information objects and connections between them, such as the systems based on CASE-technology METAS.
Resumo:
DNA-binding proteins are crucial for various cellular processes and hence have become an important target for both basic research and drug development. With the avalanche of protein sequences generated in the postgenomic age, it is highly desired to establish an automated method for rapidly and accurately identifying DNA-binding proteins based on their sequence information alone. Owing to the fact that all biological species have developed beginning from a very limited number of ancestral species, it is important to take into account the evolutionary information in developing such a high-throughput tool. In view of this, a new predictor was proposed by incorporating the evolutionary information into the general form of pseudo amino acid composition via the top-n-gram approach. It was observed by comparing the new predictor with the existing methods via both jackknife test and independent data-set test that the new predictor outperformed its counterparts. It is anticipated that the new predictor may become a useful vehicle for identifying DNA-binding proteins. It has not escaped our notice that the novel approach to extract evolutionary information into the formulation of statistical samples can be used to identify many other protein attributes as well.
Resumo:
The purpose of this article is to evaluate the effectiveness of learning by doing as a practical tool for managing the training of students in "Library Management" at the ULSIT, Sofia, Bulgaria, by using the creation of project 'Data Base “Bulgarian Revival Towns” (CD), financed by Bulgarian Ministry of Education, Youth and Science (1/D002/144/13.10.2011) headed by Prof. DSc Ivanka Yankova, which aims to create new information resource for the towns which will serve the needs of scientific researches. By participating in generating the an array in the database through searching, selection and digitization of documents from these period, at the same time students get an opportunity to expand their skills to work effectively in a team, finding the interdisciplinary, a causal connection between the studied items, objects and subjects and foremost – practical experience in the field of digitization, information behavior, strategies for information search, etc. This method achieves good results for the accumulation of sustainable knowledge and it generates motivation to work in the field of library and information professions.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2016
Resumo:
A segment selection method controlled by Quality of Experience (QoE) factors for Dynamic Adaptive Streaming over HTTP (DASH) is presented in this paper. Current rate adaption algorithms aim to eliminate buffer underrun events by significantly reducing the code rate when experiencing pauses in replay. In reality, however, viewers may choose to accept a level of buffer underrun in order to achieve an improved level of picture fidelity or to accept the degradation in picture fidelity in order to maintain the service continuity. The proposed rate adaption scheme in our work can maximize the user QoE in terms of both continuity and fidelity (picture quality) in DASH applications. It is shown that using this scheme a high level of quality for streaming services, especially at low packet loss rates, can be achieved. Our scheme can also maintain a best trade-off between continuity-based quality and fidelity-based quality, by determining proper threshold values for the level of quality intended by clients with different quality requirements. In addition, the integration of the rate adaptation mechanism with the scheduling process is investigated in the context of a mobile communication network and related performances are analyzed.
Resumo:
Az elemzés egy a Budapesti Corvinus Egyetem (BCE) Logisztika és Ellátási Lánc Menedzsment Tanszéke által végzett kérdőíves felmérés eredményeit foglalja össze. A kutatás alapvető célja, hogy felmérje és bemutassa a hazai vállalatok logisztikai, ezen belül is elsősorban disztribúciós logisztikai folyamatainak informatikai oldalról történő jelenlegi támogatottsági szintjét és a következő két-három év e téren várható fejlesztési irányait. A kutatás szisztematikusan kitért a logisztikai információs rendszer valamennyi alrendszerére, vizsgálta a különböző azonosítási megoldások elterjedtségét, a vállalatirányítási rendszer, illetve egyes moduljainak használatával kapcsolatban kialakult gyakorlatot, de a logisztika stratégiai döntéseinek informatikai támogatottságát és a használt kommunikációs technikákat is. Összességében megállapítható, hogy a logisztikai információs rendszerek fejlettségi szintje ma Magyarországon közepes, fontos megjegyezni azonban, hogy a kkv-szektor e téren is jelentősen lemaradt. Ez természetesen azt is jelenti, hogy az informatikai eszközök alkalmazásának kiterjesztésével még komoly teljesítményjavulás érhető el. ________ The essay summarizes the results of a survey carried out by Corvinus University of Budapest, Department of Logistics and Supply Chain Management. Aim of the survey was to analyze and describe the actual Hungarian company practice regarding the IT support of logistics – and particularly distribution – processes, and the plans to develop it within the next 2-3 years. Survey has systematically overviewed all fields of logistics information system, analyzed the prevalence of different identification techniques and systems. Generally the authors appoint that logistics information systems applied by Hungarian companies are on satisfactory level; however it is important to tell that SME companies are in huge lag. This means that improving logistics information system hides the possibility of considerable performance development.
Resumo:
An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.
Resumo:
Background: Information seeking is an important coping mechanism for dealing with chronic illness. Despite a growing number of mental health websites, there is little understanding of how patients with bipolar disorder use the Internet to seek information. Methods: A 39 question, paper-based, anonymous survey, translated into 12 languages, was completed by 1222 patients in 17 countries as a convenience sample between March 2014 and January 2016. All patients had a diagnosis of bipolar disorder from a psychiatrist. Data were analyzed using descriptive statistics and generalized estimating equations to account for correlated data. Results: 976 (81 % of 1212 valid responses) of the patients used the Internet, and of these 750 (77 %) looked for information on bipolar disorder. When looking online for information, 89 % used a computer rather than a smartphone, and 79 % started with a general search engine. The primary reasons for searching were drug side effects (51 %), to learn anonymously (43 %), and for help coping (39 %). About 1/3 rated their search skills as expert, and 2/3 as basic or intermediate. 59 % preferred a website on mental illness and 33 % preferred Wikipedia. Only 20 % read or participated in online support groups. Most patients (62 %) searched a couple times a year. Online information seeking helped about 2/3 to cope (41 % of the entire sample). About 2/3 did not discuss Internet findings with their doctor. Conclusion: Online information seeking helps many patients to cope although alternative information sources remain important. Most patients do not discuss Internet findings with their doctor, and concern remains about the quality of online information especially related to prescription drugs. Patients may not rate search skills accurately, and may not understand limitations of online privacy. More patient education about online information searching is needed and physicians should recommend a few high quality websites.