935 resultados para Enterprise application integration (Computer systems)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sales configurators are essential tools for companies that offer complicated case specifically crafted products for customers. Most sophisticated of them are able to design an entire end product on the fly according to given constraints, calculate price for the offer and move the order into production. This thesis covers a sales configurator acquisition project in a large industrial company that offers cranes for its customers. The study spans the preliminary stages of a large-scale software purchase project starting from the specification of problem domain and ending up presenting the most viable software solution that fulfils the requirements for the new system. The project consists of mapping usage environment, use cases, and collecting requirements that are expected from the new system. The collected requirements involve fitting the new sales system into enterprise application infrastructure, mitigating the risks involved in the project and specifying new features to the application whilst preserving all of the admired features of the old sales system currently used in the company. The collected requirements were presented to a number of different sales software vendors who were asked to provide solution suggestions that would fulfil all the demands. All of the received solution proposals were exposed to an evaluation to determine the most feasible solutions, and the construction of evaluation criteria itself was a part of the study. The final outcome of this study is a short-list of the most feasible sales configurator solutions together with a description of how software purchase process in large enterprises work, and which aspects should be paid attention in large projects of similar kind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kilpailukykyisyyden säilyttäminen alati kehittyvillä markkinoilla vaatii ajanmukaisten tietojärjestelmien käyttöä. Eräs tärkeimmistä tällaisista järjestelmistä on organisaatiossa käytössä oleva tiedonhallintajärjestelmä, jota käytetään yrityksessä kulutettavan ja tuotettavan tiedon hallitsemiseen. Käytössä olevan tiedonhallintajärjestelmän vaihtaminen modernimpaan on monimutkainen prosessi, joka alkaa uuden järjestelmän valinnasta ja jatkuu järjestelmän käyttöönottamisella. Käyttöönottoon kuuluu tarpeellisten vanhojen sovellusten integroiminen osaksi uutta järjestelmää sekä käyttäjien kouluttaminen uuden järjestelmän vaatimiin työtapoihin. Diplomityössä on perehdytty uuden ALMAtiedonhallintajärjestelmän käyttöönottoon prosessiteollisuuden suunnittelu- ja konsultointiyritys CTS Engtec Oy:ssä. Työn puitteissa liitettiin kaksi CTS:llä käytössä olevaa sovellusta, CTS Pine ja PMMATE, osaksi uutta tiedonhallintajärjestelmää. Lisäksi työssä on tutustuttu tiedonhallintajärjestelmiin liittyviin käsitteisiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of water-sensitive papers is an important tool for assessing the quality of pesticide application on crops, but manual analysis is laborious and time-consuming. Thus, this study aimed to evaluate and compare the results obtained from four software programs for spray droplet analysis in different scanned images of water-sensitive papers. After spraying, papers with four droplet deposition patterns (varying droplet spectra and densities) were analyzed manually and by means of the following computer programs: CIR, e-Sprinkle, DepositScan and Conta-Gotas. The diameter of the volume and number medians and the number of droplets per target area were studied. There is a strong correlation between the values measured using the different programs and the manual analysis, but there is a great difference between the numerical values measured for the same paper. Thus, it is not advisable to compare results obtained from different programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 E38 D56 1992

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les émotions jouent un rôle primordial dans les processus cognitifs et plus particulièrement dans les tâches d’apprentissage. D’ailleurs, plusieurs recherches neurologiques ont montré l’interrelation qui existe entre la cognition et les émotions. Elles ont aussi déterminé plusieurs formes d’intelligence humaine autre que l’intelligence rationnelle parmi lesquelles nous distinguons la forme ayant comme dimension émotionnelle, à savoir l’intelligence émotionnelle, vu son impact sur les processus d’apprentissage. L’intelligence émotionnelle est alors un facteur significatif de réussite scolaire et professionnelle. Sous la lumière de ces constatations présentées, les informaticiens à leur tour, vont alors tenter de consentir de plus en plus de place au facteur émotionnel dans les systèmes informatiques et plus particulièrement dans ceux dédiés à l’apprentissage. L’intégration de l’intelligence émotionnelle dans ces systèmes et plus précisément, dans les Systèmes Tutoriels Intelligents (STI), va leur permettre de gérer les émotions de l’apprenant et par la suite améliorer ses performances. Dans ce mémoire, notre objectif principal est d’élaborer une stratégie d’apprentissage visant à favoriser et accentuer la mémorisation chez les enfants. Pour atteindre cet objectif, nous avons développé un cours d’anglais en ligne ainsi qu’un tuteur virtuel utilisant des ressources multimédia tels que le ton de la voix, la musique, les images et les gestes afin de susciter des émotions chez l’apprenant. Nous avons conduit une expérience pour tester l’efficacité de quelques stratégies émotionnelles ainsi qu’évaluer l’impact des émotions suscitées sur la capacité de mémorisation des connaissances à acquérir par l’apprenant. Les résultats de cette étude expérimentale ont prouvé que l’induction implicite des émotions chez ce dernier a une influence significative sur ses performances. Ils ont également montré qu’il n’existe pas une stratégie efficace pour tous les apprenants à la fois, cependant l’efficacité d’une telle stratégie par rapport à une autre dépend essentiellement du profil comportemental de l’apprenant déterminé à partir de son tempérament.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce mémoire présente un patron d’architecture permettant, dans un contexte orientéobjet, l’exploitation d’objets appartenant simultanément à plusieurs hiérarchies fonctionnelles. Ce patron utilise un reasoner basé sur les logiques de description (web sémantique) pour procéder à la classification des objets dans les hiérarchies. La création des objets est simplifiée par l’utilisation d’un ORM (Object Relational Mapper). Ce patron permet l’utilisation effective du raisonnement automatique dans un contexte d’applications d’entreprise. Les concepts requis pour la compréhension du patron et des outils sont présentés. Les conditions d’utilisation du patron sont discutées ainsi que certaines pistes de recherche pour les élargir. Un prototype appliquant le patron dans un cas simple est présenté. Une méthodologie accompagne le patron. Finalement, d’autres utilisations potentielles des logiques de description dans le même contexte sont discutées.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of the present investigation were to evaluate the qualitative and quantitative distribution of natural cyanobacterial population and their ecobiological properties along the Cochin estuary and their application in aquaculture systems as a nutritional supplement due to their nutrient-rich biochemical composition and antioxidant potential. This thesis presents a detailed account of the distribution of cyanobacteria in Cochin estuary, an assessment of physico-chemical parameters and the nutrients of the study site, an evaluation of the effect of physico-chemical parameters on cyanobacterial distribution and abundance, isolation, identification and culturing of cyanobacteria, the biochemical composition an productivity of cyanobacteria, and an evaluation of the potential of the selected cyanobacteria as antioxidants against ethanol induced lipid peroxidation. The pH, salinity and nutritional requirements were optimized for low-cost production of the selected cyanobacterial strains. The present study provides an insight into the distribution, abundance, diversity and ecology of cyanobacteria of Cochin estuary. From the results, it is evident that the ecological conditions of Cochin estuary support a rich cyanobacterial growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After skin cancer, breast cancer accounts for the second greatest number of cancer diagnoses in women. Currently the etiologies of breast cancer are unknown, and there is no generally accepted therapy for preventing it. Therefore, the best way to improve the prognosis for breast cancer is early detection and treatment. Computer aided detection systems (CAD) for detecting masses or micro-calcifications in mammograms have already been used and proven to be a potentially powerful tool , so the radiologists are attracted by the effectiveness of clinical application of CAD systems. Fractal geometry is well suited for describing the complex physiological structures that defy the traditional Euclidean geometry, which is based on smooth shapes. The major contribution of this research include the development of • A new fractal feature to accurately classify mammograms into normal and normal (i)With masses (benign or malignant) (ii) with microcalcifications (benign or malignant) • A novel fast fractal modeling method to identify the presence of microcalcifications by fractal modeling of mammograms and then subtracting the modeled image from the original mammogram. The performances of these methods were evaluated using different standard statistical analysis methods. The results obtained indicate that the developed methods are highly beneficial for assisting radiologists in making diagnostic decisions. The mammograms for the study were obtained from the two online databases namely, MIAS (Mammographic Image Analysis Society) and DDSM (Digital Database for Screening Mammography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of computer vision based quality control has been slowly but steadily gaining importance mainly due to its speed in achieving results and also greatly due to its non- destnictive nature of testing. Besides, in food applications it also does not contribute to contamination. However, computer vision applications in quality control needs the application of an appropriate software for image analysis. Eventhough computer vision based quality control has several advantages, its application has limitations as to the type of work to be done, particularly so in the food industries. Selective applications, however, can be highly advantageous and very accurate.Computer vision based image analysis could be used in morphometric measurements of fish with the same accuracy as the existing conventional method. The method is non-destructive and non-contaminating thus providing anadvantage in seafood processing.The images could be stored in archives and retrieved at anytime to carry out morphometric studies for biologists.Computer vision and subsequent image analysis could be used in measurements of various food products to assess uniformity of size. One product namely cutlet and product ingredients namely coating materials such as bread crumbs and rava were selected for the study. Computer vision based image analysis was used in the measurements of length, width and area of cutlets. Also the width of coating materials like bread crumbs was measured.Computer imaging and subsequent image analysis can be very effectively used in quality evaluations of product ingredients in food processing. Measurement of width of coating materials could establish uniformity of particles or the lack of it. The application of image analysis in bacteriological work was also done

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we have mainly achieved the following: 1. we provide a review of the main methods used for the computation of the connection and linearization coefficients between orthogonal polynomials of a continuous variable, moreover using a new approach, the duplication problem of these polynomial families is solved; 2. we review the main methods used for the computation of the connection and linearization coefficients of orthogonal polynomials of a discrete variable, we solve the duplication and linearization problem of all orthogonal polynomials of a discrete variable; 3. we propose a method to generate the connection, linearization and duplication coefficients for q-orthogonal polynomials; 4. we propose a unified method to obtain these coefficients in a generic way for orthogonal polynomials on quadratic and q-quadratic lattices. Our algorithmic approach to compute linearization, connection and duplication coefficients is based on the one used by Koepf and Schmersau and on the NaViMa algorithm. Our main technique is to use explicit formulas for structural identities of classical orthogonal polynomial systems. We find our results by an application of computer algebra. The major algorithmic tools for our development are Zeilberger’s algorithm, q-Zeilberger’s algorithm, the Petkovšek-van-Hoeij algorithm, the q-Petkovšek-van-Hoeij algorithm, and Algorithm 2.2, p. 20 of Koepf's book "Hypergeometric Summation" and it q-analogue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a Graphical User Interface, developed with python and the graphic library wxpython, to GRASS GIS. This GUI allows to access several modules with a graphic interface written in Spanish. Its main purpouse is to be a teaching tool, that is the reason way it only allows to access several basic put crucial moludes. It also allows user to organize the elements presented to stress the aspects to be resalted in a particular working sesion with the program

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conferència emmarcada dins el 2on Workshop Biblioteca UdG: La Universitat de Girona davant els reptes del nou Espai Europeu d'Educació Superior, on s'exposen les últimes novetats del programa Millennium referent a les estadístiques, Url checker, materials booking, scheduler

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Es parla del que va suposar canviar de sistema de gestió de biblioteques, de VTLS a Millennium. Es fa un resum de les actuacions precanvi, del calendari d'implementació i formació, les dates clau. I, per últim, es donen alguns exemples de catàlegs: University of California i la Universidad Complutense