854 resultados para Information Systems Development
Resumo:
This paper focuses on outsourcing vendors, their characteristics and the vendor selection process. It draws on current research and two research studies, one specifi- cally examining outsourcing vendors and the other examining vendor-client issues. We first outline the development of the market for the outsourcing of information technology/information systems services and activities, then detail the characteristics of different types of vendor companies and their competitive positions, before providing a client perspective to the issue of vendor selection.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.
Resumo:
Most physiological effects of thyroid hormones are mediated by the two thyroid hormone receptor subtypes, TR alpha and TR beta. Several pharmacological effects mediated by TR beta might be beneficial in important medical conditions such as obesity, hypercholesterolemia and diabetes, and selective TR beta activation may elicit these effects while maintaining an acceptable safety profile, To understand the molecular determinants of affinity and subtype selectivity of TR ligands, we have successfully employed a ligand- and structure-guided pharmacophore-based approach to obtain the molecular alignment of a large series of thyromimetics. Statistically reliable three-dimensional quantitative structure-activity relationship (3D-QSAR) and three-dimensional quantitative structure-selectivity relationship (3D-QSSR) models were obtained using the comparative molecular field analysis (CoMFA) method, and the visual analyses of the contour maps drew attention to a number of possible opportunities for the development of analogs with improved affinity and selectivity. Furthermore, the 3D-QSSR analysis allowed the identification of a novel and previously unmentioned halogen bond, bringing new insights to the mechanism of activity and selectivity of thyromimetics.
Resumo:
The glycolytic enzyme glyceraldehyde-3 -phosphate dehydrogenase (GAPDH) is as an attractive target for the development of novel antitrypanosomatid agents. In the present work, comparative molecular field analysis and comparative molecular similarity index analysis were conducted on a large series of selective inhibitors of trypanosomatid GAPDH. Four statistically significant models were obtained (r(2) > 0.90 and q(2) > 0.70), indicating their predictive ability for untested compounds. The models were then used to predict the potency of an external test set, and the predicted values were in good agreement with the experimental results. Molecular modeling studies provided further insight into the structural basis for selective inhibition of trypanosomatid GAPDH.
Resumo:
Schistosomiasis is considered the second most important tropical parasitic disease, with severe socioeconomic consequences for millions of people worldwide. Schistosoma monsoni, one of the causative agents of human schistosomiasis, is unable to synthesize purine nucleotides de novo, which makes the enzymes of the purine salvage pathway important targets for antischistosomal drug development. In the present work, we describe the development of a pharmacophore model for ligands of S. mansoni purine nucleoside phosphorylase (SmPNP) as well as a pharmacophore-based virtual screening approach, which resulted in the identification of three thioxothiazolidinones (1-3) with substantial in vitro inhibitory activity against SmPNP. Synthesis, biochemical evaluation, and structure activity relationship investigations led to the successful development of a small set of thioxothiazolidinone derivatives harboring a novel chemical scaffold as new competitive inhibitors of SmPNP at the low-micromolar range. Seven compounds were identified with IC(50) values below 100 mu M. The most potent inhibitors 7, 10, and 17 with 1050 of 2, 18, and 38 mu M, respectively, could represent new potential lead compounds for further development of the therapy of schistosomiasis.
Resumo:
The widespread use of service-oriented architectures (SOAs) and Web services in commercial software requires the adoption of development techniques to ensure the quality of Web services. Testing techniques and tools concern quality and play a critical role in accomplishing quality of SOA based systems. Existing techniques and tools for traditional systems are not appropriate to these new systems, making the development of Web services testing techniques and tools required. This article presents new testing techniques to automatically generate a set of test cases and data for Web services. The techniques presented here explore data perturbation of Web services messages upon data types, integrity and consistency. To support these techniques, a tool (GenAutoWS) was developed and applied to real problems. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.
Resumo:
Today there are many system development projects that break both budget and time plan. Often this depends on defects in the information systems that could have been prevented. The cost of test can in some cases be as high as 50 % of the projects total cost and it's at the same time an important part of development. Test as such has moved its focus from the software it self and its faults to a wider perspective on whole infrastructures of information systems where assure a good quality is important. Sogeti in the Netherlands have developed a test method called TMap (Test Management approach) that can be used for structured testing of information systems. TMap haven't been used as much as desired in the office in Borlänge. Because Microsoft is releasing a new version of their platform Visual Studio Team System (VSTS 2010) some colleges at Sogeti in the Netherlands are about to develop a template that can support the use of TMap in VSTS 2010. When we write this the template is still in development. The goal for Sogeti was to find out the differences between the test functionality in VSTS 2008 and 2010. By using the purpose with this essay, which was to analyze the test process in VSTS 2008 with TMap against the test process in VSTS 2010 together with the template we got much help to achieve the goal. The analysis was done with four different aspects: The TPI and TMMi models, problem and strength analyses and a few question formulations. The TPI and TMMi models where used to analyses and evaluate the test process. The analysis showed that there were differences between the both test processes. VSTS 2010 together with the template gave a better support to use TMap and perform test. In VSTS 2010 the test tool Camano is connected to TFS and the tool is also to make the execution and logging of tests easier. This leads to a test process that is easier to handle and has a better support for TMap.
Resumo:
Today’s e-services are complex phenomenon consisting of several different e-services linked together. The e-services are provided by IT systems and presented to customers through user interfaces. Within web design research criteria are laid out for the design of good user interfaces, but one problem is that these analyses are performed without a service focus. This lack of service focus can result in the designed user interfaces providing indistinct service concepts, especially where several e-services are intertwined with each other.In order to design IT system interfaces, we have to be clear about which e-services are provided by the IT system and how these e-services are related to each other. This paper presents a framework for the analysis of user interfaces in terms of focused e-service, service environment and two types of intertwined e-services; related e-services and interrelating e-services. The analyses are exemplified by an Internet based e-marketplace. The paper discusses how the framework can be combined with ordinary web design criteria and how it can be used for e-service development.
Resumo:
Många projekt misslyckas och en av anledningarna är dålig styrning av projektet i allmänhet och inom IT branschen i synnerhet. Baserad på kritik av de traditionella metoderna under de senaste åren, så har det uppkommit flera lättrörliga metoder som kallas Agila metoder. Scrum är den mest kända Agila metoden som används idag. Metoden lovar goda resultat, men i en artikel ur tidningen Computer Sweden (feb 2009) står det ”siffror visar att nio av tio Scrumprojekt misslyckas”. Artikeln triggade vårt intresse av att ta reda på vilka problem specifika för Scrum som många har kritiserat och valde därför att rikta in vår studie mot detta. Uppsatsen syftar till att undersöka om lokala IT-företag i Borlänge, Headlight, Sogeti ochstatliga nätkapacitetleverantören Trafikverket ICT lider av det allmänna problem som de andra Scrumanvändarna upplever i samband med användningen av metoden. Denna uppsats har fokus på fyra problemområden: bristfällig dokumentation, sämre effektivitet i arbetsprocessen, sämre effektivitet i arbetsprocessen i stora projekt samt bristande stöd för utvärdering. För vår studie har litteraturstudier och intervjuer genomförts. Intervjuserier gjordes på elva personer hos våra fallföretag. Målgruppen för våra intervjuer är Product Owner (PO) ScrumMaster (SM) och utvecklare. Vi kan efter genomförd studie dra slutsatsen att de allmänna upplevda problem som de andra Scrumanvändaren upplever har vi även kunnat identifiera hos våra fallföretag. Resultaten har bekräftats med insamlade data och vår teoretiska ram. I diskussionen presenterar vi rekommendationer för att undvik relaterade problem med Scrum.
Resumo:
The main aim of this project is to develop an ESES lab on a full scale system. The solar combisystem used is available most of the time and is only used twice a year to carry out some technical courses. At the moment, there are no other laboratories about combisystems. The experiments were designed in a way to use the system to the most in order to help the students apply the theoretical knowledge in the solar thermal course as well as make them more familiar with solar systems components. The method adopted to reach this aim is to carry out several test sequences on the system, in order to help formulating at the end some educating experiments. A few tests were carried out at the beginning of the project just for the sake of understanding the system and figuring out if any additional measuring equipment is required. The level of these tests sequences was varying from a simple energy draw off or collector loop controller respond tests to more complicated tests, such as the use of the ‘collector’ heater to simulate the solar collector effect on the system. The tests results were compared and verified with the theoretical data wherever relevant. The results of the experiment about the use of the ‘collector’ heater instead of the collector were positively acceptable. Finally, the Lab guide was developed based on the results of these experiments and also the experience gotten while conducting them. The lab work covers the theories related to solar systems in general and combisystems in particular.
Resumo:
In this study a case study was conducted at the Transport Agency. The government agency is seeking better alternatives to its current ways of keeping track of important dates stated in legal documents. The agency wants to explore the possibility for an IT-solution for keeping track on those dates. In the case study change analysis was conducted on the vehicle type approval process. Qualitative interviews were conducted with a section manager, vehicle type and component approvers, and with a system administrator and a system manager at the Transport Agency in Borlänge. The study describes an information environment in an organization in which date-controlled conditions apply and challenges with such an environment.The study also provides a proposal for designing an IT-solution for such an environment. The result that emerged showed that the information environment in terms of the legal documents and the institutions’ (EU and UNECE) websites where they are distributed is challenging for keeping track on dates. The challenges of the websites are that they are limited and complicated in terms of information search. The challenges with the legal documents are that date-controlled conditions apply. In other words, the dates themselves determine which dates and what applies and not. The legal documents are also many, retroactive, refer to each other and contain regulations for different kinds of areas, which makes difficult to find the relevant dates. The design proposal that emerged was made based on the existing IT environment at the Transport Agency. It was proposed that important key concepts are identified, categorized and realized in a database with searchability on the most important common concepts as denominators, and that it the information conforms with E-message. The developed database was based on date, date type, vehicle category, legal document, minor version and other info. A web interface was created in ASP.NET and C # for access to the database, with the message that a webbbased IT-solution should be considered.