935 resultados para Enterprise application integration (Computer systems)
Resumo:
This research has been undertaken to determine how successful multi-organisational enterprise strategy is reliant on the correct type of Enterprise Resource Planning (ERP) information systems being used. However there appears to be a dearth of research as regards strategic alignment between ERP systems development and multi-organisational enterprise governance as guidelines and frameworks to assist practitioners in making decision for multi-organisational collaboration supported by different types of ERP systems are still missing from theoretical and empirical perspectives. This calls for this research which investigates ERP systems development and emerging practices in the management of multi-organisational enterprises (i.e. parts of companies working with parts of other companies to deliver complex product-service systems) and identify how different ERP systems fit into different multi-organisational enterprise structures, in order to achieve sustainable competitive success. An empirical inductive study was conducted using the Grounded Theory-based methodological approach based on successful manufacturing and service companies in the UK and China. This involved an initial pre-study literature review, data collection via 48 semi-structured interviews with 8 companies delivering complex products and services across organisational boundaries whilst adopting ERP systems to support their collaborative business strategies – 4 cases cover printing, semiconductor manufacturing, and parcel distribution industries in the UK and 4 cases cover crane manufacturing, concrete production, and banking industries in China in order to form a set of 29 tentative propositions that have been validated via a questionnaire receiving 116 responses from 16 companies. The research has resulted in the consolidation of the validated propositions into a novel concept referred to as the ‘Dynamic Enterprise Reference Grid for ERP’ (DERG-ERP) which draws from multiple theoretical perspectives. The core of the DERG-ERP concept is a contingency management framework which indicates that different multi-organisational enterprise paradigms and the supporting ERP information systems are not the result of different strategies, but are best considered part of a strategic continuum with the same overall business purpose of multi-organisational cooperation. At different times and circumstances in a partnership lifecycle firms may prefer particular multi-organisational enterprise structures and the use of different types of ERP systems to satisfy business requirements. Thus the DERG-ERP concept helps decision makers in selecting, managing and co-developing the most appropriate multi-organistional enterprise strategy and its corresponding ERP systems by drawing on core competence, expected competitiveness, and information systems strategic capabilities as the main contingency factors. Specifically, this research suggests that traditional ERP(I) systems are associated with Vertically Integrated Enterprise (VIE); whilst ERPIIsystems can be correlated to Extended Enterprise (EE) requirements and ERPIII systems can best support the operations of Virtual Enterprise (VE). The contribution of this thesis is threefold. Firstly, this work contributes to a gap in the extant literature about the best fit between ERP system types and multi-organisational enterprise structure types; and proposes a new contingency framework – the DERG-ERP, which can be used to explain how and why enterprise managers need to change and adapt their ERP information systems in response to changing business and operational requirements. Secondly, with respect to a priori theoretical models, the new DERG-ERP has furthered multi-organisational enterprise management thinking by incorporating information system strategy, rather than purely focusing on strategy, structural, and operational aspects of enterprise design and management. Simultaneously, the DERG-ERP makes theoretical contributions to the current IS Strategy Formulation Model which does not explicitly address multi-organisational enterprise governance. Thirdly, this research clarifies and emphasises the new concept and ideas of future ERP systems (referred to as ERPIII) that are inadequately covered in the extant literature. The novel DERG-ERP concept and its elements have also been applied to 8 empirical cases to serve as a practical guide for ERP vendors, information systems management, and operations managers hoping to grow and sustain their competitive advantage with respect to effective enterprise strategy, enterprise structures, and ERP systems use; referred to in this thesis as the “enterprisation of operations”.
Resumo:
This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.
Resumo:
The present paper discusses the process of multi-lateral integration of the business applications, which requires the construction of a common infra-structure, acquires the format of a service and leads to release of the individual construction of a private infra-structure by every participant in the process.
Resumo:
Enterprise Resource Planning (ERP) systems are software programs designed to integrate the functional requirements, and operational information needs of a business. Pressures of competition and entry standards for participation in major manufacturing supply chains are creating greater demand for small business ERP systems. The proliferation of new offerings of ERP systems introduces complexity to the selection process to identify the right ERP business software for a small and medium-sized enterprise (SME). The selection of an ERP system is a process in which a faulty conclusion poses a significant risk of failure to SME’s. The literature reveals that there are still very high failure rates in ERP implementation, and that faulty selection processes contribute to this failure rate. However, the literature is devoid of a systematic methodology for the selection process for an ERP system by SME’s. This study provides a methodological approach to selecting the right ERP system for a small or medium-sized enterprise. The study employs Thomann’s meta-methodology for methodology development; a survey of SME’s is conducted to inform the development of the methodology, and a case study is employed to test, and revise the new methodology. The study shows that a rigorously developed, effective methodology that includes benchmarking experiences has been developed and successfully employed. It is verified that the methodology may be applied to the domain of users it was developed to serve, and that the test results are validated by expert users and stakeholders. Future research should investigate in greater detail the application of meta-methodologies to supplier selection and evaluation processes for services and software; additional research into the purchasing practices of small firms is clearly needed.^
Resumo:
Automated information system design and implementation is one of the fastest changing aspects of the hospitality industry. During the past several years nothing has increased the professionalism or improved the productivity within the industry more than the application of computer technology. Intuitive software applications, deemed the first step toward making computers more people-literate, object-oriented programming, intended to more accurately model reality, and wireless communications are expected to play a significant role in future technological advancement.
Resumo:
The local area network (LAN) interconnecting computer systems and soft- ware can make a significant contribution to the hospitality industry. The author discusses the advantages and disadvantages of such systems.
Resumo:
Understanding habitat selection and movement remains a key question in behavioral ecology. Yet, obtaining a sufficiently high spatiotemporal resolution of the movement paths of organisms remains a major challenge, despite recent technological advances. Observing fine-scale movement and habitat choice decisions in the field can prove to be difficult and expensive, particularly in expansive habitats such as wetlands. We describe the application of passive integrated transponder (PIT) systems to field enclosures for tracking detailed fish behaviors in an experimental setting. PIT systems have been applied to habitats with clear passageways, at fixed locations or in controlled laboratory and mesocosm settings, but their use in unconfined habitats and field-based experimental setups remains limited. In an Everglades enclosure, we continuously tracked the movement and habitat use of PIT-tagged centrarchids across three habitats of varying depth and complexity using multiple flatbed antennas for 14 days. Fish used all three habitats, with marked species-specific diel movement patterns across habitats, and short-lived movements that would be likely missed by other tracking techniques. Findings suggest that the application of PIT systems to field enclosures can be an insightful approach for gaining continuous, undisturbed and detailed movement data in unconfined habitats, and for experimentally manipulating both internal and external drivers of these behaviors.
Resumo:
The enterprise management approach provides a holistic view of organizations and their related information systems. In order to cope with the globalization, virtualization, and volatile competitive environment, traditional firms are seeking to reconstruct their organizational structures and establish new IS architectures to transform from single autonomous entities into more open enterprises supported by new Enterprise Resource Planning (ERP) systems. This paper reports on ERP engage-abilities within three different enterprise management patterns based on the theoretical foundations of the "Dynamic Enterprise Reference Grid". An exploratory inductive study in Zoomlion using the narrative research approach has been conducted. Also, this research delivers a conceptual framework to demonstrate the adoption of ERP in the three enterprise management structures and points to a new architectural type (ERPIII) for operating in the virtual enterprise paradigm. © 2010 Springer-Verlag.
Resumo:
This paper is an overview of the development and application of Computer Vision for the Structural Health
Monitoring (SHM) of Bridges. A brief explanation of SHM is provided, followed by a breakdown of the stages of computer
vision techniques separated into laboratory and field trials. Qualitative evaluations and comparison of these methods have been
provided along with the proposal of guidelines for new vision-based SHM systems.
Resumo:
Monitoring multiple myeloma patients for relapse requires sensitive methods to measure minimal residual disease and to establish a more precise prognosis. The present study aimed to standardize a real-time quantitative polymerase chain reaction (PCR) test for the IgH gene with a JH consensus self-quenched fluorescence reverse primer and a VDJH or DJH allele-specific sense primer (self-quenched PCR). This method was compared with allele-specific real-time quantitative PCR test for the IgH gene using a TaqMan probe and a JH consensus primer (TaqMan PCR). We studied nine multiple myeloma patients from the Spanish group treated with the MM2000 therapeutic protocol. Self-quenched PCR demonstrated sensitivity of >or=10(-4) or 16 genomes in most cases, efficiency was 1.71 to 2.14, and intra-assay and interassay reproducibilities were 1.18 and 0.75%, respectively. Sensitivity, efficiency, and residual disease detection were similar with both PCR methods. TaqMan PCR failed in one case because of a mutation in the JH primer binding site, and self-quenched PCR worked well in this case. In conclusion, self-quenched PCR is a sensitive and reproducible method for quantifying residual disease in multiple myeloma patients; it yields similar results to TaqMan PCR and may be more effective than the latter when somatic mutations are present in the JH intronic primer binding site.
Resumo:
This paper is reviewing objective assessments of Parkinson’s disease(PD) motor symptoms, cardinal, and dyskinesia, using sensor systems. It surveys the manifestation of PD symptoms, sensors that were used for their detection, types of signals (measures) as well as their signal processing (data analysis) methods. A summary of this review’s finding is represented in a table including devices (sensors), measures and methods that were used in each reviewed motor symptom assessment study. In the gathered studies among sensors, accelerometers and touch screen devices are the most widely used to detect PD symptoms and among symptoms, bradykinesia and tremor were found to be mostly evaluated. In general, machine learning methods are potentially promising for this. PD is a complex disease that requires continuous monitoring and multidimensional symptom analysis. Combining existing technologies to develop new sensor platforms may assist in assessing the overall symptom profile more accurately to develop useful tools towards supporting better treatment process.
Resumo:
INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.
Resumo:
Intrusion Detection Systems (IDSs) provide an important layer of security for computer systems and networks, and are becoming more and more necessary as reliance on Internet services increases and systems with sensitive data are more commonly open to Internet access. An IDS’s responsibility is to detect suspicious or unacceptable system and network activity and to alert a systems administrator to this activity. The majority of IDSs use a set of signatures that define what suspicious traffic is, and Snort is one popular and actively developing open-source IDS that uses such a set of signatures known as Snort rules. Our aim is to identify a way in which Snort could be developed further by generalising rules to identify novel attacks. In particular, we attempted to relax and vary the conditions and parameters of current Snort rules, using a similar approach to classic rule learning operators such as generalisation and specialisation. We demonstrate the effectiveness of our approach through experiments with standard datasets and show that we are able to detect previously undetected variants of various attacks. We conclude by discussing the general effectiveness and appropriateness of generalisation in Snort based IDS rule processing. Keywords: anomaly detection, intrusion detection, Snort, Snort rules
Resumo:
`Evolution of mylonitic microfabrics' (EMM) is an interactive Filemaker Pro 3.0 application that documents a series of see-through deformation experiments on polycrystalline norcamphor. The application comprises computer animations, graphics and text explanations designed to give students and researchers insight into the interaction and dynamic nature of small-scale, mylonitic processes like intracrystalline glide, dynamic recrystallization and strain localization (microshearing). EMM shows how mylonitic steady state is achieved at different strain rates and temperatures. First, rotational mechanisms like glide-induced vorticity, subgrain rotation recrystallization and rigid-body rotation bring grains' crystal lattices into orientations that are favorable for intracrystalline glide. In a second stage, selective elimination of grains whose lattices are poorly oriented for glide involves grain boundary migration. This strengthens the texture. Temperature and strain rate affect both the relative activity of different strain accommodation mechanisms and the rate of microfabric change. Steady-state microfabrics are characterized by stable texture, grain size and shape-preferred orientations of grains and domains. This involves the cyclical generation and elimination of dynamically recrystallized grains and microshear zones.
Resumo:
INTRODUCTION In recent years computer systems have become increasingly complex and consequently the challenge of protecting these systems has become increasingly difficult. Various techniques have been implemented to counteract the misuse of computer systems in the form of firewalls, antivirus software and intrusion detection systems. The complexity of networks and dynamic nature of computer systems leaves current methods with significant room for improvement. Computer scientists have recently drawn inspiration from mechanisms found in biological systems and, in the context of computer security, have focused on the human immune system (HIS). The human immune system provides an example of a robust, distributed system that provides a high level of protection from constant attacks. By examining the precise mechanisms of the human immune system, it is hoped the paradigm will improve the performance of real intrusion detection systems. This paper presents an introduction to recent developments in the field of immunology. It discusses the incorporation of a novel immunological paradigm, Danger Theory, and how this concept is inspiring artificial immune systems (AIS). Applications within the context of computer security are outlined drawing direct reference to the underlying principles of Danger Theory and finally, the current state of intrusion detection systems is discussed and improvements suggested.