799 resultados para Integrated Information Systems
Resumo:
Az elemzés egy a Budapesti Corvinus Egyetem (BCE) Logisztika és Ellátási Lánc Menedzsment Tanszéke által végzett kérdőíves felmérés eredményeit foglalja össze. A kutatás alapvető célja, hogy felmérje és bemutassa a hazai vállalatok logisztikai, ezen belül is elsősorban disztribúciós logisztikai folyamatainak informatikai oldalról történő jelenlegi támogatottsági szintjét és a következő két-három év e téren várható fejlesztési irányait. A kutatás szisztematikusan kitért a logisztikai információs rendszer valamennyi alrendszerére, vizsgálta a különböző azonosítási megoldások elterjedtségét, a vállalatirányítási rendszer, illetve egyes moduljainak használatával kapcsolatban kialakult gyakorlatot, de a logisztika stratégiai döntéseinek informatikai támogatottságát és a használt kommunikációs technikákat is. Összességében megállapítható, hogy a logisztikai információs rendszerek fejlettségi szintje ma Magyarországon közepes, fontos megjegyezni azonban, hogy a kkv-szektor e téren is jelentősen lemaradt. Ez természetesen azt is jelenti, hogy az informatikai eszközök alkalmazásának kiterjesztésével még komoly teljesítményjavulás érhető el. ________ The essay summarizes the results of a survey carried out by Corvinus University of Budapest, Department of Logistics and Supply Chain Management. Aim of the survey was to analyze and describe the actual Hungarian company practice regarding the IT support of logistics – and particularly distribution – processes, and the plans to develop it within the next 2-3 years. Survey has systematically overviewed all fields of logistics information system, analyzed the prevalence of different identification techniques and systems. Generally the authors appoint that logistics information systems applied by Hungarian companies are on satisfactory level; however it is important to tell that SME companies are in huge lag. This means that improving logistics information system hides the possibility of considerable performance development.
Resumo:
Néhány éve vonult be a köztudatba a cloud computing fogalom, mely ma már a szakirodalomban és az informatikai alkalmazásokban is egyre nagyobb teret foglal el. Ez az új IT-technológia a számítási felhő számítástechnikai szolgáltatásaihoz kapcsolódó ERP-rendszerek szabványosítását, elterjedését eredményezi. A szerzők cikkükben áttekintést adnak a cloud computing mai helyzetéről és a számítási felhőben működő adatfeldolgozó rendszerekkel (kiemelten ERP) kapcsolatos felhasználói elvárásokról, illetve kezdeti, németországi alkalmazási tapasztalatokról. Külön tárgyalják az ERP-rendszerek új kiválasztási céljait és kritériumait, melyek a felhőkörnyezet speciális lehetőségei miatt alakultak ki. _____ The concept of ‘Cloud’ as an IT notion emerged in the past years and proliferated within the business and IT professional community. The concept of cloud gained awareness both in the professional and scientific literature and in the practice of IT/IS world. The cloud has a profound impact on the Business Information Systems, especially on ERP systems. Indirectly, the cloud leads to a massive standardization on ERP systems and their services. In this paper, the authors provide a literature overview about the current situation of Cloud Computing and the requirements established by end-users against the other data processing facilities and systems, outstandingly the ERP systems. The majority of investigated cases are based on samples from Germany. Furthermore, the initial experiences of application are discussed. Separately, the recent selection objectives and criteria for ERP systems are investigated that came into existence because the appearance of Cloud in the IT environment.
Resumo:
Online learning systems (OLS) have become center stage for corporations and educational institutions as a competitive tool in the knowledge economy. The satisfaction construct has received extensive coverage in information systems literature as an indicator of effectiveness but has been criticized for lack of validity; yet, the value construct has been largely ignored, although it has a long history in psychology, sociology, and behavioral science. The purpose of this dissertation is to investigate the value and satisfaction constructs in the context of OLS, and their perceived by learners relationship for implied effectiveness of OLS. ^ First, a qualitative phase is employed to gather OLS values from learners' focus groups, followed by a pilot phase to refine a proposed instrument, and a main phase to validate the survey. Responses were received from 75 students in four focus groups, 141 in the pilot, and 207 the main survey. Extensive data cleaning and exploratory factor analysis were done to identify factors of learners' perceived value and satisfaction of OLS. Then, Value-Satisfaction grids and the Learners' Value Index of Satisfaction (LeVIS) were developed as benchmarking tools of OLS. Moreover, Multicriteria Decision Analysis (MCDA) techniques were employed to impute value from satisfaction scores in order to reduce survey response time. ^ The results provided four satisfaction and four value factors with high reliability (Cronbach's α). Moreover, value and satisfaction were found to have low linear and nonlinear correlations, indicating that they are two distinct uncorrelated constructs. This is consistent with the literature. Value-Satisfaction grids and the LeVIS index indicated relatively high effectiveness for technology and support characteristics, relatively low effectiveness for professor's characteristics, while course and learner characteristics indicated average effectiveness. ^ The main contributions of this study include identifying, defining, and articulating the relationship between value and satisfaction constructs as assessment of users' implied IS effectiveness, as well as assessing the accuracy of MCDA procedures to predict value scores, thus reducing by half the survey questionnaire size. ^
Resumo:
Security remains a top priority for organizations as their information systems continue to be plagued by security breaches. This dissertation developed a unique approach to assess the security risks associated with information systems based on dynamic neural network architecture. The risks that are considered encompass the production computing environment and the client machine environment. The risks are established as metrics that define how susceptible each of the computing environments is to security breaches. ^ The merit of the approach developed in this dissertation is based on the design and implementation of Artificial Neural Networks to assess the risks in the computing and client machine environments. The datasets that were utilized in the implementation and validation of the model were obtained from business organizations using a web survey tool hosted by Microsoft. This site was designed as a host site for anonymous surveys that were devised specifically as part of this dissertation. Microsoft customers can login to the website and submit their responses to the questionnaire. ^ This work asserted that security in information systems is not dependent exclusively on technology but rather on the triumvirate people, process and technology. The questionnaire and consequently the developed neural network architecture accounted for all three key factors that impact information systems security. ^ As part of the study, a methodology on how to develop, train and validate such a predictive model was devised and successfully deployed. This methodology prescribed how to determine the optimal topology, activation function, and associated parameters for this security based scenario. The assessment of the effects of security breaches to the information systems has traditionally been post-mortem whereas this dissertation provided a predictive solution where organizations can determine how susceptible their environments are to security breaches in a proactive way. ^
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
The Bahamas is a small island nation that is dealing with the problem of freshwater shortage. All of the country’s freshwater is contained in shallow lens aquifers that are recharged solely by rainfall. The country has been struggling to meet the water demands by employing a combination of over-pumping of aquifers, transport of water by barge between islands, and desalination of sea water. In recent decades, new development on New Providence, where the capital city of Nassau is located, has created a large area of impervious surfaces and thereby a substantial amount of runoff with the result that several of the aquifers are not being recharged. A geodatabase was assembled to assess and estimate the quantity of runoff from these impervious surfaces and potential recharge locations were identified using a combination of Geographic Information Systems (GIS) and remote sensing. This study showed that runoff from impervious surfaces in New Providence represents a large freshwater resource that could potentially be used to recharge the lens aquifers on New Providence.
Resumo:
There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.
Resumo:
The purpose of this study was to determine the effects of a computer-based Integrated Learning Systems (ILS) model used with adult high school students engaging mathematics activities. This study examined achievement, attitudinal and behavior differences between students completing ILS activities in a traditional, individualized format compared to cooperative learning groups.
Resumo:
Although there are more than 7,000 properties using lodging yield management systems (LYMSs), both practitioners and researchers alike have found it difficult to measure their success. Considerable research was performed in the 1980s to develop success measures for information systems in general. In this work the author develops success measures specifically for LYMSs.
Resumo:
Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.
Resumo:
A nuclear waste stream is the complete flow of waste material from origin to treatment facility to final disposal. The objective of this study was to design and develop a Geographic Information Systems (GIS) module using Google Application Programming Interface (API) for better visualization of nuclear waste streams that will identify and display various nuclear waste stream parameters. A proper display of parameters would enable managers at Department of Energy waste sites to visualize information for proper planning of waste transport. The study also developed an algorithm using quadratic Bézier curve to make the map more understandable and usable. Microsoft Visual Studio 2012 and Microsoft SQL Server 2012 were used for the implementation of the project. The study has shown that the combination of several technologies can successfully provide dynamic mapping functionality. Future work should explore various Google Maps API functionalities to further enhance the visualization of nuclear waste streams.
Resumo:
The research described here is supported by the award made by the RCUK Digital Economy programme to the dot.rural Digital Economy Research Hub; award reference: EP/G066051/1/.
Resumo:
Postprint
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
The protection of cyberspace has become one of the highest security priorities of governments worldwide. The EU is not an exception in this context, given its rapidly developing cyber security policy. Since the 1990s, we could observe the creation of three broad areas of policy interest: cyber-crime, critical information infrastructures and cyber-defence. One of the main trends transversal to these areas is the importance that the private sector has come to assume within them. In particular in the area of critical information infrastructure protection, the private sector is seen as a key stakeholder, given that it currently operates most infrastructures in this area. As a result of this operative capacity, the private sector has come to be understood as the expert in network and information systems security, whose knowledge is crucial for the regulation of the field. Adopting a Regulatory Capitalism framework, complemented by insights from Network Governance, we can identify the shifting role of the private sector in this field from one of a victim in need of protection in the first phase, to a commercial actor bearing responsibility for ensuring network resilience in the second, to an active policy shaper in the third, participating in the regulation of NIS by providing technical expertise. By drawing insights from the above-mentioned frameworks, we can better understand how private actors are involved in shaping regulatory responses, as well as why they have been incorporated into these regulatory networks.