761 resultados para Management information systems.
Resumo:
Despite longstanding concern with the dimensionality of the service quality construct as measured by ServQual and IS-ServQual instruments, variations on the IS-ServQual instrument have been enduringly prominent in both academic research and practice in the field of IS. We explain the continuing popularity of the instrument based on the salience of the item set for predicting overall customer satisfaction, suggesting that the preoccupation with the dimensions has been a distraction. The implicit mutual exclusivity of the items suggests a more appropriate conceptualization of IS-ServQual as a formative index. This conceptualization resolves the paradox in IS-ServQual research, that of how an instrument with such well-known and well-documented weaknesses continue to be very influential and widely used by academics and practitioners. A formative conceptualization acknowledges and addresses the criticisms of IS-ServQual, while simultaneously explaining its enduring salience by focusing on the items rather than the “dimensions.” By employing an opportunistic sample and adopting the most recent IS-ServQual instrument published in a leading IS journal (virtually, any valid IS- ServQual sample in combination with a previously tested instrument variant would suffice for study purposes), we demonstrate that when re-specified as both first-order and second-order formatives, IS-ServQual has good model quality metrics and high predictive power on customer satisfaction. We conclude that this formative specification has higher practical use and is more defensible theoretically.
Resumo:
The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.
Resumo:
Background: The aging population is placing increasing demands on surgical services, simultaneously with a decreasing supply of professional labor and a worsening economic situation. Under growing financial constraints, successful operating room management will be one of the key issues in the struggle for technical efficiency. This study focused on several issues affecting operating room efficiency. Materials and methods: The current formal operating room management in Finland and the use of performance metrics and information systems used to support this management were explored using a postal survey. We also studied the feasibility of a wireless patient tracking system as a tool for managing the process. The reliability of the system as well as the accuracy and precision of its automatically recorded time stamps were analyzed. The benefits of a separate anesthesia induction room in a prospective setting were compared with the traditional way of working, where anesthesia is induced in the operating room. Using computer simulation, several models of parallel processing for the operating room were compared with the traditional model with respect to cost-efficiency. Moreover, international differences in operating room times for two common procedures, laparoscopic cholecystectomy and open lung lobectomy, were investigated. Results: The managerial structure of Finnish operating units was not clearly defined. Operating room management information systems were found to be out-of-date, offering little support to online evaluation of the care process. Only about half of the information systems provided information in real time. Operating room performance was most often measured by the number of procedures in a time unit, operating room utilization, and turnover time. The wireless patient tracking system was found to be feasible for hospital use. Automatic documentation of the system facilitated patient flow management by increasing process transparency via more available and accurate data, while lessening work for staff. Any parallel work flow model was more cost-efficient than the traditional way of performing anesthesia induction in the operating room. Mean operating times for two common procedures differed by 50% among eight hospitals in different countries. Conclusions: The structure of daily operative management of an operating room warrants redefinition. Performance measures as well as information systems require updating. Parallel work flows are more cost-efficient than the traditional induction-in-room model.
Resumo:
NK model, proposed by Kauffman (1993), is a strong simulation framework to study competing dynamics. It has been applied in some social science fields, for instance, organization science. However, like many other simulation methods, NK model has not received much attention from Management Information Systems (MIS) discipline. This tutorial, thus, is trying to introduce NK model in a simple way and encourage related studies. To demonstrate how NK model works, this tutorial reproduces several Levinthal’s (1997) experiments. Besides, this tutorial attempts to make clear the relevance between NK model and agent-based modeling (ABM). The relevance can be a theoretical basis to further develop NK model framework for other research scenarios. For example, this tutorial provides an NK model solution to study IT value cocreation process by extending network structure and agent interactions.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
[ES]En el trabajo que presentamos se revisan los nuevos enfoques de dirección de empresas basados en gestión de procesos y gestión de riesgos, y su influencia en el futuro de la información económica-financiera, lo que dará lugar a cambios acelerados por las más avanzadas aplicaciones informáticas de la contabilidad y las prácticas de auditoría, puesto que en la práctica diaria de las empresas, tanto la auditoría como la información contable y los sistemas de información, contemplan ya dicho enfoque de la contabilidad por procesos. En consecuencia, la evolución de los sistemas de información favorece, cuando no obliga, a que el profesional de la contabilidad empresarial adopte una visión de la misma según los nuevos puntos de vista basados en procesos, provocando nuevas aplicaciones informáticas a la contabilidad.
Resumo:
Trata-se de estudo descritivo, de abordagem qualitativa, do tipo estudo de caso cujo objetivo é analisar o modelo de contratualização de uma unidade hospitalar pública. No contexto da evolução das relações intergovernamentais da saúde, verificaram-se o grau de correspondência entre as ações e serviços de natureza hospitalar ofertados no município e as necessidades de implementação desses na proposta de planejamento municipal, norteada pelo Plano Municipal de Saúde. Na busca do arcabouço teórico, foram aprofundados temas como: o processo contratual do Sistema Único de Saúde, o modelo de assistência hospitalar no Brasil, as redes de atenção à saúde e os mecanismos de gestão/relações interfederativas. São descritos os cenários municipais e regionais contextualizando a implantação da unidade hospitalar. Realizou-se estudo dos sistemas de informação da gestão pública, como: cadastro nacional de estabelecimentos de saúde/CNES, sistemas de informação hospitalar, sistema de informação morbimortalidade e o Plano de Saúde municipal e estadual. Ao final, apresentam-se os desafios da gestão na implantação do novo modelo de contrato diante da dificuldade de financiamento. Acredita-se que repensar o modelo de contratação dos serviços implica assegurar correspondência entre os serviços de saúde e os resultados da assistência à saúde da população usuária.
Resumo:
Building on recent developments in mixed methods, we discuss the methodological implications of critical realism and explore how these can guide dynamic mixed-methods research design in information systems. Specifically, we examine the core ontological assumptions of CR in order to gain some perspective on key epistemological issues such as causation and validity, and illustrate how these shape our logic of inference in the research process through what is known as retroduction. We demonstrate the value of a CR-led mixed-methods research approach by drawing on a study that examines the impact of ICT adoption in the financial services sector. In doing so, we provide insight into the interplay between qualitative and quantitative methods and the particular value of applying mixed methods guided by CR methodological principles. Our positioning of demi-regularities within the process of retroduction contributes a distinctive development in this regard. We argue that such a research design enables us to better address issues of validity and the development of more robust meta-inferences.
Resumo:
The Xinli mine area of Sanshandao mine is adjacent to the Bohai Sea and its main exploitable ore deposit occurs in the undersea rock mass. The mine is the biggest undersea gold mine of China after production. The mine area faces a latent danger of water bursting, even sudden seawater inrush. There is no mature experience in undersea mining in China so far. The vein ore deposit is located in the lower wall of a fault; its possible groundwater sources mainly include bittern, Quaternary pore water and modern seawater. To ensure the safety of undersea mining, to survey the flooding conditions of the ore deposit using proper measures and study the potential seawater inrush pattern are the key technical problems. With the Xinli mine area as a case study, the engineering geological conditions of the Xinli mine area are surveyed in situ, the regional structural pattern and rock mass framework characteristics are found out, the distribution of the structural planes are modeled by a Monte Carlo method and the connectivity coefficients of rock mass structural planes are calculated. The regional hydro-geological conditions are analyzed and the in-situ hydro-geological investigation and sampling are performed in detail, the hydrochemistry and isotopes testing and groundwater dynamic monitoring are conducted, the recharge, runoff, discharge conditions are specified and the sources of flooding are distinguished. Some indices are selected from the testing results to calculate the proportion of each source in some water discharge points and in the whole water discharge of the Xinli mine area. The temporal and spatial variations of each water source of the whole ore deposit flooding are analyzed. According to the special project conditions in the Xinli mine area, the permeability coefficient tensors of the rock mass in Xinli mine area are calculated based on a fracture geometry measurement method, in terms of the connectivity and a few hydraulic testing results, a modified synthetic permeability coefficient are calculated. The hydro-geological conceptual and mathematical model are established,the water yield of mine is predicted using Visual Modflow code. The spreading law of surrounding rock mass deformation and secondary stress are studied by numerical analysis; the intrinsic mechanism of the faults slip caused by the excavation of ore deposit is analyzed. The results show that the development of surrounding rock mass deformation and secondary stress of vein ore deposit in the lower wall of a fault, is different from that in a thick-big ore deposit. The secondary stress caused by the excavation of vein ore deposit in the lower wall of a fault, is mainly distributed in the upper wall of the fault, one surface subsidence center will occur. The influences of fault on the rock mass movement, secondary stress and hydro-geological structures are analyzed; the secondary stress is blocked by the fault and the tensile stress concentration occurs in the rock mass near the fault, the original water blocking structure is destructed and the permeable structure is reconstructed, the primary structural planes begin to expand and newborn fissures occur, so the permeability of the original permeable structure is greatly enhanced, so the water bursting will probably occur. Based on this knowledge, the possible water inrush pattern and position of the Xinli mine area are predicted. Some computer programs are developed using object-oriented design method under the development platform Visual Studio.Net. These programs include a Monte Carlo simulation procedure, a joint diagrammatizing procedure, a structural planes connectivity coefficient calculating procedure, a permeability tensor calculating procedure, a water chemical formula edit and water source fixture conditions calculating procedure. A new computer mapping algorithm of joint iso-density diagram is raised. Based on the powerful spatial data management and icon functions of Geographic Information System, the pit water discharge dynamic monitoring data management information systems are established with ArcView.
Resumo:
The healthcare industry is beginning to appreciate the benefits which can be obtained from using Mobile Health Systems (MHS) at the point-of-care. As a result, healthcare organisations are investing heavily in mobile health initiatives with the expectation that users will employ the system to enhance performance. Despite widespread endorsement and support for the implementation of MHS, empirical evidence surrounding the benefits of MHS remains to be fully established. For MHS to be truly valuable, it is argued that the technological tool be infused within healthcare practitioners work practices and used to its full potential in post-adoptive scenarios. Yet, there is a paucity of research focusing on the infusion of MHS by healthcare practitioners. In order to address this gap in the literature, the objective of this study is to explore the determinants and outcomes of MHS infusion by healthcare practitioners. This research study adopts a post-positivist theory building approach to MHS infusion. Existing literature is utilised to develop a conceptual model by which the research objective is explored. Employing a mixed-method approach, this conceptual model is first advanced through a case study in the UK whereby propositions established from the literature are refined into testable hypotheses. The final phase of this research study involves the collection of empirical data from a Canadian hospital which supports the refined model and its associated hypotheses. The results from both phases of data collection are employed to develop a model of MHS infusion. The study contributes to IS theory and practice by: (1) developing a model with six determinants (Availability, MHS Self-Efficacy, Time-Criticality, Habit, Technology Trust, and Task Behaviour) and individual performance-related outcomes of MHS infusion (Effectiveness, Efficiency, and Learning), (2) examining undocumented determinants and relationships, (3) identifying prerequisite conditions that both healthcare practitioners and organisations can employ to assist with MHS infusion, (4) developing a taxonomy that provides conceptual refinement of IT infusion, and (5) informing healthcare organisations and vendors as to the performance of MHS in post-adoptive scenarios.
Resumo:
La Cadena Datos-Información-Conocimiento (DIC), denominada “Jerarquía de la Información” o “Pirámide del Conocimiento”, es uno de los modelos más importantes en la Gestión de la Información y la Gestión del Conocimiento. Por lo general, la estructuración de la cadena se ha ido definiendo como una arquitectura en la que cada elemento se levanta sobre el elemento inmediatamente inferior; sin embargo no existe un consenso en la definición de los elementos, ni acerca de los procesos que transforman un elemento de un nivel a uno del siguiente nivel. En este artículo se realiza una revisión de la Cadena Datos-Información-Conocimiento examinando las definiciones más relevantes sobre sus elementos y sobre su articulación en la literatura, para sintetizar las acepciones más comunes. Se analizan los elementos de la Cadena DIC desde la semiótica de Peirce; enfoque que nos permite aclarar los significados e identificar las diferencias, las relaciones y los roles que desempeñan en la cadena desde el punto de vista del pragmatismo. Finalmente se propone una definición de la Cadena DIC apoyada en las categorías triádicas de signos y la semiosis ilimitada de Peirce, los niveles de sistemas de signos de Stamper y las metáforas de Zeleny.