991 resultados para IS function


Relevância:

100.00% 100.00%

Publicador:

Resumo:

While IS function has gained widespread attention for over two decades, there is little consensus among information systems (IS) researchers and practitioners on how best to evaluate IS function's support performance. This paper reports on preliminary findings of a larger research effort proceeds from a central interest in the importance of evaluating IS function's support in organisations. This study is the first that attempts to re-conceptualise and conceive evaluate IS function's support as a multi- dimensional formative construct. We argue that a holistic measure for evaluating evaluate IS function's support should consist of dimensions that together assess the variety of the support functions and the quality of the support services provided to end-users. Thus, the proposed model consists of two halves, "Variety" and "Quality" within which resides seven dimensions. The Variety half includes five dimensions: Training; Documentation; Data- related Support, Software-related Support; and Hardware-related Support. The Quality half includes two dimensions: IS Support Staff and Support Services Performance. The proposed model is derived using a directed content analysis of 83 studies; from top IS outlets, employing the characteristics of the analytic theory and consistent with formative construct development procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the growing importance of IS for organizations and the continuous stream of new IT developments, the IS function in organizations becomes more critical but is also more challenged. For example, how should the IS function deal with the consumerization of IT or what is the added value of the IS function when it comes to SaaS and the Cloud? In this paper we argue that IS research is in need of a dynamic perspective on the IS function. The IS function has to become more focused on building and adapting IS capabilities in a changing environment. We discuss that there has been an overreliance on the Resource Based View so far for understanding the IS function and capabilities and introduce Dynamic Capabilities Theory as an additional theoretical perspective, which has only been limitedly addressed in IS literature yet. We present a first conceptualization of the dynamic IS function and discuss IS capabilities frameworks and individual IS capabilities from a dynamic perspective. These initial insights demonstrate the contribution of a dynamic perspective on the IS function itself.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In a competitive environment, companies continuously innovate to offer superior services at lower costs. ‘Shared Services’ have been extensively adopted in practice as a means for improving organizational performance. Shared Services are considered most appropriate for support functions and are widely adopted in human resource management, finance and accounting, and more recently employed as an information systems (IS) function. As computer-based corporate information systems have become de facto and the backbone of administrative systems, the technical impediments to sharing have come down dramatically. As this trend continues, CIOs and IT professionals need a deeper understanding of the Shared Services phenomenon. Yet, analysis of IS academic literature reveals that Shared Services, though mentioned in more than 100 articles, has received little in depth attention. This paper investigates the current status of Shared Services in IS literature. The authors present a detailed review of literature from main IS journals and conferences. The paper concludes with a tentative operational definition, a list of perceived main objectives of Shared Services, and an agenda for related future research.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This study is motivated by, and proceeds from, a central interest in the importance of evaluating IS service quality and adopts the IS ZOT SERVQUAL instrument (Kettinger & Lee, 2005) as its core theory base. This study conceptualises IS service quality as a multidimensional formative construct and seeks to answer the main research questions: “Is the IS service quality construct valid as a 1st-order formative, 2nd-order formative multidimensional construct?” Additionally, with the aim of validating the IS service quality construct within its nomological net, as in prior service marketing work, Satisfaction was hypothesised as its immediate consequence. With the goal of testing the above research question, IS service quality and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS), employing 219 valid responses, largely evidenced the validity of IS service quality as a multidimensional formative construct. The nomological validity of the IS service quality construct was also evidenced by demonstrating that 55% of Satisfaction was explained by the multidimensional formative IS service quality construct.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Shared services have gained significance as an organizational arrangement, in particular for support functions, to reduce costs, increase quality and create new capabilities. The Information Systems (IS) function is amenable to sharing arrangements and information systems can enable sharing in other functional areas. However, despite being a promising area for IS research, literature on shared services in the IS discipline is scarce and scattered. There is still little consensus on what shared services is. Moreover, a thorough understanding of why shared services are adopted, who are involved, and how things are shared is lacking. In this article, we set out to progress IS research on shared services by establishing a common ground for future research and proposing a research agenda to shape the field based on an analysis of the IS literature. We present a holistic and inclusive definition, discuss the primacy of economic-strategic objectives so far, and introduce conceptual frameworks for stakeholders and the notion of sharing. We also provide an overview of the theories and research methods applied. We propose a research agenda that addresses fundamental issues related to objectives, stakeholders, and the notion of sharing to lay the foundation for taking IS research on shared services forward.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

IT consumerization is both a major opportunity and significant challenge for organizations. However, IS research has hardly discussed the implications for IT management so far. In this paper we address this topic by empirically identifying organizational themes for IT consumerization and conceptually exploring the direct and indirect effects on the business value of IT, IT capabilities, and the IT function. More specifically, based on two case studies, we identify eight organizational themes: consumer IT strategy, policy development and responsibilities, consideration of private life of employees, user involvement into IT-related processes, individualization, updated IT infrastructure, end user support, and data and system security. The contributions of this paper are: (1) the identification of organizational themes for IT consumerization; (2) the proposed effects on the business value of IT, IT capabilities and the IT function, and; (3) combining empirical insights into IT consumerization with managerial theories in the IS discipline.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper illustrates a Wavelet Coefficient based approach using experiments to understand the sensitivity of ultrasonic signals due to parametric variation of a crack configuration in a metal plate. A PZT patch sensor/actuator system integrated to a metal plate with through-thickness crack is used. The proposed approach uses piezoelectric patches, which can be used to both actuate and sense the ultrasonic signals. While this approach leads to more flexibility and reduced cost for larger scalability of the sensor/actuator network, the complexity of the signals increases as compared to what is encountered in conventional ultrasonic NDE problems using selective wave modes. A Damage Index (DI) has been introduced, which is function of wavelet coefficient. Experiments have been carried out for various crack sizes, crack orientations and band-limited tone-burst signal through FIR filter. For a 1 cm long crack interrogated with 20 kHz tone-burst signal, the Damage Index (DI) for the horizontal crack orientation increases by about 70% with respect to that for 135 degrees oriented crack and it increases by about 33% with respect to the vertically oriented crack. The detailed results reported in this paper is a step forward to developing computational schemes for parametric identification of damage using sensor/actuator network and ultrasonic wave.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the major challenges of MIS activities is the difficulty in measuring the effectiveness of delivered systems. The principal purpose of my research is to explore this field in order to develop an instrument by which to measure such effectiveness. Conceptualisation of Information System (IS) Effectiveness has been substantially framed by DeLone and McLean's (1992) Success; Model. But with the innovation in Information Technology (IT) over the past decade, and the constant pressure in IT to improve performance, there is merit in undertaking a fresh appraisal of the issue. This study built on the model of IS Success developed by DeLone and MeLean, but was broadened to include related research from the domains of IS, Management and Marketing. This analysis found that an effective IS function is built on three pillars: the systems implemented; the information held and delivered by these systems; and, the service provided in support of the IS function. A common foundation for these pillars is the concept of stakeholder needs. In seeking to appreciate the effectiveness: of delivered IS applications in relation to the job performance of stakeholders, this research developed an understanding of what quality means in an IT context I argue that quality is a more useful criterion for effectiveness than the more customary measures of use and user satisfaction. Respecification of the IS Success Model was then proposed. The second phase of the research was to test this model empirically through judgment panels, focus groups and interviews. Results consistently supported the structure and components of the respecified model. Quality was determined as a multi-dimensional construct, with the key dimensions for the quality of delivered IS differing from those used in the research from other disciplines. Empirical work indicated that end-user stakeholders derived their evaluations of quality by internally evaluating perceived performance of delivered IS in relation to their expectations for such performance. A short trial explored whether, when overt measurement of expectations was concurrent with the measurement of perceptions, a more revealing appraisal of delivered IS quality was provided than when perceptions alone were measured. Results revealed a difference between the two measures. Using the New IS Success Model as the foundation, and drawing upon the related theoretical and empirical research, an instrument was developed to measure the quality/effectiveness of delivered IS applications. Four trials of this instrument, QUALIT, are documented. Analysis of results from preliminary trials indicates promise in terms of business value: the instrument is simple to administer and has the capacity to pinpoint areas of weakness. The research related to the respecification of the New IS Success Model and the associated empirical studies, including the development of QTJALIT, have both contributed to the development of theory about IS Effectiveness. More precisely, my research has reviewed the components of an information system, the dimensions comprising these components and the indicators of each, and based upon these findings, formulated an instrument by which to measure the effectiveness of a delivered IS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neste trabalho foi estudado o comportamento de fundações superficiais apoiadas em sistema de dupla camada, quando a superior é cimentada. O estudo consistiu-se de três etapas, chamadas de Etapa de Laboratório, Etapa Numérica e Etapa de Campo. Na Etapa de Laboratório foi verificada a viabilidade técnica de utilizar os resíduos industriais cinza pesada e cal de carbureto na estabilização de um solo residual de arenito botucatu. Estudou-se a reatividade da cinza pesada com a cal de carbureto, a influência da temperatura e do tempo de cura no desenvolvimento das reações pozolânicas, a influência de diferentes teores de resíduos na resistência à compressão simples, compressão diametral e durabilidade, objetivando definir uma mistura ótima e, ainda, o impacto ambiental da utilização da mistura ótima, através de ensaios de lixiviação e solubilização. Na Etapa Numérica foi estudado, através do Método dos Elementos Finitos, o comportamento de fundações superficiais apoiadas em dupla camada. O modelo utilizado para representar o comportamento do material cimentado e não-cimentado foi o elástico-plástico com critério de ruptura de Drucker-Prager e fluxo não-associado. Verificou-se, através de análise paramétrica, a influência da espessura da camada cimentada e do diâmetro da fundação, bem como a influência dos parâmetros dos materiais cimentado e não-cimentado na resposta carga x recalque de fundações superficiais. Na Etapa de Campo foram construídos aterros experimentais utilizando a mistura ótima determinada na Etapa de Laboratório e, sobre estes aterros, foram executados provas de carga de placas. A análise dos resultados obtidos nas três etapas levou às seguintes conclusões: é possível utilizar cinza pesada e cal de carbureto para estabilizar o solo residual de botucatu; o comportamento de fundações superficiais sobre solos cimentados é controlado pela relação espessura da camada cimentada diâmetro da fundação; os parâmetros ângulo de atrito e módulo de elasticidade da camada cimentada não influenciam os resultados de prova de carga; a ruptura da fundação é função de dois mecanismos progressivos, os quais são função das tensões de tração geradas na parte inferior da camada cimentada e das tensões cisalhantes existentes logo abaixo das bordas da fundação.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents optimal rules for monetary policy in Brazil derived from a backward looking expectation model consisting of a Keynesian IS function and an Augmented Phillips Curve (ISAS). The IS function displays'a high sensitivity of aggregate demand to the real interest rate and the Phillips Curve is accelerationist. The optimal monetary rules show low interest rate volatility with reaction coefficients lower than the ones suggested by Taylor (1993a,b). Reaction functions estimated through ADL and SUR models suggest that monetary policy has not been optimal and has aimed to product rather than inflation stabilization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The formation of paraffin deposits is common in the petroleum industry during production, transport and treatment stages. It happens due to modifications in the thermodynamic variables that alter the solubility of alkanes fractions present in petroleum. The deposition of paraffin can provoke significant and growing petroleum losses, arriving to block the flow, hindering to the production. This process is associated with the phases equilibrium L-S and the stages and nucleation, growth and agglomeration the crystals. That process is function of petroleum intrinsic characteristics and temperature and pressure variations, during production. Several preventive and corrective methods are used to control the paraffin crystallization, such as: use of chemical inhibitors, hot solvents injection, use of termochemistry reactions, and mechanical removal. But for offshore exploration this expensive problem needs more investigation. Many studies have been carried through Wax Appearance Temperature (WAT) of paraffin; therefore the formed crystals are responsible for the modification of the reologics properties of the oil, causing a lot off operational problems. From the determination of the WAT of a system it is possible to affirm if oil presents or not trend to the formation of organic deposits, making possible to foresee and to prevent problems of wax crystallization. The solvent n-paraffin has been widely used as fluid of perforation, raising the production costs when it is used in the removal paraffin deposits, needing an operational substitute. This study aims to determine the WAT of paraffin and the interference off additives in its reduction, being developed system paraffin/solvent/surfactant that propitiates the wax solubilization. Crystallization temperatures in varied paraffin concentrations and different solvents were established in the first stage of the experiments. In the second stage, using the methodology of variation of the photoelectric signal had been determined the temperature of crystallization of the systems and evaluated the interferences of additives to reduction of the WAT. The experimental results are expressed in function of the variations of the photoelectric signals during controlled cooling, innovating and validating this new methodology to determine WAT, relatively simple with relation the other applied that involve specific equipments and of high cost. Through the curves you differentiate of the results had been also identified to the critical stages of growth and agglomeration of the crystals that represent to the saturation of the system, indicating difficulties of flow due to the increase of the density

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article, we consider the synthetic control chart with two-stage sampling (SyTS chart) to control the process mean and variance. During the first stage, one item of the sample is inspected; if its value X, is close to the target value of the process mean, then the sampling is interrupted. Otherwise, the sampling goes on to the second stage, where the remaining items are inspected and the statistic T = Sigma [x(i) - mu(0) + xi sigma(0)](2) is computed taking into account all items of the sample. The design parameter is function of X-1. When the statistic T is larger than a specified value, the sample is classified as nonconforming. According to the synthetic procedure, the signal is based on Conforming Run Length (CRL). The CRL is the number of samples taken from the process since the previous nonconforming sample until the occurrence of the next nonconforming sample. If the CRL is sufficiently small, then a signal is generated. A comparative study shows that the SyTS chart and the joint X and S charts with double sampling are very similar in performance. However, from the practical viewpoint, the SyTS chart is more convenient to administer than the joint charts.