833 resultados para Information systems and tecnologies
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Information technology is at the centre of today’s business environment. The increasing importance of e-commerce and the integration of information systems in all areas of a business means it is crucial for managers to understand and implement IS (information systems). This major text, now in its second edition, provides the skills and knowledge necessary to choose the right systems, and to develop and manage them effectively. Business Information Systems: Technology, Development and Management assumes no prior knowledge of IS or IT, and emphasises the importance of IS to management decision making. It takes a 3 part structure: Part One covers hardware and software technologies; Part Two looks at information systems analysis and design; and Part Three describes the strategic management of IS. This successful format allows each section to be studied alongside individual modules, and enables students to focus clearly on specific areas and use the book for more than one course. This book is suitable for college students, undergraduate degree and postgraduate students taking courses with modules in the practical IT skills of selection, implementation, management and use of BIS. The practical sections are also of use to managers in industry involved in the development and use of IS.
Resumo:
This major text assumes no prior knowledge of IS or IT and builds both business and Information systems knowledge to enable the reader to choose the right systems, to develop them and to manage them effectively. The three-part structure to the book covers: Introduction to business information systems Business information systems development Business information systems management Suitable for any IS, BIS or MIS course from UG to MBA level within a Business or Computer Science Department.
Resumo:
A comprehensive introduction to the technology, development and management of business information systems. The book assumes no prior knowledge of IS or IT, so that new concepts and terms are defined as clearly as possible, with explanations in the text, and definitions at the margin. In this fast-moving area, the book covers both the crucial underpinnings of the subject as well as the most recent business and technology applications. It is written for students on any IS, BIS or MIS course from undergraduate to postgraduate and MBA level within a Business or Computer Science Department.
Resumo:
Based on a Belief-Action-Outcome framework, we produced a model that shows senior managers' perception of both the antecedents to and the consequences of Green IS adoption by a firm. This conceptual model and its associated hypotheses were empirically tested using a dataset generated from a survey of 405 organizations. The results suggest that coercive pressure influences the attitude toward Green IS adoption while mimetic pressure does not. In addition, we found that there was a significant relationship between Green IS adoption, attitude, and consideration of future consequences. Finally, we found that only long term Green IS adoption was positively related to environmental performance. © 2013 Elsevier B.V.
Resumo:
This article investigates whether (1) cross-functional integration within a firm and the use of information systems (IS) that support information sharing with external parties can enhance integration across the supply chain and wider networks and (2) whether collaboration with customers, suppliers and other external parties leads to increased supply chain performance in terms of new product development and introduction of new processes. Data from a high-quality survey carried out in Taiwan in 2009 were used, and appropriate econometric models were applied. Results show that the adoption of IS that enhance information sharing is vital not only for the effective communication with suppliers and with wider network members, but their adoption also has a direct effect across a firm's innovative effort. Cross-functional integration appears to matter only for the introduction of an innovative process. Collaboration with customers and suppliers affected a product's design and its overall features and functionality, respectively. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
A comprehensive introduction to the technology, development and management of business information systems. The book assumes no prior knowledge of IS or IT, so that new concepts and terms are defined as clearly as possible, with explanations in the text, and definitions at the margin. In this fast-moving area, the book covers both the crucial underpinnings of the subject as well as the most recent business and technology applications.It is written for students on any IS, BIS or MIS course from undergraduate to postgraduate and MBA level within a Business or Computer Science Department - See more at: http://catalogue.pearsoned.ca/educator/product/Business-Information-Systems-Technology-Development-and-Management-for-the-EBusiness/9780273716624.page#sthash.LgAheK57.dpuf
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This article describes architecture and implementation of subsystem intended for working with queries and reports in adaptive dynamically extended information systems able to dynamically extending. The main features of developed approach are application universality, user orientation and opportunity to integrate with external information systems. Software implementation is based on multilevel metadata approach.
Resumo:
Geographic Information Systems (GIS) is an emerging information technology (IT) which promises to have large scale influences in how spatially distributed resources are managed. It has had applications in the management of issues as diverse as recovering from the disaster of Hurricane Andrew to aiding military operations in Desert Storm. Implementation of GIS systems is an important issue because there are high cost and time involvement in setting them up. An important component of the implementation problem is the "meaning" different groups of people who are influencing the implementation give to the technology. The research was based on the theory of (theoretical stance to the problem was based on the) "Social Construction of Knowledge" systems which assumes knowledge systems are subject to sociological analysis both in usage and in content. An interpretive research approach was adopted to inductively derive a model which explains how the "meanings" of a GIS are socially constructed. The research design entailed a comparative case analysis over two county sites which were using the same GIS for a variety of purposes. A total of 75 in-depth interviews were conducted to elicit interpretations of GIS. Results indicate that differences in how geographers and data-processors view the technology lead to different implementation patterns in the two sites.
Resumo:
Automated information system design and implementation is one of the fastest changing aspects of the hospitality industry. During the past several years nothing has increased the professionalism or improved the productivity within the industry more than the application of computer technology. Intuitive software applications, deemed the first step toward making computers more people-literate, object-oriented programming, intended to more accurately model reality, and wireless communications are expected to play a significant role in future technological advancement.
Resumo:
This research analyzed the spatial relationship between a mega-scale fracture network and the occurrence of vegetation in an arid region. High-resolution aerial photographs of Arches National Park, Utah were used for digital image processing. Four sets of large-scale joints were digitized from the rectified color photograph in order to characterize the geospatial properties of the fracture network with the aid of a Geographic Information System. An unsupervised landcover classification was carried out to identify the spatial distribution of vegetation on the fractured outcrop. Results of this study confirm that the WNW-ESE alignment of vegetation is dominantly controlled by the spatial distribution of the systematic joint set, which in turn parallels the regional fold axis. This research provides insight into the spatial heterogeneity inherent to fracture networks, as well as the effects of jointing on the distribution of surface vegetation in desert environments.
Resumo:
The paper investigates how Information Systems (IS) has emerged as the product of inter-disciplinary discourses. The research aim in this study is to better understand diversity in IS research, and the extent to which the diversity of discourse expanded and contracted from 1995 to 2011. Methodologically, we apply a combined citations/co-citations analysis based on the eight Association for Information Systems basket journals and the 22 subject-field classification framework provided by the Association of Business Schools. Our findings suggest that IS is in a state of continuous interaction and competition with other disciplines. General Management was reduced from a dominant position as a reference discipline in IS at the expense of a growing variety of other discourses including Business Strategy, Marketing, and Ethics and Governance, among others. Over time, IS as a field moved from the periphery to a central position during its discursive formation. This supports the notion of IS as a fluid discipline dynamically embracing a diverse range of adjacent reference disciplines, while keeping a degree of continuing interaction with them. Understanding where IS is currently at allows us to better understand and propose fruitful avenues for its development in both academia and practice. © 2013 JIT Palgrave Macmillan All rights reserved.
Resumo:
Veterinary medicines (VMs) from agricultural industry can enter the environment in a number of ways. This includes direct exposure through aquaculture, accidental spillage and disposal, and indirect entry by leaching from manure or runoff after treatment. Many compounds used in animal treatments have ecotoxic properties that may have chronic or sometimes lethal effects when they come into contact with non-target organisms. VMs enter the environment in mixtures, potentially having additive effects. Traditional ecotoxicology tests are used to determine the lethal and sometimes reproductive effects on freshwater and terrestrial organisms. However, organisms used in ecotoxicology tests can be unrepresentative of the populations that are likely to be exposed to the compound in the environment. Most often the tests are on single compound toxicity but mixture effects may be significant and should be included in ecotoxicology testing. This work investigates the use, measured environmental concentrations (MECs) and potential impact of sea lice treatments on salmon farms in Scotland. Alternative methods for ecotoxicology testing including mixture toxicity, and the use of in silico techniques to predict the chronic impact of VMs on different species of aquatic organisms were also investigated. The Scottish Environmental Protection Agency (SEPA) provided information on the use of five sea lice treatments from 2008-2011 on Scottish salmon farms. This information was combined with the recently available data on sediment MECs for the years 2009-2012 provided by SEPA using ArcGIS 10.1. In depth analysis of this data showed that from a total of 55 sites, 30 sites had a MEC higher than the maximum allowable concentration (MAC) as set out by SEPA for emamectin benzoate and 7 sites had a higher MEC than MAC for teflubenzuron. A number of sites that were up to 16 km away from the nearest salmon farm reported as using either emamectin benzoate or teflubenzuron measured positive for the two treatments. There was no relationship between current direction and the distribution of the sea lice treatments, nor was there any evidence for alternative sources of the compounds e.g. land treatments. The sites that had MECs higher than the MAC could pose a risk to non-target organisms and disrupt the species dynamics of the area. There was evidence that some marine protected sites might be at risk of exposure to these compounds. To complement this work, effects on acute mixture toxicity of the 5 sea lice treatments, plus one major metabolite 3-phenoxybenzoic acid (3PBA), were measured using an assay using the bioluminescent bacteria Aliivibrio fischeri. When exposed to the 5 sea lice treatments and 3PBA A. fischeri showed a response to 3PBA, emamectin benzoate and azamethiphos as well as combinations of the three. In order to establish any additive effect of the sea lice treatments, the efficacy of two mixture prediction equations, concentration addition (CA) and independent action ii(IA) were tested using the results from single compound dose response curves. In this instance IA was the more effective prediction method with a linear regression confidence interval of 82.6% compared with 22.6% of CA. In silico molecular docking was carried out to predict the chronic effects of 15 VMs (including the five used as sea lice control). Molecular docking has been proposed as an alternative screening method for the chronic effects of large animal treatments on non-target organisms. Oestrogen receptor alpha (ERα) of 7 non-target bony fish and the African clawed frog Xenopus laevis were modelled using SwissModel. These models were then ‘docked’ to oestradiol, the synthetic oestrogen ethinylestradiol, two known xenoestrogens dichlorodiphenyltrichloroethane (DDT) and bisphenol A (BPA), the antioestrogen breast cancer treatment tamoxifen and 15 VMs using Auto Dock 4. Based on the results of this work, four VMs were identified as being possible xenoestrogens or anti-oestrogens; these were cypermethrin, deltamethrin, fenbendazole and teflubenzuron. Further investigation, using in vitro assays, into these four VMs has been suggested as future work. A modified recombinant yeast oestrogen screen (YES) was attempted using the cDNA of the ERα of the zebrafish Danio rerio and the rainbow trout Oncorhynchus mykiss. Due to time and difficulties in cloning protocols this work was unable to be completed. Use of such in vitro assays would allow for further investigation of the highlighted VMs into their oestrogenic potential. In conclusion, VMs used as sea lice treatments, such as teflubenzuron and emamectin benzoate may be more persistent and have a wider range in the environment than previously thought. Mixtures of sea lice treatments have been found to persist together in the environment, and effects of these mixtures on the bacteria A. fischeri can be predicted using the IA equation. Finally, molecular docking may be a suitable tool to predict chronic endocrine disrupting effects and identify varying degrees of impact on the ERα of nine species of aquatic organisms.
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.