30 resultados para Information Technologies

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent years, the European Union has come to view cyber security, and in particular, cyber crime as one of the most relevant challenges to the completion of its Area of Freedom, Security and Justice. Given European societies’ increased reliance on borderless and decentralized information technologies, this sector of activity has been identified as an easy target for actors such as organised criminals, hacktivists or terrorist networks. Such analysis has been accompanied by EU calls to step up the fight against unlawful online activities, namely through increased cooperation among law enforcement authorities (both national and extra- communitarian), the approximation of legislations, and public- private partnerships. Although EU initiatives in this field have, so far, been characterized by a lack of interconnection and an integrated strategy, there has been, since the mid- 2000s, an attempt to develop a more cohesive and coordinated policy. An important part of this policy is connected to the activities of Europol, which have come to assume a central role in the coordination of intelligence gathering and analysis of cyber crime. The European Cybercrime Center (EC3), which will become operational within Europol in January 2013, is regarded, in particular, as a focal point of the EU’s fight against this phenomenon. Bearing this background in mind, the present article wishes to understand the role of Europol in the development of a European policy to counter the illegal use of the internet. The article proposes to reach this objective by analyzing, through the theoretical lenses of experimental governance, the evolution of this agency’s activities in the area of cyber crime and cyber security, its positioning as an expert in the field, and the consequences for the way this policy is currently developing and is expected to develop in the near future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Research shows that consumers are readily embracing the Internet to buy products. This paper proposes that, in the case of grocery shopping, this may lead to sub-optimal decisions at the household level. Decisions online on what, where and from who to buy are normally taken by one individual. In the case of grocery shopping, decisions, however, need to be ‘vetted’ by ‘other’ individuals within the household. The ‘household wide related’ decisions influence how information technologies and systems for commerce should be designed and managed for optimum decision making. This paper argues, unlike previous research, that e-grocery retailing is failing to grow to its full potential not solely because of the ‘classical’ hazards and perceived risks associated with doing grocery shopping online but because e-grocery retailing strategy has failed to acknowledge the micro-household level specificities that affect decision making. Our exploratory research is based on empirical evidence which were collected through telephone interviews. We offer an insight into how e-grocery ‘fits’ and is ‘disrupted’ by the reality of day to day consumption decision making at the household level. Our main finding is to advocate a more role-neutral, multi-user and multi-technology approach to e-grocery shopping which re-defines the concept of the main shopper/decision maker thereby reconceptualising the ‘shopping logic’ for grocery products.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents the digital imaging results of a collaborative research project working toward the generation of an on-line interactive digital image database of signs from ancient cuneiform tablets. An important aim of this project is the application of forensic analysis to the cuneiform symbols to identify scribal hands. Cuneiform tablets are amongst the earliest records of written communication, and could be considered as one of the original information technologies; an accessible, portable and robust medium for communication across distance and time. The earliest examples are up to 5,000 years old, and the writing technique remained in use for some 3,000 years. Unfortunately, only a small fraction of these tablets can be made available for display in museums and much important academic work has yet to be performed on the very large numbers of tablets to which there is necessarily restricted access. Our paper will describe the challenges encountered in the 2D image capture of a sample set of tablets held in the British Museum, explaining the motivation for attempting 3D imaging and the results of initial experiments scanning the smaller, more densely inscribed cuneiform tablets. We will also discuss the tractability of 3D digital capture, representation and manipulation, and investigate the requirements for scaleable data compression and transmission methods. Additional information can be found on the project website: www.cuneiform.net

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: In today's competitive scenario, effective supply chain management is increasingly dependent on third-party logistics (3PL) companies' capabilities and performance. The dissemination of information technology (IT) has contributed to change the supply chain role of 3PL companies and IT is considered an important element influencing the performance of modern logistics companies. Therefore, the purpose of this paper is to explore the relationship between IT and 3PLs' performance, assuming that logistics capabilities play a mediating role in this relationship. Design/methodology/approach: Empirical evidence based on a questionnaire survey conducted on a sample of logistics service companies operating in the Italian market was used to test a conceptual resource-based view (RBV) framework linking IT adoption, logistics capabilities and firm performance. Factor analysis and ordinary least square (OLS) regression analysis have been used to test hypotheses. The focus of the paper is multidisciplinary in nature; management of information systems, strategy, logistics and supply chain management approaches have been combined in the analysis. Findings: The results indicate strong relationships among data gathering technologies, transactional capabilities and firm performance, in terms of both efficiency and effectiveness. Moreover, a positive correlation between enterprise information technologies and 3PL financial performance has been found. Originality/value: The paper successfully uses the concept of logistics capabilities as mediating factor between IT adoption and firm performance. Objective measures have been proposed for IT adoption and logistics capabilities. Direct and indirect relationships among variables have been successfully tested. © Emerald Group Publishing Limited.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Market-level information diffused by print media may contribute to the legitimation of an emerging technology and thus influence the diffusion of competing technological standards. After analyzing more than 10,000 trade media abstracts from the Local Area Networks (LAN) industry published between 1981 and 2000, we found the presence of differential effects on the adoption of competing standards by two market-level information types: technology and product availability. The significance of these effects depends on the technology's order of entry and suggests that high-tech product managers should make strategic use of market-level information by appropriately focusing the content of their communications. © 2007 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As information and communications technology (ICT) involves both traditional capital and knowledge capital, potential spillovers through various mechanisms can occur. We posit that ICT capital may boost productivity growth, not only in the home country, but also in other countries. In this paper, we provide empirical evidence of such spillovers using panel data on 37 countries from 1996 to 2004. Our results support the existence of ICT spillovers across country borders. Furthermore, we find that developing countries could reap more benefits from ICT spillovers than developed countries. This is particularly important for policy decisions regarding national trade liberalization and economic integration. Developing economies that are more open to foreign trade may have an economic advantage and may develop knowledge-intensive activities, which will lead to economic development in the long run.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Services-led competitive strategies are critically important to Western manufacturers. This paper contributes to our basic knowledge of such strategies by examining the enabling information and communication technologies that successfully servitized manufacturers appear to be adopting. Although these are preliminary findings from a longer-term research programme, through this paper we seek to offer immediate assistance to manufacturers who wish to understand how they might exploit the servitization movement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Services-led competitive strategies are critically important to western manufacturers. This paper contributes to our foundational knowledge of such strategies by examining the enabling information and communication technologies that successfully servitized manufacturers appear to be adopting. Although these are preliminary findings from a longer-term research programme, through this article we seek to offer immediate assistance to manufacturers who wish to understand how they might exploit the servitization movement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper applies Latour’s 1992 translation map as a device to explore the development of and recent conflict between two data standards for the exchange of business information – EDIFACT and XBRL. Our research is focussed in France, where EDIFACT is well established and XBRL is just emerging. The alliances supporting both standards are local and global. The French/European EDIFACT is promulgated through the United Nations while a consortium of national jurisdictions and companies has coalesced around the US initiated XBRL International (XII). We suggest cultural differences pose a barrier to co-operation between the two networks. Competing data standards create the risk of switching costs. The different technical characteristics of the standards are identified as raising implications for regulators and users. A key concern is the lack of co-ordination of data standard production and the mechanisms regulatory agencies use to choose platforms for electronic data submission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dementia is one of the greatest contemporary health and social care challenges, and novel approaches to the care of its sufferers are needed. New information and communication technologies (ICT) have the potential to assist those caring for people with dementia, through access to networked information and support, tracking and surveillance. This article reports the views about such new technologies of 34 carers of people with dementia. We also held a group discussion with nine carers for respondent validation. The carers' actual use of new ICT was limited, although they thought a gradual increase in the use of networked technology in dementia care was inevitable but would bypass some carers who saw themselves as too old. Carers expressed a general enthusiasm for the benefits of ICT, but usually not for themselves, and they identified several key challenges including: establishing an appropriate balance between, on the one hand, privacy and autonomy and, on the other: maximising safety; establishing responsibility for and ownership of the equipment and who bears the costs; the possibility that technological help would mean a loss of valued personal contact; and the possibility that technology would substitute for existing services rather than be complementary. For carers and dementia sufferers to be supported, the expanding use of these technologies should be accompanied by intensive debate of the associated issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis reports of a study into the effect upon organisations of co-operative information systems (CIS) incorporating flexible communications, group support and group working technologies. A review of the literature leads to the development of a model of effect based upon co-operative business tasks. CIS have the potential to change how co-operative business tasks are carried out and their principal effect (or performance) may therefore be evaluated by determining to what extent they are being employed to perform these tasks. A significant feature of CIS use identified is the extent to which they may be designed to fulfil particular tasks, or by contrast, may be applied creatively by users in an emergent fashion to perform tasks. A research instrument is developed using a survey questionnaire to elicit users judgements of the extent to which a CIS is employed to fulfil a range of co-operative tasks. This research instrument is applied to a longitudinal study of Novell GroupWise introduction at Northamptonshire County Council during which qualitative as well as quantitative data were gathered. A method of analysis of questionnaire results using principles from fuzzy mathematics and artificial intelligence is developed and demonstrated. Conclusions from the longitudinal study include the importance of early experiences in setting patterns for use for CIS, the persistence of patterns of use over time and the dominance of designed usage of the technology over emergent use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the development of an operational river basin water resources information management system. The river or drainage basin is the fundamental unit of the system; in both the modelling and prediction of hydrological processes, and in the monitoring of the effect of catchment management policies. A primary concern of the study is the collection of sufficient and sufficiently accurate information to model hydrological processes. Remote sensing, in combination with conventional point source measurement, can be a valuable source of information, but is often overlooked by hydrologists, due to the cost of acquisition and processing. This thesis describes a number of cost effective methods of acquiring remotely sensed imagery, from airborne video survey to real time ingestion of meteorological satellite data. Inexpensive micro-computer systems and peripherals are used throughout to process and manipulate the data. Spatial information systems provide a means of integrating these data with topographic and thematic cartographic data, and historical records. For the system to have any real potential the data must be stored in a readily accessible format and be easily manipulated within the database. The design of efficient man-machine interfaces and the use of software enginering methodologies are therefore included in this thesis as a major part of the design of the system. The use of low cost technologies, from micro-computers to video cameras, enables the introduction of water resources information management systems into developing countries where the potential benefits are greatest.