27 resultados para Software Development
Resumo:
In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.
Resumo:
This paper explores the role of transactive memory in enabling knowledge transfer between globally distributed teams. While the information systems literature has recently acknowledged the role transactive memory plays in improving knowledge processes and performance in colocated teams, little is known about its contribution to distributed teams. To contribute to filling this gap, knowledge-transfer challenges and processes between onsite and offshore teams were studied at TATA Consultancy Services. In particular, the paper describes the transfer of knowledge between onsite and offshore teams through encoding, storing and retrieving processes. An in-depth case study of globally distributed software development projects was carried out, and a qualitative, interpretive approach was adopted. The analysis of the case suggests that in order to overcome differences derived from the local contexts of the onsite and offshore teams (e.g. different work routines, methodologies and skills), some specific mechanisms supporting the development of codified and personalized ‘directories’ were introduced. These include the standardization of templates and methodologies across the remote sites as well as frequent teleconferencing sessions and occasional short visits. These mechanisms contributed to the development of the notion of ‘who knows what’ across onsite and offshore teams despite the challenges associated with globally distributed teams, and supported the transfer of knowledge between onsite and offshore teams. The paper concludes by offering theoretical and practical implications.
Resumo:
This book contains 11 carefully revised and selected papers from the 5th Workshop on Global Sourcing, held in Courchevel, France, March 14-17, 2011. They have been gleaned from a vast empirical base brought together by leading researchers in information systems, strategic management, and operations. This volume is intended for use by students, academics, and practitioners interested in the outsourcing and offshoring of information technology and business processes. It offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for students and managers. The topics discussed combine theoretical and practical insights, and they are extensively illustrated by case studies from client and vendor organizations. Last but not least, the book examines current and future trends in outsourcing and offshoring, paying particular attention to how innovation can be realized in global or outsourced software development environments.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in outsourcing and offshoring of information technology and business processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit to students and managers. The range of topics covered in this book is wide and diverse. Various governance and coordination mechanisms for managing outsourcing relationships are discussed in great depth and the decision-making processes and considerations regarding sourcing arrangements, including multi-sourcing and cloud services, are examined. Vendors’ capabilities for managing global software development are studied in depth. Clients’ capabilities and issues related to compliance and culture are also discussed in association with various sourcing models. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
Technological advancements enable new sourcing models in software development such as cloud computing, software-as-a-service, and crowdsourcing. While the first two are perceived as a re-emergence of older models (e.g., ASP), crowdsourcing is a new model that creates an opportunity for a global workforce to compete with established service providers. Organizations engaging in crowdsourcing need to develop the capabilities to successfully utilize this sourcing model in delivering services to their clients. To explore these capabilities we collected qualitative data from focus groups with crowdsourcing leaders at a large technology organization. New capabilities we identified stem from the need of the traditional service provider to assume a "client" role in the crowdsourcing context, while still acting as a "vendor" in providing services to the end client. This paper expands the research on vendor capabilities and IS outsourcing as well as offers important insights to organizations that are experimenting with, or considering, crowdsourcing.
Resumo:
Magnification can be provided to assist those with visual impairment to make the best use of remaining vision. Electronic transverse magnification of an object was first conceived for use in low vision in the late 1950s, but has developed slowly and is not extensively prescribed because of its relatively high cost and lack of portability. Electronic devices providing transverse magnification have been termed closed-circuit televisions (CCTVs) because of the direct cable link between the camera imaging system and monitor viewing system, but this description generally refers to surveillance devices and does not indicate the provision of features such as magnification and contrast enhancement. Therefore, the term Electronic Vision Enhancement Systems (EVES) is proposed to better distinguish and describe such devices. This paper reviews current knowledge on EVES for the visually impaired in terms of: classification; hardware and software (development of technology, magnification and field-of-view, contrast and image enhancement); user aspects (users and usage, reading speed and duration, and training); and potential future development of EVES. © 2003 The College of Optometrists.
Resumo:
AOSD'03 Practitioner Report Performance analysis is motivated as an ideal domain for benefiting from the application of Aspect Oriented (AO) technology. The experience of a ten week project to apply AO to the performance analysis domain is described. We show how all phases of a performance analysts’ activities – initial profiling, problem identification, problem analysis and solution exploration – were candidates for AO technology assistance – some being addressed with more success than others. A Profiling Workbench is described that leverages the capabilities of AspectJ, and delivers unique capabilities into the hands of developers exploring caching opportunities.
Resumo:
The programme of research examines knowledge workers, their relationships with organisations, and perceptions of management practices through the development of a theoretical model and knowledge worker archetypes. Knowledge worker and non-knowledge worker archetypes were established through an analysis of the extant literature. After an exploratory study of knowledge workers in a small software development company the archetypes were refined to include occupational classification data and the findings from Study 1. The Knowledge Worker Characteristics Model (KWCM) was developed as a theoretical framework in order to analyse differences between the two archetypes within the IT sector. The KWCM comprises of the variables within the job characteristics model, creativity, goal orientation, identification and commitment. In Study 2, a global web based survey was conducted. There were insufficient non-knowledge worker responses and therefore a cluster analysis was conducted to interrogate the archetypes further. This demonstrated, unexpectedly, that that there were marked differences within the knowledge worker archetypes suggesting the need to granulate the archetype further. The theoretical framework and the archetypes were revised (as programmers and web developers) and the research study was refocused to examine occupational differences within knowledge work. Findings from Study 2 identified that there were significant differences between the archetypes in relation to the KWCM. 19 semi-structured interviews were conducted in Study 3 in order to deepen the analysis using qualitative data and to examine perceptions of people management practices. The findings from both studies demonstrate that there were significant differences between the two groups but also that job challenge, problem solving, intrinsic reward and team identification were of importance to both groups of knowledge workers. This thesis presents an examination of knowledge workers’ perceptions of work, organisations and people management practices in the granulation and differentiation of occupational archetypes.
Resumo:
Objectives: To develop a decision support system (DSS), myGRaCE, that integrates service user (SU) and practitioner expertise about mental health and associated risks of suicide, self-harm, harm to others, self-neglect, and vulnerability. The intention is to help SUs assess and manage their own mental health collaboratively with practitioners. Methods: An iterative process involving interviews, focus groups, and agile software development with 115 SUs, to elicit and implement myGRaCE requirements. Results: Findings highlight shared understanding of mental health risk between SUs and practitioners that can be integrated within a single model. However, important differences were revealed in SUs' preferred process of assessing risks and safety, which are reflected in the distinctive interface, navigation, tool functionality and language developed for myGRaCE. A challenge was how to provide flexible access without overwhelming and confusing users. Conclusion: The methods show that practitioner expertise can be reformulated in a format that simultaneously captures SU expertise, to provide a tool highly valued by SUs. A stepped process adds necessary structure to the assessment, each step with its own feedback and guidance. Practice Implications: The GRiST web-based DSS (www.egrist.org) links and integrates myGRaCE self-assessments with GRiST practitioner assessments for supporting collaborative and self-managed healthcare.
Resumo:
One of the major challenges in measuring efficiency in terms of resources and outcomes is the assessment of the evolution of units over time. Although Data Envelopment Analysis (DEA) has been applied for time series datasets, DEA models, by construction, form the reference set for inefficient units (lambda values) based on their distance from the efficient frontier, that is, in a spatial manner. However, when dealing with temporal datasets, the proximity in time between units should also be taken into account, since it reflects the structural resemblance among time periods of a unit that evolves. In this paper, we propose a two-stage spatiotemporal DEA approach, which captures both the spatial and temporal dimension through a multi-objective programming model. In the first stage, DEA is solved iteratively extracting for each unit only previous DMUs as peers in its reference set. In the second stage, the lambda values derived from the first stage are fed to a Multiobjective Mixed Integer Linear Programming model, which filters peers in the reference set based on weights assigned to the spatial and temporal dimension. The approach is demonstrated on a real-world example drawn from software development.
Resumo:
Two classes of software that are notoriously difficult to develop on their own are rapidly merging into one. This will affect every key service that we rely upon in modern society, yet a successful merge is unlikely to be achievable using software development techniques specific to either class. This paper explains the growing demand for software capable of both self-adaptation and high integrity, and advocates the use of a collection of "@runtime" techniques for its development, operation and management. We summarise early research into the development of such techniques, and discuss the remaining work required to overcome the great challenge of self-adaptive high-integrity software. © 2011 ACM.
Resumo:
This paper investigates how existing software engineering techniques can be employed, adapted and integrated for the development of systems of systems. Starting from existing system-of-systems (SoS) studies, we identify computing paradigms and techniques that have the potential to help address the challenges associated with SoS development, and propose an SoS development framework that combines these techniques in a novel way. This framework addresses the development of a class of IT systems of systems characterised by high variability in the types of interactions between their component systems, and by relatively small numbers of such interactions. We describe how the framework supports the dynamic, automated generation of the system interfaces required to achieve these interactions, and present a case study illustrating the development of a data-centre SoS using the new framework.