917 resultados para Workshop papers
Resumo:
This book contains 13 papers from the 7th Workshop on Global Sourcing, held in Val d'Isere, France, during March 11-14, 2013, which were carefully reviewed and selected from 40 submissions. They are based on a vast empirical base brought together by leading researchers in information systems, strategic management, and operations. This volume is intended for students, academics, and practitioners interested in research results and experiences on outsourcing and offshoring of information technology and business processes. The topics discussed represent both client and supplier perspectives on sourcing of global services, combine theoretical and practical insights regarding challenges that both clients and vendors face, and include case studies from client and vendor organizations.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in the outsourcing and offshoring of information technology and business services and processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for practitioners, academics and students. The range of topics covered in this book is wide and diverse, and represents both client and supplier perspectives on sourcing of global services. Various aspects related to the decision making process (e.g., asset transfer), learning mechanisms and organizational practices for managing outsourcing relationships are discussed in great depth. Contemporary sourcing models, including cloud services, are examined. Client dependency on the outsourcing provider, and social aspects, such as identity, are discussed in detail. Furthermore, resistance in outsourcing and failures are investigated to derive lessons as to how to avoid them and improve efficiency in outsourcing. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
This book contains 11 carefully revised and selected papers from the 5th Workshop on Global Sourcing, held in Courchevel, France, March 14-17, 2011. They have been gleaned from a vast empirical base brought together by leading researchers in information systems, strategic management, and operations. This volume is intended for use by students, academics, and practitioners interested in the outsourcing and offshoring of information technology and business processes. It offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for students and managers. The topics discussed combine theoretical and practical insights, and they are extensively illustrated by case studies from client and vendor organizations. Last but not least, the book examines current and future trends in outsourcing and offshoring, paying particular attention to how innovation can be realized in global or outsourced software development environments.
Resumo:
This book constitutes the revised selected papers from the 10th Global Sourcing Workshop held in Val d’Isère, France, in February 2016. The 11 papers presented in this volume were carefully reviewed and selected from 47 submissions. The book offers a review of the key topics in outsourcing and offshoring of information technology and business services offering practical frameworks that serve as a tool kit to students and managers. The range of topics covered is wide and diverse, but predominately focused on how to achieve success in shared services and outsourcing. More specifically, the book examines outsourcing decisions and management practices, giving specific attention to shared services that have become one of the dominant sourcing models. The topics discussed combine theoretical and practical insights regarding challenges that industry leaders, policy makers, and professionals face or should be concerned with. Case studies from various organizations, industries and countries such as UK, Italy, The Netherlands, Canada, Australia and Denmark complete the book.
Resumo:
The trypanosome evolution workshop, a joint meeting of the University of Exeter and the London School of Hygiene and Tropical Medicine, focused on topics relating to trypanosomatid and vector evolution. The meeting, sponsored by The Wellcome Trust, The Special Programme for Research and Training in Tropical Disease of World Health Organization and the British Section of the Society of Protozoologists, brought together an international group of experts who presented papers on a wide range of topics including parasite and vector phylogenies, molecular methodology and relevant biogeographical data.
Resumo:
With this document, we provide a compilation of in-depth discussions on some of the most current security issues in distributed systems. The six contributions have been collected and presented at the 1st Kassel Student Workshop on Security in Distributed Systems (KaSWoSDS’08). We are pleased to present a collection of papers not only shedding light on the theoretical aspects of their topics, but also being accompanied with elaborate practical examples. In Chapter 1, Stephan Opfer discusses Viruses, one of the oldest threats to system security. For years there has been an arms race between virus producers and anti-virus software providers, with no end in sight. Stefan Triller demonstrates how malicious code can be injected in a target process using a buffer overflow in Chapter 2. Websites usually store their data and user information in data bases. Like buffer overflows, the possibilities of performing SQL injection attacks targeting such data bases are left open by unwary programmers. Stephan Scheuermann gives us a deeper insight into the mechanisms behind such attacks in Chapter 3. Cross-site scripting (XSS) is a method to insert malicious code into websites viewed by other users. Michael Blumenstein explains this issue in Chapter 4. Code can be injected in other websites via XSS attacks in order to spy out data of internet users, spoofing subsumes all methods that directly involve taking on a false identity. In Chapter 5, Till Amma shows us different ways how this can be done and how it is prevented. Last but not least, cryptographic methods are used to encode confidential data in a way that even if it got in the wrong hands, the culprits cannot decode it. Over the centuries, many different ciphers have been developed, applied, and finally broken. Ilhan Glogic sketches this history in Chapter 6.
Resumo:
Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.
Resumo:
Rising sea level is perhaps the most severe consequence of climate warming, as much of the world’s population and infrastructure is located near current sea level (Lemke et al. 2007). A major rise of a metre or more would cause serious problems. Such possibilities have been suggested by Hansen and Sato (2011) who pointed out that sea level was several metres higher than now during the Holsteinian and Eemian interglacials (about 250,000 and 120,000 years ago, respectively), even though the global temperature was then only slightly higher than it is nowadays. It is consequently of the utmost importance to determine whether such a sea level rise could occur and, if so, how fast it might happen. Sea level undergoes considerable changes due to natural processes such as the wind, ocean currents and tidal motions. On longer time scales, the sea level is influenced by steric effects (sea water expansion caused by temperature and salinity changes of the ocean) and by eustatic effects caused by changes in ocean mass. Changes in the Earth’s cryosphere, such as the retreat or expansion of glaciers and land ice areas, have been the dominant cause of sea level change during the Earth’s recent history. During the glacial cycles of the last million years, the sea level varied by a large amount, of the order of 100 m. If the Earth’s cryosphere were to disappear completely, the sea level would rise by some 65 m. The scientific papers in the present volume address the different aspects of the Earth’s cryosphere and how the different changes in the cryosphere affect sea level change. It represents the outcome of the first workshop held within the new ISSI Earth Science Programme. The workshop took place from 22 to 26 March, 2010, in Bern, Switzerland, with the objective of providing an in-depth insight into the future of mountain glaciers and the large land ice areas of Antarctica and Greenland, which are exposed to natural and anthropogenic climate influences, and their effects on sea level change. The participants of the workshop are experts in different fields including meteorology, climatology, oceanography, glaciology and geodesy; they use advanced space-based observational studies and state-of-the-art numerical modelling.
Resumo:
Research must be published, otherwise it will be lost. The most important papers for a researcher to produce are those published in international refereed journals. Good practice in writing papers is something that can be learned. The editorial process involves sending submitted papers to independent experts in the field, usually anonymously, and their comments inform the editor, who decides whether and how to progress with a paper. Much of this is as obscure to experienced researchers as it is to new ones. With forethought and planning, the success rate of getting submitted papers accepted for publication can be increased. Editors and publishers are generally very keen to help people improve their success rate.