889 resultados para Implementation Process of Licensing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a process of cellulose thermal degradation with bio-hydrogen generation and zinc nanostructures synthesis. Production of zinc nanowires and zinc nanoflowers was performed by a novel processes based on cellulose pyrolysis, volatiles reforming and direct reduction of ZnO. The bio-hydrogen generated in situ promoted the ZnO reduction with Zn nanostructures formation by vapor–solid (VS) route. The cellulose and cellulose/ZnO samples were characterized by thermal analyses (TG/DTG/DTA) and the gases evolved were analyzed by FTIR spectroscopy (TG/FTIR). The hydrogen was detected by TPR (Temperature Programmed Reaction) tests. The results showed that in the presence of ZnO the cellulose thermal degradation produced larger amounts of H2 when compared to pure cellulose. The process was also carried out in a tubular furnace with N2 atmosphere, at temperatures up to 900 °C, and different heating rates. The nanostructures growth was catalyst-free, without pressure reduction, at temperatures lower than those required in the carbothermal reduction of ZnO with fossil carbon. The nanostructures were investigated by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDS) and transmission electron microscopy (TEM). The optical properties were investigated by photoluminescence (PL). One mechanism was presented in an attempt to explain the synthesis of zinc nanostructures that are crystalline, were obtained without significant re-oxidation and whose morphologies are dependent on the heating rates of the process. This route presents a potential use as an industrial process taking into account the simple operational conditions, the low costs of cellulose and the importance of bio-hydrogen and nanostructured zinc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue addressed in this article is whether and to what extent a lawyer has an ethical responsibility to pursue implementation of the remedy in institutional reform litigation. Institutional reform litigation refers to cases in which an individual or class of individuals sues a large organization in order to vindicate constitutional or statutory rights. The types of cases with which this article is concerned are the "public law" type, such as school desegregation, prisoners' rights and patients' rights cases, although included under the rubric of institutional reform can be, inter alia, antitrust, reapportionment and bankruptcy cases. The implementation stage of institutional reform litigation arises after an individual or class of individuals prevails at the liability stage, or pursuant to a settlement, and a court orders the defendant organization to change in order to vindicate the plaintiffs' rights. At that point, the defendant organization, whether it be a prison, mental hospital or school district, usually has the burden of implementing the order. One conclusion drawn is that the ethical duty of the lawyer must always be consistent with the lawyer's "special responsibility for the quality of justice."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ion channels are pore-forming proteins that regulate the flow of ions across biological cell membranes. Ion channels are fundamental in generating and regulating the electrical activity of cells in the nervous system and the contraction of muscolar cells. Solid-state nanopores are nanometer-scale pores located in electrically insulating membranes. They can be adopted as detectors of specific molecules in electrolytic solutions. Permeation of ions from one electrolytic solution to another, through a protein channel or a synthetic pore is a process of considerable importance and realistic analysis of the main dependencies of ion current on the geometrical and compositional characteristics of these structures are highly required. The project described by this thesis is an effort to improve the understanding of ion channels by devising methods for computer simulation that can predict channel conductance from channel structure. This project describes theory, algorithms and implementation techniques used to develop a novel 3-D numerical simulator of ion channels and synthetic nanopores based on the Brownian Dynamics technique. This numerical simulator could represent a valid tool for the study of protein ion channel and synthetic nanopores, allowing to investigate at the atomic-level the complex electrostatic interactions that determine channel conductance and ion selectivity. Moreover it will provide insights on how parameters like temperature, applied voltage, and pore shape could influence ion translocation dynamics. Furthermore it will help making predictions of conductance of given channel structures and it will add information like electrostatic potential or ionic concentrations throughout the simulation domain helping the understanding of ion flow through membrane pores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of skeletal loading conditions in vivo and their relationship to the health of bone tissues, remain an open question. Computational modeling of the musculoskeletal system is the only practicable method providing a valuable approach to muscle and joint loading analyses, although crucial shortcomings limit the translation process of computational methods into the orthopedic and neurological practice. A growing attention focused on subject-specific modeling, particularly when pathological musculoskeletal conditions need to be studied. Nevertheless, subject-specific data cannot be always collected in the research and clinical practice, and there is a lack of efficient methods and frameworks for building models and incorporating them in simulations of motion. The overall aim of the present PhD thesis was to introduce improvements to the state-of-the-art musculoskeletal modeling for the prediction of physiological muscle and joint loads during motion. A threefold goal was articulated as follows: (i) develop state-of-the art subject-specific models and analyze skeletal load predictions; (ii) analyze the sensitivity of model predictions to relevant musculotendon model parameters and kinematic uncertainties; (iii) design an efficient software framework simplifying the effort-intensive phases of subject-specific modeling pre-processing. The first goal underlined the relevance of subject-specific musculoskeletal modeling to determine physiological skeletal loads during gait, corroborating the choice of full subject-specific modeling for the analyses of pathological conditions. The second goal characterized the sensitivity of skeletal load predictions to major musculotendon parameters and kinematic uncertainties, and robust probabilistic methods were applied for methodological and clinical purposes. The last goal created an efficient software framework for subject-specific modeling and simulation, which is practical, user friendly and effort effective. Future research development aims at the implementation of more accurate models describing lower-limb joint mechanics and musculotendon paths, and the assessment of an overall scenario of the crucial model parameters affecting the skeletal load predictions through probabilistic modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geochemical mapping is a valuable tool for the control of territory that can be used not only in the identification of mineral resources and geological, agricultural and forestry studies but also in the monitoring of natural resources by giving solutions to environmental and economic problems. Stream sediments are widely used in the sampling campaigns carried out by the world's governments and research groups for their characteristics of broad representativeness of rocks and soils, for ease of sampling and for the possibility to conduct very detailed sampling In this context, the environmental role of stream sediments provides a good basis for the implementation of environmental management measures, in fact the composition of river sediments is an important factor in understanding the complex dynamics that develop within catchment basins therefore they represent a critical environmental compartment: they can persistently incorporate pollutants after a process of contamination and release into the biosphere if the environmental conditions change. It is essential to determine whether the concentrations of certain elements, in particular heavy metals, can be the result of natural erosion of rocks containing high concentrations of specific elements or are generated as residues of human activities related to a certain study area. This PhD thesis aims to extract from an extensive database on stream sediments of the Romagna rivers the widest spectrum of informations. The study involved low and high order stream in the mountain and hilly area, but also the sediments of the floodplain area, where intensive agriculture is active. The geochemical signals recorded by the stream sediments will be interpreted in order to reconstruct the natural variability related to bedrock and soil contribution, the effects of the river dynamics, the anomalous sites, and with the calculation of background values be able to evaluate their level of degradation and predict the environmental risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade the near-surface mounted (NSM) strengthening technique using carbon fibre reinforced polymers (CFRP) has been increasingly used to improve the load carrying capacity of concrete members. Compared to externally bonded reinforcement (EBR), the NSM system presents considerable advantages. This technique consists in the insertion of carbon fibre reinforced polymer laminate strips into pre-cut slits opened in the concrete cover of the elements to be strengthened. CFRP reinforcement is bonded to concrete with an appropriate groove filler, typically epoxy adhesive or cement grout. Up to now, research efforts have been mainly focused on several structural aspects, such as: bond behaviour, flexural and/or shear strengthening effectiveness, and energy dissipation capacity of beam-column joints. In such research works, as well as in field applications, the most widespread adhesives that are used to bond reinforcements to concrete are epoxy resins. It is largely accepted that the performance of the whole application of NSM systems strongly depends on the mechanical properties of the epoxy resins, for which proper curing conditions must be assured. Therefore, the existence of non-destructive methods that allow monitoring the curing process of epoxy resins in the NSM CFRP system is desirable, in view of obtaining continuous information that can provide indication in regard to the effectiveness of curing and the expectable bond behaviour of CFRP/adhesive/concrete systems. The experimental research was developed at the Laboratory of the Structural Division of the Civil Engineering Department of the University of Minho in Guimar\~aes, Portugal (LEST). The main objective was to develop and propose a new method for continuous quality control of the curing of epoxy resins applied in NSM CFRP strengthening systems. This objective is pursued through the adaptation of an existing technique, termed EMM-ARM (Elasticity Modulus Monitoring through Ambient Response Method) that has been developed for monitoring the early stiffness evolution of cement-based materials. The experimental program was composed of two parts: (i) direct pull-out tests on concrete specimens strengthened with NSM CFRP laminate strips were conducted to assess the evolution of bond behaviour between CFRP and concrete since early ages; and, (ii) EMM-ARM tests were carried out for monitoring the progressive stiffness development of the structural adhesive used in CFRP applications. In order to verify the capability of the proposed method for evaluating the elastic modulus of the epoxy, static E-Modulus was determined through tension tests. The results of the two series of tests were then combined and compared to evaluate the possibility of implementation of a new method for the continuous monitoring and quality control of NSM CFRP applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Community has stressed the importance of achieving a common understanding to deal with the environmental noise through community actions of the Member States. This implies the use of harmonized indicators and specific information regarding the values of indicators, the exceedance of limits and the number of people and dwellings exposed to noise. The D.Lgs. 149/2005 in compliance with the European Directive 2002/49/EC defines methodologies, noise indicators and types of outputs required. In this dissertation the work done for the noise mapping of highly trafficked roads of the Province of Bologna will be reported. The study accounts for the environmental noise generated by the road infrastructure outside the urban agglomeration of Bologna. Roads characterized by an annual traffic greater than three millions of vehicles will be considered. The process of data collection and validation will be reported, as long as the implementation of the calculation method in the software and the procedure to create and calibrate the calculation model. Results will be provided as required by the legislation, in forms of maps and tables. Moreover results regarding each road section accounted will be combined to gain a general understanding of the situation of the overall studied area. Although the understanding of the noise levels and the number of people exposed is paramount, it is not sufficient to develop strategies of noise abatement interventions. Thus a further step will be addressed: the creation of priority maps as the basis of action plans for organizing and prioritizing solutions for noise reduction and abatement. Noise reduction measures are reported in a qualitative way in the annex and constitute a preliminary research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study of the process of language shift and maintenance in the bilingual community of Romanians living in Hungary was based on 40 tape-recorded Romanian sociolinguistic interviews. These were transcribed into computerised form and provide an excellent source of sociolinguistic, contact linguistic and discourse analysis data, making it possible to show the effect of internal and external factors on the bilingual speech mode. The main topics considered were the choice of Romanian and Hungarian in community interactions, factors of language choice, code-switching: introlanguage and interlanguage, reasons for code-switching, the relationship between age and the frequency of code switching in the interview situation, and the unequal competition of minority and majority languages at school.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study of the process of language shift and maintenance in the bilingual community of Romanians living in Hungary was based on 40 tape-recorded Romanian sociolinguistic interviews. These were transcribed into computerised form and provide an excellent source of sociolinguistic, contact linguistic and discourse analysis data, making it possible to show the effect of internal and external factors on the bilingual speech mode. The main topics considered were the choice of Romanian and Hungarian in community interactions, factors of language choice, code-switching: introlanguage and interlanguage, reasons for code-switching, the relationship between age and the frequency of code switching in the interview situation, and the unequal competition of minority and majority languages at school.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background It is commonly assumed that indigenous medical systems remain strong in developing countries because biomedicine is physically inaccessible or financially not affordable. This paper compares the health-seeking behavior of households from rural Andean communities at a Peruvian and a Bolivian study site. The main research question was whether the increased presence of biomedicine led to a displacement of Andean indigenous medical practices or to coexistence of the two healing traditions. Methodology Open-ended interviews and free listing exercises were conducted between June 2006 and December 2008 with 18 households at each study site. Qualitative identification of households’ therapeutic strategies and use of remedies was carried out by means of content analysis of interview transcriptions and inductive interference. Furthermore, a quantitative assessment of the incidence of culture-bound illnesses in local ethnobiological inventories was performed. Results Our findings indicate that the health-seeking behavior of the Andean households in this study is independent of the degree of availability of biomedical facilities in terms of quality of services provided, physical accessibility, and financial affordability, except for specific practices such as childbirth. Preference for natural remedies over pharmaceuticals coexists with biomedical healthcare that is both accessible and affordable. Furthermore, our results show that greater access to biomedicine does not lead to less prevalence of Andean indigenous medical knowledge, as represented by the levels of knowledge about culture-bound illnesses. Conclusions The take-home lesson for health policy-makers from this study is that the main obstacle to use of biomedicine in resource-poor rural areas might not be infrastructural or economic alone. Rather, it may lie in lack of sufficient recognition by biomedical practitioners of the value and importance of indigenous medical systems. We propose that the implementation of health care in indigenous communities be designed as a process of joint development of complementary knowledge and practices from indigenous and biomedical health traditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Project-based education and portfolio assessments are at the forefront of educational research. This research follows the implementation of a project-based unit in a high school physics class. Students played the role of an engineering firm who designed, built and tested file folder bridges. The purpose was to determine if projectbased learning could improve student attitude toward science and related careers like engineering. Teams of students presented their work in a portfolio for a final assessment of the process of designing, building and testing their bridges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deregulation strategies and their regulating effects: The case of the termination of Social Assistance for rejected asylum seekers in Switzerland. In Switzerland, rejected asylum seekers no longer have any residence rights. In 2003 the Swiss state decided to terminate the so far granted social assistance for people with a non-entry decision on their asylum request. In 2008 the termination of social assistance was expanded to all rejected asylum seekers. Nevertheless, facing the impossibility of deporting them, the Swiss state entitled this group of people to emergency assistance. It is a basic, which is stated in the Swiss Federal constitution. In this context, new structures were established specially for rejected asylum seekers. These structures had to be set up, financed, controlled, managed and legitimized. For example, collective centres were set up exclusively for rejected asylum seekers. In this speech, I want to analyze the political and bureaucratic process of terminating social assistance for rejected asylum seekers. The exclusion of rejected asylum seekers from social aid was embedded in a wider austerity program of the Federal State. The Federal Migration Office had been requested to save money. The main official goal was to reduce the support of these illegalized people, reduce any structures that would prolong their stay on Swiss ground and to set incentives so that they would leave the country on their own. But during the implementation, new regulating effects emerged. Drawing on ethnographic material, I will highlight these “messy procedures” (Sciortino 2004). First, I will analyze the means and goals developed by the Federal authorities while conceptualising the termination of social assistance. Second, I will focus on the new built structures and elaborate the practices and legitimating strategies of the authorities. As a conclusion, I will analyze the ambivalences of these processes which, at the end, established specific structures for the “unwanted”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IS outsourcing projects often fail to achieve project goals. To inhibit this failure, managers need to design formal controls that are tailored to the specific contextual demands. However, the dynamic and uncertain nature of IS outsourcing projects makes the design of such specific formal controls at the outset of a project challenging. Hence, the process of translating high-level project goals into specific formal controls becomes crucial for success or failure of IS outsourcing projects. Based on a comparative case study of four IS outsourcing projects, our study enhances current understanding of such translation processes and their consequences by developing a process model that explains the success or failure to achieve high-level project goals as an outcome of two unique translation patterns. This novel process-based explanation for how and why IS outsourcing projects succeed or fail has important implications for control theory and IS project escalation literature.