901 resultados para Process Modeling, Collaboration, Distributed Modeling, Collaborative Technology


Relevância:

50.00% 50.00%

Publicador:

Resumo:

The physical implementation of quantum information processing is one of the major challenges of current research. In the last few years, several theoretical proposals and experimental demonstrations on a small number of qubits have been carried out, but a quantum computing architecture that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is still lacking. In particular, a major ultimate objective is the construction of quantum simulators, yielding massively increased computational power in simulating quantum systems. Here we investigate promising routes towards the actual realization of a quantum computer, based on spin systems. The first one employs molecular nanomagnets with a doublet ground state to encode each qubit and exploits the wide chemical tunability of these systems to obtain the proper topology of inter-qubit interactions. Indeed, recent advances in coordination chemistry allow us to arrange these qubits in chains, with tailored interactions mediated by magnetic linkers. These act as switches of the effective qubit-qubit coupling, thus enabling the implementation of one- and two-qubit gates. Molecular qubits can be controlled either by uniform magnetic pulses, either by local electric fields. We introduce here two different schemes for quantum information processing with either global or local control of the inter-qubit interaction and demonstrate the high performance of these platforms by simulating the system time evolution with state-of-the-art parameters. The second architecture we propose is based on a hybrid spin-photon qubit encoding, which exploits the best characteristic of photons, whose mobility is exploited to efficiently establish long-range entanglement, and spin systems, which ensure long coherence times. The setup consists of spin ensembles coherently coupled to single photons within superconducting coplanar waveguide resonators. The tunability of the resonators frequency is exploited as the only manipulation tool to implement a universal set of quantum gates, by bringing the photons into/out of resonance with the spin transition. The time evolution of the system subject to the pulse sequence used to implement complex quantum algorithms has been simulated by numerically integrating the master equation for the system density matrix, thus including the harmful effects of decoherence. Finally a scheme to overcome the leakage of information due to inhomogeneous broadening of the spin ensemble is pointed out. Both the proposed setups are based on state-of-the-art technological achievements. By extensive numerical experiments we show that their performance is remarkably good, even for the implementation of long sequences of gates used to simulate interesting physical models. Therefore, the here examined systems are really promising buildingblocks of future scalable architectures and can be used for proof-of-principle experiments of quantum information processing and quantum simulation.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In recent years, it has become increasingly common for companies to improve their competitiveness and find new markets by extending their operations through international new product development collaborations involving technology transfer. Technology development, cost reduction and market penetration are seen as the foci in such collaborative operations with the aim being to improve the competitive position of both partners. In this paper, the case of technology transfer through collaborative new product development in the machine tool sector is used to provide a typical example of such partnerships. The paper outlines the links between the operational aspects of collaborations and their strategic objectives. It is based on empirical data collected from the machine tool industries in the UK and China. The evidence includes longitudinal case studies and questionnaire surveys of machine tool manufacturers in both countries. The specific case of BSA Tools Ltd and its Chinese partner the Changcheng Machine Tool Works is used to provide an in-depth example of the operational development of a successful collaboration. The paper concludes that a phased coordination of commercial, technical and strategic interactions between the two partners is essential for such collaborations to work.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Recent developments in the new economic geography and the literature on regional innovation systems have emphasised the potentially important role of networking and the characteristics of firms' local operating environment in shaping their innovative activity. Modeling UK, German and Irish plants' investments in R&D, technology transfer and networking, and their effect on the extent and success of plants' innovation activities, casts some doubt on the importance of both of these relationships. In particular, our analysis provides no support for the contention that firms or plants in the UK, Ireland or Germany with more strongly developed external links (collaborative networks or technology transfer) develop greater innovation intensity. However, although inter-firm links also have no effect on the commercial success of plants' innovation activity, intra-group links are important in terms of achieving commercial success. We also find evidence that R&D, technology transfer and networking inputs are substitutes rather than complements in the innovation process, and that there are systematic sectoral and regional influences in the efficiency with which such inputs are translated into innovation outputs. © 2001 Elsevier Science B.V.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In recent years it has become increasingly common for companies to improve their competitiveness and find new markets by extending their operations through international new product development collaborations involving technology transfer. Technology development, cost reduction and market penetration are seen as the foci in such collaborative operations with the aim being to improve the competitive position of both partners. In this paper the case of technology transfer through collaborative new product development in the machine tool sector is used to provide a typical example of such partnerships. The research evidence on which the paper is based includes longitudinal case studies and questionnaire surveys of machine tool manufacturers in both countries. The specific case of a UK machine tool company and its Chinese partner is used to provide a specific example of the operational development of a successful collaboration. The paper concludes that a phased co-ordination of commercial, technical and strategic interactions between the two partners is essential for such collaborations to work. In particular, the need to transfer marketing know-how is emphasised, having been identified as an area of weakness among technology acquirers in China.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The investigation of insulation debris generation, transport, and sedimentation becomes more important with regard to reactor safety research for pressurized water reactors and boiling water reactors when considering the long-term behavior of emergency core coolant systems during all types of loss-of-coolant accidents (LOCAs). The insulation debris released near the break during a LOCA incident consists of a mixture of disparate particle populations that varies with size, shape, consistency, and other properties. Some fractions of the released insulation debris can be transported into the reactor sump, where it may perturb/impinge on the emergency core cooling systems. Open questions of generic interest are, for example, the particle load on strainers and corresponding pressure drop, the sedimentation of the insulation debris in a water pool, and its possible resuspension and transport in the sump water flow. A joint research project on such questions is being performed in cooperation with the University of Applied Sciences Zittau/Görlitz. The project deals with the experimental investigation and the development of computational fluid dynamics (CFD) models for the description of particle transport phenomena in coolant flow. While the experiments are performed at the University of Applied Sciences Zittau/Görlitz, the theoretical work is concentrated at Forschungszentrum Dresden-Rossendorf. In the current paper the basic concepts for CFD modeling are described and feasibility studies including the conceptual design of the experiments are presented.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Despite the voluminous studies written about organisational innovation over the last 30-40 years our understanding of this phenomenon continues to be inconsistent and inconclusive (Wolfe, 1994). An assessment of the theoretical and methodological issues influencing the explanatory utility of many studies has led scholars (e.g. Slappendel, 1996) to re-evaluate the assumptions used to ground studies. Building on these criticisms the current study contributes to the development of an interactive perspective of organisational innovation. This work contributes empirically and theoretically to an improved understanding of the innovation process and the interaction between the realm of action and the mediating effects of pre-existing contingencies i.e. social control, economic exchange and the communicability of knowledge (Scarbrough, 1996). Building on recent advances in institutional theory (see Barley, 1986; 1990; Barley and Tolbert, 1997) and critical theory (Morrow, 1994, Sayer, 1992) the study aims to demonstrate, via longitudinal intensive research, the process through which ideas are translated into reality. This is significant because, despite a growing recognition of the implicit link between the strategic conduct of actors and the institutional realm in organisational analysis, there are few examples that theorise and empirically test these connections. By assessing an under researched example of technology transfer; the government's Teaching Company Scheme (TCS) this project provides a critique of the innovation process that contributes to theory and our appreciation of change in the UK government's premier technology transfer scheme (QR, 1996). Critical moments during the translation of ideas illustrate how elements that are linked to social control, economic exchange and communicability mediate the innovation process. Using analytical categories i.e. contradiction, slippage and dysfunctionality these are assessed in relation to the actions (coping strategies) of programme members over a two-year period. Drawing on Giddens' (1995) notion of the duality of structure this study explores the nature of the relationship between the task environment and institutional environment demonstrating how and why knowledge is both an enabler and barrier to organisational innovation.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This thesis reports the results of research into the connections between transaction attributes and buyer-supplier relationships (BSR) in advanced manufacturing technology (AMT) acquisitions and implementation. It also examines the impact of the different patterns of BSR on performance. Specifically, it addresses the issues of how the three transaction attributes; namely level of complexity, level of asset specificity, and level of uncertainty, can affect the relationships between the technology buyer and suppler in AMT acquisition and implementation, and then to see the impact of different patterns of BSR on the two aspect of performance; namely technology and implementation performance. In understanding the pohenomena, the study mainly draws on and integrates the literature of transaction cost economics theory,buyer-supplier relationships and advanced manufacturing technology as a basis of theoretical framework and hypotheses development.data were gathered through a questionnaire survey with 147 responses and seven semi-structured interviews of manufacturing firms in Malaysia. Quantitative data were analysed mainly using the AMOS (Analysis of Moment Structure) package for structural equation modeling and SPSS (Statistical Package for Social Science) for analysis of variance (ANOVA). Data from interview sessions were used to develop a case study with the intention of providing a richer and deeper understanding on the subject under investigation and to offer triangulation in the research process. he results of the questionnaire survey indicate that the higher the level of technological specificity and uncertainty, the more firms are likely to engage in a closer relationship with technology suppliers.However, the complexity of the technology being implemented is associated with BSR only because it is associated with the level of uncertainty that has direct impact upon BSR.The analysis also provides strong support for the premise that developing strong BSR could lead to an improved performance. However, with high levels of transaction attribute, implementation performance suffers more when firms have weak relationships with technology suppliers than with moderate and low levels of transaction attributes. The implications of the study are offered for both the academic and practitioner audience. The thesis closes with reports on its limitations and suggestions for further research that would address some of these limitations.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Innovation is central to the survival and growth of firms, and ultimately to the health of the economies of which they are part. A clear understanding both of the processes by which firms perform innovation and the benefits which flow from innovation in terms of productivity and growth is therefore essential. This paper demonstrates the use of a conceptual framework and modeling tool, the innovation value chain (IVC), and shows how the IVC approach helps to highlight strengths and weaknesses in the innovation performance of a key group of firms-new technology-based firms. The value of the IVC is demonstrated in showing the key interrelationships in the whole process of innovation from sourcing knowledge through product and process innovation to performance in terms of the growth and productivity outcomes of different types of innovation. The use of the IVC highlights key complementarities, such as that between internal R&D, external R&D, and other external sources of knowledge. Other important relationships are also highlighted. Skill resources matter throughout the IVC, being positively associated with external knowledge linkages and innovation success, and also having a direct influence on growth independent of the effect on innovation. A key benefit of the IVC approach is therefore its ability to highlight the roles of different factors at various stages of the knowledge-innovation-performance nexus, and to show their indirect as well as direct impact. This in turn permits both managerial and policy implications to be drawn. © 2012 Product Development & Management Association.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.