27 resultados para Location-aware process modeling

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of systems thinking to designing, managing, and improving business processes has developed a new "holonic-based" process modeling methodology. The theoretical background and the methodology are described using examples taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. A key point of differentiation attributed to this methodology is that it allows a set of models to be produced without taking a task breakdown approach but instead uses systems thinking and a construct known as the "holon" to build process descriptions as a system of systems (i.e., a holarchy). The process-oriented holonic modeling methodology has been used for total quality management and business process engineering exercises in different industrial sectors and builds models that connect the strategic vision of a company to its operational processes. Exercises have been conducted in response to environmental pressures to make operations align with strategic thinking as well as becoming increasingly agile and efficient. This unique methodology is best applied in environments of high complexity, low volume, and high variety, where repeated learning opportunities are few and far between (e.g., large development projects). © 2007 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of mobility, related to technology in particular, has evolved dramatically over the last two decades including: (i) hardware ranging from walkmans to Ipods, laptops to netbooks, PDAs to 3G mobile phone; (ii) software supporting multiple audio and video formats driven by ubiquitous mobile wireless access, WiMax, automations such as radio frequency ID tracking and location aware services. Against the background of increasing budget deficit, along with the imperative for efficiency gains, leveraging ICT and mobility promises for work related tasks, in a public administration context, in emerging markets, point to multiple possible paths. M-government transition involve both technological changes and adoption to deliver government services differently (e.g. 24/7, error free, anywhere to the same standards) but also the design of digital strategies including possibly competing m-government models, the re-shaping of cultural practices, the creation of m-policies and legislations, the structuring of m-services architecture, and progress regarding m-governance. While many emerging countries are already offering e-government services and are gearing-up for further m-government activities, little is actually known about the resistance that is encountered, as a reflection of civil servants' current standing, before any further macro-strategies are deployed. Drawing on the resistance and mobility literature, this chapter investigates how civil servants' behaviors, in an emerging country technological environment, through their everyday practice, react and resist the influence of m-government transition. The findings points to four main type of resistance namely: i) functional resistance; ii) ideological resistance; iii) market driven resistance and iv) geographical resistance. Policy implication are discussed in the specific context of emerging markets. © 2011, IGI Global.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the effect of corruption on Multinational Enterprises' (MNEs) incentives to undertake FDI in a particular country. We contribute to the existing literature by modelling the relationship between corruption and FDI using both parametric and non-parametric methods. We report that the impact of corruption on FDI stock is different for the different quantiles of the FDI stock distribution. This is a characteristic that could not be captured in previous studies which used only parametric methods. After controlling for the location selection process of MNEs and other host country characteristics, the result from both parametric and non-parametric analyses offer some support for the ‘helping-hand’ role of corruption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent developments in the new economic geography and the literature on regional innovation systems have emphasised the potentially important role of networking and the characteristics of firms' local operating environment in shaping their innovative activity. Modeling UK, German and Irish plants' investments in R&D, technology transfer and networking, and their effect on the extent and success of plants' innovation activities, casts some doubt on the importance of both of these relationships. In particular, our analysis provides no support for the contention that firms or plants in the UK, Ireland or Germany with more strongly developed external links (collaborative networks or technology transfer) develop greater innovation intensity. However, although inter-firm links also have no effect on the commercial success of plants' innovation activity, intra-group links are important in terms of achieving commercial success. We also find evidence that R&D, technology transfer and networking inputs are substitutes rather than complements in the innovation process, and that there are systematic sectoral and regional influences in the efficiency with which such inputs are translated into innovation outputs. © 2001 Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops and applies an integrated multiple criteria decision making approach to optimize the facility location-allocation problem in the contemporary customer-driven supply chain. Unlike the traditional optimization techniques, the proposed approach, combining the analytic hierarchy process (AHP) and the goal programming (GP) model, considers both quantitative and qualitative factors, and also aims at maximizing the benefits of deliverer and customers. In the integrated approach, the AHP is used first to determine the relative importance weightings or priorities of alternative locations with respect to both deliverer oriented and customer oriented criteria. Then, the GP model, incorporating the constraints of system, resource, and AHP priority is formulated to select the best locations for setting up the warehouses without exceeding the limited available resources. In this paper, a real case study is used to demonstrate how the integrated approach can be applied to deal with the facility location-allocation problem, and it is proved that the integrated approach outperforms the traditional costbased approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is now substantial evidence that locational and agglomeration influences can have a significant positive effect on innovation performance. Networking and boundary-spanning activities are also increasingly recognised as important contributors to innovation success. In this article we attempt to discover whether these factors are associated: in particular, is there any link between plant location, agglomeration effects and the extent of outsourcing in the innovation process? Using data for a large sample of UK and German manufacturing plants, we find that organisational and strategic factors play a much greater and more consistent role than locational influences in shaping the level of outsourcing in the innovation process. Strategic approaches to outsourcing may also benefit plants in obtaining economies of scope in the management or governance of outsourcing within the innovation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic differential equations arise naturally in a range of contexts, from financial to environmental modeling. Current solution methods are limited in their representation of the posterior process in the presence of data. In this work, we present a novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations. The method is applied to two simple problems: the Ornstein-Uhlenbeck process, of which the exact solution is known and can be compared to, and the double-well system, for which standard approaches such as the ensemble Kalman smoother fail to provide a satisfactory result. Experiments show that our variational approximation is viable and that the results are very promising as the variational approximate solution outperforms standard Gaussian process regression for non-Gaussian Markov processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A survey of the existing state-of-the-art of turbine blade manufacture highlights two operations that have not been automated namely that of loading of a turbine blade into an encapsulation die, and that of removing a machined blade from the encapsulation block. The automation of blade decapsulation has not been pursued. In order to develop a system to automate the loading of an encapsulation die a prototype mechanical handling robot has been designed together with a computer controlled encapsulation die. The robot has been designed as a mechanical handling robot of cylindrical geometry, suitable for use in a circular work cell. It is the prototype for a production model to be called `The Cybermate'. The prototype robot is mechanically complete but due to unforeseen circumstances the robot control system is not available (the development of the control system did not form a part of this project), hence it has not been possible to fully test and assess the robot mechanical design. Robot loading of the encapsulation die has thus been simulated. The research work with regard to the encapsulation die has focused on the development of computer controlled, hydraulically actuated, location pins. Such pins compensate for the inherent positional inaccuracy of the loading robot and reproduce the dexterity of the human operator. Each pin comprises a miniature hydraulic cylinder, controlled by a standard bidirectional flow control valve. The precision positional control is obtained through pulsing of the valves under software control, with positional feedback from an 8-bit transducer. A test-rig comprising one hydraulic location pin together with an opposing spring loaded pin has demonstrated that such a pin arrangement can be controlled with a repeatability of +/-.00045'. In addition this test-rig has demonstrated that such a pin arrangement can be used to gauge and compensate for the dimensional error of the component held between the pins, by offsetting the pin datum positions to allow for the component error. A gauging repeatability of +/- 0.00015' was demonstrated. This work has led to the design and manufacture of an encapsulation die comprising ten such pins and the associated computer software. All aspects of the control software except blade gauging and positional data storage have been demonstrated. Work is now required to achieve the accuracy of control demonstrated by the single pin test-rig, with each of the ten pins in the encapsulation die. This would allow trials of the complete loading cycle to take place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theprocess of manufacturing system design frequently includes modeling, and usually, this means applying a technique such as discrete event simulation (DES). However, the computer tools currently available to apply this technique enable only a superficial representation of the people that operate within the systems. This is a serious limitation because the performance of people remains central to the competitiveness of many manufacturing enterprises. Therefore, this paper explores the use of probability density functions to represent the variation of worker activity times within DES models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.