965 resultados para well-structured transition systems
Resumo:
Discrete event simulation is a popular aid for manufacturing system design; however in application this technique can sometimes be unnecessarily complex. This paper is concerned with applying an alternative technique to manufacturing system design which may well provide an efficient form of rough-cut analysis. This technique is System Dynamics, and the work described in this paper has set about incorporating the principles of this technique into a computer based modelling tool that is tailored to manufacturing system design. This paper is structured to first explore the principles of System Dynamics and how they differ from Discrete Event Simulation. The opportunity for System Dynamics is then explored, and this leads to defining the capabilities that a suitable tool would need. This specification is then transformed into a computer modelling tool, which is then assessed by applying this tool to model an engine production facility. Read More: http://www.worldscientific.com/doi/abs/10.1142/S0219686703000228
Resumo:
Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.
Resumo:
Internally heated fluids are found across the nuclear fuel cycle. In certain situations the motion of the fluid is driven by the decay heat (i.e. corium melt pools in severe accidents, the shutdown of liquid metal reactors, molten salt and the passive control of light water reactors) as well as normal operation (i.e. intermediate waste storage and generation IV reactor designs). This can in the long-term affect reactor vessel integrity or lead to localized hot spots and accumulation of solid wastes that may prompt local increases in activity. Two approaches to the modeling of internally heated convection are presented here. These are based on numerical analysis using codes developed in-house and simulations using widely available computational fluid dynamics solvers. Open and closed fluid layers at around the transition between conduction and convection of various aspect ratios are considered. We determine optimum domain aspect ratio (1:7:7 up to 1:24:24 for open systems and 5:5:1, 1:10:10 and 1:20:20 for closed systems), mesh resolutions and turbulence models required to accurately and efficiently capture the convection structures that evolve when perturbing the conductive state of the fluid layer. Note that the open and closed fluid layers we study here are bounded by a conducting surface over an insulating surface. Conclusions will be drawn on the influence of the periodic boundary conditions on the flow patterns observed. We have also examined the stability of the nonlinear solutions that we found with the aim of identifying the bifurcation sequence of these solutions en route to turbulence.
Resumo:
The paper discusses both the complementary factors and contradictions of adoption ERP based systems with enterprise 2.0. ERP is well known as its' efficient business process management. Also the high failure rate the system implementation is famous as well. According to [1], ERP systems could achieve efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. However, enterprise 2.0 supports flexible business process management, informal and less structured interactions [3],[4],[21]. Traditional researcher claimed efficiency and flexibility may seem incompatible in that they are different business objectives and may exist in different organizational environments. However, the paper will break traditional norms that combine ERP and enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on the multiple cases studies, four cases presented different attitudes on usage ERP systems and enterprise social systems. Based on socio-technical theory, the paper presents in-depth analysis benefits of combination ERP with enterprise 2.0 for these firms.
Resumo:
Studying the transition from a linearly stable coherent laminar state to a highly disordered state of turbulence is conceptually and technically challenging, and of great interest because all pipe and channel flows are of that type. In optics, understanding how a system loses coherence, as spatial size or the strength of excitation increases, is a fundamental problem of practical importance. Here, we report our studies of a fibre laser that operates in both laminar and turbulent regimes. We show that the laminar phase is analogous to a one-dimensional coherent condensate and the onset of turbulence is due to the loss of spatial coherence. Our investigations suggest that the laminar-turbulent transition in the laser is due to condensate destruction by clustering dark and grey solitons. This finding could prove valuable for the design of coherent optical devices as well as systems operating far from thermodynamic equilibrium. © 2013 Macmillan Publishers Limited.
Resumo:
The Product Service Systems, servitization, and Service Science literature continues to grow as organisations seek to protect and improve their competitive position. The potential of technology applications to deliver service delivery systems facilitated by the ability to make real time decisions based upon ‘in the field’ performance is also significant. Research identifies four key questions to be addressed. Namely: how far along the servitization continuum should the organisation go in a single strategic step? Does the organisation have the structure and infrastructure to support this transition? What level of condition monitoring should it employ? Is the product positioned correctly in the value chain to adopt condition monitoring technology? Strategy consists of three dimensions, namely content, context, and process. The literature relating to PSS, servitization, and strategy all discuss the concepts relative to content and context but none offer a process to deliver an aligned strategy to deliver a service delivery system enabled by condition based management. This paper presents a tested iterative strategy formulation methodology which is the result of a structured development programme.
Resumo:
The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.
Resumo:
Transition P Systems are a parallel and distributed computational model based on the notion of the cellular membrane structure. Each membrane determines a region that encloses a multiset of objects and evolution rules. Transition P Systems evolve through transitions between two consecutive configurations that are determined by the membrane structure and multisets present inside membranes. Moreover, transitions between two consecutive configurations are provided by an exhaustive non-deterministic and parallel application of evolution rules. But, to establish the rules to be applied, it is required the previous calculation of useful, applicable and active rules. Hence, computation of useful evolution rules is critical for the whole evolution process efficiency, because it is performed in parallel inside each membrane in every evolution step. This work defines usefulness states through an exhaustive analysis of the P system for every membrane and for every possible configuration of the membrane structure during the computation. Moreover, this analysis can be done in a static way; therefore membranes only have to check their usefulness states to obtain their set of useful rules during execution.
Resumo:
ransition P-systems are based on biological membranes and try to emulate cell behavior and its evolution due to the presence of chemical elements. These systems perform computation through transition between two consecutive configurations, which consist in a m-tuple of multisets present at any moment in the existing m regions of the system. Transition between two configurations is performed by using evolution rules also present in each region. Among main Transition P-systems characteristics are massive parallelism and non determinism. This work is part of a very large project and tries to determine the design of a hardware circuit that can improve remarkably the process involved in the evolution of a membrane. Process in biological cells has two different levels of parallelism: the first one, obviously, is the evolution of each cell inside the whole set, and the second one is the application of the rules inside one membrane. This paper presents an evolution of the work done previously and includes an improvement that uses massive parallelism to do transition between two states. To achieve this, the initial set of rules is transformed into a new set that consists in all their possible combinations, and each of them is treated like a new rule (participant antecedents are added to generate a new multiset), converting an unique rule application in a way of parallelism in the means that several rules are applied at the same time. In this paper, we present a circuit that is able to process this kind of rules and to decode the result, taking advantage of all the potential that hardware has to implement P Systems versus previously proposed sequential solutions.
Resumo:
In the field of Transition P systems implementation, it has been determined that it is very important to determine in advance how long takes evolution rules application in membranes. Moreover, to have time estimations of rules application in membranes makes possible to take important decisions related to hardware / software architectures design. The work presented here introduces an algorithm for applying active evolution rules in Transition P systems, which is based on active rules elimination. The algorithm complies the requisites of being nondeterministic, massively parallel, and what is more important, it is time delimited because it is only dependant on the number of membrane evolution rules.
Resumo:
Transition P Systems are a parallel and distributed computational model based on the notion of the cellular membrane structure. Each membrane determines a region that encloses a multiset of objects and evolution rules. Transition P Systems evolve through transitions between two consecutive configurations that are determined by the membrane structure and multisets present inside membranes. Moreover, transitions between two consecutive configurations are provided by an exhaustive non-deterministic and parallel application of active evolution rules subset inside each membrane of the P system. But, to establish the active evolution rules subset, it is required the previous calculation of useful and applicable rules. Hence, computation of applicable evolution rules subset is critical for the whole evolution process efficiency, because it is performed in parallel inside each membrane in every evolution step. The work presented here shows advantages of incorporating decision trees in the evolution rules applicability algorithm. In order to it, necessary formalizations will be presented to consider this as a classification problem, the method to obtain the necessary decision tree automatically generated and the new algorithm for applicability based on it.
Resumo:
P systems or Membrane Computing are a type of a distributed, massively parallel and non deterministic system based on biological membranes. They are inspired in the way cells process chemical compounds, energy and information. These systems perform a computation through transition between two consecutive configurations. As it is well known in membrane computing, a configuration consists in a m-tuple of multisets present at any moment in the existing m regions of the system at that moment time. Transitions between two configurations are performed by using evolution rules which are in each region of the system in a non-deterministic maximally parallel manner. This work is part of an exhaustive investigation line. The final objective is to implement a HW system that evolves as it makes a transition P-system. To achieve this objective, it has been carried out a division of this generic system in several stages, each of them with concrete matters. In this paper the stage is developed by obtaining the part of the system that is in charge of the application of the active rules. To count the number of times that the active rules is applied exist different algorithms. Here, it is presents an algorithm with improved aspects: the number of necessary iterations to reach the final values is smaller than the case of applying step to step each rule. Hence, the whole process requires a minor number of steps and, therefore, the end of the process will be reached in a shorter length of time.
Resumo:
Transition P systems are computational models based on basic features of biological membranes and the observation of biochemical processes. In these models, membrane contains objects multisets, which evolve according to given evolution rules. In the field of Transition P systems implementation, it has been detected the necessity to determine whichever time are going to take active evolution rules application in membranes. In addition, to have time estimations of rules application makes possible to take important decisions related to the hardware / software architectures design. In this paper we propose a new evolution rules application algorithm oriented towards the implementation of Transition P systems. The developed algorithm is sequential and, it has a linear order complexity in the number of evolution rules. Moreover, it obtains the smaller execution times, compared with the preceding algorithms. Therefore the algorithm is very appropriate for the implementation of Transition P systems in sequential devices.
Resumo:
We use advanced statistical tools of time-series analysis to characterize the dynamical complexity of the transition to optical wave turbulence in a fiber laser. Ordinal analysis and the horizontal visibility graph applied to the experimentally measured laser output intensity reveal the presence of temporal correlations during the transition from the laminar to the turbulent lasing regimes. Both methods unveil coherent structures with well-defined time scales and strong correlations both, in the timing of the laser pulses and in their peak intensities. Our approach is generic and may be used in other complex systems that undergo similar transitions involving the generation of extreme fluctuations.
Resumo:
OBJECTIVES: To understand older adults' experiences of moving into extra care housing which offers enrichment activities alongside social and healthcare support. DESIGN: A longitudinal study was conducted which adopted a phenomenological approach to data generation and analysis. METHODS: Semi-structured interviews were conducted in the first 18 months of living in extra care housing. Interpretative phenomenological analysis was used because its commitment to idiography enabled an in-depth analysis of the subjective lived experience of moving into extra care housing. Themes generated inductively were examined against an existential-phenomenological theory of well-being. RESULTS: Learning to live in an extra care community showed negotiating new relationships was not straightforward; maintaining friendships outside the community became more difficult as capacity declined. In springboard for opportunity/confinement, living in extra care provided new opportunities for social engagement and a restored sense of self. Over time horizons began to shrink as incapacities grew. Seeking care illustrated reticence to seek care, due to embarrassment and a sense of duty to one's partner. Becoming aged presented an ontological challenge. Nevertheless, some showed a readiness for death, a sense of homecoming. CONCLUSIONS: An authentic later life was possible but residents required emotional and social support to live through the transition and challenges of becoming aged. Enhancement activities boosted residents' quality of life but the range of activities could be extended to cater better for quieter, smaller scale events within the community; volunteer activity facilitators could be used here. Peer mentoring may help build new relationships and opportunities for interactive stimulation. Acknowledging the importance of feeling-empathic imagination-in caregiving may help staff and residents relate better to each other, thus helping individuals to become ontologically secure and live well to the end.