994 resultados para Mesoscale processes
Resumo:
Successful identification and exploitation of opportunities has been an area of interest to many entrepreneurship researchers. Since Shane and Venkataraman’s seminal work (e.g. Shane and Venkataraman, 2000; Shane, 2000), several scholars have theorised on how firms identify, nurture and develop opportunities. The majority of this literature has been devoted to understanding how entrepreneurs search for new applications of their technological base or discover opportunities based on prior knowledge (Zahra, 2008; Sarasvathy et al., 2003). In particular, knowledge about potential customer needs and problems that may present opportunities is vital (Webb et al., 2010). Whereas the role of prior knowledge of customer problems (Shane, 2003; Shepherd and DeTienne, 2005) and positioning oneself in a so-called knowledge corridor (Fiet, 1996) has been researched, the role of opportunity characteristics and their interaction with customer-related mechanisms that facilitate and hinder opportunity identification has received scant attention.
Resumo:
The improvement and optimization of business processes is one of the top priorities in an organization. Although process analysis methods are mature today, business analysts and stakeholders are still hampered by communication issues. That is, analysts cannot effectively obtain accurate business requirements from stakeholders, and stakeholders are often confused about analytic results offered by analysts. We argue that using a virtual world to model a business process can benefit communication activities. We believe that virtual worlds can be used as an efficient model-view approach, increasing the cognition of business requirements and analytic results, as well as the possibility of business plan validation. A healthcare case study is provided as an approach instance, illustrating how intuitive such an approach can be. As an exploration paper, we believe that this promising research can encourage people to investigate more research topics in the interdisciplinary area of information system, visualization and multi-user virtual worlds.
Resumo:
Process modeling is an important design practice in organizational improvement projects. In this paper, we examine the design of business process diagrams in contexts where novice analysts only have basic design tools such as paper and pencils available, and little to no understanding of formalized modeling approaches. Based on a quasi-experimental study with 89 BPM students, we identify five distinct process design archetypes ranging from textual to hybrid and graphical representation forms. We examine the quality of the designs and identify which representation formats enable an analyst to articulate business rules, states, events, activities, temporal and geospatial information in a process model. We found that the quality of the process designs decreases with the increased use of graphics and that hybrid designs featuring appropriate text labels and abstract graphical forms appear well-suited to describe business processes. We further examine how process design preferences predict formalized process modeling ability. Our research has implications for practical process design work in industry as well as for academic curricula on process design.
Resumo:
Using complex event rules for capturing dependencies between business processes is an emerging trend in enterprise information systems. In previous work we have identified a set of requirements for event extensions for business process modeling languages. This paper introduces a graphical language for modeling composite events in business processes, namely BEMN, that fulfills all these requirements. These include event conjunction, disjunction and inhibition as well as cardinality of events whose graphical expression can be factored into flow-oriented process modeling and event rule modeling. Formal semantics for the language are provided.
Resumo:
Flow-oriented process modeling languages have a long tradition in the area of Business Process Management and are widely used for capturing activities with their behavioral and data dependencies. Individual events were introduced for triggering process instantiation and activities. However, real-world business cases drive the need for also covering complex event patterns as they are known in the field of Complex Event Processing. Therefore, this paper puts forward a catalog of requirements for handling complex events in process models, which can be used as reference framework for assessing process definition languages and systems. An assessment of BPEL and BPMN is provided.
Resumo:
The crosstalk between fibroblasts and keratinocytes is a vital component of the wound healing process, and involves the activity of a number of growth factors and cytokines. In this work, we develop a mathematical model of this crosstalk in order to elucidate the effects of these interactions on the regeneration of collagen in a wound that heals by second intention. We consider the role of four components that strongly affect this process: transforming growth factor-beta, platelet-derived growth factor, interleukin-1 and keratinocyte growth factor. The impact of this network of interactions on the degradation of an initial fibrin clot, as well as its subsequent replacement by a matrix that is mainly comprised of collagen, is described through an eight-component system of nonlinear partial differential equations. Numerical results, obtained in a two-dimensional domain, highlight key aspects of this multifarious process such as reepithelialisation. The model is shown to reproduce many of the important features of normal wound healing. In addition, we use the model to simulate the treatment of two pathological cases: chronic hypoxia, which can lead to chronic wounds; and prolonged inflammation, which has been shown to lead to hypertrophic scarring. We find that our model predictions are qualitatively in agreement with previously reported observations, and provide an alternative pathway for gaining insight into this complex biological process.
Resumo:
A Multimodal Seaport Container Terminal (MSCT) is a complex system which requires careful planning and control in order to operate efficiently. It consists of a number of subsystems that require optimisation of the operations within them, as well as synchronisation of machines and containers between the various subsystems. Inefficiency in the terminal can delay ships from their scheduled timetables, as well as cause delays in delivering containers to their inland destinations, both of which can be very costly to their operators. The purpose of this PhD thesis is to use Operations Research methodologies to optimise and synchronise these subsystems as an integrated application. An initial model is developed for the overall MSCT; however, due to a large number of assumptions that had to be made, as well as other issues, it is found to be too inaccurate and infeasible for practical use. Instead, a method of developing models for each subsystem is proposed that then be integrated with each other. Mathematical models are developed for the Storage Area System (SAS) and Intra-terminal Transportation System (ITTS). The SAS deals with the movement and assignment of containers to stacks within the storage area, both when they arrive and when they are rehandled to retrieve containers below them. The ITTS deals with scheduling the movement of containers and machines between the storage areas and other sections of the terminal, such as the berth and road/rail terminals. Various constructive heuristics are explored and compared for these models to produce good initial solutions for large-sized problems, which are otherwise impractical to compute by exact methods. These initial solutions are further improved through the use of an innovative hyper-heuristic algorithm that integrates the SAS and ITTS solutions together and optimises them through meta-heuristic techniques. The method by which the two models can interact with each other as an integrated system will be discussed, as well as how this method can be extended to the other subsystems of the MSCT.
Resumo:
The study shows an alternative solution to existing efforts at solving the problem of how to centrally manage and synchronise users’ Multiple Profiles (MP) across multiple discrete social networks. Most social network users hold more than one social network account and utilise them in different ways depending on the digital context (Iannella, 2009a). They may, for example, enjoy friendly chat on Facebook1, professional discussion on LinkedIn2, and health information exchange on PatientsLikeMe3 In this thesis the researcher proposes a framework for the management of a user’s multiple online social network profiles. A demonstrator, called Multiple Profile Manager (MPM), will be showcased to illustrate how effective the framework will be. The MPM will achieve the required profile management and synchronisation using a free, open, decentralized social networking platform (OSW) that was proposed by the Vodafone Group in 2010. The proposed MPM will enable a user to create and manage an integrated profile (IP) and share/synchronise this profile with all their social networks. The necessary protocols to support the prototype are also proposed by the researcher. The MPM protocol specification defines an Extensible Messaging and Presence Protocol (XMPP) extension for sharing vCard and social network accounts information between the MPM Server, MPM Client, and social network sites (SNSs). . Therefore many web users need to manage disparate profiles across many distributed online sources. Maintaining these profiles is cumbersome, time-consuming, inefficient, and may lead to lost opportunity. The writer of this thesis adopted a research approach and a number of use cases for the implementation of the project. The use cases were created to capture the functional requirements of the MPM and to describe the interactions between users and the MPM. In the research a development process was followed in establishing the prototype and related protocols. The use cases were subsequently used to illustrate the prototype via the screenshots taken of the MPM client interfaces. The use cases also played a role in evaluating the outcomes of the research such as the framework, prototype, and the related protocols. An innovative application of this project is in the area of public health informatics. The researcher utilised the prototype to examine how the framework might benefit patients and physicians. The framework can greatly enhance health information management for patients and more importantly offer a more comprehensive personal health overview of patients to physicians. This will give a more complete picture of the patient’s background than is currently available and will prove helpful in providing the right treatment. The MPM prototype and related protocols have a high application value as they can be integrated into the real OSW platform and so serve users in the modern digital world. They also provide online users with a real platform for centrally storing their complete profile data, efficiently managing their personal information, and moreover, synchronising the overall complete profile with each of their discrete profiles stored in their different social network sites.
Resumo:
Exploiting wind-energy is one possible way to ex- tend flight duration for Unmanned Arial Vehicles. Wind-energy can also be used to minimise energy consumption for a planned path. In this paper, we consider uncertain time-varying wind fields and plan a path through them. A Gaussian distribution is used to determine uncertainty in the Time-varying wind fields. We use Markov Decision Process to plan a path based upon the uncertainty of Gaussian distribution. Simulation results that compare the direct line of flight between start and target point and our planned path for energy consumption and time of travel are presented. The result is a robust path using the most visited cell while sampling the Gaussian distribution of the wind field in each cell.
Resumo:
The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.
Resumo:
This book provides the much needed international dimension on the payoffs of information technology investments. The bulk of the research on the impact of information technology investments has been undertaken in developed economies, mainly the United States. This research provides an alternative dimension - a developing country perspective on how information technology investments impacts organizations. Secondly, there has been much debate and controversy on how we measure information technology investment payoffs. This research uses an innovative two-stage model where it proposes that information technology investments will first impact the process and improvement in the processes will then impact the performance. In doing so, it considers sectors of information technology investment rather than taking it as one. Finally, almost all prior studies in this area have considered only the tangible impact of information technology investments. This research proposes that one can only better understand the benefits by looking at both the tangible and intangible benefits.
Resumo:
Current conceptualizations of organizational processes consider them as internally optimized yet static systems. Still, turbulences in the contextual environment of a firm often lead to adaptation requirements that these processes are unable to fulfil. Based on a multiple case study of the core processes of two large organizations, we offer an extended conceptualisation of business processes as complex adaptive systems. This conceptualization can enable firms to optimise business processes by analysing operations in different contexts and by examining the complex interaction between external, contextual elements and internal agent schemata. From this analysis, we discuss how information technology can play a vital goal in achieving this goal by providing discovery, analysis, and automation support. We detail implications for research and practice.
Resumo:
Many corporations and individuals realize that environmental sustainability is an urgent problem to address. In this chapter, we contribute to the emerging academic discussion by proposing two innovative approaches for engaging in the development of environmentally sustainable business processes. Specifically, we describe an extended process modeling approach for capturing and documenting the dioxide emissions produced during the execution of a business process. For illustration, we apply this approach to the case of a government Shared Service provider. Second, we then introduce an analysis method for measuring the carbon dioxide emissions produced during the execution of a business process. To illustrate this approach, we apply it in the real-life case of a European airport and show how this information can be leveraged in the re-design of "green" business processes.
Resumo:
Bacterially mediated iron redox cycling exerts a strong influence on groundwater geochemistry, but few studies have investigated iron biogeochemical processes in coastal alluvial aquifers from a microbiological viewpoint. The shallow alluvial aquifer located adjacent to Poona estuary on the subtropical Southeast Queensland coast represents a redox-stratified system where iron biogeochemical cycling potentially affects water quality. Using a 300 m transect of monitoring wells perpendicular to the estuary, we examined groundwater physico-chemical conditions and the occurrence of cultivable bacterial populations involved in iron (and manganese, sulfur) redox reactions in this aquifer. Results showed slightly acidic and near-neutral pH, suboxic conditions and an abundance of dissolved iron consisting primarily of iron(II) in the majority of wells. The highest level of dissolved iron(III) was found in a well proximal to the estuary most likely a result of iron curtain effects due to tidal intrusion. A number of cultivable, (an)aerobic bacterial populations capable of diverse carbon, iron, or sulfur metabolism coexisted in groundwater redox transition zones. Our findings indicated aerobic, heterotrophic respiration and bacterially mediated iron/sulfur redox reactions were integral to carbon cycling in the aquifer. High abundances of dissolved iron and cultivable iron and sulfur bacterial populations in estuary-adjacent aquifers have implications for iron transport to marine waters. This study demonstrated bacterially mediated iron redox cycling and associated biogeochemical processes in subtropical coastal groundwaters using culture-based methods.
Resumo:
Ocean processes are complex and have high variability in both time and space. Thus, ocean scientists must collect data over long time periods to obtain a synoptic view of ocean processes and resolve their spatiotemporal variability. One way to perform these persistent observations is to utilise an autonomous vehicle that can remain on deployment for long time periods. However, such vehicles are generally underactuated and slow moving. A challenge for persistent monitoring with these vehicles is dealing with currents while executing a prescribed path or mission. Here we present a path planning method for persistent monitoring that exploits ocean currents to increase navigational accuracy and reduce energy consumption.