139 resultados para Discrete Time Branching Processes
Resumo:
Vendors provide reference process models as consolidated, off-the-shelf solutions to capture best practices in a given industry domain. Customers can then adapt these models to suit their specific requirements. Traditional process flexibility approaches facilitate this operation, but do not fully address it as they do not sufficiently take controlled change guided by vendors' reference models into account. This tension between the customer's freedom of adapting reference models, and the ability to incorporate with relatively low effort vendor-initiated reference model changes, thus needs to be carefully balanced. This paper introduces process extensibility as a new paradigm for customizing reference processes and managing their evolution over time. Process extensibility mandates a clear recognition of the different responsibilities and interests of reference model vendors and consumers, and is concerned with keeping the effort of customer-side reference model adaptations low while allowing sufficient room for model change.
Resumo:
Principal Topic Although corporate entrepreneurship is of vital importance for long-term firm survival and growth (Zahra and Covin, 1995), researchers still struggle with understanding how to manage corporate entrepreneurship activities. Corporate entrepreneurship consists of three parts: innovation, venturing, and renewal processes (Guth and Ginsberg, 1990). Innovation refers to the development of new products, venturing to the creation of new businesses, and renewal to redefining existing businesses (Sharma, and Chrisman, 1999; Verbeke et al., 2007). Although there are many studies focusing on one of these aspects (cf. Burgelman, 1985; Huff et al., 1992), it is very difficult to compare the outcomes of these studies due to differences in contexts, measures, and methodologies. This is a significant lack in our understanding of CE, as firms engage in all three aspects of CE, making it important to compare managerial and organizational antecedents of innovation, venturing and renewal processes. Because factors that may enhance venturing activities may simultaneously inhibit renewal activities. The limited studies that did empirically compare the individual dimensions (cf. Zahra, 1996; Zahra et al., 2000; Yiu and Lau, 2008; Yiu et al., 2007) generally failed to provide a systematic explanation for potential different effects of organizational antecedents on innovation, venturing, and renewal. With this study we aim to investigate the different effects of structural separation and social capital on corporate entrepreneurship activities. The access to existing and the development of new knowledge has been deemed of critical importance in CE-activities (Floyd and Wooldridge, 1999; Covin and Miles, 2007; Katila and Ahuja, 2002). Developing new knowledge can be facilitated by structurally separating corporate entrepreneurial units from mainstream units (cf. Burgelman, 1983; Hill and Rothaermel, 2003; O'Reilly and Tushman, 2004). Existing knowledge and resources are available through networks of social relationships, defined as social capital (Nahapiet and Ghoshal, 1998; Yiu and Lau, 2008). Although social capital has primarily been studied at the organizational level, it might be equally important at top management level (Belliveau et al., 1996). However, little is known about the joint effects of structural separation and integrative mechanisms to provide access to social capital on corporate entrepreneurship. Could these integrative mechanisms for example connect the separated units to facilitate both knowledge creation and sharing? Do these effects differ for innovation, venturing, and renewal processes? Are the effects different for organizational versus top management team integration mechanisms? Corporate entrepreneurship activities have for example been suggested to take place at different levels. Whereas innovation is suggested to be a more bottom-up process, strategic renewal is a more top-down process (Floyd and Lane, 2000; Volberda et al., 2001). Corporate venturing is also a more bottom-up process, but due to the greater required resource commitments relative to innovation, it ventures need to be approved by top management (Burgelman, 1983). As such we will explore the following key research question in this paper: How do social capital and structural separation on organizational and TMT level differentially influence innovation, venturing, and renewal processes? Methodology/Key Propositions We investigated our hypotheses on a final sample of 240 companies in a variety of industries in the Netherlands. All our measures were validated in previous studies. We targeted a second respondent in each firm to reduce problems with single-rater data (James et al., 1984). We separated the measurement of the independent and the dependent variables in two surveys to create a one-year time lag and reduce potential common method bias (Podsakoff et al., 2003). Results and Implications Consistent with our hypotheses, our results show that configurations of structural separation and integrative mechanisms have different effects on the three aspects of corporate entrepreneurship. Innovation was affected by organizational level mechanisms, renewal by integrative mechanisms on top management team level and venturing by mechanisms on both levels. Surprisingly, our results indicated that integrative mechanisms on top management team level had negative effects on corporate entrepreneurship activities. We believe this paper makes two significant contributions. First, we provide more insight in what the effects of ambidextrous organizational forms (i.e. combinations of differentiation and integration mechanisms) are on venturing, innovation and renewal processes. Our findings show that more valuable insights can be gained by comparing the individual parts of corporate entrepreneurship instead of focusing on the whole. Second, we deliver insights in how management can create a facilitative organizational context for these corporate entrepreneurship activities.
Resumo:
Team games conceptualized as dynamical systems engender a view of emergent decision-making behaviour under constraints, although specific effects of instructional and body-scaling constraints have yet to be verified empirically. For this purpose, we studied the effects of task and individual constraints on decision-making processes in basketball. Eleven experienced female players performed 350 trials in 1 vs. 1 sub-phases of basketball in which an attacker tried to perturb the stable state of a dyad formed with a defender (i.e. break the symmetry). In Experiment 1, specific instructions (neutral, risk taking or conservative) were manipulated to observe effects on emergent behaviour of the dyadic system. When attacking players were given conservative instructions, time to cross court mid-line and variability of the attacker's trajectory were significantly greater. In Experiment 2, body-scaling of participants was manipulated by creating dyads with different height relations. When attackers were considerably taller than defenders, there were fewer occurrences of symmetry-breaking. When attackers were considerably shorter than defenders, time to cross court mid-line was significantly shorter than when dyads were composed of athletes of similar height or when attackers were considerably taller than defenders. The data exemplify how interacting task and individual constraints can influence emergent decision-making processes in team ball games.
Resumo:
Elaborated Intrusion theory (Kavanagh, Andrade & May 2005) distinguishes between unconscious, associative processes as the precursors of desire, and controlled processes of cognitive elaboration that lead to conscious sensory images of the target of desire and associated affect. We argue that these mental images play a key role in motivating human behavior. Consciousness is functional in that it allows competing goals to be compared and evaluated. The role of effortful cognitive processes in desire helps to explain the different time courses of craving and physiological withdrawal.
Resumo:
The Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) is the largest study of new firm formation that has ever been undertaken in Australia. CAUSEE follows the development of several samples of new and emerging firms over time. In this report we focus on the drivers of outcomes – in terms of reaching an operational stage vs. terminating the effort – of 493 randomly selected nascent firms whose founders have been comprehensively interviewed on two occasions, 12 months apart. We investigate the outcome effects of three groups of variables: Characteristics of the Venture; Resources Used in the Start-Up Process and Characteristics of the Start-Up Process Itself.
Resumo:
This paper examines a sequence of asynchronous interaction on the photosharing website, Flickr. In responding to a call for a focus on the performative aspects of online annotation (Wolff & Neuwirth, 2001), we outline and apply an interaction order approach to identify temporal and cultural aspects of the setting that provide for commonality and sharing. In particular, we study the interaction as a feature of a synthetic situation (Knorr Cetina, 2009) focusing on the requirements of maintaining a sense of an ongoing discussion online. Our analysis suggests that the rhetorical system of the Flickr environment, its appropriation by participants as a context for bounded activities, and displays of commonality, affiliation, and shared access provide for a common sense of participation in a time envelope. This, in turn, is argued to be central to new processes of consociation (Schutz, 1967; Zhao, 2004) occurring in the life world of Web 2.0 environments.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Industrial applications of the simulated-moving-bed (SMB) chromatographic technology have brought an emergent demand to improve the SMB process operation for higher efficiency and better robustness. Improved process modelling and more-efficient model computation will pave a path to meet this demand. However, the SMB unit operation exhibits complex dynamics, leading to challenges in SMB process modelling and model computation. One of the significant problems is how to quickly obtain the steady state of an SMB process model, as process metrics at the steady state are critical for process design and real-time control. The conventional computation method, which solves the process model cycle by cycle and takes the solution only when a cyclic steady state is reached after a certain number of switching, is computationally expensive. Adopting the concept of quasi-envelope (QE), this work treats the SMB operation as a pseudo-oscillatory process because of its large number of continuous switching. Then, an innovative QE computation scheme is developed to quickly obtain the steady state solution of an SMB model for any arbitrary initial condition. The QE computation scheme allows larger steps to be taken for predicting the slow change of the starting state within each switching. Incorporating with the wavelet-based technique, this scheme is demonstrated to be effective and efficient for an SMB sugar separation process. Moreover, investigations are also carried out on when the computation scheme should be activated and how the convergence of the scheme is affected by a variable stepsize.
Resumo:
Road and highway infrastructure provides the backbone for a nation’s economic growth. The versatile dispersion of population in Australia and its resource boom, coupled with improved living standards and growing societal expectations, calls for continuing development and improvement of road infrastructure under the current local, state and federal governments’ policies and strategic plans. As road infrastructure projects involve huge resources and mechanisms, achieving sustainability not only on economic scales but also through environmental and social responsibility becomes a crucial issue. While sustainability is a logical link to infrastructure development, literature study and consultation with the industry found that there is a lack of common understanding on what constitutes sustainability in the infrastructure context. Its priorities are often interpreted differently among multiple stakeholders. For road infrastructure projects which typically span over long periods of time, achieving tangible sustainability outcomes during the lifecycle of development remains a formidable task. Sustainable development initiatives often remain ideological as in macro-level policies and broad-based concepts. There were little elaboration and exemplar cases on how these policies and concepts can be translated into practical decision-making during project implementation. In contrast, there seemed to be over commitment on research and development of sustainability assessment methods and tools. Between the two positions, there is a perception-reality gap and mismatch, specifically on how to enhance sustainability deliverables during infrastructure project delivery. Review on past research in this industry sector also found that little has been done to promote sustainable road infrastructure development; this has wide and varied potential impacts. This research identified the common perceptions and expectations by different stakeholders towards achieving sustainability in road and highway infrastructure projects. Face to face interviews on selected representatives of these stakeholders were carried out in order to select and categorize, confirm and prioritize a list of sustainability performance targets identified through literature and past research. A Delphi study was conducted with the assistance of a panel of senior industry professionals and academic experts, which further considered the interrelationship and influence of the sustainability indicators, and identified critical sustainability indicators under ten critical sustainability criteria (e.g. Environmental, Health & Safety, Resource Utilization & Management, Social & Cultural, Economic, Public Governance & Community Engagement, Relations Management, Engineering, Institutional and Project Management). This presented critical sustainability issues that needed to be addressed at the project level. Accordingly, exemplar highway development projects were used as case studies to elicit solutions for the critical issues. Through the identification and integration of different perceptions and priority needs of the stakeholders, as well as key sustainability indicators and solutions for critical issues, a set of decision-making guidelines was developed to promote and drive consistent sustainability deliverables in road infrastructure projects.
Resumo:
In this work, we investigate an alternative bootstrap approach based on a result of Ramsey [F.L. Ramsey, Characterization of the partial autocorrelation function, Ann. Statist. 2 (1974), pp. 1296-1301] and on the Durbin-Levinson algorithm to obtain a surrogate series from linear Gaussian processes with long range dependence. We compare this bootstrap method with other existing procedures in a wide Monte Carlo experiment by estimating, parametrically and semi-parametrically, the memory parameter d. We consider Gaussian and non-Gaussian processes to prove the robustness of the method to deviations from normality. The approach is also useful to estimate confidence intervals for the memory parameter d by improving the coverage level of the interval.
Resumo:
The effective atomic number is widely employed in radiation studies, particularly for the characterisation of interaction processes in dosimeters, biological tissues and substitute materials. Gel dosimeters are unique in that they comprise both the phantom and dosimeter material. In this work, effective atomic numbers for total and partial electron interaction processes have been calculated for the first time for a Fricke gel dosimeter, five hypoxic and nine normoxic polymer gel dosimeters. A range of biological materials are also presented for comparison. The spectrum of energies studied spans 10 keV to 100 MeV, over which the effective atomic number varies by 30 %. The effective atomic numbers of gels match those of soft tissue closely over the full energy range studied; greater disparities exist at higher energies but are typically within 4 %.
Resumo:
Cell invasion involves a population of cells which are motile and proliferative. Traditional discrete models of proliferation involve agents depositing daughter agents on nearest- neighbor lattice sites. Motivated by time-lapse images of cell invasion, we propose and analyze two new discrete proliferation models in the context of an exclusion process with an undirected motility mechanism. These discrete models are related to a family of reaction- diffusion equations and can be used to make predictions over a range of scales appropriate for interpreting experimental data. The new proliferation mechanisms are biologically relevant and mathematically convenient as the continuum-discrete relationship is more robust for the new proliferation mechanisms relative to traditional approaches.
Resumo:
Exclusion processes on a regular lattice are used to model many biological and physical systems at a discrete level. The average properties of an exclusion process may be described by a continuum model given by a partial differential equation. We combine a general class of contact interactions with an exclusion process. We determine that many different types of contact interactions at the agent-level always give rise to a nonlinear diffusion equation, with a vast variety of diffusion functions D(C). We find that these functions may be dependent on the chosen lattice and the defined neighborhood of the contact interactions. Mild to moderate contact interaction strength generally results in good agreement between discrete and continuum models, while strong interactions often show discrepancies between the two, particularly when D(C) takes on negative values. We present a measure to predict the goodness of fit between the discrete and continuous model, and thus the validity of the continuum description of a motile, contact-interacting population of agents. This work has implications for modeling cell motility and interpreting cell motility assays, giving the ability to incorporate biologically realistic cell-cell interactions and develop global measures of discrete microscopic data.
Resumo:
Using sculpture and drawing as my primary methods of investigation, this research explores ways of shifting the emphasis of my creative visual arts practice from object to process whilst still maintaining a primacy of material outcomes. My motivation was to locate ways of developing a sustained practice shaped as much by new works, as by a creative flow between works. I imagined a practice where a logic of structure within discrete forms and a logic of the broader practice might be developed as mutually informed processes. Using basic structural components of multiple wooden curves and linear modes of deployment – in both sculptures and drawings – I have identified both emergence theory and the image of rhizomic growth (Deleuze and Guattari, 1987) as theoretically integral to this imagining of a creative practice, both in terms of critiquing and developing works. Whilst I adopt a formalist approach for this exegesis, the emergence and rhizome models allow it to work as a critique of movement, of becoming and changing, rather than merely a formalism of static structure. In these models, therefore, I have identified a formal approach that can be applied not only to objects, but to practice over time. The thorough reading and application of these ontological models (emergence and rhizome) to visual arts practice, in terms of processes, objects and changes, is the primary contribution of this thesis. The works that form the major component of the research develop, reflect and embody these notions of movement and change.
Resumo:
This paper explores principles of contemporary aesthetics to suggest a basis for determining qualitative outcomes of artistic works in two contexts: the arts industry and the academy setting of practice-led research. Commonly articulated measures of quality—creativity and innovation—are questioned as mere rhetoric if not framed in specific ways in the two discrete settings. The paper also interrogates generally held assumptions that a longer time to develop work and greater periods of self-reflexivity will produce higher calibre artistic outcomes. The unease produced by apparent differences in qualitative outcomes between art works created in an industry setting and those created through practice-led research is analysed through three interconnected framing devices: intention, contextual parameters and criteria for evaluation, in conjunction with the relationships between the art work, the artist and the audience/viewer/listener. Common and differentiated criteria in the two contexts are explored, leading to the conclusion that innovation is more likely to be revealed in the end product in an industry context whereas in practice-led research it may be in the methodological processes of creating the work. While identifying and acknowledging that the two contexts encourage and produce distinctive qualitative artistic outcomes, both of value to the arts and the academy, the paper recommends ways in which closer formal liaison between industry artists and practice-led artists and supervisors might occur in order to ensure ongoing mutual influence and relevance.