987 resultados para Continuous Processes
Resumo:
Evolutionary processes acting at the expanding margins of a species' range are still poorly understood. Genetic drift is considered prevalent in marginal populations, and the maintenance of genetic diversity during recolonization might seem puzzling. To investigate such processes, a fine-scale investigation of 219 individuals was performed within a population of Biscutella laevigata (Brassicaceae), located at the leading edge of its range. The survey used amplified fragment length polymorphisms (AFLPs). As commonly reported across the whole species distribution range, individual density and genetic diversity decreased along the local axis of recolonization of this expanding population, highlighting the enduring effect of the historical colonization on present-day diversity. The self-incompatibility system of the plant may have prevented local inbreeding in newly found patches and sustained genetic diversity by ensuring gene flow from established populations. Within the more continuously populated region, spatial analysis of genetic structure revealed restricted gene flow among individuals. The distribution of genotypes formed a mosaic of relatively homogenous patches within the continuous population. This pattern could be explained by a history of expansion by long-distance dispersal followed by fine-scale diffusion (that is, a stratified dispersal combination). The secondary contact among expanding patches apparently led to admixture among differentiated genotypes where they met (that is, a reshuffling effect). This type of dynamics could explain the maintenance of genetic diversity during recolonization.
Resumo:
Abstract OBJECTIVE Developing continuing education guidelines for the development of nursing management competencies along with the members of the Center of Nursing Continuing Education of Parana. METHOD A qualitative research outlined by the action research method, with a sample consisting of 16 nurses. Data collection was carried out in three stages and data were analyzed according to the thematic analysis technique. RESULTS It was possible to discuss the demands and difficulties in developing nursing management competencies in hospital organizations and to collectively design a guideline. CONCLUSION The action research contributed to the production of knowledge, confirming the need and the importance of changing the educational processes and evaluations, based on methodologies and instruments for professional development in accordance with human resource policies and contemporary organizational policies.
Resumo:
A dynamical model based on a continuous addition of colored shot noises is presented. The resulting process is colored and non-Gaussian. A general expression for the characteristic function of the process is obtained, which, after a scaling assumption, takes on a form that is the basis of the results derived in the rest of the paper. One of these is an expansion for the cumulants, which are all finite, subject to mild conditions on the functions defining the process. This is in contrast with the Lévy distribution¿which can be obtained from our model in certain limits¿which has no finite moments. The evaluation of the spectral density and the form of the probability density function in the tails of the distribution shows that the model exhibits a power-law spectrum and long tails in a natural way. A careful analysis of the characteristic function shows that it may be separated into a part representing a Lévy process together with another part representing the deviation of our model from the Lévy process. This
Resumo:
Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.
Resumo:
The continuous-time random walk (CTRW) formalism can be adapted to encompass stochastic processes with memory. In this paper we will show how the random combination of two different unbiased CTRWs can give rise to a process with clear drift, if one of them is a CTRW with memory. If one identifies the other one as noise, the effect can be thought of as a kind of stochastic resonance. The ultimate origin of this phenomenon is the same as that of the Parrondo paradox in game theory.
Resumo:
Flood effectiveness observations imply that two families of processes describe the formation of debris flow volume. One is related to the rainfall?erosion relationship, and can be seen as a gradual process, and one is related to additional geological/geotechnical events, those named hereafter extraordinary events. In order to discuss the hypothesis of coexistence of two modes of volume formation, some methodologies are applied. Firstly, classical approaches consisting in relating volume to catchments characteristics are considered. These approaches raise questions about the quality of the data rather than providing answers concerning the controlling processes. Secondly, we consider statistical approaches (cumulative number of events distribution and cluster analysis) and these suggest the possibility of having two distinct families of processes. However the quantitative evaluation of the threshold differs from the one that could be obtained from the first approach, but they all agree in the sense of the coexistence of two families of events. Thirdly, a conceptual model is built exploring how and why debris flow volume in alpine catchments changes with time. Depending on the initial condition (sediment production), the model shows that large debris flows (i.e. with important volume) are observed in the beginning period, before a steady-state is reached. During this second period debris flow volume such as is observed in the beginning period is not observed again. Integrating the results of the three approaches, two case studies are presented showing: (1) the possibility to observe in a catchment large volumes that will never happen again due to a drastic decrease in the sediment availability, supporting its difference from gradual erosion processes; (2) that following a rejuvenation of the sediment storage (by a rock avalanche) the magnitude?frequency relationship of a torrent can be differentiated into two phases, the beginning one with large and frequent debris flow and a later one with debris flow less intense and frequent, supporting the results of the conceptual model. Although the results obtained cannot identify a clear threshold between the two families of processes, they show that some debris flows can be seen as pulse of sediment differing from that expected from gradual erosion.
Resumo:
Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00 am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann–Whitney test for non-normal continuous variables. Results: The median patients’ global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00 am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.
Resumo:
Background: The aim of this study was to evaluate how hospital capacity was managed focusing on standardizing the admission and discharge processes. Methods: This study was set in a 900-bed university affiliated hospital of the National Health Service, near Barcelona (Spain). This is a cross-sectional study of a set of interventions which were gradually implemented between April and December 2008. Mainly, they were focused on standardizing the admission and discharge processes to improve patient flow. Primary administrative data was obtained from the 2007 and 2009 Hospital Database. Main outcome measures were median length of stay, percentage of planned discharges, number of surgery cancellations and median number of delayed emergency admissions at 8:00¿am. For statistical bivariate analysis, we used a Chi-squared for linear trend for qualitative variables and a Wilcoxon signed ranks test and a Mann¿Whitney test for non-normal continuous variables. Results:The median patients' global length of stay was 8.56 days in 2007 and 7.93 days in 2009 (p<0.051). The percentage of patients admitted the same day as surgery increased from 64.87% in 2007 to 86.01% in 2009 (p<0.05). The number of cancelled interventions due to lack of beds was 216 patients in 2007 and 42 patients in 2009. The median number of planned discharges went from 43.05% in 2007 to 86.01% in 2009 (p<0.01). The median number of emergency patients waiting for an in-hospital bed at 8:00¿am was 5 patients in 2007 and 3 patients in 2009 (p<0.01). Conclusions: In conclusion, standardization of admission and discharge processes are largely in our control. There is a significant opportunity to create important benefits for increasing bed capacity and hospital throughput.
Resumo:
Identify processes to modify in order to reduce snow plow accidents. Reviewed all [Iowa] D.O.T. snow plow accidents that occurred in calendar years 1992 and 1993.
Resumo:
Motivation: Hormone pathway interactions are crucial in shaping plant development, such as synergism between the auxin and brassinosteroid pathways in cell elongation. Both hormone pathways have been characterized in detail, revealing several feedback loops. The complexity of this network, combined with a shortage of kinetic data, renders its quantitative analysis virtually impossible at present.Results: As a first step towards overcoming these obstacles, we analyzed the network using a Boolean logic approach to build models of auxin and brassinosteroid signaling, and their interaction. To compare these discrete dynamic models across conditions, we transformed them into qualitative continuous systems, which predict network component states more accurately and can accommodate kinetic data as they become available. To this end, we developed an extension for the SQUAD software, allowing semi-quantitative analysis of network states. Contrasting the developmental output depending on cell type-specific modulators enabled us to identify a most parsimonious model, which explains initially paradoxical mutant phenotypes and revealed a novel physiological feature.
Resumo:
In this paper we consider a stochastic process that may experience random reset events which suddenly bring the system to the starting value and analyze the relevant statistical magnitudes. We focus our attention on monotonic continuous-time random walks with a constant drift: The process increases between the reset events, either by the effect of the random jumps, or by the action of the deterministic drift. As a result of all these combined factors interesting properties emerge, like the existence (for any drift strength) of a stationary transition probability density function, or the faculty of the model to reproduce power-law-like behavior. General formulas for two extreme statistics, the survival probability, and the mean exit time, are also derived. To corroborate in an independent way the results of the paper, Monte Carlo methods were used. These numerical estimations are in full agreement with the analytical predictions.
Resumo:
Scientific studies regarding specifically references do not seem to exist. However, the utilization of references is an important practice for many companies involved in industrial marketing. The purpose of the study is to increase the understanding about the utilization of references in international industrial marketing in order to contribute to the development of a theory of reference behavior. Specifically, the modes of reference usage in industry, the factors affecting a supplier's reference behavior, and the question how references are actually utilized, are explored in the study. Due to the explorative nature of the study, a research design was followed where theory and empirical studies alternated. An Exploratory Framework was developed to guide a pilot case study that resulted in Framework 1. Results of the pilot study guided an expanded literature review that was used to develop first a Structural Framework and a Process Framework which were combined in Framework 2. Then, the second empirical phase of the case study was conducted in the same (pilot) case company. In this phase, Decision Systems Analysis (DSA) was used as the analysis method. The DSA procedure consists of three interviewing waves: initial interviews, reinterviews, and validating interviews. Four reference decision processes were identified, described and analyzed in the form of flowchart descriptions. The flowchart descriptions were used to explore new constructs and to develop new propositions to develop Framework 2 further. The quality of the study was ascertained by many actions in both empirical parts of the study. The construct validity of the study was ascertained by using multiple sources of evidence and by asking the key informant to review the pilot case report. The DSA method itself includes procedures assuring validity. Because of the choice to conduct a single case study, external validity was not even pursued. High reliability was pursued through detailed documentation and thorough reporting of evidence. It was concluded that the core of the concept of reference is a customer relationship regardless of the concrete forms a reference might take in its utilization. Depending on various contingencies, references might have various tasks inside the four roles of increasing 1) efficiency of sales and sales management, 2) efficiency of the business, 3) effectiveness of marketing activities, and 4) effectiveness in establishing, maintaining and enhancing customer relationships. Thus, references have not only external but internal tasks as well. A supplier's reference behavior might be affected by many hierarchical conditions. Additionally, the empirical study showed that the supplier can utilize its references as a continuous, all pervasive decision making process through various practices. The process includes both individual and unstructured decision making subprocesses. The proposed concept of reference can be used to guide a reference policy recommendable for companies for which the utilization of references is important. The significance of the study is threefold: proposing the concept of reference, developing a framework of a supplier's reference behavior and its short term process of utilizing references, and conceptual structuring of an unstructured and in industrial marketing important phenomenon to four roles.
Resumo:
Today´s organizations must have the ability to react to rapid changes in the market. These rapid changes cause pressure to continuously find new efficient ways to organize work practices. Increased competition requires businesses to become more effective and to pay attention to quality of management and to make people to understand their work's impact on the final result. The fundamentals in continmuois improvement are systematic and agile tackling of indentified individual process constraints and the fact tha nothin finally improves without changes. Successful continuous improvement requires management commitment, education, implementation, measurement, recognition and regeneration. These ingredients form the foundation, both for breakthrough projects and small step ongoing improvement activities. One part of the organization's management system are the quality tools, which provide systematic methodologies for identifying problems, defining their root causes, finding solutions, gathering and sorting of data, supporting decision making and implementing the changes, and many other management tasks. Organizational change management includes processes and tools for managing the people in an organizational level change. These tools include a structured approach, which can be used for effective transition of organizations through change. When combined with the understanding of change management of individuals, these tools provide a framework for managing people in change,
Resumo:
The aim of this study was to create a Balanced Scorecard to the DigiCup solution. The first goal was to create process descriptions to the few critical processes. The second goal was to define appropriate measurements, according to customer survey as well as following the Balanced Scorecard process description, to manage the critical success factors. The overall goal of this study was to create a performance measurement system for the solution which guides the operation towards continuous improvement. This study was conducted by using both qualitative and quantitative methods, and the analysis was done by using a case study method. The material was gathered from the current customers, the management and the employees using structured, semi-structured and open group and individual interviews. The current customers were divided into retailers and direct customers of the DigiCup solution. The questions which the customers were asked were related to the information about interviewee, company, business strategy, market, satisfaction survey and future requirements. The management defined the strategy and took part in specifying the perspectives, objectives and measurements to the Balanced Scorecard of the DigiCup solution. The employees participated into the choosing of the metrics. The material consisted from altogether sixteen interviews. At the beginning of the study the product development, the order-delivery as well as the printing processes was chosen to be the critical processes of the DigiCup solution. These processes were concentrated on already in the literature review while trying to find the characteristics of these processes as well as the critical success factors and the appropriate measurements, which could be utilized when creating the Balanced Scorecard to the DigiCup solution according to the customer survey. The appropriate perspectives, objectives and measurements were found to the DigiCup solution. The chosen measures works as a basis for the development of IT-reporting tool. As a conclusion it can be stated that when discussing a new business, where the objectives are changing according to which development’s phases the company is in, the measurement should be updated often enough.
Resumo:
This thesis was part of lean adaptation project started at Outotec Lappeenranta factory in early 2013. The purpose of this thesis was to develop and propose lean tools that could be used in daily management, visual management and continuous improvement. This thesis was “outsiders” view, and as such, did not study the current processes deeply. As result of this thesis, two different Daily Management -boards were designed, one for parallel processes and one for sequential processes. In addition, methods of doing continuous improvement and daily task accountability were framed and standard work for the leaders outlined. The tools presented in this thesis are general tools which support work in lean environment. They are visual and, if used correctly, they provide a basis from which continuous improvement can be done. Lean philosophy emphasizes the deep understanding of the current situation and it would be against the lean principles to blindly implement anything developed “on the outside”. The tools presented should be reviewed and modified further by the people working on the factory floor.