972 resultados para Random processes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thermal degradation processes of two sulfur polymers, poly(xylylene sulfide) (PXM) and poly(xylylene disulfide) (PXD), were investigated in parallel by direct pyrolysis mass spectrometry (DPMS) and flash pyrolysis GC/MS (Py-GC/MS). Thermogravimetric data showed that these polymers decompose with two separate steps in the temperature ranges of 250-280 and 600-650 degrees C, leaving a high amount of residue (about 50% at 800 degrees C). The pyrolysis products detected by DPMS in the first degradation step of PXM and PXD were terminated by three types of end groups, -CH3, -CH2SH, and -CH=S, originating from thermal cleavage reactions involving a series of homolytic chain scissions followed by hydrogen transfer reactions, generating several oligomers containing some intact xylylene sulfide repeating units. The presence of pyrolysis compounds containing some stilbene-like units in the first degradation step has also been observed. Their formation has been accounted for with a parallel cleavage involving the elimination of H2S from the PXM main chains. These unsaturated units can undergo cross-linking at higher temperatures, producing the high amount of char residue observed. The thermal degradation compounds detected by DPMS in the second decomposition step at about 600-650 degrees C were constituted of condensed aromatic molecules containing dihydrofenanthrene and fenanthrene units. These compounds might be generated from the polymer chains containing stilbene units, by isomerization and dehydrogenation reactions. The pyrolysis products obtained in the Py-GC/MS of PXM and PXD at 610 degrees C are almost identical. The relative abundance in the pyrolysate and the spectral properties of the main pyrolysis products were found to be in generally good agreement with those obtained by DPMS. Polycyclic aromatic hydrocarbons (PAHs) were also detected by Py-GC/MS but in minor amounts with respect to DPMS. This apparent discrepancy was due to the simultaneous detection of PAHs together with all pyrolysis products in the Py-GC/MS, whereas in DPMS they were detected in the second thermal degradation step without the greatest part of pyrolysis compounds generated in the first degradation step. The results obtained by DPMS and PSI-GC/MS experiments showed complementary data for the degradation of PXM and PXD and, therefore, allowed the unequivocal formulation of the thermal degradation mechanism for these sulfur-containing polymers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Part I (Manjunath et al., 1994, Chem. Engng Sci. 49, 1451-1463) of this paper showed that the random particle numbers and size distributions in precipitation processes in very small drops obtained by stochastic simulation techniques deviate substantially from the predictions of conventional population balance. The foregoing problem is considered in this paper in terms of a mean field approximation obtained by applying a first-order closure to an unclosed set of mean field equations presented in Part I. The mean field approximation consists of two mutually coupled partial differential equations featuring (i) the probability distribution for residual supersaturation and (ii) the mean number density of particles for each size and supersaturation from which all average properties and fluctuations can be calculated. The mean field equations have been solved by finite difference methods for (i) crystallization and (ii) precipitation of a metal hydroxide both occurring in a single drop of specified initial supersaturation. The results for the average number of particles, average residual supersaturation, the average size distribution, and fluctuations about the average values have been compared with those obtained by stochastic simulation techniques and by population balance. This comparison shows that the mean field predictions are substantially superior to those of population balance as judged by the close proximity of results from the former to those from stochastic simulations. The agreement is excellent for broad initial supersaturations at short times but deteriorates progressively at larger times. For steep initial supersaturation distributions, predictions of the mean field theory are not satisfactory thus calling for higher-order approximations. The merit of the mean field approximation over stochastic simulation lies in its potential to reduce expensive computation times involved in simulation. More effective computational techniques could not only enhance this advantage of the mean field approximation but also make it possible to use higher-order approximations eliminating the constraints under which the stochastic dynamics of the process can be predicted accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a Linear system with Markovian switching which is perturbed by Gaussian type noise, If the linear system is mean square stable then we show that under certain conditions the perturbed system is also stable, We also shaw that under certain conditions the linear system with Markovian switching can be stabilized by such noisy perturbation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of techniques for scaling up classifiers so that they can be applied to problems with large datasets of training examples is one of the objectives of data mining. Recently, AdaBoost has become popular among machine learning community thanks to its promising results across a variety of applications. However, training AdaBoost on large datasets is a major problem, especially when the dimensionality of the data is very high. This paper discusses the effect of high dimensionality on the training process of AdaBoost. Two preprocessing options to reduce dimensionality, namely the principal component analysis and random projection are briefly examined. Random projection subject to a probabilistic length preserving transformation is explored further as a computationally light preprocessing step. The experimental results obtained demonstrate the effectiveness of the proposed training process for handling high dimensional large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a need for better understanding of the processes and new ideas to develop traditional pharmaceutical powder manufacturing procedures. Process analytical technology (PAT) has been developed to improve understanding of the processes and establish methods to monitor and control processes. The interest is in maintaining and even improving the whole manufacturing process and the final products at real-time. Process understanding can be a foundation for innovation and continuous improvement in pharmaceutical development and manufacturing. New methods are craved for to increase the quality and safety of the final products faster and more efficiently than ever before. The real-time process monitoring demands tools, which enable fast and noninvasive measurements with sufficient accuracy. Traditional quality control methods have been laborious and time consuming and they are performed off line i.e. the analysis has been removed from process area. Vibrational spectroscopic methods are responding this challenge and their utilisation have increased a lot during the past few years. In addition, other methods such as colour analysis can be utilised in noninvasive real-time process monitoring. In this study three pharmaceutical processes were investigated: drying, mixing and tabletting. In addition tablet properties were evaluated. Real-time monitoring was performed with NIR and Raman spectroscopies, colour analysis, particle size analysis and compression data during tabletting was evaluated using mathematical modelling. These methods were suitable for real-time monitoring of pharmaceutical unit operations and increase the knowledge of the critical parameters in the processes and the phenomena occurring during operations. They can improve our process understanding and therefore, finally, enhance the quality of final products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the fourth TAProViz workshop being run at the 13th International Conference on Business Process Management (BPM). The intention this year is to consolidate on the results of the previous successful workshops by further developing this important topic, identifying the key research topics of interest to the BPM visualization community. Towards this goal, the workshop topics were extended to human computer interaction and related domains. Submitted papers were evaluated by at least three program committee members, in a double blind manner, on the basis of significance, originality, technical quality and exposition. Three full and one position papers were accepted for presentation at the workshop. In addition, we invited a keynote speaker, Jakob Pinggera, a postdoctoral researcher at the Business Process Management Research Cluster at the University of Innsbruck, Austria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through an analysis using the transfer function of a pinhole camera, the multiple imaging characteristics of photographic diffusers described by Grover and Tremblay [Appl. Opt.21,4500(1982)] is studied. It is found that only one pinhole diameter satisfies the optimum imaging condition for best contrast transfer at any desired spatial frequency. A simple method of generating random pinhole arrays with a controlled pinhole diameter is described. These pinhole arrays are later used to generate high frequency sinusoidal gratings from a coarse grid. The contrast in the final gratings is found to be reasonably high.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This splitting techniques for MARKOV chains developed by NUMMELIN (1978a) and ATHREYA and NEY (1978b) are used to derive an imbedded renewal process in WOLD's point process with MARKOV-correlated intervals. This leads to a simple proof of renewal theorems for such processes. In particular, a key renewal theorem is proved, from which analogues to both BLACKWELL's and BREIMAN's forms of the renewal theorem can be deduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The anharmonic oscillator under combined sinusoidal and white noise excitation is studied using the Gaussian closure approximation. The mean response and the steady-state variance of the system is obtained by the WKBJ approximation and also by the Fokker Planck equation. The multiple steadystate solutions are obtained and their stability analysis is presented. Numerical results are obtained for a particular set of system parameters. The theoretical results are compared with a digital simulation study to bring out the usefulness of the present approximate theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.