593 resultados para stochastic load factor
Resumo:
Process models in organizational collections are typically modeled by the same team and using the same conventions. As such, these models share many characteristic features like size range, type and frequency of errors. In most cases merely small samples of these collections are available due to e.g. the sensitive information they contain. Because of their sizes, these samples may not provide an accurate representation of the characteristics of the originating collection. This paper deals with the problem of constructing collections of process models, in the form of Petri nets, from small samples of a collection for accurate estimations of the characteristics of this collection. Given a small sample of process models drawn from a real-life collection, we mine a set of generation parameters that we use to generate arbitrary-large collections that feature the same characteristics of the original collection. In this way we can estimate the characteristics of the original collection on the generated collections.We extensively evaluate the quality of our technique on various sample datasets drawn from both research and industry.
Resumo:
This paper proposes the use of the Bayes Factor as a distance metric for speaker segmentation within a speaker diarization system. The proposed approach uses a pair of constant sized, sliding windows to compute the value of the Bayes Factor between the adjacent windows over the entire audio. Results obtained on the 2002 Rich Transcription Evaluation dataset show an improved segmentation performance compared to previous approaches reported in literature using the Generalized Likelihood Ratio. When applied in a speaker diarization system, this approach results in a 5.1% relative improvement in the overall Diarization Error Rate compared to the baseline.
Resumo:
Luxury is a quality that is difficult to define as the historical concept of luxury appears to be both dynamic and culturally specific. The everyday definition explains a ‘luxury’ in relation to a necessity: a luxury (product or service) is defined as something that consumers want rather than need. However, the growth of global markets has seen a boom in what are now referred to as ‘luxury brands’. This branding of products as luxury has resulted in a change in the way consumers understand luxury goods and services. In their attempts to characterize a luxury brand, Fionda & Moore in their article “The anatomy of a Luxury Brand” summarize a range of critical conditions that are in addition to product branding “... including product and design attributes of quality, craftsmanship and innovative, creative and unique products” (Fionda & Moore, 2009). For the purposes of discussing fashion design however, quality and craftsmanship are inseparable while creativity and innovation exist under different conditions. The terms ‘creative’ and ‘innovative’ are often used inter-changeably and are connected with most descriptions of the design process, defining ‘design’ and ‘fashion’ in many cases. Christian Marxt and Fredrik Hacklin identify this condition in their paper “Design, product development, innovation: all the same in the end?”(Marxt & Hacklin, 2005) and suggest that design communities should be aware that the distinction between these terms, whilst once quite definitive, is becoming narrow to a point where they will mean the same thing. In relation to theory building in the discipline this could pose significant problems. Brett Richards (2003) identifies innovation as different from creativity in that innovation aims to transform and implement rather than simply explore and invent. Considering this distinction, in particular relation to luxury branding, may affect the way in which design can contribute to a change in the way luxury fashion goods might be perceived in a polarised fashion market, namely suggesting that ‘luxury’ is what consumers need rather than the ‘pile it high, sell it cheap’ fashion that the current market dynamic would indicate they want. This paper attempts to explore the role of innovation as a key contributing factor in luxury concepts, in particular the relationship between innovation and creativity, the conditions which enable innovation, the role of craftsmanship in innovation and design innovation in relation to luxury fashion products. An argument is presented that technological innovation can be demonstrated as a common factor in the development of luxury fashion product and that the connection between designer and maker will play an important role in the development of luxury fashion goods for a sustainable fashion industry.
Resumo:
Dynamic load sharing can be defined as a measure of the ability of a heavy vehicle multi-axle group to equalise load across its wheels under typical travel conditions; i.e. in the dynamic sense at typical travel speeds and operating conditions of that vehicle. Various attempts have been made to quantify the ability of heavy vehicles to equalise the load across their wheels during travel. One of these was the concept of the load sharing coefficient (LSC). Other metrics such as the dynamic load coefficient (DLC), peak dynamic wheel force (PDWF) and dynamic impact force (DIF) have been used to compare one heavy vehicle suspension with another for potential road damage. This paper compares these metrics and determines a relationship between DLC and LSC with sensitivity analysis of this relationship. The shortcomings of the presently-available metrics are discussed with a new metric proposed - the dynamic load equalisation (DLE) measure.
Resumo:
The study investigated the effect on learning of four different instructional formats used to teach assembly procedures. Cognitive load and spatial information processing theories were used to generate the instructional material. The first group received a physical model to study, the second an isometric drawing, the third an isometric drawing plus a model and the fourth an orthographic drawing. Forty secondary school students were presented with the four different instructional formats and subsequently tested on an assembly task. The findings indicated that there may be evidence to argue that the model format which only required encoding of an already constructed three dimensional representation, caused less extraneous cognitive load compared to the isometric and the orthographic formats. No significant difference was found between the model and the isometric-plus-model formats on all measures because 80% of the students in the isometric-plus-model format chose to use the model format only. The model format also did not differ significantly from other groups in total time taken to complete the assembly, in number of correctly assembled pieces and in time spent on studying the tasks. However, the model group had significantly more correctly completed models and required fewer extra looks than the other groups.
Resumo:
Cognitive load theory was used to generate a series of three experiments to investigate the effects of various worked example formats on learning orthographic projection. Experiments 1 and 2 investigated the benefits of presenting problems, conventional worked examples incorporating the final 2-D and 3-D representations only, and modified worked examples with several intermediate stages of rotation between the 2-D and 3-D representations. Modified worked examples proved superior to conventional worked examples without intermediate stages while conventional worked examples were, in turn, superior to problems. Experiment 3 investigated the consequences of varying the number and location of intermediate stages in the rotation trajectory and found three stages to be superior to one. A single intermediate stage was superior when nearer the 2-D than the 3-D end of the trajectory. It was concluded that (a) orthographic projection is learned best using worked examples with several intermediate stages and that (b) a linear relation between angle of rotation and problem difficulty did not hold for orthographic projection material. Cognitive load theory could be used to suggest the ideal location of the intermediate stages.
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
Hydrogels, which are three-dimensional crosslinked hydrophilic polymers, have been used and studied widely as vehicles for drug delivery due to their good biocompatibility. Traditional methods to load therapeutic proteins into hydrogels have some disadvantages. Biological activity of drugs or proteins can be compromised during polymerization process or the process of loading protein can be really timeconsuming. Therefore, different loading methods have been investigated. Based on the theory of electrophoresis, an electrochemical gradient can be used to transport proteins into hydrogels. Therefore, an electrophoretic method was used to load protein in this study. Chemically and radiation crosslinked polyacrylamide was used to set up the model to load protein electrophoretically into hydrogels. Different methods to prepare the polymers have been studied and have shown the effect of the crosslinker (bisacrylamide) concentration on the protein loading and release behaviour. The mechanism of protein release from the hydrogels was anomalous diffusion (i.e. the process was non-Fickian). The UV-Vis spectra of proteins before and after reduction show that the bioactivities of proteins after release from hydrogel were maintained. Due to the concern of cytotoxicity of residual monomer in polyacrylamide, poly(2-hydroxyethyl- methacrylate) (pHEMA) was used as the second tested material. In order to control the pore size, a polyethylene glycol (PEG) porogen was introduced to the pHEMA. The hydrogel disintegrated after immersion in water indicating that the swelling forces exceeded the strength of the material. In order to understand the cause of the disintegration, several different conditions of crosslinker concentration and preparation method were studied. However, the disintegration of the hydrogel still occurred after immersion in water principally due to osmotic forces. A hydrogel suitable for drug delivery needs to be biocompatible and also robust. Therefore, an approach to improving the mechanical properties of the porogen-containing pHEMA hydrogel by introduction of an inter-penetrating network (IPN) into the hydrogel system has been researched. A double network was formed by the introduction of further HEMA solution into the system by both electrophoresis and slow diffusion. Raman spectroscopy was used to observe the diffusion of HEMA into the hydrogel prior to further crosslinking by ã-irradiation. The protein loading and release behaviour from the hydrogel showing enhanced mechanical property was also studied. Biocompatibility is a very important factor for the biomedical application of hydrogels. Different hydrogels have been studied on both a three-dimensional HSE model and a HSE wound model for their biocompatibilities. They did not show any detrimental effect to the keratinocyte cells. From the results reported above, these hydrogels show good biocompatibility in both models. Due to the advantage of the hydrogels such as the ability to absorb and deliver protein or drugs, they have potential to be used as topical materials for wound healing or other biomedical applications.
Resumo:
Focusing on the conditions that an optimization problem may comply with, the so-called convergence conditions have been proposed and sequentially a stochastic optimization algorithm named as DSZ algorithm is presented in order to deal with both unconstrained and constrained optimizations. The principle is discussed in the theoretical model of DSZ algorithm, from which we present the practical model of DSZ algorithm. Practical model efficiency is demonstrated by the comparison with the similar algorithms such as Enhanced simulated annealing (ESA), Monte Carlo simulated annealing (MCS), Sniffer Global Optimization (SGO), Directed Tabu Search (DTS), and Genetic Algorithm (GA), using a set of well-known unconstrained and constrained optimization test cases. Meanwhile, further attention goes to the strategies how to optimize the high-dimensional unconstrained problem using DSZ algorithm.
Resumo:
We study the regret of optimal strategies for online convex optimization games. Using von Neumann's minimax theorem, we show that the optimal regret in this adversarial setting is closely related to the behavior of the empirical minimization algorithm in a stochastic process setting: it is equal to the maximum, over joint distributions of the adversary's action sequence, of the difference between a sum of minimal expected losses and the minimal empirical loss. We show that the optimal regret has a natural geometric interpretation, since it can be viewed as the gap in Jensen's inequality for a concave functional--the minimizer over the player's actions of expected loss--defined on a set of probability distributions. We use this expression to obtain upper and lower bounds on the regret of an optimal strategy for a variety of online learning problems. Our method provides upper bounds without the need to construct a learning algorithm; the lower bounds provide explicit optimal strategies for the adversary. Peter L. Bartlett, Alexander Rakhlin
Resumo:
Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed-form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This article provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox–Ingersoll–Ross and Ornstein–Uhlenbeck equations respectively.