1000 resultados para bottleneck analysis
Resumo:
In any manufacturing system, there are many factors that are affecting and limiting the capacity of the entire system. This thesis addressed a study on how to improve the production capacity in a Finnish company (Viljavuuspalvelu Oy) through different methods like bottleneck analysis, Overall Equipment Effectiveness (OEE), and Just in Time production. Four analyzing methods have been studied in order to detect the bottleneck machine in Viljavuuspalvelu Oy. The results shows that the bottleneck machine in the industrial area that constraint the production is the grinding machine while the bottleneck machine in the laboratory section is the photometry machine. In addition, the Overall Equipment Effectiveness (OEE) of the entire system of the studied case was calculated and it has been found that the OEE of the Viljavuuspalvelu Oy is 35.75%. Moreover, two methods on how to increase the OEE were studied and it was shown that either the total output of the company should be 1254 samples/shift in order to have an OEE around 85% which is considered as a world class or the Ideal run rate should be 1.45 pieces/minute. In addition, some realistic methods are applied based on the finding in this thesis to increase the OEE factor in the company and in one realistic method the % OEE has increase to 62.59%. Finally, an explanation on how to implement the Just in Time production in Viljavuuspalvelu Oy has been studied.
Resumo:
In this research summary, we provide a novel look into the entrepreneurial profile of the UK in an international context. We use a new method – the Global Entrepreneurship and Development Index GEDI – to identify the entrepreneurial strengths and weaknesses of the UK economy, as well as to identify potential bottlenecks that hold back the performance of the UK relative to other advanced economies. We perform a Penalty for Bottleneck analysis to identify the bottlenecks in the UK's entrepreneurial profile. We also explore optimal resource allocation for UK's policy for National Systems of Entrepreneurship.
Resumo:
This thesis investigates the strategy implementation process of enterprices; a process whichhas lacked the academic attentioon compared with a rich strategy formation research trdition. Strategy implementation is viewed as a process ensuring tha the strtegies of an organisation are realised fully and quickly, yet with constant consideration of changing circumstances. The aim of this sudy is to provide a framework for identifying, analysing and removing the strategy implementation bottleneck af an organization and thus for intesifying its strategy process.The study is opened by specifying the concept, tasks and key actors of strategy implementation process; especially arguments for the critical implementation role of the top management are provided. In order to facilitate the analysis nad synthetisation of the core findings of scattered doctrine, six characteristic approaches to strategy implementation phenomenon are identified and compared. The Bottleneck Framework is introduced as an instrument for arranging potential strategy realisation problems, prioritising an organisation's implementation obstacles and focusing the improvement measures accordingly. The SUCCESS Framework is introduced as a mnemonic of the seven critical factors to be taken into account when promoting sttrategy implementation. Both frameworks are empirically tested by applying them to real strategy implementation intesification process in an international, industrial, group-structured case enterprise.
High throughput, high resolution selection of polymorphic microsatellite loci for multiplex analysis
Resumo:
Background Large-scale genetic profiling, mapping and genetic association studies require access to a series of well-characterised and polymorphic microsatellite markers with distinct and broad allele ranges. Selection of complementary microsatellite markers with non-overlapping allele ranges has historically proved to be a bottleneck in the development of multiplex microsatellite assays. The characterisation process for each microsatellite locus can be laborious and costly given the need for numerous, locus-specific fluorescent primers. Results Here, we describe a simple and inexpensive approach to select useful microsatellite markers. The system is based on the pooling of multiple unlabelled PCR amplicons and their subsequent ligation into a standard cloning vector. A second round of amplification utilising generic labelled primers targeting the vector and unlabelled locus-specific primers targeting the microsatellite flanking region yield allelic profiles that are representative of all individuals contained within the pool. Suitability of various DNA pool sizes was then tested for this purpose. DNA template pools containing between 8 and 96 individuals were assessed for the determination of allele ranges of individual microsatellite markers across a broad population. This helped resolve the balance between using pools that are large enough to allow the detection of many alleles against the risk of including too many individuals in a pool such that rare alleles are over-diluted and so do not appear in the pooled microsatellite profile. Pools of DNA from 12 individuals allowed the reliable detection of all alleles present in the pool. Conclusion The use of generic vector-specific fluorescent primers and unlabelled locus-specific primers provides a high resolution, rapid and inexpensive approach for the selection of highly polymorphic microsatellite loci that possess non-overlapping allele ranges for use in large-scale multiplex assays.
Resumo:
The South American fur seal, Arctocephalus australis, was one of the earliest otariid seals to be exploited by humans: at least 6000 years ago on the Atlantic coast and 4000 on the Pacific coast of South America. More than 750,000 fur seals were killed in Uruguay until 1991. However, a climatological phenomenon-the severe 1997-1998 El Nino Southern Oscillation (ENSO)-was responsible for the decline of 72% Of the Peruvian fur seal population due to starvation as a consequence of warming of sea-surface temperatures and primary productivity reduction. Currently, there is no precise information on global population size or on the species` conservation status. The present study includes the first bottleneck test for the Pacific and Atlantic populations of A. australis based on the analysis of seven microsatellite loci. Genetic bottleneck compromises the evolutionary potential of a population to respond to environmental changes. The perspective becomes even more alarming due to current global warming models that predict stronger and more frequent ENSO events in the future. Our analysis found moderate support for deviation from neutrality-equilibrium for the Pacific population of fur seals and none for the Atlantic population. This difference among population reflects different demographic histories, and is consistent with a greater reduction in population size in the Pacific. Such an event could be a result of the synergic effects of recurrent ENSO events and the anthropogenic impact (sealing and prey overfishing) on this population.
Resumo:
O monitoramento da diversidade genética é fundamental em um programa de repovoamento. Avaliouse a diversidade genética de pacu Piaractus mesopotamicus (Holmberg, 1887) em duas estações de piscicultura em Andirá -Paraná, Brasil, utilizadas no programa de repovoamento do Rio Paranapanema. Foram amplificados seis loci microssatélite para avaliar 60 amostras de nadadeira. O estoque de reprodutores B apresentou maior número de alelos e heterozigose (alelos: 22 e H O: 0,628) que o estoque de reprodutores A (alelos: 21 e H O: 0,600). Alelos com baixos níveis de frequência foram observados nos dois estoques. Os coeficientes positivos de endogamia no locus Pme2 (estoque A: F IS = 0,30 e estoque B: F IS = 0,20), Pme5 (estoque B: F IS = 0,15), Pme14 (estoque A: F IS = 0,07) e Pme28 (estoque A: F IS = 0,24 e estoque B: F IS = 0,20), indicaram deficiência de heterozigotos. Foi detectada a presença de um alelo nulo no lócus Pme2. As estimativas negativas nos loci Pme4 (estoque A: F IS = -0,43 e estoque B: F IS= -0,37), Pme5 (estoque A: F IS = - 0,11), Pme14 (estoque B: F IS = - 0,15) e Pme32 (estoque A: F IS = - 0,93 e estoque B: F IS = - 0,60) foram indicativas de excesso de heterozigotos. Foi evidenciado desequilíbrio de ligação e riqueza alélica baixa só no estoque A. A diversidade genética de Nei foi alta nos dois estoques. A distância (0,085) e identidade (0,918) genética mostraram similaridade entre os estoques, o qual reflete uma possível origem comum. 6,05% da variância genética total foi devida a diferenças entre os estoques. Foi observado um recente efeito gargalo nos dois estoques. Os resultados indicaram uma alta diversidade genética nos estoques de reprodutores e baixa diferenciação genética entre eles, o que foi causado pelo manejo reprodutivo das pisciculturas, redução do tamanho populacional e intercâmbio genético entre as pisciculturas.
Resumo:
Background: The quasispecies composition of Hepatitis C virus (HCV) could have important implications with regard to viral persistence and response to interferon-based therapy. The complete NS5A was analyzed to evaluate whether the composition of NS5A quasispecies of HCV 1a/1b is related to responsiveness to combined interferon pegylated (PEG-IFN) and ribavirin therapy.Methods: Viral RNA was isolated from serum samples collected before, during and after treatment from virological sustained responder (SVR), non-responder (NR) and the end-of-treatment responder patients (ETR). NS5A region was amplified, cloned and sequenced. Six hundred and ninety full-length NS5A sequences were analyzed.Results: This study provides evidence that lower nucleotide diversity of the NS5A region pre-therapy is associated with viral clearance. Analysis of samples of NRs and the ETRs time points showed that genetic diversity of populations tend to decrease over time. Post-therapy population of ETRs presented higher genetic distance from baseline probably due to the bottleneck phenomenon observed for those patients in the end of treatment. The viral effective population of those patients also showed a strong decrease after therapy. Otherwise, NRs demonstrated a continuous variation or stability of effective populations and genetic diversity over time that did not seem to be related to therapy. Phylogenetic relationships concerning complete NS5A sequences obtained from patients did not demonstrate clustering associated with specific response patterns. However, distinctive clustering of pre/post-therapy sequences was observed. In addition, the evolution of quasispecies over time was subjected to purifying or relaxed purifying selection. Codons 157 (P03), 182 and 440 (P42), 62 and 404 (P44) were found to be under positive selective pressure but it failed to be related to the therapy.Conclusion: These results confirm the hypothesis that a relationship exists between NS5A heterogeneity and response to therapy in patients infected with chronic hepatitis C. © 2013 Jardim et al.; licensee BioMed Central Ltd.
Resumo:
Mortality factors that act sequentially through the demographic transitions from seed to sapling may have critical effects on recruitment success. Understanding how habitat heterogeneity influences the causal factors that limit propagule establishment in natural populations is central to assess these demographic bottlenecks and their consequences. Bamboos often influence forest structure and dynamics and are a major factor in generating landscape complexity and habitat heterogeneity in tropical forests. To understand how patch heterogeneity influences plant recruitment we studied critical establishment stages during early recruitment of Euterpe edulis, Sloanea guianensis and Virola bicuhyba in bamboo and non-bamboo stands in the Brazilian Atlantic forest. We combined observational studies of seed rain and seedling emergence with seed addition experiments to evaluate the transition probabilities among regeneration stages within bamboo and non-bamboo stands. The relative importance of each mortality factor was evaluated by determining how the loss of propagules affected stage-specific recruitment success. Our results revealed that the seed addition treatment significantly increased seedling survivorship for all three species. E. edulis seedling survival probability increased in the addition treatment in the two stand types. However, for S. guianensis and V. bicuhyba this effect depended strongly on artificially protecting the seeds, as both species experienced increased seed and seedling losses due to post-dispersal seed predators and herbivores. Propagules of all three species had a greater probability of reaching subsequent recruitment stages when protected. The recruitment of large-seeded V. bicuhyba and E. edulis appears to be much more limited by post-dispersal factors than by dispersal limitation, whereas the small-seeded S. guianensis showed an even stronger effect of post-dispersal factors causing recruitment collapse in some situations. We demonstrated that E. edulis, S. guianensis and V. bicuhyba are especially susceptible to predation during early compared with later establishment stages and this early stage mortality can be more crucial than stand differences as determinants of successful regeneration. Among-species differences in the relative importance of dispersal vs. establishment limitation are mediated by variability in species responses to patch heterogeneity. Thus, bamboo effects on the early recruitment of non-bamboo species are patchy and species-specific, with successional bamboo patches exerting a far-reaching influence on the heterogeneity of plant species composition and abundance. © 2012 Perspectives in Plant Ecology, Evolution and Systematics.
Resumo:
Franches-Montagnes is the only native horse breed in Switzerland, therefore special efforts should be made for ensuring its survival. The objectives of this study were to characterize the structure of this population as well as genetic variability with pedigree data, conformation traits and molecular markers. Studies were focused to clarify if this population is composed of a heavy- and a light-type subpopulation. Extended pedigree records of 3-year-old stallions (n = 68) and mares (n = 108) were available. Evaluations of body conformation traits as well as pedigree data and molecular markers did not support the two-subpopulation hypothesis. The generation interval ranged from 7.8 to 9.3 years. The complete generation equivalent was high (>12). The number of effective ancestors varied between 18.9 and 20.1, whereof 50% of the genetic variability was attributed to seven of them. Genetic contribution of Warmblood horses ranged from 36% to 42% and that of Coldblood horses from 4% to 6%. The average inbreeding coefficient reached 6%. Inbreeding effective population size was 114.5 when the average increase of the inbreeding coefficient per year since 1910 was taken. Our results suggest that bottleneck situations occurred because of selection of a small number of sire lines. Promotion of planned matings between parents that are less related is recommended in order to avoid a reduction of the genetic diversity.
Resumo:
This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.
Resumo:
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.
Resumo:
In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^
Resumo:
The evolution of the television market is led by 3DTV technology, and this tendency can accelerate during the next years according to expert forecasts. However, 3DTV delivery by broadcast networks is not currently developed enough, and acts as a bottleneck for the complete deployment of the technology. Thus, increasing interest is dedicated to ste-reo 3DTV formats compatible with current HDTV video equipment and infrastructure, as they may greatly encourage 3D acceptance. In this paper, different subsampling schemes for HDTV compatible transmission of both progressive and interlaced stereo 3DTV are studied and compared. The frequency characteristics and preserved frequency content of each scheme are analyzed, and a simple interpolation filter is specially designed. Finally, the advantages and disadvantages of the different schemes and filters are evaluated through quality testing on several progressive and interlaced video sequences.
Resumo:
This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.