10 resultados para Asymptotic Variance of Estimate

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the present work the bifurcational behaviour of the solutions of Rayleigh equation and corresponding spatially distributed system is being analysed. The conditions of oscillatory and monotonic loss of stability are obtained. In the case of oscillatory loss of stability, the analysis of linear spectral problem is being performed. For nonlinear problem, recurrent formulas for the general term of the asymptotic approximation of the self-oscillations are found, the stability of the periodic mode is analysed. Lyapunov-Schmidt method is being used for asymptotic approximation. The correlation between periodic solutions of ODE and PDE is being investigated. The influence of the diffusion on the frequency of self-oscillations is being analysed. Several numerical experiments are being performed in order to support theoretical findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis the bifurcational behavior of the solutions of Langford system is analysed. The equilibriums of the Langford system are found, and the stability of equilibriums is discussed. The conditions of loss of stability are found. The periodic solution of the system is approximated. We consider three types of boundary condition for Langford spatially distributed system: Neumann conditions, Dirichlet conditions and Neumann conditions with additional requirement of zero average. We apply the Lyapunov-Schmidt method to Langford spatially distributed system for asymptotic approximation of the periodic mode. We analyse the influence of the diffusion on the behavior of self-oscillations. As well in the present work we perform numerical experiments and compare it with the analytical results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this research is to estimate and characterize heterogeneous mass transfer coefficients in bench- and pilot-scale fluidized bed processes by the means of computational fluid dynamics (CFD). A further objective is to benchmark the heterogeneous mass transfer coefficients predicted by fine-grid Eulerian CFD simulations against empirical data presented in the scientific literature. First, a fine-grid two-dimensional Eulerian CFD model with a solid and gas phase has been designed. The model is applied for transient two-dimensional simulations of char combustion in small-scale bubbling and turbulent fluidized beds. The same approach is used to simulate a novel fluidized bed energy conversion process developed for the carbon capture, chemical looping combustion operated with a gaseous fuel. In order to analyze the results of the CFD simulations, two one-dimensional fluidized bed models have been formulated. The single-phase and bubble-emulsion models were applied to derive the average gas-bed and interphase mass transfer coefficients, respectively. In the analysis, the effects of various fluidized bed operation parameters, such as fluidization, velocity, particle and bubble diameter, reactor size, and chemical kinetics, on the heterogeneous mass transfer coefficients in the lower fluidized bed are evaluated extensively. The analysis shows that the fine-grid Eulerian CFD model can predict the heterogeneous mass transfer coefficients quantitatively with acceptable accuracy. Qualitatively, the CFD-based research of fluidized bed process revealed several new scientific results, such as parametrical relationships. The huge variance of seven orders of magnitude within the bed Sherwood numbers presented in the literature could be explained by the change of controlling mechanisms in the overall heterogeneous mass transfer process with the varied process conditions. The research opens new process-specific insights into the reactive fluidized bed processes, such as a strong mass transfer control over heterogeneous reaction rate, a dominance of interphase mass transfer in the fine-particle fluidized beds and a strong chemical kinetic dependence of the average gas-bed mass transfer. The obtained mass transfer coefficients can be applied in fluidized bed models used for various engineering design, reactor scale-up and process research tasks, and they consequently provide an enhanced prediction accuracy of the performance of fluidized bed processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena oli kartoittaa kemiallisesti parhaita käytäntöjä sellutehtaiden kemikaalien talteenottoprosessissa. Kirjallisuudesta ei löytynyt lipeäkiertoon liittyvää kemiallista tietoutta. Laiteteknisten ratkaisujen kautta yritettiin saada selville parhaat käytännöt. Teoriaosassa on myös käsitelty lipeäkierrossa esiintyvien vierasaineiden lähteitä, rikastumista ja poistumista prosessista. Työ tehtiin Stora Enson Suomessa sijaitseville tehtaille. Kokeellisessa osassa on suoritettu tehtailta saadun materiaalin perusteella tunnuslukujen laskentaa sekä PCA-analyysin teko. Työssä tarkastelualueena on lipeäkierto liuottajasta meesauuniin. Kaukopäässä on käytössä lipeäkierrossa kaksi lipeälinjaa ja muilla tehtailla vain yksi. Saatujen tunnuslukujen perusteella tehtaille on annettu pisteitä nollasta viiteen. Viisi pistettä on saanut parhaan arvon saanut tehdas ja nolla pistettä kaukana parhaasta arvosta olevat. Tunnuslukuja on valittu 14 ja edellä esitetyllä tavalla pisteytettynä on parhaaksi tehtaaksi saatu Kaukopään 1-lipeälinja ja huonoin on Kemijärvi. Huomioitavaa on Kaukopään 2-lipeälinjan sijoittuminen vasta neljänneksi, vaikka lipeälinjoilla on valkolipeän ja meesan käsittelyssä yhteiset syöttösäiliöt. PCA-analyysin perusteella havaittiin, että kuukausikeskiarvoja käytettäessä tulokset tasoittuvat verrattuna päiväarvoilla tehtyyn analyysiin. Analyysi osoittaa, että tehtailla lipeäkierto ei ole täysin hallinnassa. Tämä ilmenee suurina objektien liikkeinä kuvissa. Päiväarvoilla analyysi tehtiin vain Oulun tehtaalle. Tarkastelussa valittiin kuukauden mittainen ajanjakso niin hyvää kuin huonoakin jaksoa. Mallien selitysasteista voidaan nähdä eroja hyvän ja huonon jakson välillä. Tämä ilmenee niin, että hyvän jakson mallin selitysaste on korkeampi kuin huonon jakson. Objektien sijoittuminen kuvissa on myös erilainen. Hyvän jakson aikana objektit ovat pieninä ryhminä, kun huonon jakson objektit ovat hajonneet yksittäisiksi arvoiksi ja päivien väliset erot ovat suuret.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Laitekaappien integrointi koostuu moduulien sekä kaapeleiden liittämisestä mallikohtaisiksi kokonaisuuksiksi. Tämä kokoonpanoprosessi on tilausohjautuva ja tehdään mallikohtaisesti yksittäiskokoonpanona. Mallien integrointityön vaikeus ja kokoonpanoaika vaihtelevat voimakkaasti. Tämä yhdistettynä työvoiman vaihtuvuuteen luo haastavan ympäristön kehittää tuotantoa sekä laadun että kapasiteetin näkökulmasta. Työssä on selvitetty voidaanko näitä kehittää jakamalla tuotantoprosessi pienempiin vaiheisiin, jotka ovat helpompi tasapainottaa ja oppia. Kokoomalinjan soveltaminen tilausohjautuvaan tuotantoon vaatii perinteiseen tahdistettuun kokoomalinjaan nähden suurempien poikkeavuuksien sallimista. Toisistaan merkittävästi poikkeavien työaikojen ja laajan mallivariaation vuoksi linjaa ei pystytä hallitsemaan niin järjestelmällisesti kuin tasapituisilla työvaiheilla. Tehokkaan tuotannon aikaansaaminen tällaiselle linjalle vaatii mahdollisuutta työjärjestyksen suunnitteluun ja sen simulointiin. Tässä työssä on pyritty arvioimaan simuloinnin avulla kokoomalinjan toimivuutta stokastisen kysynnän vallitessa. Malli on luotu hyväksikäyttäen tuotteiden valmistusaikoja, jotka on jaettu mallikohtaisesti kaikkiin mahdollisiin työtehtäviin. Nämä tehtävät on pyritty tasapainottamaan eri työpisteiden tehtäviksi. Tasapainotuksen tavoitteena on ollut minimoida tuotteiden työtehtävien keston voimakasta hajontaa, jota mallien kysynnän satunnaisuus voimistaa. Simulointien perusteella on luotu yksinkertaistettu sääntö työjärjestyksen muodostamiselle. Mallinnuksessa on pyritty maksimoimaan tuotannon tehokkuus minimoiden sekä keskeneräisen tuotannon määrää että läpimenoaikaa. Tehokkaimman vaihtoehdon löydyttyä on arvioitu kokoomalinjan soveltuvuutta laitekaappien integrointiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A trade-off between return and risk plays a central role in financial economics. The intertemporal capital asset pricing model (ICAPM) proposed by Merton (1973) provides a neoclassical theory for expected returns on risky assets. The model assumes that risk-averse investors (seeking to maximize their expected utility of lifetime consumption) demand compensation for bearing systematic market risk and the risk of unfavorable shifts in the investment opportunity set. Although the ICAPM postulates a positive relation between the conditional expected market return and its conditional variance, the empirical evidence on the sign of the risk-return trade-off is conflicting. In contrast, autocorrelation in stock returns is one of the most consistent and robust findings in empirical finance. While autocorrelation is often interpreted as a violation of market efficiency, it can also reflect factors such as market microstructure or time-varying risk premia. This doctoral thesis investigates a relation between the mixed risk-return trade-off results and autocorrelation in stock returns. The results suggest that, in the case of the US stock market, the relative contribution of the risk-return trade-off and autocorrelation in explaining the aggregate return fluctuates with volatility. This effect is then shown to be even more pronounced in the case of emerging stock markets. During high-volatility periods, expected returns can be described using rational (intertemporal) investors acting to maximize their expected utility. During lowvolatility periods, market-wide persistence in returns increases, leading to a failure of traditional equilibrium-model descriptions for expected returns. Consistent with this finding, traditional models yield conflicting evidence concerning the sign of the risk-return trade-off. The changing relevance of the risk-return trade-off and autocorrelation can be explained by heterogeneous agents or, more generally, by the inadequacy of the neoclassical view on asset pricing with unboundedly rational investors and perfect market efficiency. In the latter case, the empirical results imply that the neoclassical view is valid only under certain market conditions. This offers an economic explanation as to why it has been so difficult to detect a positive tradeoff between the conditional mean and variance of the aggregate stock return. The results highlight the importance, especially in the case of emerging stock markets, of noting both the risk-return trade-off and autocorrelation in applications that require estimates for expected returns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.