651 resultados para SMOOTHING SPLINE
Resumo:
Tämän työn tarkoituksena on kehittää lyhyen tähtäimen kysynnän ennakointiprosessia VAASAN Oy:ssä, jossa osa tuotteista valmistetaan kysyntäennakoiden perusteella. Valmistettavien tuotteiden luonteesta johtuva varastointimahdollisuuden puuttuminen, korkea toimitusvarmuustavoite sekä tarvittavien ennakoiden suuri määrä asettavat suuret haasteet kysynnän ennakointiprosessille. Työn teoriaosuudessa käsitellään kysynnän ennustamisen tarvetta, ennusteiden käyttökohteita sekä kysynnän ennustamismenetelmiä. Pelkällä kysynnän ennustamisella ei kuitenkaan päästä toimitusketjun kannalta optimaaliseen lopputulokseen, vaan siihen tarvitaan kokonaisvaltaista kysynnän hallintaa. Se on prosessi, jonka tavoitteena on tasapainottaa toimitusketjun kyvykkyydet ja asiakkaiden vaatimukset keskenään mahdollisimman tehokkaasti. Työssä tutkittiin yrityksessä kolmen kuukauden aikana eksponentiaalisen tasoituksen menetelmällä laadittuja ennakoita sekä ennakoijien tekemiä muutoksia niihin. Tutkimuksen perusteella optimaalinen eksponentiaalisen tasoituksen alfa-kerroin on 0,6. Ennakoijien tilastollisiin ennakoihin tekemät muutokset paransivat ennakoiden tarkkuutta ja ne olivat erityisen tehokkaita toimituspuutteiden minimoimisessa. Lisäksi työn tuloksena ennakoijien käyttöön saatiin monia päivittäisiä rutiineja helpottavia ja automatisoivia työkaluja.
Resumo:
Työssä tutkittiin hitsattujen levyliitosten väsymiskestävyyden mitoitusarvoja. Hitsien väsymiskestävyyden mitoitusarvot määritettiin lineaarista murtumismekaniikkaa soveltavalla 2D FEM-laskentaohjelmalla. Murtumismekaanisen laskennan tuloksista määriteltiin, eri liitosgeometrioiden ja kuormitustyyppien mukaisia, nimellisen jännityksen väsymismitoitusmenetelmää vastaavia FAT-luokkia, joissa on huomioitu rakenteellinen jännitys hitsiä vastaan kohtisuorassa suunnassa. Tutkittujen liitosten geometriat olivat pääsääntöisesti poikkeavia mitoitusstandardien ja ohjeiden sisältämistä taulukkotapauksista. Laskennassa otettiin huomioon hitsien liittymiskulma perusaineeseen, rajaviivan pyöristykset ja vajaa hitsautumissyvyys. Kuormitustyyppien vaihtelua tutkittiin rakenteellisen jännityksen taivutusosuuden muutoksilla ja kuormaa kantavien X-liitosten risteävien kuormituksien suhteellisilla suuruuksilla. Väsymiskestävyydet määritettiin kuormituskohtaisille kalvo- ja taivutusjännityksille sekä näiden jännitysjakaumien keskiarvoille. Työssä saatuja FAT-luokkia voidaan hyödyntää vastaavien geometrioiden ja kuormitusten yhteydessä, sekä interpoloimalla myös tuloksien väliarvoissa. Työssä käytetyillä menetelmillä voidaan parantaa nimellisen jännityksen mitoitusmenetelmän tarkkuutta ja laajentaa sitä koskemaan myös taulukkotapausten ulkopuolisia liitoksia. Työn tuloksissa on esitetty FAT-luokkia T-, X- ja päittäisliitoksille ja näiden eri kuormitusyhdistelmille.
Resumo:
Mass-produced paper electronics (large area organic printed electronics on paper-based substrates, “throw-away electronics”) has the potential to introduce the use of flexible electronic applications in everyday life. While paper manufacturing and printing have a long history, they were not developed with electronic applications in mind. Modifications to paper substrates and printing processes are required in order to obtain working electronic devices. This should be done while maintaining the high throughput of conventional printing techniques and the low cost and recyclability of paper. An understanding of the interactions between the functional materials, the printing process and the substrate are required for successful manufacturing of advanced devices on paper. Based on the understanding, a recyclable, multilayer-coated paper-based substrate that combines adequate barrier and printability properties for printed electronics and sensor applications was developed in this work. In this multilayer structure, a thin top-coating consisting of mineral pigments is coated on top of a dispersion-coated barrier layer. The top-coating provides well-controlled sorption properties through controlled thickness and porosity, thus enabling optimizing the printability of functional materials. The penetration of ink solvents and functional materials stops at the barrier layer, which not only improves the performance of the functional material but also eliminates potential fiber swelling and de-bonding that can occur when the solvents are allowed to penetrate into the base paper. The multi-layer coated paper under consideration in the current work consists of a pre-coating and a smoothing layer on which the barrier layer is deposited. Coated fine paper may also be used directly as basepaper, ensuring a smooth base for the barrier layer. The top layer is thin and smooth consisting of mineral pigments such as kaolin, precipitated calcium carbonate, silica or blends of these. All the materials in the coating structure have been chosen in order to maintain the recyclability and sustainability of the substrate. The substrate can be coated in steps, sequentially layer by layer, which requires detailed understanding and tuning of the wetting properties and topography of the barrier layer versus the surface tension of the top-coating. A cost competitive method for industrial scale production is the curtain coating technique allowing extremely thin top-coatings to be applied simultaneously with a closed and sealed barrier layer. The understanding of the interactions between functional materials formulated and applied on paper as inks, makes it possible to create a paper-based substrate that can be used to manufacture printed electronics-based devices and sensors on paper. The multitude of functional materials and their complex interactions make it challenging to draw general conclusions in this topic area. Inevitably, the results become partially specific to the device chosen and the materials needed in its manufacturing. Based on the results, it is clear that for inks based on dissolved or small size functional materials, a barrier layer is beneficial and ensures the functionality of the printed material in a device. The required active barrier life time depends on the solvents or analytes used and their volatility. High aspect ratio mineral pigments, which create tortuous pathways and physical barriers within the barrier layer limit the penetration of solvents used in functional inks. The surface pore volume and pore size can be optimized for a given printing process and ink through a choice of pigment type and coating layer thickness. However, when manufacturing multilayer functional devices, such as transistors, which consist of several printed layers, compromises have to be made. E.g., while a thick and porous top-coating is preferable for printing of source and drain electrodes with a silver particle ink, a thinner and less absorbing surface is required to form a functional semiconducting layer. With the multilayer coating structure concept developed in this work, it was possible to make the paper substrate suitable for printed functionality. The possibility of printing functional devices, such as transistors, sensors and pixels in a roll-to-roll process on paper is demonstrated which may enable introducing paper for use in disposable “onetime use” or “throwaway” electronics and sensors, such as lab-on-strip devices for various analyses, consumer packages equipped with product quality sensors or remote tracking devices.
Resumo:
The purpose of this thesis was to study the design of demand forecasting processes and management of demand. In literature review were different processes found and forecasting methods and techniques interviewed. Also role of bullwhip effect in supply chain was identified and how to manage it with information sharing operations. In the empirical part of study is at first described current situation and challenges in case company. After that will new way to handle demand introduced with target budget creation and how information sharing with 5 products and a few customers would bring benefits to company. Also the new S&OP process created within this study and organization for it.
Resumo:
In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.
Resumo:
The break point of the curve of blood lactate vs exercise load has been called anaerobic threshold (AT) and is considered to be an important indicator of endurance exercise capacity in human subjects. There are few studies of AT determination in animals. We describe a protocol for AT determination by the "lactate minimum test" in rats during swimming exercise. The test is based on the premise that during an incremental exercise test, and after a bout of maximal exercise, blood lactate decreases to a minimum and then increases again. This minimum value indicates the intensity of the AT. Adult male (90 days) Wistar rats adapted to swimming for 2 weeks were used. The initial state of lactic acidosis was obtained by making the animals jump into the water and swim while carrying a load equivalent to 50% of body weight for 6 min (30-s exercise interrupted by a 30-s rest). After a 9-min rest, blood was collected and the incremental swimming test was started. The test consisted of swimming while supporting loads of 4.5, 5.0, 5.5, 6.0 and 7.0% of body weight. Each exercise load lasted 5 min and was followed by a 30-s rest during which blood samples were taken. The blood lactate minimum was determined from a zero-gradient tangent to a spline function fitting the blood lactate vs workload curve. AT was estimated to be 4.95 ± 0.10% of body weight while interpolated blood lactate was 7.17 ± 0.16 mmol/l. These results suggest the application of AT determination in animal studies concerning metabolism during exercise.
Resumo:
An increase in daily mortality from myocardial infarction has been observed in association with meteorological factors and air pollution in several cities in the world, mainly in the northern hemisphere. The objective of the present study was to analyze the independent effects of environmental variables on daily counts of death from myocardial infarction in a subtropical region in South America. We used the robust Poisson regression to investigate associations between weather (temperature, humidity and barometric pressure), air pollution (sulfur dioxide, carbon monoxide, and inhalable particulate), and the daily death counts attributed to myocardial infarction in the city of São Paulo in Brazil, where 12,007 fatal events were observed from 1996 to 1998. The model was adjusted in a linear fashion for relative humidity and day-of-week, while nonparametric smoothing factors were used for seasonal trend and temperature. We found a significant association of daily temperature with deaths due to myocardial infarction (P < 0.001), with the lowest mortality being observed at temperatures between 21.6 and 22.6ºC. Relative humidity appeared to exert a protective effect. Sulfur dioxide concentrations correlated linearly with myocardial infarction deaths, increasing the number of fatal events by 3.4% (relative risk of 1.03; 95% confidence interval = 1.02-1.05) for each 10 µg/m³ increase. In conclusion, this study provides evidence of important associations between daily temperature and air pollution and mortality from myocardial infarction in a subtropical region, even after a comprehensive control for confounding factors.
Resumo:
Time series analysis can be categorized into three different approaches: classical, Box-Jenkins, and State space. Classical approach makes a basement for the analysis and Box-Jenkins approach is an improvement of the classical approach and deals with stationary time series. State space approach allows time variant factors and covers up a broader area of time series analysis. This thesis focuses on parameter identifiablity of different parameter estimation methods such as LSQ, Yule-Walker, MLE which are used in the above time series analysis approaches. Also the Kalman filter method and smoothing techniques are integrated with the state space approach and MLE method to estimate parameters allowing them to change over time. Parameter estimation is carried out by repeating estimation and integrating with MCMC and inspect how well different estimation methods can identify the optimal model parameters. Identification is performed in probabilistic and general senses and compare the results in order to study and represent identifiability more informative way.
Resumo:
Type 2 diabetes increases the risk of cardiovascular mortality and these patients, even without previous myocardial infarction, run the risk of fatal coronary heart disease similar to non-diabetic patients surviving myocardial infarction. There is evidence showing that particulate matter air pollution is associated with increases in cardiopulmonary morbidity and mortality. The present study was carried out to evaluate the effect of diabetes mellitus on the association of air pollution with cardiovascular emergency room visits in a tertiary referral hospital in the city of São Paulo. Using a time-series approach, and adopting generalized linear Poisson regression models, we assessed the effect of daily variations in PM10, CO, NO2, SO2, and O3 on the daily number of emergency room visits for cardiovascular diseases in diabetic and non-diabetic patients from 2001 to 2003. A semi-parametric smoother (natural spline) was adopted to control long-term trends, linear term seasonal usage and weather variables. In this period, 45,000 cardiovascular emergency room visits were registered. The observed increase in interquartile range within the 2-day moving average of 8.0 µg/m³ SO2 was associated with 7.0% (95%CI: 4.0-11.0) and 20.0% (95%CI: 5.0-44.0) increases in cardiovascular disease emergency room visits by non-diabetic and diabetic groups, respectively. These data indicate that air pollution causes an increase of cardiovascular emergency room visits, and that diabetic patients are extremely susceptible to the adverse effects of air pollution on their health conditions.
Resumo:
A combinação da espectroscopia no infravermelho próximo (NIR) e calibração multivariada (método dos mínimos quadrados parciais - PLS) para a determinação do teor de proteína total em amostras de café cru, foi investigada. Os teores de proteína total foram inicialmente determinados usando-se como método de referência o de Kjeldhal, e, posteriormente foram construídos modelos de regressão a partir dos espectros na região do infravermelho próximo das amostras de café cru. Foram coletados 159 espectros das amostras de café cru utilizando um acessório de reflectância difusa, na faixa espectral de 4500 a 10000cm-1. Os espectros originais no NIR sofreram diferentes transformações e pré-tratamento matemático, como a transformação Kubelka-Munk; correção multiplicativa de sinal (MSC); alisamento (SPLINE); derivada primeira; média móvel e o pré-tratamento dos dados escalados pela variância. O método analítico proposto possibilitou a determinação direta, sem destruição da amostra, com obtenção de resultados rápidos e sem o consumo de reagentes químicos de forma a preservar o meio ambiente. O método proposto forneceu resultados com boa capacidade de previsão do teor de proteína total, sendo que os erros médios foram inferiores a 6,7%.
Resumo:
A espectroscopia na região do infravermelho próximo (NIR) foi usada para determinar o teor de umidade em amostras de café cru. Foram construídos modelos de regressão usando o método dos mínimos quadrados parciais (PLS) com diferentes pré-tratamentos de dados e 157 espectros NIR coletados de amostras de café usando um acessório de reflectância difusa, na região entre 4500 e 10000 cm-1. Os espectros originais passaram por diferentes transformações e pré-tratamentos matemáticos, como a transformação Kubelka-Munk; a correção multiplicativa de sinal (MSC); o alisamento com SPLINE e a média móvel, e os dados foram escalados pela variância. O modelo de regressão permitiu determinar o teor de umidade nas amostras de café cru com erro quadrático médio de calibração (SEC) de 0,569 g.100 g -1; erro quadrático médio de validação de 0,298 g.100 g -1; coeficiente de correlação (r) 0,712 e 0,818 para calibração e validação, respectivamente; e erro relativo médio de 4,1% para amostras de validação.
Resumo:
This paper aims at evaluating the conduction of monetary policy after the adoption of inflation targeting. Formation of Selic rate is modeled by estimating a reaction function of the BCB. Results show an excessive degree of interest rate smoothing and a high level of equilibrium interest rate. This evidence supports the belief that Selic rate's formation is ruled by a conservative behavior. The conservative conduction of monetary policy is related to two distinct features of BCB's reaction function: i) the great weight of autoregressive components; and, chiefly, ii) a very high level of the equilibrium interest rate. The main conclusion is that, all remaining unchanged, the interest rate would hardly be reduced in a satisfactory way. Massive and chronic deflation would be needed if Selic were to reach a reasonable level, closer to that of rates in the rest of the world. This evidences the need for a debate on the adequacy of current stabilization strategy.
Resumo:
The focus of the paper is the nonparametric estimation of an instrumental regression function P defined by conditional moment restrictions stemming from a structural econometric model : E[Y-P(Z)|W]=0 and involving endogenous variables Y and Z and instruments W. The function P is the solution of an ill-posed inverse problem and we propose an estimation procedure based on Tikhonov regularization. The paper analyses identification and overidentification of this model and presents asymptotic properties of the estimated nonparametric instrumental regression function.
Resumo:
We characterize the solution to a model of consumption smoothing using financing under non-commitment and savings. We show that, under certain conditions, these two different instruments complement each other perfectly. If the rate of time preference is equal to the interest rate on savings, perfect smoothing can be achieved in finite time. We also show that, when random revenues are generated by periodic investments in capital through a concave production function, the level of smoothing achieved through financial contracts can influence the productive investment efficiency. As long as financial contracts cannot achieve perfect smoothing, productive investment will be used as a complementary smoothing device.
Resumo:
This paper develops a model where the value of the monetary policy instrument is selected by a heterogenous committee engaged in a dynamic voting game. Committee members differ in their institutional power and, in certain states of nature, they also differ in their preferred instrument value. Preference heterogeneity and concern for the future interact to generate decisions that are dynamically ineffcient and inertial around the previously-agreed instrument value. This model endogenously generates autocorrelation in the policy variable and provides an explanation for the empirical observation that the nominal interest rate under the central bank’s control is infrequently adjusted.