550 resultados para SMOOTHING SPLINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 68T01, 62H30, 32C09.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tanulmányban a Pénzügyminisztérium gazdaságpolitikai főosztálya és az MTA Közgazdaságtudományi Intézete által kifejlesztett középméretű negyedéves makrogazdasági modell segítségével elemezzük a magyar gazdaság legfontosabb mechanizmusait. A modellezés során követett alapelvek és a modell blokkjainak bemutatása után egy forgatókönyv-elemzés keretében vizsgáljuk a makrogazdasági és költségvetési folyamatokat befolyásoló főbb faktorok hatásait. A - tágan értelmezett - "bizonytalansági tényezőket" három csoportba soroljuk: megkülönböztetjük a külső környezet (például árfolyam) változását, a gazdasági szereplők viselkedésében rejlő bizonytalanságokat (például a bérigazodás sebességének vagy a fogyasztássimítás mértékének bizonytalanságát), valamint a gazdaságpolitikai lépéseket (például állami bérek emelését). Megmutatjuk, hogy e kockázatok makrokövetkezményei nem függetlenek egymástól, például egy árfolyamváltozás hatását befolyásolja a bérigazodás sebessége. ______ This paper analyses the most important mechanisms of the Hungarian economy using a medium-sized quarterly macroeconomic model developed jointly by the Economic Policy Department of the Ministry of Finance and the Institute of Economics of the Hungarian Academy of Sciences. After introducing the fundamental principles of modelling and the building blocks of the model investigated, within a scenario analysis, the authors present the effects of the main factors behind the macroeconomic and budgetary processes. The sources of uncertainty - defined in a broad sense - are categorized in three groups: change in the external environment (e.g. the exchange rate), uncertainties in the behav-iour of economic agents (e.g. in speed of wage adjustment or extent of consumption smoothing), and economic policy decisions (e.g. the increase in public sector wages). The macroeconomic consequences of these uncertainties are shown not to be independent of each other. For instance, the effects of an exchange rate shock are influenced by the speed of wage adjustment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accurate and reliable estimation of travel time based on point detector data is needed to support Intelligent Transportation System (ITS) applications. It has been found that the quality of travel time estimation is a function of the method used in the estimation and varies for different traffic conditions. In this study, two hybrid on-line travel time estimation models, and their corresponding off-line methods, were developed to achieve better estimation performance under various traffic conditions, including recurrent congestion and incidents. The first model combines the Mid-Point method, which is a speed-based method, with a traffic flow-based method. The second model integrates two speed-based methods: the Mid-Point method and the Minimum Speed method. In both models, the switch between travel time estimation methods is based on the congestion level and queue status automatically identified by clustering analysis. During incident conditions with rapidly changing queue lengths, shock wave analysis-based refinements are applied for on-line estimation to capture the fast queue propagation and recovery. Travel time estimates obtained from existing speed-based methods, traffic flow-based methods, and the models developed were tested using both simulation and real-world data. The results indicate that all tested methods performed at an acceptable level during periods of low congestion. However, their performances vary with an increase in congestion. Comparisons with other estimation methods also show that the developed hybrid models perform well in all cases. Further comparisons between the on-line and off-line travel time estimation methods reveal that off-line methods perform significantly better only during fast-changing congested conditions, such as during incidents. The impacts of major influential factors on the performance of travel time estimation, including data preprocessing procedures, detector errors, detector spacing, frequency of travel time updates to traveler information devices, travel time link length, and posted travel time range, were investigated in this study. The results show that these factors have more significant impacts on the estimation accuracy and reliability under congested conditions than during uncongested conditions. For the incident conditions, the estimation quality improves with the use of a short rolling period for data smoothing, more accurate detector data, and frequent travel time updates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growing demand of data traffic in the networks of third generation (3G), the mobile operators have attempted to focus resources on infrastructure in places where it identifies a greater need. The channeling investments aim to maintain the quality of service especially in dense urban areas. WCDMA - HSPA parameters Rx Power, RSCP (Received Signal Code Power), Ec/Io (Energy per chip/Interference) and transmission rate (throughput) at the physical layer are analyzed. In this work the prediction of time series on HSPA network is performed. The collection of values of the parameters was performed on a fully operational network through a drive test in Natal - RN, a capital city of Brazil northeastern. The models used for prediction of time series were the Simple Exponential Smoothing, Holt, Holt Winters Additive and Holt Winters Multiplicative. The objective of the predictions of the series is to check which model will generate the best predictions of network parameters WCDMA - HSPA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was developed with the objective of proposing a simple, fast and versatile methodological routine using near-infrared spectroscopy (NIR) combined with multivariate analysis for the determination of ash content, moisture, protein and total lipids present in the gray shrimp (Litopenaeus vannamei ) which is conventionally performed gravimetrically after ashing at 550 ° C gravimetrically after drying at 105 ° C for the determination of moisture gravimetrically after a Soxhlet extraction using volumetric and after digestion and distillation Kjedhal respectively. Was first collected the spectra of 63 samples processed boiled shrimp Litopenaeus vannamei species. Then, the determinations by conventional standard methods were carried out. The spectra centered average underwent multiplicative scattering correction of light, smoothing Saviztky-Golay 15 points and first derivative, eliminated the noisy region, the working range was from 1100,36 to 2502,37 nm. Thus, the PLS models for predicting ash showed R 0,9471; 0,1017 and RMSEP RMSEC 0,1548; Moisture R was 0,9241; 2,5483 and RMSEP RMSEC 4,1979; R protein to 0,9201; 1,9391 and RMSEP RMSEC 2,7066; for lipids R 0,8801; 0,2827 and RMSEP RMSEC 0,2329 So that the results showed that the relative errors found between the reference method and the NIR were small and satisfactory. These results are an excellent indication that you can use the NIR to these analyzes, which is quite advantageous, since conventional techniques are time consuming, they spend a lot of reagents and involve a number of professionals, which requires a reasonable runtime while after the validation of the methodology execution using NIR reduces all this time to a few minutes, saving reagents, time and without waste generation, and that this is a non-destructive technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, a numerical program has been developed to simulate the wave-induced ship motions in the time domain. Wave-body interactions have been studied for various ships and floating bodies through forced motion and free motion simulations in a wide range of wave frequencies. A three-dimensional Rankine panel method is applied to solve the boundary value problem for the wave-body interactions. The velocity potentials and normal velocities on the boundaries are obtained in the time domain by solving the mixed boundary integral equations in relation to the source and dipole distributions. The hydrodynamic forces are calculated by the integration of the instantaneous hydrodynamic pressures over the body surface. The equations of ship motion are solved simultaneously with the boundary value problem for each time step. The wave elevation is computed by applying the linear free surface conditions. A numerical damping zone is adopted to absorb the outgoing waves in order to satisfy the radiation condition for the truncated free surface. A numerical filter is applied on the free surface for the smoothing of the wave elevation. Good convergence has been reached for both forced motion simulations and free motion simulations. The computed added-mass and damping coefficients, wave exciting forces, and motion responses for ships and floating bodies are in good agreement with the numerical results from other programs and experimental data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'evoluzione tecnologica e l'utilizzo crescente della computer grafica in diversi settori stanno suscitando l'interesse di sempre più persone verso il mondo della modellazione 3D. I software di modellazione, tuttavia, si presentano spesso inadeguati all'utilizzo da parte di utenti senza esperienza, soprattutto a causa dei comandi di navigazione e modellazione poco intuitivi. Dal punto di vista dell'interazione uomo-computer, questi software devono infatti affrontare un grande ostacolo: il rapporto tra dispositivi di input 2D (come il mouse) e la manipolazione di una scena 3D. Il progetto presentato in questa tesi è un addon per Blender che consente di utilizzare il dispositivo Leap Motion come ausilio alla modellazione di superfici in computer grafica. L'obiettivo di questa tesi è stato quello di progettare e realizzare un'interfaccia user-friendly tra Leap e Blender, in modo da potere utilizzare i sensori del primo per facilitare ed estendere i comandi di navigazione e modellazione del secondo. L'addon realizzato per Blender implementa il concetto di LAM (Leap Aided Modelling: modellazione assistita da Leap), consentendo quindi di estendere le feature di Blender riguardanti la selezione, lo spostamento e la modifica degli oggetti in scena, la manipolazione della vista utente e la modellazione di curve e superfici Non Uniform Rational B-Splines (NURBS). Queste estensioni sono state create per rendere più veloci e semplici le operazioni altrimenti guidate esclusivamente da mouse e tastiera.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We provide new evidence on sea surface temperature (SST) variations and paleoceanographic/paleoenvironmental changes over the past 1500 years for the north Aegean Sea (NE Mediterranean). The reconstructions are based on multiproxy analyses, obtained from the high resolution (decadal to multi-decadal) marine record M2 retrieved from the Athos basin. Reconstructed SSTs show an increase from ca. 850 to 950 AD and from ca. 1100 to 1300 AD. A cooling phase of almost 1.5 °C is observed from ca. 1600 AD to 1700 AD. This seems to have been the starting point of a continuous SST warming trend until the end of the reconstructed period, interrupted by two prominent cooling events at 1832 ± 15 AD and 1995 ± 1 AD. Application of an adaptive Kernel smoothing suggests that the current warming in the reconstructed SSTs of the north Aegean might be unprecedented in the context of the past 1500 years. Internal variability in atmospheric/oceanic circulations systems as well as external forcing as solar radiation and volcanic activity could have affected temperature variations in the north Aegean Sea over the past 1500 years. The marked temperature drop of approximately ~2 °C at 1832 ± 15 yr AD could be related to the 1809 ?D 'unknown' and the 1815 AD Tambora volcanic eruptions. Paleoenvironmental proxy-indices of the M2 record show enhanced riverine/continental inputs in the northern Aegean after ca. 1450 AD. The paleoclimatic evidence derived from the M2 record is combined with a socio-environmental study of the history of the north Aegean region. We show that the cultivation of temperature-sensitive crops, i.e. walnut, vine and olive, co-occurred with stable and warmer temperatures, while its end coincided with a significant episode of cooler temperatures. Periods of agricultural growth in Macedonia coincide with periods of warmer and more stable SSTs, but further exploration is required in order to identify the causal links behind the observed phenomena. The Black Death likely caused major changes in agricultural activity in the north Aegean region, as reflected in the pollen data from land sites of Macedonia and the M2 proxy-reconstructions. Finally, we conclude that the early modern peaks in mountain vegetation in the Rhodope and Macedonia highlands, visible also in the M2 record, were very likely climate-driven.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chemical Stratigraphy, or the study of the variation of chemical elements within sedimentary sequences, has gradually become an experienced tool in the research and correlation of global geologic events. In this paper 87Sr/ 86Sr ratios of the Triassic marine carbonates (Muschelkalk facies) of southeast Iberian Ranges, Iberian Peninsula, are presented and the representative Sr-isotopic curve constructed for the upper Ladinian interval. The studied stratigraphic succession is 102 meters thick, continuous, and well preserved. Previous paleontological data from macro and micro, ammonites, bivalves, foraminifera, conodonts and palynological assemblages, suggest a Fassanian-Longobardian age (Late Ladinian). Although diagenetic minerals are present in small amounts, the elemental data content of bulk carbonate samples, especially Sr contents, show a major variation that probably reflects palaeoenvironmental changes. The 87Sr/86Sr ratios curve shows a rise from 0.707649 near the base of the section to 0.707741 and then declines rapidly to 0.707624, with a final values rise up to 0.70787 in the upper part. The data up to meter 80 in the studied succession is broadly concurrent with 87Sr/86Sr ratios of sequences of similar age and complements these data. Moreover, the sequence stratigraphic framework and its key surfaces, which are difficult to be recognised just based in the facies analysis, are characterised by combining variations of the Ca, Mg, Mn, Sr and CaCO3 contents

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research paper presents a five step algorithm to generate tool paths for machining Free form / Irregular Contoured Surface(s) (FICS) by adopting STEP-NC (AP-238) format. In the first step, a parametrized CAD model with FICS is created or imported in UG-NX6.0 CAD package. The second step recognizes the features and calculates a Closeness Index (CI) by comparing them with the B-Splines / Bezier surfaces. The third step utilizes the CI and extracts the necessary data to formulate the blending functions for identified features. In the fourth step Z-level 5 axis tool paths are generated by adopting flat and ball end mill cutters. Finally, in the fifth step, tool paths are integrated with STEP-NC format and validated. All these steps are discussed and explained through a validated industrial component.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research paper presents the work on feature recognition, tool path data generation and integration with STEP-NC (AP-238 format) for features having Free form / Irregular Contoured Surface(s) (FICS). Initially, the FICS features are modelled / imported in UG CAD package and a closeness index is generated. This is done by comparing the FICS features with basic B-Splines / Bezier curves / surfaces. Then blending functions are caculated by adopting convolution theorem. Based on the blending functions, contour offsett tool paths are generated and simulated for 5 axis milling environment. Finally, the tool path (CL) data is integrated with STEP-NC (AP-238) format. The tool path algorithm and STEP- NC data is tested with various industrial parts through an automated UFUNC plugin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimates of HIV prevalence are important for policy in order to establish the health status of a country's population and to evaluate the effectiveness of population-based interventions and campaigns. However, participation rates in testing for surveillance conducted as part of household surveys, on which many of these estimates are based, can be low. HIV positive individuals may be less likely to participate because they fear disclosure, in which case estimates obtained using conventional approaches to deal with missing data, such as imputation-based methods, will be biased. We develop a Heckman-type simultaneous equation approach which accounts for non-ignorable selection, but unlike previous implementations, allows for spatial dependence and does not impose a homogeneous selection process on all respondents. In addition, our framework addresses the issue of separation, where for instance some factors are severely unbalanced and highly predictive of the response, which would ordinarily prevent model convergence. Estimation is carried out within a penalized likelihood framework where smoothing is achieved using a parametrization of the smoothing criterion which makes estimation more stable and efficient. We provide the software for straightforward implementation of the proposed approach, and apply our methodology to estimating national and sub-national HIV prevalence in Swaziland, Zimbabwe and Zambia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Abnormalities in serum phosphorus, calcium and parathyroid hormone (PTH) have been associated with poor survival in haemodialysis patients. This COSMOS (Current management Of Secondary hyperparathyroidism: a Multicentre Observational Study) analysis assesses the association of high and low serum phosphorus, calcium and PTH with a relative risk of mortality. Furthermore, the impact of changes in these parameters on the relative risk of mortality throughout the 3-year follow-up has been investigated. METHODS:COSMOS is a 3-year, multicentre, open-cohort, prospective study carried out in 6797 adult chronic haemodialysis patients randomly selected from 20 European countries. RESULTS:Using Cox proportional hazard regression models and penalized splines analysis, it was found that both high and low serum phosphorus, calcium and PTH were associated with a higher risk of mortality. The serum values associated with the minimum relative risk of mortality were 4.4 mg/dL for serum phosphorus, 8.8 mg/dL for serum calcium and 398 pg/mL for serum PTH. The lowest mortality risk ranges obtained using as base the previous values were 3.6-5.2 mg/dL for serum phosphorus, 7.9-9.5 mg/dL for serum calcium and 168-674 pg/mL for serum PTH. Decreases in serum phosphorus and calcium and increases in serum PTH in patients with baseline values of >5.2 mg/dL (phosphorus), >9.5 mg/dL (calcium) and <168 pg/mL (PTH), respectively, were associated with improved survival. CONCLUSIONS:COSMOS provides evidence of the association of serum phosphorus, calcium and PTH and mortality, and suggests survival benefits of controlling chronic kidney disease-mineral and bone disorder biochemical parameters in CKD5D patients.