545 resultados para Smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose and investigate a method for the stable determination of a harmonic function from knowledge of its value and its normal derivative on a part of the boundary of the (bounded) solution domain (Cauchy problem). We reformulate the Cauchy problem as an operator equation on the boundary using the Dirichlet-to-Neumann map. To discretize the obtained operator, we modify and employ a method denoted as Classic II given in [J. Helsing, Faster convergence and higher accuracy for the Dirichlet–Neumann map, J. Comput. Phys. 228 (2009), pp. 2578–2576, Section 3], which is based on Fredholm integral equations and Nyström discretization schemes. Then, for stability reasons, to solve the discretized integral equation we use the method of smoothing projection introduced in [J. Helsing and B.T. Johansson, Fast reconstruction of harmonic functions from Cauchy data using integral equation techniques, Inverse Probl. Sci. Eng. 18 (2010), pp. 381–399, Section 7], which makes it possible to solve the discretized operator equation in a stable way with minor computational cost and high accuracy. With this approach, for sufficiently smooth Cauchy data, the normal derivative can also be accurately computed on the part of the boundary where no data is initially given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The offered paper deals with the problems of color images preliminary procession. Among these are: interference control (local ones and noise) and extraction of the object from the background on the stage preceding the process of contours extraction. It was considered for a long time that execution of smoothing in segmentation through the boundary extraction is inadmissible, but the described methods and the obtained results evidence about expedience of using the noise control methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate measurement of intervertebral kinematics of the cervical spine can support the diagnosis of widespread diseases related to neck pain, such as chronic whiplash dysfunction, arthritis, and segmental degeneration. The natural inaccessibility of the spine, its complex anatomy, and the small range of motion only permit concise measurement in vivo. Low dose X-ray fluoroscopy allows time-continuous screening of cervical spine during patient's spontaneous motion. To obtain accurate motion measurements, each vertebra was tracked by means of image processing along a sequence of radiographic images. To obtain a time-continuous representation of motion and to reduce noise in the experimental data, smoothing spline interpolation was used. Estimation of intervertebral motion for cervical segments was obtained by processing patient's fluoroscopic sequence; intervertebral angle and displacement and the instantaneous centre of rotation were computed. The RMS value of fitting errors resulted in about 0.2 degree for rotation and 0.2 mm for displacements. © 2013 Paolo Bifulco et al.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

2002 Mathematics Subject Classification: 62M10.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 68T01, 62H30, 32C09.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tanulmányban a Pénzügyminisztérium gazdaságpolitikai főosztálya és az MTA Közgazdaságtudományi Intézete által kifejlesztett középméretű negyedéves makrogazdasági modell segítségével elemezzük a magyar gazdaság legfontosabb mechanizmusait. A modellezés során követett alapelvek és a modell blokkjainak bemutatása után egy forgatókönyv-elemzés keretében vizsgáljuk a makrogazdasági és költségvetési folyamatokat befolyásoló főbb faktorok hatásait. A - tágan értelmezett - "bizonytalansági tényezőket" három csoportba soroljuk: megkülönböztetjük a külső környezet (például árfolyam) változását, a gazdasági szereplők viselkedésében rejlő bizonytalanságokat (például a bérigazodás sebességének vagy a fogyasztássimítás mértékének bizonytalanságát), valamint a gazdaságpolitikai lépéseket (például állami bérek emelését). Megmutatjuk, hogy e kockázatok makrokövetkezményei nem függetlenek egymástól, például egy árfolyamváltozás hatását befolyásolja a bérigazodás sebessége. ______ This paper analyses the most important mechanisms of the Hungarian economy using a medium-sized quarterly macroeconomic model developed jointly by the Economic Policy Department of the Ministry of Finance and the Institute of Economics of the Hungarian Academy of Sciences. After introducing the fundamental principles of modelling and the building blocks of the model investigated, within a scenario analysis, the authors present the effects of the main factors behind the macroeconomic and budgetary processes. The sources of uncertainty - defined in a broad sense - are categorized in three groups: change in the external environment (e.g. the exchange rate), uncertainties in the behav-iour of economic agents (e.g. in speed of wage adjustment or extent of consumption smoothing), and economic policy decisions (e.g. the increase in public sector wages). The macroeconomic consequences of these uncertainties are shown not to be independent of each other. For instance, the effects of an exchange rate shock are influenced by the speed of wage adjustment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accurate and reliable estimation of travel time based on point detector data is needed to support Intelligent Transportation System (ITS) applications. It has been found that the quality of travel time estimation is a function of the method used in the estimation and varies for different traffic conditions. In this study, two hybrid on-line travel time estimation models, and their corresponding off-line methods, were developed to achieve better estimation performance under various traffic conditions, including recurrent congestion and incidents. The first model combines the Mid-Point method, which is a speed-based method, with a traffic flow-based method. The second model integrates two speed-based methods: the Mid-Point method and the Minimum Speed method. In both models, the switch between travel time estimation methods is based on the congestion level and queue status automatically identified by clustering analysis. During incident conditions with rapidly changing queue lengths, shock wave analysis-based refinements are applied for on-line estimation to capture the fast queue propagation and recovery. Travel time estimates obtained from existing speed-based methods, traffic flow-based methods, and the models developed were tested using both simulation and real-world data. The results indicate that all tested methods performed at an acceptable level during periods of low congestion. However, their performances vary with an increase in congestion. Comparisons with other estimation methods also show that the developed hybrid models perform well in all cases. Further comparisons between the on-line and off-line travel time estimation methods reveal that off-line methods perform significantly better only during fast-changing congested conditions, such as during incidents. The impacts of major influential factors on the performance of travel time estimation, including data preprocessing procedures, detector errors, detector spacing, frequency of travel time updates to traveler information devices, travel time link length, and posted travel time range, were investigated in this study. The results show that these factors have more significant impacts on the estimation accuracy and reliability under congested conditions than during uncongested conditions. For the incident conditions, the estimation quality improves with the use of a short rolling period for data smoothing, more accurate detector data, and frequent travel time updates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growing demand of data traffic in the networks of third generation (3G), the mobile operators have attempted to focus resources on infrastructure in places where it identifies a greater need. The channeling investments aim to maintain the quality of service especially in dense urban areas. WCDMA - HSPA parameters Rx Power, RSCP (Received Signal Code Power), Ec/Io (Energy per chip/Interference) and transmission rate (throughput) at the physical layer are analyzed. In this work the prediction of time series on HSPA network is performed. The collection of values of the parameters was performed on a fully operational network through a drive test in Natal - RN, a capital city of Brazil northeastern. The models used for prediction of time series were the Simple Exponential Smoothing, Holt, Holt Winters Additive and Holt Winters Multiplicative. The objective of the predictions of the series is to check which model will generate the best predictions of network parameters WCDMA - HSPA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was developed with the objective of proposing a simple, fast and versatile methodological routine using near-infrared spectroscopy (NIR) combined with multivariate analysis for the determination of ash content, moisture, protein and total lipids present in the gray shrimp (Litopenaeus vannamei ) which is conventionally performed gravimetrically after ashing at 550 ° C gravimetrically after drying at 105 ° C for the determination of moisture gravimetrically after a Soxhlet extraction using volumetric and after digestion and distillation Kjedhal respectively. Was first collected the spectra of 63 samples processed boiled shrimp Litopenaeus vannamei species. Then, the determinations by conventional standard methods were carried out. The spectra centered average underwent multiplicative scattering correction of light, smoothing Saviztky-Golay 15 points and first derivative, eliminated the noisy region, the working range was from 1100,36 to 2502,37 nm. Thus, the PLS models for predicting ash showed R 0,9471; 0,1017 and RMSEP RMSEC 0,1548; Moisture R was 0,9241; 2,5483 and RMSEP RMSEC 4,1979; R protein to 0,9201; 1,9391 and RMSEP RMSEC 2,7066; for lipids R 0,8801; 0,2827 and RMSEP RMSEC 0,2329 So that the results showed that the relative errors found between the reference method and the NIR were small and satisfactory. These results are an excellent indication that you can use the NIR to these analyzes, which is quite advantageous, since conventional techniques are time consuming, they spend a lot of reagents and involve a number of professionals, which requires a reasonable runtime while after the validation of the methodology execution using NIR reduces all this time to a few minutes, saving reagents, time and without waste generation, and that this is a non-destructive technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, a numerical program has been developed to simulate the wave-induced ship motions in the time domain. Wave-body interactions have been studied for various ships and floating bodies through forced motion and free motion simulations in a wide range of wave frequencies. A three-dimensional Rankine panel method is applied to solve the boundary value problem for the wave-body interactions. The velocity potentials and normal velocities on the boundaries are obtained in the time domain by solving the mixed boundary integral equations in relation to the source and dipole distributions. The hydrodynamic forces are calculated by the integration of the instantaneous hydrodynamic pressures over the body surface. The equations of ship motion are solved simultaneously with the boundary value problem for each time step. The wave elevation is computed by applying the linear free surface conditions. A numerical damping zone is adopted to absorb the outgoing waves in order to satisfy the radiation condition for the truncated free surface. A numerical filter is applied on the free surface for the smoothing of the wave elevation. Good convergence has been reached for both forced motion simulations and free motion simulations. The computed added-mass and damping coefficients, wave exciting forces, and motion responses for ships and floating bodies are in good agreement with the numerical results from other programs and experimental data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We provide new evidence on sea surface temperature (SST) variations and paleoceanographic/paleoenvironmental changes over the past 1500 years for the north Aegean Sea (NE Mediterranean). The reconstructions are based on multiproxy analyses, obtained from the high resolution (decadal to multi-decadal) marine record M2 retrieved from the Athos basin. Reconstructed SSTs show an increase from ca. 850 to 950 AD and from ca. 1100 to 1300 AD. A cooling phase of almost 1.5 °C is observed from ca. 1600 AD to 1700 AD. This seems to have been the starting point of a continuous SST warming trend until the end of the reconstructed period, interrupted by two prominent cooling events at 1832 ± 15 AD and 1995 ± 1 AD. Application of an adaptive Kernel smoothing suggests that the current warming in the reconstructed SSTs of the north Aegean might be unprecedented in the context of the past 1500 years. Internal variability in atmospheric/oceanic circulations systems as well as external forcing as solar radiation and volcanic activity could have affected temperature variations in the north Aegean Sea over the past 1500 years. The marked temperature drop of approximately ~2 °C at 1832 ± 15 yr AD could be related to the 1809 ?D 'unknown' and the 1815 AD Tambora volcanic eruptions. Paleoenvironmental proxy-indices of the M2 record show enhanced riverine/continental inputs in the northern Aegean after ca. 1450 AD. The paleoclimatic evidence derived from the M2 record is combined with a socio-environmental study of the history of the north Aegean region. We show that the cultivation of temperature-sensitive crops, i.e. walnut, vine and olive, co-occurred with stable and warmer temperatures, while its end coincided with a significant episode of cooler temperatures. Periods of agricultural growth in Macedonia coincide with periods of warmer and more stable SSTs, but further exploration is required in order to identify the causal links behind the observed phenomena. The Black Death likely caused major changes in agricultural activity in the north Aegean region, as reflected in the pollen data from land sites of Macedonia and the M2 proxy-reconstructions. Finally, we conclude that the early modern peaks in mountain vegetation in the Rhodope and Macedonia highlands, visible also in the M2 record, were very likely climate-driven.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chemical Stratigraphy, or the study of the variation of chemical elements within sedimentary sequences, has gradually become an experienced tool in the research and correlation of global geologic events. In this paper 87Sr/ 86Sr ratios of the Triassic marine carbonates (Muschelkalk facies) of southeast Iberian Ranges, Iberian Peninsula, are presented and the representative Sr-isotopic curve constructed for the upper Ladinian interval. The studied stratigraphic succession is 102 meters thick, continuous, and well preserved. Previous paleontological data from macro and micro, ammonites, bivalves, foraminifera, conodonts and palynological assemblages, suggest a Fassanian-Longobardian age (Late Ladinian). Although diagenetic minerals are present in small amounts, the elemental data content of bulk carbonate samples, especially Sr contents, show a major variation that probably reflects palaeoenvironmental changes. The 87Sr/86Sr ratios curve shows a rise from 0.707649 near the base of the section to 0.707741 and then declines rapidly to 0.707624, with a final values rise up to 0.70787 in the upper part. The data up to meter 80 in the studied succession is broadly concurrent with 87Sr/86Sr ratios of sequences of similar age and complements these data. Moreover, the sequence stratigraphic framework and its key surfaces, which are difficult to be recognised just based in the facies analysis, are characterised by combining variations of the Ca, Mg, Mn, Sr and CaCO3 contents

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimates of HIV prevalence are important for policy in order to establish the health status of a country's population and to evaluate the effectiveness of population-based interventions and campaigns. However, participation rates in testing for surveillance conducted as part of household surveys, on which many of these estimates are based, can be low. HIV positive individuals may be less likely to participate because they fear disclosure, in which case estimates obtained using conventional approaches to deal with missing data, such as imputation-based methods, will be biased. We develop a Heckman-type simultaneous equation approach which accounts for non-ignorable selection, but unlike previous implementations, allows for spatial dependence and does not impose a homogeneous selection process on all respondents. In addition, our framework addresses the issue of separation, where for instance some factors are severely unbalanced and highly predictive of the response, which would ordinarily prevent model convergence. Estimation is carried out within a penalized likelihood framework where smoothing is achieved using a parametrization of the smoothing criterion which makes estimation more stable and efficient. We provide the software for straightforward implementation of the proposed approach, and apply our methodology to estimating national and sub-national HIV prevalence in Swaziland, Zimbabwe and Zambia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Costs related to inventory are usually a significant amount of the company’s total assets. Despite this, companies in general don’t pay a lot of interest in it, even if the benefits from effective inventory are obvious when it comes to less tied up capital, increased customer satisfaction and better working environment. Permobil AB, Timrå is in an intense period when it comes to revenue and growth. The production unit is aiming for an increased output of 30 % in the next two years. To make this possible the company has to improve their way to distribute and handle material,The purpose of the study is to provide useful information and concrete proposals for action, so that the company can build a strategy for an effective and sustainable solution when it comes to inventory management. Alternative methods for making forecasts are suggested, in order to reach a more nuanced perception of different articles, and how they should be managed. Analytic Hierarchy Process (AHP) was used in order to give specially selected persons the chance to decide criteria for how the article should be valued. The criteria they agreed about were annual volume value, lead time, frequency rate and purchase price. The other method that was proposed was a two-dimensional model where annual volume value and frequency was the criteria that specified in which class an article should be placed. Both methods resulted in significant changes in comparison to the current solution. For the spare part inventory different forecast methods were tested and compared with the current solution. It turned out that the current forecast method performed worse than both moving average and exponential smoothing with trend. The small sample of ten random articles is not big enough to reject the current solution, but still the result is a reason enough, for the company to control the quality of the forecasts.