920 resultados para Estimation Of Distribution Algorithms
Resumo:
Cikkünkben a magyar monetáris politikát vizsgáljuk olyan szempontból, hogy kamatdöntései meghozatalakor figyelembe vette-e az országkockázatot, és ha igen, hogyan. A kérdés megválaszolásához a monetáris politika elemzésének leggyakoribb eszközét használjuk: az ország monetáris politikáját leíró Taylor-szabályokat becslünk. A becslést több kockázati mérőszámmal is elvégeztük több, különféle Taylor-szabályt használva. Az érzékenységvizsgálatban az inflációhoz és a kibocsátási réshez is alkalmaztunk más, az alapspecifikációban szereplőtől eltérő mérőszámokat. Eredményeink szerint a Magyar Nemzeti Bank kamatdöntései jól leírhatók egy rugalmas, inflációs célkövető rezsimmel: a Taylor-szabályban szignifikáns szerepe van az inflációs céltól való eltérésének és - a szabályok egy része esetén - a kibocsátási résnek. Emellett a döntéshozók figyelembe vették az országkockázatot is, annak növekedésére a kamat emelésével válaszoltak. Az országkockázat Taylor-szabályba történő beillesztése a megfelelő kockázati mérőszám kiválasztása esetén jelentős mértékben képes javítani a Taylor-szabály illeszkedését. _____ The paper investigates the degree to which Hungarian monetary policy has considered country risk in its decisions and if so, how. The answer was sought through the commonest method of analysing a countrys monetary policy: Taylor rules for describing it. The estimation of the rule was prepared using several risk indicators and applying various types of Taylor rules. As a sensitivity analysis, other indicators of inflation and output gap were employed than in the base rule. This showed that the interest-rate decisions of the National Bank of Hungary can be well described by a flexible inflation targeting regime: in the Taylor rules, deviation of inflation from its target has a significant role and the output gap is also significant in one part of the rules. The decision-makers also considered country risk and responded to an increase in it by raising interest rates. Insertion of country risk into the Taylor rule could improve the models fit to an important degree when choosing an appropriate risk measure.
Resumo:
A cikk célja, hogy elemző bemutatását adja az ellátási láncok működéséhez, különösen a disztribúciós tevékenység kiszervezéséhez kapcsolódó működési kockázatoknak. Az írás első része az irodalomkutatás eredményeit feldolgozva az ellátási láncok kockázati kitettségének növekedése mögött rejlő okokat törekszik feltárni, s röviden bemutatja a vállalati kockázatkezelés lehetséges lépéseit e téren. A cikk második gondolati egysége mélyinterjúk segítségével összefoglalja és rendszerezi a disztribúció kiszervezéséhez kapcsolódó kockázatokat, számba veszi a kapcsolódó kockázatkezelési lehetőségeket, s bemutatja a megkérdezett vállalatok által alkalmazott kockázat-megelőzési alternatívákat. ______ The aim of this paper is to introduce operational risks of supply chains, especially risks deriving from the outsourcing of distribution management. Based on literature review the first part of the paper talks about the potential reasons of increasing global supply chain risks, and the general business activities of risk assessment. Analyzing the results of semi-structured qualitative interviews, the second part summarizes the risks belonging to the outsourcing of distribution and introduces the potential risk assessment and avoidance opportunities and alternatives in practice.
Resumo:
The market model is the most frequently estimated model in financial economics and has proven extremely useful in the estimation of systematic risk. In this era of rapid globalization of financial markets there has been a substantial increase in cross listings of stocks in foreign and regional capital markets. As many as a third to a half of the stocks in some major exchanges are foreign listed. The multiple listings of stocks has major implications for the estimation of systematic risk. The traditiona1 method of estimating the market model by using data from only one market will lead to misleading estimates of beta. This study demonstrates that the estimator for systematic risk and the methodology itself changes when stocks are listed in multiple markets. General expressions are developed to obtain the estimator of global beta under a variety of assumptions about the error terms of the market models for different capital markets. The assumptions pertain both to the volatilities of the abnormal returns in each market, and to the relationship between the markets. ^ Explicit expressions are derived for the estimation of global systematic risk beta when the returns are homoscedastic and also under different heteroscedastic conditions both within and/or between markets. These results for the estimation of global beta are further extended when return generating process follows an autoregressive scheme.^
Resumo:
Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^
Resumo:
With the rapid globalization and integration of world capital markets, more and more stocks are listed in multiple markets. With multi-listed stocks, the traditional measurement of systematic risk, the domestic beta, is not appropriate since it only contain information from one market. ^ Prakash et al. (1993) developed a technique, the global beta, to capture information from multiple markets wherein the stocks are listed. In this study, the global betas are obtained as well as domestic betas for 704 multi-listed stocks from 59 world equity markets. Welch tests show that domestic betas are not equal across markets, therefore, global beta is more appropriate in a global investment setting. ^ The traditional Capital Asset Pricing Models (CAPM) is also tested with regards to both domestic beta and global beta. The results generally support the positive relationship between stocks returns and global beta while tend to reject this relationship between stocks returns and domestic beta. Further tests of International CAPM with domestic beta and global beta strengthen the conclusion.^
Resumo:
Accurate knowledge of the time since death, or postmortem interval (PMI), has enormous legal, criminological, and psychological impact. In this study, an investigation was made to determine whether the relationship between the degradation of the human cardiac structure protein Cardiac Troponin T and PMI could be used as an indicator of time since death, thus providing a rapid, high resolution, sensitive, and automated methodology for the determination of PMI. ^ The use of Cardiac Troponin T (cTnT), a protein found in heart tissue, as a selective marker for cardiac muscle damage has shown great promise in the determination of PMI. An optimized conventional immunoassay method was developed to quantify intact and fragmented cTnT. A small sample of cardiac tissue, which is less affected than other tissues by external factors, was taken, homogenized, extracted with magnetic microparticles, separated by SDS-PAGE, and visualized with Western blot by probing with monoclonal antibody against cTnT. This step was followed by labeling and available scanners. This conventional immunoassay provides a proper detection and quantitation of cTnT protein in cardiac tissue as a complex matrix; however, this method does not provide the analyst with immediate results. Therefore, a competitive separation method using capillary electrophoresis with laser-induced fluorescence (CE-LIF) was developed to study the interaction between human cTnT protein and monoclonal anti-TroponinT antibody. ^ Analysis of the results revealed a linear relationship between the percent of degraded cTnT and the log of the PMI, indicating that intact cTnT could be detected in human heart tissue up to 10 days postmortem at room temperature and beyond two weeks at 4C. The data presented demonstrates that this technique can provide an extended time range during which PMI can be more accurately estimated as compared to currently used methods. The data demonstrates that this technique represents a major advance in time of death determination through a fast and reliable, semi-quantitative measurement of a biochemical marker from an organ protected from outside factors. ^
Resumo:
This study focuses on quantifying explicitly the sediment budget of deeply incised ravines in the lower Le Sueur River watershed, in southern Minnesota. High-rate-gully-erosion equations along with the Universal Soil Loss Equation (USLE) were implemented in a numerical modeling approach that is based on a time-integration of the sediment balance equations. The model estimates the rates of ravine width and depth change and the amount of sediment periodically flushing from the ravines. Components of the sediment budget of the ravines were simulated with the model and results suggest that the ravine walls are the major sediment source in the ravines. A sensitivity analysis revealed that the erodibility coefficients of the gully bed and wall, the local slope angle and the Manning’s coefficient are the key parameters controlling the rate of sediment production. Recommendations to guide further monitoring efforts in the watershed and increased detail modeling approaches are highlighted as a result of this modeling effort.
Resumo:
Taylor Slough is one of the natural freshwater contributors to Florida Bay through a network of microtidal creeks crossing the Everglades Mangrove Ecotone Region (EMER). The EMER ecological function is critical since it mediates freshwater and nutrient inputs and controls the water quality in Eastern Florida Bay. Furthermore, this region is vulnerable to changing hydrodynamics and nutrient loadings as a result of upstream freshwater management practices proposed by the Comprehensive Everglades Restoration Program (CERP), currently the largest wetland restoration project in the USA. Despite the hydrological importance of Taylor Slough in the water budget of Florida Bay, there are no fine scale (∼1 km2) hydrodynamic models of this system that can be utilized as a tool to evaluate potential changes in water flow, salinity, and water quality. Taylor River is one of the major creeks draining Taylor Slough freshwater into Florida Bay. We performed a water budget analysis for the Taylor River area, based on long-term hydrologic data (1999–2007) and supplemented by hydrodynamic modeling using a MIKE FLOOD (DHI,http://dhigroup.com/) model to evaluate groundwater and overland water discharges. The seasonal hydrologic characteristics are very distinctive (average Taylor River wet vs. dry season outflow was 6 to 1 during 1999–2006) with a pronounced interannual variability of flow. The water budget shows a net dominance of through flow in the tidal mixing zone, while local precipitation and evapotranspiration play only a secondary role, at least in the wet season. During the dry season, the tidal flood reaches the upstream boundary of the study area during approximately 80 days per year on average. The groundwater field measurements indicate a mostly upwards-oriented leakage, which possibly equals the evapotranspiration term. The model results suggest a high importance of groundwater contribution to the water salinity in the EMER. The model performance is satisfactory during the dry season where surface flow in the area is confined to the Taylor River channel. The model also provided guidance on the importance of capturing the overland flow component, which enters the area as sheet flow during the rainy season. Overall, the modeling approach is suitable to reach better understanding of the water budget in the mangrove region. However, more detailed field data is needed to ascertain model predictions by further calibrating overland flow parameters.
Resumo:
This paper addresses the issues of hotel operators identifying effective means of allocating rooms through various electronic channels of distribution. Relying upon the theory of coercive isomorphism, a think tank was constructed to identify and define electronic channels of distribution currently being utilized in the hotel industry. Through two full-day focus groups consisting of key hotel executives and industry practitioners, distribution channels were identified as were challenges and solutions associated with each.
Resumo:
Tall buildings are wind-sensitive structures and could experience high wind-induced effects. Aerodynamic boundary layer wind tunnel testing has been the most commonly used method for estimating wind effects on tall buildings. Design wind effects on tall buildings are estimated through analytical processing of the data obtained from aerodynamic wind tunnel tests. Even though it is widely agreed that the data obtained from wind tunnel testing is fairly reliable the post-test analytical procedures are still argued to have remarkable uncertainties. This research work attempted to assess the uncertainties occurring at different stages of the post-test analytical procedures in detail and suggest improved techniques for reducing the uncertainties. Results of the study showed that traditionally used simplifying approximations, particularly in the frequency domain approach, could cause significant uncertainties in estimating aerodynamic wind-induced responses. Based on identified shortcomings, a more accurate dual aerodynamic data analysis framework which works in the frequency and time domains was developed. The comprehensive analysis framework allows estimating modal, resultant and peak values of various wind-induced responses of a tall building more accurately. Estimating design wind effects on tall buildings also requires synthesizing the wind tunnel data with local climatological data of the study site. A novel copula based approach was developed for accurately synthesizing aerodynamic and climatological data up on investigating the causes of significant uncertainties in currently used synthesizing techniques. Improvement of the new approach over the existing techniques was also illustrated with a case study on a 50 story building. At last, a practical dynamic optimization approach was suggested for tuning structural properties of tall buildings towards attaining optimum performance against wind loads with less number of design iterations.
Resumo:
This study focuses on quantifying explicitly the sediment budget of deeply incised ravines in the lower Le Sueur River watershed, in southern Minnesota. High-rate-gully-erosion equations along with the Universal Soil Loss Equation (USLE) were implemented in a numerical modeling approach that is based on a time-integration of the sediment balance equations. The model estimates the rates of ravine width and depth change and the amount of sediment periodically flushing from the ravines. Components of the sediment budget of the ravines were simulated with the model and results suggest that the ravine walls are the major sediment source in the ravines. A sensitivity analysis revealed that the erodibility coefficients of the gully bed and wall, the local slope angle and the Manning’s coefficient are the key parameters controlling the rate of sediment production. Recommendations to guide further monitoring efforts in the watershed and increased detail modeling approaches are highlighted as a result of this modeling effort.
Resumo:
Water-alternating-gas (WAG) is an enhanced oil recovery method combining the improved macroscopic sweep of water flooding with the improved microscopic displacement of gas injection. The optimal design of the WAG parameters is usually based on numerical reservoir simulation via trial and error, limited by the reservoir engineer’s availability. Employing optimisation techniques can guide the simulation runs and reduce the number of function evaluations. In this study, robust evolutionary algorithms are utilized to optimise hydrocarbon WAG performance in the E-segment of the Norne field. The first objective function is selected to be the net present value (NPV) and two global semi-random search strategies, a genetic algorithm (GA) and particle swarm optimisation (PSO) are tested on different case studies with different numbers of controlling variables which are sampled from the set of water and gas injection rates, bottom-hole pressures of the oil production wells, cycle ratio, cycle time, the composition of the injected hydrocarbon gas (miscible/immiscible WAG) and the total WAG period. In progressive experiments, the number of decision-making variables is increased, increasing the problem complexity while potentially improving the efficacy of the WAG process. The second objective function is selected to be the incremental recovery factor (IRF) within a fixed total WAG simulation time and it is optimised using the same optimisation algorithms. The results from the two optimisation techniques are analyzed and their performance, convergence speed and the quality of the optimal solutions found by the algorithms in multiple trials are compared for each experiment. The distinctions between the optimal WAG parameters resulting from NPV and oil recovery optimisation are also examined. This is the first known work optimising over this complete set of WAG variables. The first use of PSO to optimise a WAG project at the field scale is also illustrated. Compared to the reference cases, the best overall values of the objective functions found by GA and PSO were 13.8% and 14.2% higher, respectively, if NPV is optimised over all the above variables, and 14.2% and 16.2% higher, respectively, if IRF is optimised.
Resumo:
The quantitative diatom analysis of 218 surface sediment samples recovered in the Atlantic and western Indian sector of the Southern Ocean is used to define a base of reference data for paleotemperature estimations from diatom assemblages using the Imbrie and Kipp transfer function method. The criteria which justify the exclusion of samples and species out of the raw data set in order to define a reference database are outlined and discussed. Sensitivity tests with eight data sets were achieved evaluating the effects of overall dominance of single species, different methods of species abundance ranking, and no-analog conditions (e.g., Eucampia Antarctica) on the estimated paleotemperatures. The defined transfer functions were applied on a sediment core from the northern Antarctic zone. Overall dominance of Fragilariopsis kerguelensis in the diatom assemblages resulted in a close affinity between paleotemperature curve and relative abundance pattern of this species downcore. Logarithmic conversion of counting data applied with other ranking methods in order to compensate the dominance of F. kerguelensis revealed the best statistical results. A reliable diatom transfer function for future paleotemperature estimations is presented.