877 resultados para Acceleration data structure
Resumo:
The purpose of this paper is to make quantitative and qualitative analysis of foreign citizens who may participate on the Swedish labor market (in text refers to as ‘immigrants’). This research covers the period 1973-2005 and gives prediction figures of immigrant population, age and gender structure, and education attainment in 2010. To cope with data regarding immigrants from different countries, the population was divided into six groups. The main chapter is divided into two parts. The first part specifies division of immigrants into groups by country of origin according to geographical, ethnical, economical and historical criteria. Brief characteristics and geographic position, dynamic and structure description were given for each group; historical review explain rapid changes in immigrant population. Statistical models for description and estimation future population were given. The second part specifies education and qualification level of the immigrants according to international and Swedish standards. Models for estimating age and gender structure, level of education and professional orientation of immigrants in different groups are given. Inferences were made regarding ethnic, gender and education structure of immigrants; the distribution of immigrants among Swedish counties is given. Discussion part presents the results of the research, gives perspectives for the future brief evaluation of the role of immigrants on the Swedish labor market.
Resumo:
In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ε model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ε model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. 3DVAR allows also to identify and quantify shortcomings of the numerical model. Such comprehensive analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows.
Numerical Simulation Of Sediment Transport And Bedmorphology Around A Hydraulic Structure On A River
Resumo:
Scour around hydraulic structures is a critical problem in hydraulic engineering. Under prediction of scour depth may lead to costly failures of the structure, while over prediction might result in unnecessary costs. Unfortunately, up-to-date empirical scour prediction formulas are based on laboratory experiments that are not always able to reproduce field conditions due to complicated geometry of rivers and temporal and spatial scales of a physical model. However, computational fluid dynamics (CFD) tools can perform using real field dimensions and operating conditions to predict sediment scour around hydraulic structures. In Korea, after completing the Four Major Rivers Restoration Project, several new weirs have been built across Han, Nakdong, Geum and Yeongsan Rivers. Consequently, sediment deposition and bed erosion around such structures have became a major issue in these four rivers. In this study, an application of an open source CFD software package, the TELEMAC-MASCARET, to simulate sediment transport and bed morphology around Gangjeong weir, which is the largest multipurpose weir built on Nakdong River. A real bathymetry of the river and a geometry of the weir have been implemented into the numerical model. The numerical simulation is carried out with a real hydrograph at the upstream boundary. The bedmorphology obtained from the numerical results has been validated against field observation data, and a maximum of simulated scour depth is compared with the results obtained by empirical formulas of Hoffmans. Agreement between numerical computations, observed data and empirical formulas is judged to be satisfactory on all major comparisons. The outcome of this study does not only point out the locations where deposition and erosion might take place depending on the weir gate operation, but also analyzes the mechanism of formation and evolution of scour holes after the weir gates.
Resumo:
The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
Esta tese utiliza a informação contida em preços internacionais para identificar parâmetros de modelos de comércio sob competição imperfeita, desta forma permitindo inferência sobre o comportamento das exportações, sobre os ganhos de troca da abertura comercial e sobre a variedade de bens produzidos domesticamente. Em primeiro lugar, investigamos o repasse cambial, no longo prazo, para os preços praticados por exportadores brasileiros. O foco no longo prazo permite controlar os efeitos da rigidez de preço no curto prazo, de maneira que o repasse incompleto evidencie competição imperfeita com preços flexíveis. Em segundo lugar, calculamos os ganhos de troca de novas variedades de bens importados baseando-nos em estimativas para as elasticidades de substituição desagregadas. Finalmente, qualificamos a ênfase da literatura de comércio em ganhos de eficiência no lugar de ganhos de variedade, demonstrando que a variedade de bens produzidos domesticamente se amplia após aberturas comerciais desde que as firmas tenham uma margem de decisão em bens intermediários ou na qualificação da mão de obra.
Resumo:
This paper argues that changes in the returns to occupational tasks have contributed to changes in the wage distribution over the last three decades. Using Current Population Survey (CPS) data, we first show that the 1990s polarization of wages is explained by changes in wage setting between and within occupations, which are well captured by tasks measures linked to technological change and offshorability. Using a decomposition based on Firpo, Fortin, and Lemieux (2009), we find that technological change and deunionization played a central role in the 1980s and 1990s, while offshorability became an important factor from the 1990s onwards.
Resumo:
This paper investigates whether there is evidence of structural change in the Brazilian term structure of interest rates. Multivariate cointegration techniques are used to verify this evidence. Two econometrics models are estimated. The rst one is a Vector Autoregressive Model with Error Correction Mechanism (VECM) with smooth transition in the deterministic coe¢ cients (Ripatti and Saikkonen [25]). The second one is a VECM with abrupt structural change formulated by Hansen [13]. Two datasets were analysed. The rst one contains a nominal interest rate with maturity up to three years. The second data set focuses on maturity up to one year. The rst data set focuses on a sample period from 1995 to 2010 and the second from 1998 to 2010. The frequency is monthly. The estimated models suggest the existence of structural change in the Brazilian term structure. It was possible to document the existence of multiple regimes using both techniques for both databases. The risk premium for di¤erent spreads varied considerably during the earliest period of both samples and seemed to converge to stable and lower values at the end of the sample period. Long-term risk premiums seemed to converge to inter-national standards, although the Brazilian term structure is still subject to liquidity problems for longer maturities.
Resumo:
The systemic financial crisis that started in 2008 in the United States had some severe effects in the economic activity and required the bailout of financial institutions with the use of taxpayer’s money. It also originated claims for stronger regulatory framework in order to avoid another threat in the financial market. The Dodd Frank Act was proposed and approved in the United States in the aftermath of the crisis and brought, among many other features, the creation of the Financial Stability Oversight Council and the tougher inspection of financial institutions with asset above 50 billion dollars. The objective of this work is to study the causal effect of the Dodd Frank Act on the behavior of the treatment group subject to monitoring by the Financial Stability Oversight Council (financial institutions with assets above 50 billion dollars) regarding capital and compensation structure in comparison to the group that was not treated. We use data from Compustat and our empirical strategy is the Regression Discontinuity Design, not usually applied to the banking literature, but very useful for the present work since it allows us to compare the treatment group and the non-treatment group in the year of the enactment of the law (2010). No change of behavior was observed for the Capital Structure. In the Compensation Schemes, however, a decrease was found in the item other compensation for CEOs and CFOs. We also performed a robustness check by running a placebo test on the variables in the year before the law was enacted. No significance was found, which supports the conclusion that our main results were caused by the enactment of the DFA.
Resumo:
This paper analyzes the demand and cost structure of the French market of academic journals, taking into account its intermediary role between researchers, who are both producers and consumers of knowledge. This two sidedness feature will echoes similar problems already observed in electronic markets – payment card systems, video game console etc - such as the chicken and egg problem, where readers won’t buy a journal if they do not expect its articles to be academically relevant and researchers, that live under the mantra “Publish or Perish”, will not submit to a journal with either limited public reach or weak reputation. After the merging of several databases, we estimate the aggregated nested logit demand system combined simultaneously with a cost function. We identify the structural parameters of this market and find that price elasticities of demand are quite large and margins relatively low, indicating that this industry experiences competitive constraints.
Resumo:
Industrial companies in developing countries are facing rapid growths, and this requires having in place the best organizational processes to cope with the market demand. Sales forecasting, as a tool aligned with the general strategy of the company, needs to be as much accurate as possible, in order to achieve the sales targets by making available the right information for purchasing, planning and control of production areas, and finally attending in time and form the demand generated. The present dissertation uses a single case study from the subsidiary of an international explosives company based in Brazil, Maxam, experiencing high growth in sales, and therefore facing the challenge to adequate its structure and processes properly for the rapid growth expected. Diverse sales forecast techniques have been analyzed to compare the actual monthly sales forecast, based on the sales force representatives’ market knowledge, with forecasts based on the analysis of historical sales data. The dissertation findings show how the combination of both qualitative and quantitative forecasts, by the creation of a combined forecast that considers both client´s demand knowledge from the sales workforce with time series analysis, leads to the improvement on the accuracy of the company´s sales forecast.
Resumo:
This paper investigates the role of consumption-wealth ratio on predicting future stock returns through a panel approach. We follow the theoretical framework proposed by Lettau and Ludvigson (2001), in which a model derived from a nonlinear consumer’s budget constraint is used to settle the link between consumption-wealth ratio and stock returns. Using G7’s quarterly aggregate and financial data ranging from the first quarter of 1981 to the first quarter of 2014, we set an unbalanced panel that we use for both estimating the parameters of the cointegrating residual from the shared trend among consumption, asset wealth and labor income, cay, and performing in and out-of-sample forecasting regressions. Due to the panel structure, we propose different methodologies of estimating cay and making forecasts from the one applied by Lettau and Ludvigson (2001). The results indicate that cay is in fact a strong and robust predictor of future stock return at intermediate and long horizons, but presents a poor performance on predicting one or two-quarter-ahead stock returns.
Resumo:
This thesis provides information on the grouping structure, survival, abundance, dive characteristics and habitat preferences of short-finned pilot whales occurring in the oceanic archipelago of Madeira (Portugal, NE Atlantic), based on data collected between 2001-2011, and contributes for its conservation. Photo-identification methods and genetic analyses demonstrated that there is a large degree of variability in site fidelity, including resident, regular visitor and transient whales, and that they may not be genetically isolated. It is proposed that the pilot whales encountered in Madeira belong to a single population encompassing several clans, possibly three clans of island-associated (i.e. resident and regular visitor) whales and others of transients, each containing two to three matrilineal pods. Mark-recapture methods estimated that the island-associated community is composed of less than 150 individuals and that their survival rate is within the range of other long-lived cetacean species, and that around 300 whales of different residency patterns uses the southern area of the island of Madeira from mid-summer to mid-autumn. No significant trend was observed between years. Time-depth recorders deployed in adult whales during daytime revealed that they spend over ¾ of their time at the surface, that they have a low diving rate, and that transient whales also forage during their passage. The analyses of visual data collected from nautical and aerial line-transect surveys indicate a core/preferred habitat area in the south-east of the island of Madeira. That area is used for resting, socializing, foraging, breeding, calving and birthing. Thus, that area should be considered as an important habitat for this species, at least seasonally (during autumn) when the species is more abundant, and included in conservation plans. No direct threat needing urgent measures was identified, although the impact of some activities like whale-watching or marine traffic should be assessed.
Resumo:
To provide data for conservation, selection, and expansion programs of buffalo herds, this study evaluated the history of a population of Murrah buffaloes based on population structure and the effect of inbreeding on accumulated 305-d milk yield (MY), fat yield (FY), protein yield (PY), mozzarella production (MProd), and somatic cell score (SCS). The usefulness of including the individual inbreeding coefficient (F) or individual increase in inbreeding coefficient (Delta F) in the model to describe inbreeding depression was evaluated. Pedigree information from 8,054 animals born between 1976 and 2008 and 4,497 lactation records obtained from 12 herds were used. The realized effective population size was 40.10 +/- 1.27, and the mean F of the entire population was 2.14%. The ratio between the number of founders and ancestors demonstrated the existence of a bottleneck in the pedigree of this population, which may contribute to a reduction of genetic diversity. The effect of F on MY, FY, PY, MProd, and SCS was -1.005 kg, -0.299 kg, -0.246 kg, -1.201 kg, and -0.002 units, and the effect of Delta F transformed to equivalent F (%) for a mean of 2.57 equivalent generations was -4.287 kg, -0.581 kg, -0.383 kg, -2.001 kg, and -0.007 units, respectively. The inbreeding depression observed may have important economic repercussions for production systems. The Delta F can be considered the better of the two indicators of inbreeding depression due to its properties that prevent underestimation of this effect. A designed mating system to avoid inbreeding may be applied to this population to maintain genetic diversity.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)