950 resultados para Decomposition of Ranked Models
Resumo:
Chemical studies of superheavy elements require fast and efficient techniques, due to short half-lives and low production rates of the investigated nuclides. Here, we advocate for using a tubular flow reactor for assessing the thermal stability of the Sg carbonyl complex – Sg(CO)6. The experimental setup was tested with Mo and W carbonyl complexes, as their properties are established and supported by theoretical predictions. The suggested approach proved to be effective in discriminating between the thermal stabilities of Mo(CO)6 and W(CO)6. Therefore, an experimental verification of the predicted Sg–CO bond dissociation energy seems to be feasible by applying this technique. By investigating the effect of 104,105Mo beta-decay on the formation of 104,105Tc carbonyl complex, we estimated the lower reaction time limit for the metal carbonyl synthesis in the gas phase to be more than 100 ms. We examined further the influence of the wall material of the recoil chamber, the carrier gas composition, the gas flow rate, and the pressure on the production yield of 104Mo(CO)6, so that the future stability tests with Sg(CO)6 can be optimized accordingly.
Resumo:
This paper considers the aggregate performance of the banking industry, applying a modified and extended dynamic decomposition of bank return on equity. The aggregate performance of any industry depends on the underlying microeconomic dynamics within that industry . adjustments within banks, reallocations between banks, entry of new banks, and exit of existing banks. Bailey, Hulten, and Campbell (1992) and Haltiwanger (1997) develop dynamic decompositions of industry performance. We extend those analyses to derive an ideal decomposition that includes their decomposition as one component. We also extend the decomposition, consider geography, and implement decomposition on a state-by-state basis, linking that geographic decomposition back to the national level. We then consider how deregulation of geographic restrictions on bank activity affects the components of the state-level dynamic decomposition, controlling for competition and the state of the economy within each state and employing fixed- and random-effects estimation for a panel database across the fifty states and the District of Columbia from 1976 to 2000.
Resumo:
The aggregate performance of the banking industry depends on the underlying microlevel dynamics within that industry. adjustments within banks, reallocations between banks, entries of new banks, and exits of existing banks. This paper develops a generalized ideal dynamic decomposition and applies it to the return on equity of foreign and domestic commercial banks in Korea from 1994 to 2000. The sample corresponds to the Asian financial crisis and the final stages of a long process of deregulation and privatization in the Korean banking industry. The comparison of our findings reveals that the overall performance of Korean banks largely reflects individual bank efficiencies, except immediately after the Asian financial crisis where restructuring played a more important role on average bank performance. Moreover, Korean regional banks started the restructuring process about one year before the Korean nationwide banks. Foreign bank performance, however, largely reflected individual bank efficiencies, even immediately after the Asian financial crisis.
Resumo:
Strategies are compared for the development of a linear regression model with stochastic (multivariate normal) regressor variables and the subsequent assessment of its predictive ability. Bias and mean squared error of four estimators of predictive performance are evaluated in simulated samples of 32 population correlation matrices. Models including all of the available predictors are compared with those obtained using selected subsets. The subset selection procedures investigated include two stopping rules, C$\sb{\rm p}$ and S$\sb{\rm p}$, each combined with an 'all possible subsets' or 'forward selection' of variables. The estimators of performance utilized include parametric (MSEP$\sb{\rm m}$) and non-parametric (PRESS) assessments in the entire sample, and two data splitting estimates restricted to a random or balanced (Snee's DUPLEX) 'validation' half sample. The simulations were performed as a designed experiment, with population correlation matrices representing a broad range of data structures.^ The techniques examined for subset selection do not generally result in improved predictions relative to the full model. Approaches using 'forward selection' result in slightly smaller prediction errors and less biased estimators of predictive accuracy than 'all possible subsets' approaches but no differences are detected between the performances of C$\sb{\rm p}$ and S$\sb{\rm p}$. In every case, prediction errors of models obtained by subset selection in either of the half splits exceed those obtained using all predictors and the entire sample.^ Only the random split estimator is conditionally (on $\\beta$) unbiased, however MSEP$\sb{\rm m}$ is unbiased on average and PRESS is nearly so in unselected (fixed form) models. When subset selection techniques are used, MSEP$\sb{\rm m}$ and PRESS always underestimate prediction errors, by as much as 27 percent (on average) in small samples. Despite their bias, the mean squared errors (MSE) of these estimators are at least 30 percent less than that of the unbiased random split estimator. The DUPLEX split estimator suffers from large MSE as well as bias, and seems of little value within the context of stochastic regressor variables.^ To maximize predictive accuracy while retaining a reliable estimate of that accuracy, it is recommended that the entire sample be used for model development, and a leave-one-out statistic (e.g. PRESS) be used for assessment. ^
Resumo:
The purpose of this research is to develop a new statistical method to determine the minimum set of rows (R) in a R x C contingency table of discrete data that explains the dependence of observations. The statistical power of the method will be empirically determined by computer simulation to judge its efficiency over the presently existing methods. The method will be applied to data on DNA fragment length variation at six VNTR loci in over 72 populations from five major racial groups of human (total sample size is over 15,000 individuals; each sample having at least 50 individuals). DNA fragment lengths grouped in bins will form the basis of studying inter-population DNA variation within the racial groups are significant, will provide a rigorous re-binning procedure for forensic computation of DNA profile frequencies that takes into account intra-racial DNA variation among populations. ^
Resumo:
Antarctic terrestrial ecosystems have poorly developed soils and currently experience one of the greatest rates of climate warming on the globe. We investigated the responsiveness of organic matter decomposition in Maritime Antarctic terrestrial ecosystems to climate change, using two study sites in the Antarctic Peninsula region (Anchorage Island, 67°S; Signy Island, 61°S), and contrasted the responses found with those at the cool temperate Falkland Islands (52°S). Our approach consisted of two complementary methods: (1) Laboratory measurements of decomposition at different temperatures (2, 6 and 10 °C) of plant material and soil organic matter from all three locations. (2) Field measurements at all three locations on the decomposition of soil organic matter, plant material and cellulose, both under natural conditions and under experimental warming (about 0.8 °C) achieved using open top chambers. Higher temperatures led to higher organic matter breakdown in the laboratory studies, indicating that decomposition in Maritime Antarctic terrestrial ecosystems is likely to increase with increasing soil temperatures. However, both laboratory and field studies showed that decomposition was more strongly influenced by local substratum characteristics (especially soil N availability) and plant functional type composition than by large-scale temperature differences. The very small responsiveness of organic matter decomposition in the field (experimental temperature increase <1 °C) compared with the laboratory (experimental increases of 4 or 8 °C) shows that substantial warming is required before significant effects can be detected.
Resumo:
artículo publicado en la revista Int Fam Plan Perspect. 2003 Sep;29(3):112-20
Resumo:
Sediment samples and hydrographic conditions were studied at 28 stations around Iceland. At these sites, Conductivity-Temperature-Depth (CTD) casts were conducted to collect hydrographic data and multicorer casts were conductd to collect data on sediment characteristics including grain size distribution, carbon and nitrogen concentration, and chloroplastic pigment concentration. A total of 14 environmental predictors were used to model sediment characteristics around Iceland on regional geographic space. For these, two approaches were used: Multivariate Adaptation Regression Splines (MARS) and randomForest regression models. RandomForest outperformed MARS in predicting grain size distribution. MARS models had a greater tendency to over- and underpredict sediment values in areas outside the environmental envelope defined by the training dataset. We provide first GIS layers on sediment characteristics around Iceland, that can be used as predictors in future models. Although models performed well, more samples, especially from the shelf areas, will be needed to improve the models in future.
Resumo:
We propose a method for the decomposition of inequality changes based on panel data regression. The method is an efficient way to quantify the contributions of variables to changes of the Theil T index while satisfying the property of uniform addition. We illustrate the method using prefectural data from Japan for the period 1955 to 1998. Japan experienced a diminishing of regional income disparity during the years of high economic growth from 1955 to 1973. After estimating production functions using panel data for prefectures in Japan, we apply the new decomposition approach to identify each production factor’s contributions to the changes of per capita income inequality among prefectures. The decomposition results show that total factor productivity (residual) growth, population change (migration), and public capital stock growth contributed to the diminishing of per capita income disparity.
Resumo:
Ozone stomatal fluxes were modeled for a 3-year period following different approaches for a commercial variety of durum wheat (Triticum durum Desf. cv. Camacho) at the phenological stage of anthesis. All models performed in the same range, although not all of them afforded equally significant results. Nevertheless, all of them suggest that stomatal conductance would account for the main percentage of ozone deposition fluxes. A new modeling approach was tested, based on a 3-D architectural model of the wheat canopy, and fairly accurate results were obtained. Plant species-specific measurements, as well as measurements of stomatal conductance and environmental parameters, were required. The method proposed for calculating ozone stomatal fluxes (FO(3_3-D)) from experimental gs data and modeling them as a function of certain environmental parameters in conjunction with the use of the YPLANT model seems to be adequate, providing realistic estimates of the canopy FO(3_3-D), integrating and not neglecting the contribution of the lower leaves with respect to the flag leaf, although a further development of this model is needed.
Resumo:
Software architectural evaluation is a key discipline used to identify, at early stages of a real-time system (RTS) development, the problems that may arise during its operation. Typical mechanisms supporting concurrency, such as semaphores, mutexes or monitors, usually lead to concurrency problems in execution time that are difficult to be identified, reproduced and solved. For this reason, it is crucial to understand the root causes of these problems and to provide support to identify and mitigate them at early stages of the system lifecycle. This paper aims to present the results of a research work oriented to the development of the tool called ‘Deadlock Risk Evaluation of Architectural Models’ (DREAM) to assess deadlock risk in architectural models of an RTS. A particular architectural style, Pipelines of Processes in Object-Oriented Architectures–UML (PPOOA) was used to represent platform-independent models of an RTS architecture supported by the PPOOA –Visio tool. We validated the technique presented here by using several case studies related to RTS development and comparing our results with those from other deadlock detection approaches, supported by different tools. Here we present two of these case studies, one related to avionics and the other to planetary exploration robotics. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
We present a method for the static resource usage analysis of MiniZinc models. The analysis can infer upper bounds on the usage that a MiniZinc model will make of some resources such as the number of constraints of a given type (equality, disequality, global constraints, etc.), the number of variables (search variables or temporary variables), or the size of the expressions before calling the solver. These bounds are obtained from the models independently of the concrete input data (the instance data) and are in general functions of sizes of such data. In our approach, MiniZinc models are translated into Ciao programs which are then analysed by the CiaoPP system. CiaoPP includes a parametric analysis framework for resource usage in which the user can define resources and express the resource usage of library procedures (and certain program construets) by means of a language of assertions. We present the approach and report on a preliminary implementation, which shows the feasibility of the approach, and provides encouraging results.
Resumo:
We present two approaches to cluster dialogue-based information obtained by the speech understanding module and the dialogue manager of a spoken dialogue system. The purpose is to estimate a language model related to each cluster, and use them to dynamically modify the model of the speech recognizer at each dialogue turn. In the first approach we build the cluster tree using local decisions based on a Maximum Normalized Mutual Information criterion. In the second one we take global decisions, based on the optimization of the global perplexity of the combination of the cluster-related LMs. Our experiments show a relative reduction of the word error rate of 15.17%, which helps to improve the performance of the understanding and the dialogue manager modules.
Resumo:
There is evidence that the climate changes and that now, the change is influenced and accelerated by the CO2 augmentation in atmosphere due to combustion by humans. Such ?Climate change? is on the policy agenda at the global level, with the aim of understanding and reducing its causes and to mitigate its consequences. In most countries and international organisms UNO (e.g. Rio de Janeiro 1992), OECD, EC, etc . . . the efforts and debates have been directed to know the possible causes, to predict the future evolution of some variable conditioners, and trying to make studies to fight against the effects or to delay the negative evolution of such. The Protocol of Kyoto 1997 set international efforts about CO2 emissions, but it was partial and not followed e.g. by USA and China . . . , and in Durban 2011 the ineffectiveness of humanity on such global real challenges was set as evident. Among all that, the elaboration of a global model was not boarded that can help to choose the best alternative between the feasible ones, to elaborate the strategies and to evaluate the costs, and the authors propose to enter in that frame for study. As in all natural, technological and social changes, the best-prepared countries will have the best bear and the more rapid recover. In all the geographic areas the alternative will not be the same one, but the model must help us to make the appropriated decision. It is essential to know those areas that are more sensitive to the negative effects of climate change, the parameters to take into account for its evaluation, and comprehensive plans to deal with it. The objective of this paper is to elaborate a mathematical model support of decisions, which will allow to develop and to evaluate alternatives of adaptation to the climatic change of different communities in Europe and Latin-America, mainly in especially vulnerable areas to the climatic change, considering in them all the intervening factors. The models will consider criteria of physical type (meteorological, edaphic, water resources), of use of the ground (agriculturist, forest, mining, industrial, urban, tourist, cattle dealer), economic (income, costs, benefits, infrastructures), social (population), politician (implementation, legislation), educative (Educational programs, diffusion) and environmental, at the present moment and the future. The intention is to obtain tools for aiding to get a realistic position for these challenges, which are an important part of the future problems of humanity in next decades.