86 resultados para Statistics, Promoted Ignition, ASTM Standard, Metals Combustion
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper deals with the impact of "early" nineteenth-century globalization (c.1815-1860) on foreign trade in the Southern Cone (SC). Most of the evidence is drawn from bilateral trades between Britain and the SC, at a time when Britain was the main commercial partner of the new republics. The main conclusion drawn is that early globalization had a positive impact on foreign trade in the SC, and this was due to: improvements in the SC's terms of trade during this period; the SC's per capita consumption of textiles (the main manufacture traded on world markets at that time) increased substantially during this period, at a time when clothing was one of the main items of SC household budgets; British merchants brought with them capital, shipping, insurance, and also facilitated the formation of vast global networks, which further promoted the SC's exports to a wider range of outlets.
Resumo:
The low-temperature isothermal magnetization curves, M(H), of SmCo4 and Fe3Tb thin films are studied according to the two-dimensional correlated spin-glass model of Chudnovsky. We have calculated the magnetization law in approach to saturation and shown that the M(H) data fit well the theory at high and low fields. In our fit procedure we have used three different correlation functions. The Gaussian decay correlation function fits well the experimental data for both samples.
Resumo:
The directional consistency and skew-symmetry statistics have been proposed as global measurements of social reciprocity. Although both measures can be useful for quantifying social reciprocity, researchers need to know whether these estimators are biased in order to assess descriptive results properly. That is, if estimators are biased, researchers should compare actual values with expected values under the specified null hypothesis. Furthermore, standard errors are needed to enable suitable assessment of discrepancies between actual and expected values. This paper aims to derive some exact and approximate expressions in order to obtain bias and standard error values for both estimators for round-robin designs, although the results can also be extended to other reciprocal designs.
Resumo:
Estudi elaborat a partir d’una estada a la Universität Karlsruhe entre gener i maig del 2007. Les biblioteques d’estructures de dades defineixen interfícies i implementen algorismes i estructures de dades fonamentals. Un exemple n’és la Satandard Template Library (STL ), que forma part del llenguatge de programació C++. En el marc d’una tesi, s’està treballant per obtenir implementacions més eficients i/o versàtils d’alguns components de la STL. Per a fer-ho s’utilitzen tècniques de la enginyeria d’algorismes. En particular, s’integra el coneixement de la comunitat algorítmica i es té en consideració la tecnologia existent. L’acció durant l’estada s’ha emmarcat en el desenvolupament la Multi Core STL (MCSTL ). La MCSTL és una implementació paral•lela de la STL per a màquines multi-core. Les màquines multi-core són actualment l’únic tipus de màquina disponible al mercat. Per tant, tot i que el paral•lelisme obtingut no sigui òptim, és preferible a tenir els processadors esperant, ja que , la tendència és que el nombre de processadors per computador augmenti.
Resumo:
Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30
Resumo:
This paper presents an initial challenge to tackle the every so "tricky" points encountered when dealing with energy accounting, and thereafter illustrates how such a system of accounting can be used when assessing for the metabolic changes in societies. The paper is divided in four main sections. The first three, present a general discussion on the main issues encountered when conducting energy analyses. The last section, subsequently, combines this heuristic approach to the actual formalization of it, in quantitative terms, for the analysis of possible energy scenarios. Section one covers the broader issue of how to account for the relevant categories used when accounting for Joules of energy; emphasizing on the clear distinction between Primary Energy Sources (PES) (which are the physical exploited entities that are used to derive useable energy forms (energy carriers)) and Energy Carriers (EC) (the actual useful energy that is transmitted for the appropriate end uses within a society). Section two sheds light on the concept of Energy Return on Investment (EROI). Here, it is emphasized that, there must already be a certain amount of energy carriers available to be able to extract/exploit Primary Energy Sources to thereafter generate a net supply of energy carriers. It is pointed out that this current trend of intense energy supply has only been possible to the great use and dependence on fossil energy. Section three follows up on the discussion of EROI, indicating that a single numeric indicator such as an output/input ratio is not sufficient in assessing for the performance of energetic systems. Rather an integrated approach that incorporates (i) how big the net supply of Joules of EC can be, given an amount of extracted PES (the external constraints); (ii) how much EC needs to be invested to extract an amount of PES; and (iii) the power level that it takes for both processes to succeed, is underlined. Section four, ultimately, puts the theoretical concepts at play, assessing for how the metabolic performances of societies can be accounted for within this analytical framework.
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.
Resumo:
Pensions together with savings and investments during active life are key elements of retirement planning. Motivation for personal choices about the standard of living, bequest and the replacement ratio of pension with respect to last salary income must be considered. This research contributes to the financial planning by helping to quantify long-term care economic needs. We estimate life expectancy from retirement age onwards. The economic cost of care per unit of service is linked to the expected time of needed care and the intensity of required services. The expected individual cost of long-term care from an onset of dependence is estimated separately for men and women. Assumptions on the mortality of the dependent people compared to the general population are introduced. Parameters defining eligibility for various forms of coverage by the universal public social care of the welfare system are addressed. The impact of the intensity of social services on individual predictions is assessed, and a partial coverage by standard private insurance products is also explored. Data were collected by the Spanish Institute of Statistics in two surveys conducted on the general Spanish population in 1999 and in 2008. Official mortality records and life table trends were used to create realistic scenarios for longevity. We find empirical evidence that the public long-term care system in Spain effectively mitigates the risk of incurring huge lifetime costs. We also find that the most vulnerable categories are citizens with moderate disabilities that do not qualify to obtain public social care support. In the Spanish case, the trends between 1999 and 2008 need to be further explored.
Resumo:
La UE promou les seves normes i principis com els drets humans a tercers països també. en aquest document conceptualitza la UE en el seu poder normatiu i presenta la seva política de drets humans i alguns interpretacions alternatives dels drets humans. La qüestió de si, i en el qual el preu de la UE ha de promoure els drets humans a la Xina, tenint en compte diversos punts de conflicte i si es pot complir amb el seu paper d'un poder normatiu a la llum de diferents restriccions s'examinen. Finalment, és analitza el que això implica per a la realització limitada demanda original de la UE i el que un optimitzat. política de drets humans pot semblar
Resumo:
In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.
Resumo:
A parts based model is a parametrization of an object class using a collection of landmarks following the object structure. The matching of parts based models is one of the problems where pairwise Conditional Random Fields have been successfully applied. The main reason of their effectiveness is tractable inference and learning due to the simplicity of involved graphs, usually trees. However, these models do not consider possible patterns of statistics among sets of landmarks, and thus they sufffer from using too myopic information. To overcome this limitation, we propoese a novel structure based on a hierarchical Conditional Random Fields, which we explain in the first part of this memory. We build a hierarchy of combinations of landmarks, where matching is performed taking into account the whole hierarchy. To preserve tractable inference we effectively sample the label set. We test our method on facial feature selection and human pose estimation on two challenging datasets: Buffy and MultiPIE. In the second part of this memory, we present a novel approach to multiple kernel combination that relies on stacked classification. This method can be used to evaluate the landmarks of the parts-based model approach. Our method is based on combining responses of a set of independent classifiers for each individual kernel. Unlike earlier approaches that linearly combine kernel responses, our approach uses them as inputs to another set of classifiers. We will show that we outperform state-of-the-art methods on most of the standard benchmark datasets.
Resumo:
The statistical analysis of compositional data should be treated using logratios of parts,which are difficult to use correctly in standard statistical packages. For this reason afreeware package, named CoDaPack was created. This software implements most of thebasic statistical methods suitable for compositional data.In this paper we describe the new version of the package that now is calledCoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©),Visual Basic and Open GL, and it is oriented towards users with a minimum knowledgeof computers with the aim at being simple and easy to use.This new version includes new graphical output in 2D and 3D. These outputs could bezoomed and, in 3D, rotated. Also a customization menu is included and outputs couldbe saved in jpeg format. Also this new version includes an interactive help and alldialog windows have been improved in order to facilitate its use.To use CoDaPack one has to access Excel© and introduce the data in a standardspreadsheet. These should be organized as a matrix where Excel© rows correspond tothe observations and columns to the parts. The user executes macros that returnnumerical or graphical results. There are two kinds of numerical results: new variablesand descriptive statistics, and both appear on the same sheet. Graphical output appearsin independent windows. In the present version there are 8 menus, with a total of 38submenus which, after some dialogue, directly call the corresponding macro. Thedialogues ask the user to input variables and further parameters needed, as well aswhere to put these results. The web site http://ima.udg.es/CoDaPack contains thisfreeware package and only Microsoft Excel© under Microsoft Windows© is required torun the software.Kew words: Compositional data Analysis, Software
Resumo:
Aquest projecte es centra principalment en el detector no coherent d’un GPS. Per tal de caracteritzar el procés de detecció d’un receptor, es necessita conèixer l’estadística implicada. Pel cas dels detectors no coherents convencionals, l’estadística de segon ordre intervé plenament. Les prestacions que ens dóna l’estadística de segon ordre, plasmada en la ROC, són prou bons tot i que en diferents situacions poden no ser els millors. Aquest projecte intenta reproduir el procés de detecció mitjançant l’estadística de primer ordre com a alternativa a la ja coneguda i implementada estadística de segon ordre. Per tal d’aconseguir-ho, s’usen expressions basades en el Teorema Central del Límit i de les sèries Edgeworth com a bones aproximacions. Finalment, tant l’estadística convencional com l’estadística proposada són comparades, en termes de la ROC, per tal de determinar quin detector no coherent ofereix millor prestacions en cada situació.
Resumo:
Given an observed test statistic and its degrees of freedom, one may compute the observed P value with most statistical packages. It is unknown to what extent test statistics and P values are congruent in published medical papers. Methods:We checked the congruence of statistical results reported in all the papers of volumes 409–412 of Nature (2001) and a random sample of 63 results from volumes 322–323 of BMJ (2001). We also tested whether the frequencies of the last digit of a sample of 610 test statistics deviated from a uniform distribution (i.e., equally probable digits).Results: 11.6% (21 of 181) and 11.1% (7 of 63) of the statistical results published in Nature and BMJ respectively during 2001 were incongruent, probably mostly due to rounding, transcription, or type-setting errors. At least one such error appeared in 38% and 25% of the papers of Nature and BMJ, respectively. In 12% of the cases, the significance level might change one or more orders of magnitude. The frequencies of the last digit of statistics deviated from the uniform distribution and suggested digit preference in rounding and reporting.Conclusions: this incongruence of test statistics and P values is another example that statistical practice is generally poor, even in the most renowned scientific journals, and that quality of papers should be more controlled and valued
Resumo:
This manuscript reports the study of the carbon-halide bond cleavage in 4-nitrobenzyl halides, taking special attention to the iodide and fluoride derivatives. The electrochemical reduction mechanism has been disclosed for both compounds by terms of cyclic voltammetry and controlled potential electrolysis. In the case of 4-nitrobenzyl iodide, a first one electron irreversible wave leads to the corresponding 4-nitrobenzyl radical and iodide. However, in the case of 4-nitrobenzyl fluoride, a first one-electron reversible wave appears at –1.02 vs. SCE followed by one electron irreversible wave. In this second electron transfer process, the cleavage of the C-F bond is taking place, so the bond cleavage reaction occurs at the dianion level. To disclose and understand the electrochemical reduction mechanisms that allows to obtain important thermodynamic and kinetic data that would help in the understanding of C-X bond cleavage. This type of bond dissociation reactions are involved in the metabolism pathways of the human body.