810 resultados para Mètodes de simulació


Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'anàlisi de la significació tant del contingut com del discurs generat per les imatges s'aborda amb mètodes qualitatius. La interpretació de les dades manifestes i latents de la imatge és el resultat de la interacció de l'observador amb la imatge i està afectada pel coneixement del context, la interpretació simbòlica que se'n fa i l'entorn social, històric i cultural en què estan immersos tant l'observador com la mateixa imatge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En un context d'educació a distància i amb usuaris virtuals, la formació en l'ús dels serveis i recursos d'una biblioteca universitària requereix d'una adaptació de les formes tradicionalment ofertes des de les Biblioteques presencials al nou entorn amb mètodes nous i innovadors. En aquest sentit, la Biblioteca de l'UOC ha desenvolupat noves vies de comunicació i nous formats per poder difondre serveis i per formar els nostres usuaris en l'ús d'aquests. La característica comuna de la majoria d'ells, és la de ser 'formació a distància' que es du a terme a través d'un 'Campus virtual'. Les diferents opcions seran analitzades en la comunicació així com les problemàtiques i avantatges que presenten els diversos mètodes de formació duts a terme per les biblioteques universitàries en entorns virtuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This presentation aims to make understandable the use and application context of two Webometrics techniques, the logs analysis and Google Analytics, which currently coexist in the Virtual Library of the UOC. In this sense, first of all it is provided a comprehensive introduction to webometrics and then it is analysed the case of the UOC's Virtual Library focusing on the assimilation of these techniques and the considerations underlying their use, and covering in a holistic way the process of gathering, processing and data exploitation. Finally there are also provided guidelines for the interpretation of the metric variables obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La gran demanda energètica i la problemàtica dels combustibles fòssils i d‟altres recursos per a obtenir energia posen de manifest la necessitat de desenvolupar tecnologies netes, sostenibles i econòmicament viables de generació d‟energia. En aquest àmbit, les cel·les solars sensibilitzades amb colorant (Dye Sensitized Solar Cells, DSSC), que transformen l‟energia solar a energia elèctrica, són una solució factible. A més, el desenvolupament de mètodes per a construir aquestes cel·les a baixa temperatura permetria fabricar-les sobre substrats plàstics, fet que els donaria un valor afegit i permetria una producció en continu, ràpida i amb baix cost tant, econòmic, com d‟impacte ambiental. Aquest treball presenta el desenvolupament d‟un mètode de producció a baixa temperatura (140 ºC) de DSSC, amb eficiència de 5,9 % sobre substrats FTO/vidre. Aquest valor és superior a la majoria de les eficiències reportades a la bibliografia de DSSC construïdes a baixa temperatura. Les capes mesoporoses que formen els elèctrodes de les DCCS es dipositen per doctor blade a partir d‟una pasta composta per nanopartícules de TiO2, de 4-8 nm, recobertes d‟àcid 3,6,9-trioxadecanoic, per nanopartícules de Degussa P25, de 20-25 nm, i com a dissolvents només s‟utilitza aigua i etanol. L‟aplicació d‟un tractament a 140 ºC permet eliminar la matèria orgànica de la superfície de les nanopartícules de 4-8 nm i unir-les a les de Degussa P25. Aquest fet permet obtenir capes mesoporoses sinteritzades de 6 μm de gruix. A més, l‟aplicació d‟un post-tractament, en el qual s‟utilitza l‟àcid hexafluoro titànic (IV), produeix un lleuger increment en l‟eficiència. Endemés, l‟obtenció de capes primes de TiO2 sobre substrats plàstics és un tema d‟actualitat a causa de la falta de mètodes de deposició a baixa temperatura. En aquest context, s‟ha sintetitzat, mitjançant processos respectuosos amb el medi ambient nanopartícules de TiO2 cristal·lí modificades superficialment amb lauril gal·lat i àcid 3,6,9-trioxodecanoic. Les nanopartícules poden ser dispersades en dissolvents orgànics i aigua respectivament, fet que permet obtenir suspensions estables i de fàcil manipulació. Aquestes poden ser utilitzades com a precursores per a obtenir capes primes a baixa temperatura de TiO2 cristal·lí. En concret, les capes primes formades per nanopartícules de TiO2 modificades amb àcid 3,6,9-trioxodecanoic s‟han utilitzat com a blocking layer en les DSSC construïdes a baixa temperatura.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines how structural determinants influence intermediary factors of child health inequities and how they operate through the communities where children live. In particular, we explore individual, family and community level characteristics associated with a composite indicator that quantitatively measures intermediary determinants of early childhood health in Colombia. We use data from the 2010 Colombian Demographic and Health Survey (DHS). Adopting the conceptual framework of the Commission on Social Determinants of Health (CSDH), three dimensions related to child health are represented in the index: behavioural factors, psychosocial factors and health system. In order to generate the weight of the variables and take into account the discrete nature of the data, principal component analysis (PCA) using polychoric correlations are employed in the index construction. Weighted multilevel models are used to examine community effects. The results show that the effect of household’s SES is attenuated when community characteristics are included, indicating the importance that the level of community development may have in mediating individual and family characteristics. The findings indicate that there is a significant variance in intermediary determinants of child health between-community, especially for those determinants linked to the health system, even after controlling for individual, family and community characteristics. These results likely reflect that whilst the community context can exert a greater influence on intermediary factors linked directly to health, in the case of psychosocial factors and the parent’s behaviours, the family context can be more important. This underlines the importance of distinguishing between community and family intervention programmes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuouslycored boreholes, 100 to 220m deep were drilled in the northern part of the PoPlain by Regione Lombardia in the last five years. Quantitative provenanceanalysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carriedout by using multivariate statistical analysis (principal component analysis, PCA,and similarity analysis) on an integrated data set, including high-resolution bulkpetrography and heavy-mineral analyses on Pleistocene sands and of 250 majorand minor modern rivers draining the southern flank of the Alps from West toEast (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations,metamorphic and quartzofeldspathic detritus from the Western and Central Alpswas carried from the axial belt to the Po basin longitudinally parallel to theSouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenariorapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset ofthe first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA andsimilarity analysis from core samples show that the longitudinal trunk river at thistime was shifted southward by the rapid southward and westward progradation oftransverse alluvial river systems fed from the Central and Southern Alps.Sediments were transported southward by braided river systems as well as glacialsediments transported by Alpine valley glaciers invaded the alluvial plain.Kew words: Detrital modes; Modern sands; Provenance; Principal ComponentsAnalysis; Similarity, Canberra Distance; palaeodrainage

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper analyses the regional flows of domestic tourism that took place in Spain in year 2000, contributing to the state of knowledge on tourism required by authorities and private firms when faced with decision making, for example, for regional infrastructure planning. Although tourism is one of the main income-generating economic activities in Spain, domestic tourism has received little attention in the literature compared to inbound tourism. The paper uses among others, gravitational model tools and concentration indices, to analyse regional concentration of both domestic demand and supply; tourism flows among regions, and the causes that may explain the observed flows and attractiveness between regions. Among the most remarkable results are the high regional concentration of demand and supply, and the role of population and regional income as explanatory variables. Also remarkable are the attractiveness of own region and neighbour ones, and that domestic tourism may be acting as a regional income redistributing activity

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Starting with logratio biplots for compositional data, which are based on the principle of subcompositional coherence, and then adding weights, as in correspondence analysis, we rediscover Lewi's spectral map and many connections to analyses of two-way tables of non-negative data. Thanks to the weighting, the method also achieves the property of distributional equivalence

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last twodecades have made possible a much deeper exploration of the nature of variability,and the possible processes associated with compositional data sets from manydisciplines. In this paper we concentrate on geochemical data sets. First we explainhow hypotheses of compositional variability may be formulated within the naturalsample space, the unit simplex, including useful hypotheses of subcompositionaldiscrimination and specific perturbational change. Then we develop through standardmethodology, such as generalised likelihood ratio tests, statistical tools to allow thesystematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require specialconstruction. We comment on the use of graphical methods in compositional dataanalysis and on the ordination of specimens. The recent development of the conceptof compositional processes is then explained together with the necessary tools for astaying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland.Finally we point out a number of unresolved problems in the statistical analysis ofcompositional processes

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of perturbation and power transformation operations permits the investigation of linear processes in the simplex as in a vectorial space. When the investigated geochemical processes can be constrained by the use of well-known starting point, the eigenvectors of the covariance matrix of a non-centred principalcomponent analysis allow to model compositional changes compared with a reference point.The results obtained for the chemistry of water collected in River Arno (central-northern Italy) have open new perspectives for considering relative changes of the analysed variables and to hypothesise the relative effect of different acting physical-chemical processes, thus posing the basis for a quantitative modelling

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kriging is an interpolation technique whose optimality criteria are based on normality assumptions either for observed or for transformed data. This is the case of normal, lognormal and multigaussian kriging.When kriging is applied to transformed scores, optimality of obtained estimators becomes a cumbersome concept: back-transformed optimal interpolations in transformed scores are not optimal in the original sample space, and vice-versa. This lack of compatible criteria of optimality induces a variety of problems in both point and block estimates. For instance, lognormal kriging, widely used to interpolate positivevariables, has no straightforward way to build consistent and optimal confidence intervals for estimates.These problems are ultimately linked to the assumed space structure of the data support: for instance, positive values, when modelled with lognormal distributions, are assumed to be embedded in the whole real space, with the usual real space structure and Lebesgue measure

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The low levels of unemployment recorded in the UK in recent years are widely cited asevidence of the country’s improved economic performance, and the apparent convergence of unemployment rates across the country’s regions used to suggest that the longstanding divide in living standards between the relatively prosperous ‘south’ and the more depressed ‘north’ has been substantially narrowed. Dissenters from theseconclusions have drawn attention to the greatly increased extent of non-employment(around a quarter of the UK’s working age population are not in employment) and themarked regional dimension in its distribution across the country. Amongst these dissenters it is generally agreed that non-employment is concentrated amongst oldermales previously employed in the now very much smaller ‘heavy’ industries (e.g. coal,steel, shipbuilding).This paper uses the tools of compositiona l data analysis to provide a much richer picture of non-employment and one which challenges the conventional analysis wisdom about UK labour market performance as well as the dissenters view of the nature of theproblem. It is shown that, associated with the striking ‘north/south’ divide in nonemployment rates, there is a statistically significant relationship between the size of the non-employment rate and the composition of non-employment. Specifically, it is shown that the share of unemployment in non-employment is negatively correlated with the overall non-employment rate: in regions where the non-employment rate is high the share of unemployment is relatively low. So the unemployment rate is not a very reliable indicator of regional disparities in labour market performance. Even more importantly from a policy viewpoint, a significant positive relationship is found between the size ofthe non-employment rate and the share of those not employed through reason of sicknessor disability and it seems (contrary to the dissenters) that this connection is just as strong for women as it is for men

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In human Population Genetics, routine applications of principal component techniques are oftenrequired. Population biologists make widespread use of certain discrete classifications of humansamples into haplotypes, the monophyletic units of phylogenetic trees constructed from severalsingle nucleotide bimorphisms hierarchically ordered. Compositional frequencies of the haplotypesare recorded within the different samples. Principal component techniques are then required as adimension-reducing strategy to bring the dimension of the problem to a manageable level, say two,to allow for graphical analysis.Population biologists at large are not aware of the special features of compositional data and normally make use of the crude covariance of compositional relative frequencies to construct principalcomponents. In this short note we present our experience with using traditional linear principalcomponents or compositional principal components based on logratios, with reference to a specificdataset

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main instrument used in psychological measurement is the self-report questionnaire. One of its majordrawbacks however is its susceptibility to response biases. A known strategy to control these biases hasbeen the use of so-called ipsative items. Ipsative items are items that require the respondent to makebetween-scale comparisons within each item. The selected option determines to which scale the weight ofthe answer is attributed. Consequently in questionnaires only consisting of ipsative items everyrespondent is allotted an equal amount, i.e. the total score, that each can distribute differently over thescales. Therefore this type of response format yields data that can be considered compositional from itsinception.Methodological oriented psychologists have heavily criticized this type of item format, since the resultingdata is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians havekept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate bothpositions and addresses the similarities and differences between the two data collection methods. Theultimate objective is to formulate a guideline when to use which type of item format.The comparison is based on data obtained with both an ipsative and normative version of threepsychological questionnaires, which were administered to 502 first-year students in psychology accordingto a balanced within-subjects design. Previous research only compared the direct ipsative scale scoreswith the derived ipsative scale scores. The use of compositional data analysis techniques also enables oneto compare derived normative score ratios with direct normative score ratios. The addition of the secondcomparison not only offers the advantage of a better-balanced research strategy. In principle it also allowsfor parametric testing in the evaluation