1000 resultados para test amb usuaris
Resumo:
La finalitat del nostre estudi és valorar l’evolució de la flexibilitat de la cadena cinètica posterior en els escolars de 5 a 11 anys, i observar en quins grups d’edat és necessària l’aplicació de programes específics per a millorar-la. Els escolars varen ser sotmesos a les mateixes valoracions: el test de “sit and reach”, la pressa de les mesures antropomètriques, altura i pes, i el qüestionari Minesotta sobre el consum calòric en el temps de lleure. Dels resultats es desprèn que a mesura que augmenta l’edat, la flexibilitat dels escolars estudiats disminueix progressivament. Existeix un interval d’edat situat en els 9 anys que sembla marcar un canvi de tendència en la flexibilitat dels escolars. És a partir d’aquesta edat quan el grau de flexibilitat disminueix de forma significativa i, es més marcada entre els nens, sent necessària una intervenció més específica en ells.
Resumo:
Material throughput is a means of measuring the so-called social metabolism, or physical dimensions of a society’s consumption, and can be taken as an indirect and approximate indicator of sustainability. Material flow accounting can be used to test the dematerialisation hypothesis, the idea that technological progress causes a decrease in total material used (strong dematerialisation) or material used per monetary unit of output (weak dematerialisation). This paper sets out the results of a material flow analysis for Spain for the period from 1980 to 2000. The analysis reveals that neither strong nor weak dematerialisation took place during the period analysed. Although the population did not increase considerably, materials mobilised by the Spanish economy (DMI) increased by 85% in absolute terms, surpassing GDP growth. In addition, Spain became more dependent on external trade in physical terms. In fact, its imports are more than twice the amount of its exports in terms of weight.
Resumo:
Treball de recerca realitzat per un alumne d’ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l’any 2005. Estudi sobre l’ADN que té com a finalitat conèixer introductòriament l’utilització del càlcul matemàtic computacional en les investigacions sobre aquest. Els objectius de l’estudi són per una part, conèixer què és l'ADN i quins són els seus mecanismes de duplicació i de transmissió de la informació genètica, així com el paper d'altres molècules que intervenen en aquest procés ; també s’estudia quins han estat els processos de la cèl·lula que l'ésser humà ha estat capaç de copiar o imitar. A partir d’aquesta introducció, es vol conèixer què s'entén concretament per computació amb ADN i alguns dels problemes matemàtics que s'han resolt, així com algunes aplicacions de l'ADN en altres camps. La recerca ha permès arribar a diverses conclusions. Primerament que l'ADN és un excel·lent candidat per poder fer càlculs matemàtics. En segon lloc, tot i que en el present treball no se solucionen problemes computacionalment difícils es mostra la capacitat de les molècules d'ADN per resoldre problemes. En tercer lloc, l'interès mostrat per importants empreses dedicades a la informàtica fa més esperançador que en un futur hi pugui haver ordinadors que funcionin amb molècules d'ADN. Finalment, es demostra que les matemàtiques, la informàtica i la biologia són tres camps que estan interrelacionats. Per tal de trencar una mica amb la serietat del treball, s'acaba descrivint una manera de posar música a les cadenes d'ADN, i es mostren alguns resultats com són les músiques associades als 24 cromosomes humans, així com les corresponents a 29 proteïnes.
Resumo:
We present a computer-assisted analysis of combinatorial properties of the Cayley graphs of certain finitely generated groups: Given a group with a finite set of generators, we study the density of the corresponding Cayley graph, that is, the least upper bound for the average vertex degree (= number of adjacent edges) of any finite subgraph. It is known that an m-generated group is amenable if and only if the density of the corresponding Cayley graph equals to 2m. We test amenable and non-amenable groups, and also groups for which amenability is unknown. In the latter class we focus on Richard Thompson’s group F.
Resumo:
This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.
Resumo:
Pérez-Castrillo and Wettstein (2002) and Veszteg (2004) propose the use of a multibidding mechanism for situations where agents have to choose a common project. Examples are decisions involving public goods (or public "bads"). We report experimental results to test the practical tractability and effectiveness of the multibidding mechanisms in environments where agents hold private information concerning their valuation of the projects. The mechanism performed quite well in the laboratory: it provided the ex post efficient outcome in roughly three quarters of the cases across the treatments; moreover, the largest part of the subject pool formed their bids according to the theoretical bidding behavior.
Resumo:
We report on a series of experiments that test the effects of an uncertain supply on the formation of bids and prices in sequential first-price auctions with private-independent values and unit-demands. Supply is assumed uncertain when buyers do not know the exact number of units to be sold (i.e., the length of the sequence). Although we observe a non-monotone behavior when supply is certain and an important overbidding, the data qualitatively support our price trend predictions and the risk neutral Nash equilibrium model of bidding for the last stage of a sequence, whether supply is certain or not. Our study shows that behavior in these markets changes significantly with the presence of an uncertain supply, and that it can be explained by assuming that bidders formulate pessimistic beliefs about the occurrence of another stage.
Resumo:
We use structural methods to assess equilibrium models of bidding with data from first-price auction experiments. We identify conditions to test the Nash equilibrium models for homogenous and for heterogeneous constant relative risk aversion when bidders private valuations are independent and uniformly drawn. The outcomes of our study indicate that behavior may have been affected by the procedure used to conduct the experiments and that the usual Nash equilibrium model for heterogeneous constant relative risk averse bidders does not consistently explain the observed overbidding. From an empirical standpoint, our analysis shows the possible drawbacks of overlooking the homogeneity hypothesis when testing symmetric equilibrium models of bidding and it puts in perspective the sensitivity of structural inferences to the available information.
Resumo:
When two candidates of different quality compete in a one dimensional policy space, the equilibrium outcomes are asymmetric and do not correspond to the median. There are three main effects. First, the better candidate adopts more centrist policies than the worse candidate. Second, the equilibrium is statistical, in the sense that it predicts a probability distribution of outcomes rather than a single degenerate outcome. Third, the equilibrium varies systematically with the level of uncertainty about the location of the median voter. We test these three predictions using laboratory experiments, and find strong support for all three. We also observe some biases and show that they canbe explained by quantal response equilibrium.
Resumo:
The Hausman (1978) test is based on the vector of differences of two estimators. It is usually assumed that one of the estimators is fully efficient, since this simplifies calculation of the test statistic. However, this assumption limits the applicability of the test, since widely used estimators such as the generalized method of moments (GMM) or quasi maximum likelihood (QML) are often not fully efficient. This paper shows that the test may easily be implemented, using well-known methods, when neither estimator is efficient. To illustrate, we present both simulation results as well as empirical results for utilization of health care services.
Resumo:
Expectations are central to behaviour. Despite the existence of subjective expectations data, the standard approach is to ignore these, to hypothecate a model of behaviour and to infer expectations from realisations. In the context of income models, we reveal the informational gain obtained from using both a canonical model and subjective expectations data. We propose a test for this informational gain, and illustrate our approach with an application to the problem of measuring income risk.
Resumo:
Empirical evidence on the effectiveness of R&D subsidies to firms has produced mixed results so far. One possible explanation is that firms and project selection rules may be quite heterogeneous both across agencies and across industries, leading to different outcomes in terms of the induced additional private effort. Here we focus on the participation stage. Using a sample of Spanish firms, we test for differences across agencies and industries. Our results suggest that firms in the same industry face different hurdles to participate in different agencies’ programs, that participation patterns may reflect a combination of agency goals, and that patterns differ across high-tech and low-tech industries.
Resumo:
At the end of the XIX Century, Marshall described the existence of some concentrations of small and medium enterprises specialised in a specific production activity in certain districts of some industrial English cities. Starting from his contribute, Italian scholars have paid particular attention to this local system of production coined by Marshall under the term industrial district. In other countries, different but related territorial models have played a central role as the milieu or the geographical industrial clusters. Recently, these models have been extended to non-industrial fields like culture, rural activities and tourism. In this text, we explore the extension of these territorial models to the study of tourist activities in Italy, using a framework that can be easily applied to other countries or regions. The paper is divided in five sections. In the first one, we propose a review of the territorial models applied to tourism industry. In the second part, we construct a tourist filiere and we apply a methodology for the identification of local systems through GIS tools. Thus, taxonomy of the Italian Tourist Local Systems is presented. In the third part, we discuss about the sources of competitiveness of these Tourist Local Systems. In the forth section, we test a spatial econometrics model regarding different kinds of Italian Tourist Local Systems (rural systems, arts cities, tourist districts) in order to measure external economies and territorial networks. Finally, conclusions and policy implications are exposed.
Recerca de producció de gravitons en escenaris amb dimensions extres al Tevatron amb el detector CDF
Resumo:
Les dimensions extres plantegen una solució al problema de la jerarquia de les forces i proposen una explicació a l’aparent feblesa de la força de la gravetat en front de les altres forces. Utilitzant dades simulades i dades proporcionades pel detector CDF situat en l’accelerador Tevatron a Fermilab (Chicago, EEUU), s’ha realitzat una recerca de dimensions extres i de producció de gravitons.