7 resultados para fisher
em Helda - Digital Repository of University of Helsinki
Resumo:
The objective was to measure productivity growth and its components in Finnish agriculture, especially in dairy farming. The objective was also to compare different methods and models - both parametric (stochastic frontier analysis) and non-parametric (data envelopment analysis) - in estimating the components of productivity growth and the sensitivity of results with respect to different approaches. The parametric approach was also applied in the investigation of various aspects of heterogeneity. A common feature of the first three of five articles is that they concentrate empirically on technical change, technical efficiency change and the scale effect, mainly on the basis of the decompositions of Malmquist productivity index. The last two articles explore an intermediate route between the Fisher and Malmquist productivity indices and develop a detailed but meaningful decomposition for the Fisher index, including also empirical applications. Distance functions play a central role in the decomposition of Malmquist and Fisher productivity indices. Three panel data sets from 1990s have been applied in the study. The common feature of all data used is that they cover the periods before and after Finnish EU accession. Another common feature is that the analysis mainly concentrates on dairy farms or their roughage production systems. Productivity growth on Finnish dairy farms was relatively slow in the 1990s: approximately one percent per year, independent of the method used. Despite considerable annual variation, productivity growth seems to have accelerated towards the end of the period. There was a slowdown in the mid-1990s at the time of EU accession. No clear immediate effects of EU accession with respect to technical efficiency could be observed. Technical change has been the main contributor to productivity growth on dairy farms. However, average technical efficiency often showed a declining trend, meaning that the deviations from the best practice frontier are increasing over time. This suggests different paths of adjustment at the farm level. However, different methods to some extent provide different results, especially for the sub-components of productivity growth. In most analyses on dairy farms the scale effect on productivity growth was minor. A positive scale effect would be important for improving the competitiveness of Finnish agriculture through increasing farm size. This small effect may also be related to the structure of agriculture and to the allocation of investments to specific groups of farms during the research period. The result may also indicate that the utilization of scale economies faces special constraints in Finnish conditions. However, the analysis of a sample of all types of farms suggested a more considerable scale effect than the analysis on dairy farms.
Resumo:
Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.
Resumo:
Mika KT Pajusen väitös "Towards 'a real reunion'?" – Archbishop Aleksi Lehtonen's efforts for closer relations with the Church of England 1945–1951 on yleiseen kirkkohistoriaan lukeutuva tutkimus Englannin kirkon ja Suomen evankelis-luterilaisen kirkon välisistä suhteista Aleksi Lehtosen arkkipiispakaudella 1945–1951. Suhteita on tutkittu kolmesta näkökulmasta: ekumeenisesta, poliittisesta ja kirkkopoliittisesta. Tutkimuskausi alkaa pastori H.M. Waddamsin joulukuussa 1944 Suomeen tekemän vierailun jälkimainingeista ja päättyy arkkipiispa Lehtosen kuolemaan pääsiäisenä 1951. Kirkollisten suhteiden kehitystä rytmittivät lukuisat vierailut, jotka osoittivat Englannin kirkon asenteen muuttumisen sodan aikaisesta neuvostomyönteisyydestä kylmän sodan aikaiseen täysin vastakkaiseen kantaan. Englantilaiset vieraat kohtasivat Suomessa sekä kirkon että yhteiskunnan ylimmän johdon. Molemmat maat olivat valmiita tukemaan hyviä kirkollisia suhteita tilanteen niin salliessa, joskaan eivät kovin suunnitelmallisesti. Suomen evankelis-luterilainen kirkko käytti hyviä suhteita Englannin kirkkoon saadakseen tukea ja ymmärrystä omalle kirkolleen ja yhteiskunnalleen kokemaansa Neuvostoliiton uhkaa vastaan erityisesti vaaran vuosina 1944–1948. Englannin kirkko halusi tukea suomalaista sisarkirkkoaan, mutta varoi, ettei tuottaisi tuellaan enemmän haittaa kuin hyötyä suhteessa Neuvostoliittoon. Sodan jälkeinen ekumeeninen jälleenrakentaminen lähensi kirkkoja toisiinsa. Lehtonen pyrki jatkamaan 1930-luvun kirkkojen välisiä, ehtoollisvieraanvaraisuuden saavuttaneita neuvotteluita kohti täyttä kirkollista yhteyttä. Häntä motivoi sekä evankelis-katolinen teologia että pyrkimys tukea oman maan ja kirkon läntisiä yhteyksiä. Tämä haastoi Englannin kirkon ekumeenisen linjan, joka Suomen kirkon sijasta pyrki jatkamaan neuvotteluja Tanskan, Norjan ja Islannin luterilaisten kirkkojen kanssa, joilla ei vielä ollut virallista ekumeenista sopimusta Englannin kirkon kanssa. Lehtosen pyrkimyksistä huolimatta Englannin kirkko päätyi jättämään Suomen tilanteen hautumaan. Sillä se tarkoitti suhteiden koetinkivenä olleen historiallisen piispuuden leviämistä läpi Suomen kirkon ennen kuin katsoi olevansa valmis jatkamaan kohti täyttä kirkollista yhteyttä. Molemmissa kirkoissa vaikutti pieni, innokkaiden, lähempiä suhteita toivoneiden kirkollisten vaikuttajien ydinjoukko. Englantilaisia Suomen-ystäviä motivoi tarve auttaa Suomea hankalassa poliittisessa tilanteessa. Suomessa arkkipiispa Lehtonen tuki korkeakirkollista liturgista liikettä, jolla oli läheinen yhteys anglikaanisuuteen, mutta joka sai vastaansa vanhoilliset pietistit. Suomen kirkon yleinen mielipide asettui etupäässä pietistiselle kannalle, jolle anglikaanisuus näyttäytyi teologisesti sekä liian katolisena että liian reformoituna. Kirkolliset suhteet tasaantuivat vuoden 1948 Lambeth-konferenssin jälkeen, joka rohkaisi anglikaanisia kirkkoja hyväksymään 1930-luvun neuvottelujen lähempiin kirkollisiin suhteisiin tähtäävät suositukset. Lehtonen näytti tyytyvän tähän. Samaan aikaan lähempää kirkollista kanssakäymistä tukenut ekumeeninen jälleenrakennus tuli tiensä päähän. Lehtonen jatkoi läheisempien suhteiden edistämistä, mutta hänen intonsa hiipui yhdessä heikkenevän terveydentilan kanssa. Osoituksena Lehtosen linjan kapeudesta Suomen evankelis-luterilaisen kirkon piispoista ei löytynyt hänen kuoltuaan ketään, joka olisi jatkanut hänen aktiivista anglikaanimyönteistä linjaansa.
Resumo:
A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN.
Resumo:
Tutkimuksessa mitataan porsastuotannon tuottavuuden kehitystä ProAgrian sikatilinpäätöstiloilla vuosina 2003–2008. Tuottavuutta mitataan Fisher-tuottavuusindeksillä, joka dekomponoidaan tekniseen, allokatiiviseen ja skaalatehokkuuteen sekä teknologiseen kehitykseen ja hintavaikutukseen. Koko aineistosta aggregoidulla tuottavuusindeksillä mitattuna tuottavuus kasvoi viidessä vuodessa yhteensä 14,3 % vuotuisen kasvun ollessa 2,7 %. Tuottajien keskimääräinen tuottavuusindeksi antaa lähes saman tuloksen: sen mukaan tuottavuus kasvaa yhteensä 14,7 %, mikä tekee 2,8 % vuodessa. Skaalatehokkuuden paraneminen havaitaan merkittävimmäksi tuottavuuskasvun lähteeksi. Skaalatehokkuus paranee aggregoidusti mitattuna 1,6 % vuodessa ja tiloilla keskimäärin 2,1 % vuodessa. Teknisen tehokkuuden koheneminen on toinen tuottavuuskasvua edistävä tekijä tutkimusjaksolla. Molemmilla mittaustavoilla nousu on keskimäärin 1,4 % vuodessa. Allokatiivinen tehokkuus laskee hieman: aggregoidusti mitattuna 0,1 % ja keskimäärin 0,4 % vuodessa. Teknologinen kehitys tutkimusjaksolla on lievästi negatiivista, keskimäärin -0,1 % vuodessa. Vuosittaiset vaihtelut ovat kuitenkin voimakkaita. Hintojen muutokset eivät juuri ole vaikuttaneet tuottavuuden tasoon, sillä hintavaikutuksen vuotuiset muutokset jäävät jokaisena vuonna alle puoleen prosenttiin ja keskimääräinen vuotuinen muutos on -0,1 %. Keskeinen tuottavuuskasvua edistänyt tekijä näyttää olleen tilakoon kasvu, joka on parantanut rakenteellista tehokkuutta. Teknologisen kehityksen jääminen negatiiviseksi kuitenkin tarkoittaa, että paras havaittu tuottavuuden taso ei ole noussut lainkaan.
Improving outcome of childhood bacterial meningitis by simplified treatment : Experience from Angola
Resumo:
Background Acute bacterial meningitis (BM) continues to be an important cause of childhood mortality and morbidity, especially in developing countries. Prognostic scales and the identification of risk factors for adverse outcome both aid in assessing disease severity. New antimicrobial agents or adjunctive treatments - except for oral glycerol - have essentially failed to improve BM prognosis. A retrospective observational analysis found paracetamol beneficial in adult bacteraemic patients, and some experts recommend slow β-lactam infusion. We examined these treatments in a prospective, double-blind, placebo-controlled clinical trial. Patients and methods A retrospective analysis included 555 children treated for BM in 2004 in the infectious disease ward of the Paediatric Hospital of Luanda, Angola. Our prospective study randomised 723 children into four groups, to receive a combination of cefotaxime infusion or boluses every 6 hours for the first 24 hours and oral paracetamol or placebo for 48 hours. The primary endpoints were 1) death or severe neurological sequelae (SeNeSe), and 2) deafness. Results In the retrospective study, the mortality of children with blood transfusion was 23% (30 of 128) vs. without blood transfusion 39% (109 of 282; p=0.004). In the prospective study, 272 (38%) of the children died. Of those 451 surviving, 68 (15%) showed SeNeSe, and 12% (45 of 374) were deaf. Whereas no difference between treatment groups was observable in primary endpoints, the early mortality in the infusion-paracetamol group was lower, with the difference (Fisher s exact test) from the other groups at 24, 48, and 72 hours being significant (p=0.041, 0.0005, and 0.005, respectively). Prognostic factors for adverse outcomes were impaired consciousness, dyspnoea, seizures, delayed presentation, and absence of electricity at home (Simple Luanda Scale, SLS); the Bayesian Luanda Scale (BLS) also included abnormally low or high blood glucose. Conclusions New studies concerning the possible beneficial effect of blood transfusion, and concerning longer treatment with cefotaxime infusion and oral paracetamol, and a study to validate our simple prognostic scales are warranted.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.