29 resultados para Schur-Cohn criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes methods for the reliable identification of hadronically decaying tau leptons in the search for heavy Higgs bosons of the minimal supersymmetric standard model of particle physics (MSSM). The identification of the hadronic tau lepton decays, i.e. tau-jets, is applied to the gg->bbH, H->tautau and gg->tbH+, H+->taunu processes to be searched for in the CMS experiment at the CERN Large Hadron Collider. Of all the event selections applied in these final states, the tau-jet identification is the single most important event selection criterion to separate the tiny Higgs boson signal from a large number of background events. The tau-jet identification is studied with methods based on a signature of a low charged track multiplicity, the containment of the decay products within a narrow cone, an isolated electromagnetic energy deposition, a non-zero tau lepton flight path, the absence of electrons, muons, and neutral hadrons in the decay signature, and a relatively small tau lepton mass compared to the mass of most hadrons. Furthermore, in the H+->taunu channel, helicity correlations are exploited to separate the signal tau jets from those originating from the W->taunu decays. Since many of these identification methods rely on the reconstruction of charged particle tracks, the systematic uncertainties resulting from the mechanical tolerances of the tracking sensor positions are estimated with care. The tau-jet identification and other standard selection methods are applied to the search for the heavy neutral and charged Higgs bosons in the H->tautau and H+->taunu decay channels. For the H+->taunu channel, the tau-jet identification is redone and optimized with a recent and more detailed event simulation than previously in the CMS experiment. Both decay channels are found to be very promising for the discovery of the heavy MSSM Higgs bosons. The Higgs boson(s), whose existence has not yet been experimentally verified, are a part of the standard model and its most popular extensions. They are a manifestation of a mechanism which breaks the electroweak symmetry and generates masses for particles. Since the H->tautau and H+->taunu decay channels are important for the discovery of the Higgs bosons in a large region of the permitted parameter space, the analysis described in this thesis serves as a probe for finding out properties of the microcosm of particles and their interactions in the energy scales beyond the standard model of particle physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biopower, Otherness and Women's Agency in Assisted Reproduction. This sociological study analyses how, why and with what kind of consequences assisted reproductive technologies (ART) have become the primary technology for governing infertility in Finland both on the level of individuals and society. The phenomenon is construed as one the strategies of the Focaultian biopower since ART are political techniques of the beginning of life par excellence, as they are used to prepare the bodies of certain types of women to create certain kind of life, i.e. certain kind of children. Moreover, ART are interpreted to be gendered control techniques with which the pure, and at the same time prevailing, social order symbolised by a female body is maintained by naming and excluding otherness, unsuitable mother candidates and children. Finally, it is considered how the agency, subjectivity, of women experiencing infertility and seeking treatment appears in the prevailing context of ART. The introduction of IVF-based reproductive technologies to Finland and the treatment practices of the early 1990s have been studied on the basis of a clinic questionnaire, medical doctor interviews and articles of the Medical Journal Duodecim from 1969 to 2000. Opinions on the method of the treatment providers were studied by conducting a theme interview with fertilisation doctors in 1993. Experiences of women who have received treatment or experienced infertility were studied by means of a survey in 1994 and by analysing the content of messages in an online discussion forum in 2000. On the basis of the medical doctor interviews, significant criterion for choosing mother candidates turned out to be her vitality and her mental and physical health, which are considered prerequisites for a vitality of the child to be born. The hierarchies concerning children became evident. While people normally make their children on their own, this is what people experiencing infertility are trying to do as well. In the era of ART, the primary child is genetically the parents' own child, a secondary option for Finnish parents is a genetically Finnish child conceived by donated Finnish gametes or embryos and the last option is an adopted child of foreign origin. Women's agency mainly appears in their way of using ART as a technology of the self for self-control on one's own nature, which helps them to prepare their bodies in order to become pregnant in co-operation with a fertilisation doctor. Women's creative free agency exceeding governance appeared as a distinctive use of language with which they created shared meaning for their infertility experience, their own individual and group identity and distinctive reality. ART are very political techniques as they have a possibility to change the methods of having children and to shape life. Therefore, further sociological research on them is important and needed. Key words: practises of assisted reproduction, women's agency, biopower, vital politics of the beginning of life, otherness

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on the Aristotelian criterion referred to as 'abductio', Peirce suggests a method of hypothetical inference, which operates in a different way than the deductive and inductive methods. “Abduction is nothing but guessing” (Peirce, 7.219). This principle is of extreme value for the study of our understanding of mathematical self-similarity in both of its typical presentations: relative or absolute. For the first case, abduction incarnates the quantitative/qualitative relationships of a self-similar object or process; for the second case, abduction makes understandable the statistical treatment of self-similarity, 'guessing' the continuity of geometric features to the infinity through the use of a systematic stereotype (for instance, the assumption that the general shape of the Sierpiński triangle continuates identically into its particular shapes). The metaphor coined by Peirce, of an exact map containig itself the same exact map (a map of itself), is not only the most important precedent of Mandelbrot’s problem of measuring the boundaries of a continuous irregular surface with a logarithmic ruler, but also still being a useful abstraction for the conceptualisation of relative and absolute self-similarity, and its mechanisms of implementation. It is useful, also, for explaining some of the most basic geometric ontologies as mental constructions: in the notion of infinite convergence of points in the corners of a triangle, or the intuition for defining two parallel straight lines as two lines in a plane that 'never' intersect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to examine how breathing, swallowing and voicing are affected in different laryngeal disorders. For this purpose, we examined four different patient groups: patients who had undergone total laryngectomy, anterior cervical decompression (ACD), or injection laryngoplasty with autologous fascia (ILAF), and patients with dyspnea during exercise. We studied the problems and benefits related to the automatic speech valve used for the rehabilitation of speech in laryngectomized patients. The device was given to 14 total laryngectomized patients who used the traditional valve especially well. The usefulness of voice and intelligibility of speech were assessed by speech pathologists. The results demonstrated better performance with the traditional valve in both dimensions. Most of the patients considered the automatic valve a helpful additional device but because of heavier breathing and the greater work needed for speech production, it was not suitable as a sole device in speech rehabilitation. Dysphonia and dysphagia are known complications of ACD. These symptoms are caused due to the stretching of tissue needed during the surgery, but the extent and the recovery from them was not well known before our study. We studied two patient groups, an early group with 50 patients who were examined immediately before and after the surgery and a late group with 64 patients who were examined 3 9 months postoperatively. Altogether, 60% reported dysphonia and 69% dysphagia immediately after the operation. Even though dysphagia and dysphonia often appeared after surgery, permanent problems seldom occurred. Six (12 %) cases of transient and two (3 %) permanent vocal cord paresis were detected. In our third study, the long-term results of ILAF in 43 patients with unilateral vocal cord paralysis were examined. The mean follow-up was 5.8 years (range 3 10). Perceptual evaluation demonstrated improved results for voice quality, and videostroboscopy revealed complete or partial glottal closure in 83% of the patients. Fascia showed to be a stable injection material with good vocal results. In our final study we developed a new diagnostic method for exertional laryngeal dyspnea by combining a cardiovascular exercise test with simultaneous fiberoptic observation of the larynx. With this method, it is possible to visualize paradoxal closure of the vocal cords during inspiration, which is a diagnostic criterion for vocal cord dysfunction (VCD). We examined 30 patients referred to our hospital because of suspicion of exercise-induced vocal cord dysfunction (EIVCD). Twenty seven out of thirty patients were able to perform the test. Dyspnea was induced in 15 patients, and of them five had EIVCD and four high suspicion of EIVCD. With our test it is possible to set an accurate diagnosis for exertional laryngeal dyspnea. Moreover, the often seen unnecessary use of asthma drugs among these patients can be avoided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to explore the importance of evaluating leadership criteria in Finland at leader/subordinate levels of the insurance industry. The overall purpose of the thesis is tackled and analyzed from two different perspectives: - by examining the importance of the leadership criteria and style of Finnish insurance business leaders and their subordinates - by examining the opinions of insurance business leaders regarding leadership criteria in two culturally different countries: the US and Finland. This thesis consists of three published articles that scrutinise the focal phenomena both theoretically and empirically. The main results of the study do not lend support to the existence of a universal model of leadership criteria in the insurance business. As a matter of fact, the possible model seems to be based more on the special organizational and cultural circumstances of the country in question. The leadership criteria seem to be quite stable irrespective of the comparatively short research time period (3–5 years) and hierarchical level (subordinate/leader). Leaders have major difficulties in changing their leadership style. In fact, in order to bring about an efficient organizational change in the company you have to alternate the leader. The cultural dimensions (cooperation and monitoring) identified by Finnish subordinates were mostly in line with those of their managers, whilst emphasizing more the aspect of monitoring employees, which could be seen from their point of view as another element of managers’ optimizing/efficiency requirements. In Finnish surveys the strong emphasis on cooperation and mutual trust become apparent by both subordinates and managers. The basic problem is still how to emphasize and balance them in real life in such a way that both parties are happy to work together on a common basis. The American surveys suggests hypothetically that in a soft market period (buyer’s market) managers employ a more relationship-oriented leadership style and correspondingly adapt their leadership style to a more task-oriented approach in a hard market phase (seller’s market). In making business better Finnish insurance managers could probably concentrate more on task-oriented items such as reviewing, budgeting, monitoring and goal-orientation. The study also suggests that the social safety net of the European welfare state ideology has so far shielded the culture-specific sense of social responsibility of Finnish managers from the hazards of free competition and globalization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The magnetically induced currents in organic monoring and multiring molecules, in Möbius shaped molecules and in inorganic all-metal molecules have been investigated by means of the Gauge-including magnetically induced currents (GIMIC) method. With the GIMIC method, the ring-current strengths and the ring-current density distributions can be calculated. For open-shell molecules, also the spin current can be obtained. The ring-current pathways and ring-current strengths can be used to understand the magnetic resonance properties of the molecules, to indirectly identify the effect of non-bonded interactions on NMR chemical shifts, to design new molecules with tailored properties and to discuss molecular aromaticity. In the thesis, the magnetic criterion for aromaticity has been adopted. According to this, a molecule which has a net diatropic ring current might be aromatic. Similarly, a molecule which has a net paratropic current might be antiaromatic. If the net current is zero, the molecule is nonaromatic. The electronic structure of the investigated molecules has been resolved by quantum chemical methods. The magnetically induced currents have been calculated with the GIMIC method at the density-functional theory (DFT) level, as well as at the self-consistent field Hartree-Fock (SCF-HF), at the Møller-Plesset perturbation theory of the second order (MP2) and at the coupled-cluster singles and doubles (CCSD) levels of theory. For closed-shell molecules, accurate ring-current strengths can be obtained with a reasonable computational cost at the DFT level and with rather small basis sets. For open-shell molecules, it is shown that correlated methods such as MP2 and CCSD might be needed to obtain reliable charge and spin currents. The basis set convergence has to be checked for open-shell molecules by performing calculations with large enough basis sets. The results discussed in the thesis have been published in eight papers. In addition, some previously unpublished results on the ring currents in the endohedral fullerene Sc3C2@C80 and in coronene are presented. It is shown that dynamical effects should be taken into account when modelling magnetic resonance parameters of endohedral metallofullerenes such as Sc3C2@C80. The ring-current strengths in a series of nano-sized hydrocarbon rings are related to static polarizabilities and to H-1 nuclear magnetic resonance (NMR) shieldings. In a case study on the possible aromaticity of a Möbius-shaped [16]annulene we found that, according to the magnetic criterion, the molecule is nonaromatic. The applicability of the GIMIC method to assign the aromatic character of molecules was confirmed in a study on the ring currents in simple monocylic aromatic, homoaromatic, antiaromatic, and nonaromatic hydrocarbons. Case studies on nanorings, hexaphyrins and [n]cycloparaphenylenes show that explicit calculations are needed to unravel the ring-current delocalization pathways in complex multiring molecules. The open-shell implementation of GIMIC was applied in studies on the charge currents and the spin currents in single-ring and bi-ring molecules with open shells. The aromaticity predictions that are made based on the GIMIC results are compared to other aromaticity criteria such as H-1 NMR shieldings and shifts, electric polarizabilities, bond-length alternation, as well as to predictions provided by the traditional Hückel (4n+2) rule and its more recent extensions that account for Möbius twisted molecules and for molecules with open shells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This research paper studies how the strategy of repositioning enables marketers to communicate CSR as their brand’s differentiating factor. It aims at understanding how consumer perceptions can be managed to generate brand value through corporate brand repositioning when CSR is the differentiating factor. The purpose of this paper is to answer the following research question: How can consumer perceptions be managed to generate brand value through corporate brand repositioning when CSR is the differentiating factor? The two research objectives were: 1. to build a model, which describes the different components of consumer perceptions involved in generation of brand value through repositioning when CSR is the differentiating factor, 2. to identify the most critical components in the context of the case company, IKEA for generation of brand value during the process of corporate brand repositioning Design/methodology/approach – This paper is based on the literature review covering the logic of brand value generation, repositioning strategy and consumer perceptions connected to CSR activities. A key concept of the positioning theory, the brand’s differentiating factor, was explored. Previous studies have concluded that desirability of the differentiating factor largely determines the level of brand value-creation for the target customers. The criterion of desirability is based on three dimensions: relevance, distinctiveness and believability. A model was built in terms of these desirability dimensions. This paper takes a case study approach where the predefined theoretical framework is tested using IKEA as the case company. When developing insights on the multifaceted nature of brand perceptions, personal interviews and individual probing are vital. They enable the interviewees to reflect on their feelings and perceptions with their own words. This is why the data collection was based on means-end type of questioning. Qualitative interviews were conducted with 12 consumers. Findings – The paper highlights five critical components that may determine whether IKEA will fail in its repositioning efforts. The majority of the critical components involved believability perceptions. Hence, according to the findings, establishing credibility and trustworthiness for the brand in the context of CSR seems primary. The most critical components identified of the believability aspect were: providing proof of responsible codes of conduct via conducting specific and concrete CSR actions, connecting the company’s products and the social cause, and building a linkage between the initial and new positioning while also weakening the old positioning. Originality/value – Marketers’ obligation is to prepare the company for future demands. Companies all over the globe have recognized the durable trend of responsibility and sustainability. Consumer´s worry about the environmental and social impact of modern lifestyles is growing. This is why Corporate Social Responsibility (CSR) provides brands an important source of differentiation and strength in the future. The strategy of repositioning enables marketers to communicate CSR as their brand’s differentiating factor. This study aimed at understanding how consumer perceptions can be managed to generate brand value through corporate brand repositioning when CSR is the differentiating factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relationship between the Orthodox Churches and the World Council of Churches (WCC) became a crisis just before the 8th Assembly of the WCC in Harare, Zimbabwe in 1998. The Special Commission on Orthodox Participation in the WCC (SC), inaugurated in Harare, worked during the period 1999 2002 to solve the crisis and to secure the Orthodox participation in the WCC. The purpose of this study is: 1) to clarify the theological motives for the inauguration of the SC and the theological argumentation of the Orthodox criticism; 2) to write a reliable history and analysis of the SC; 3) to outline the theological argumentation, which structures the debate, and 4) to investigate the ecclesiological questions that arise from the SC material. The study spans the years 1998 to 2006, from the WCC Harare Assembly to the Porto Alegre Assembly. Hence, the initiation and immediate reception of the Special Commission are included in the study. The sources of this study are all the material produced by and for the SC. The method employed is systematic analysis. The focus of the study is on theological argumentation; the historical context and political motives that played a part in the Orthodox-WCC relations are not discussed in detail. The study shows how the initial, specific and individual Orthodox concerns developed into a profound ecclesiological discussion and also led to concrete changes in WCC practices, the best known of which is the change to decision-making by consensus. The Final Report of the SC contains five main themes, namely, ecclesiology, decision-making, worship/common prayer, membership and representation, and social and ethical issues. The main achievement of the SC was that it secured the Orthodox membership in the WCC. The ecclesiological conclusions made in the Final Report are twofold. On the one hand, it confirms that the very act of belonging to the WCC means the commitment to discuss the relationship between a church and churches. The SC recommended that baptism should be added as a criterion for membership in the WCC, and the member churches should continue to work towards the mutual recognition of each other s baptism. These elements strengthen the ecclesiological character of the WCC. On the other hand, when the Final Report discusses common prayer, the ecclesiological conclusions are much more cautious, and the ecclesiological neutrality of the WCC is emphasized several times. The SC repeatedly emphasized that the WCC is a fellowship of churches. The concept of koinonia, which has otherwise been important in recent ecclesiological questions, was not much applied by the SC. The comparison of the results of the SC to parallel ecclesiological documents of the WCC (Nature and Mission of the Church, Called to Be the One Church) shows that they all acknowledge the different ecclesiological starting points of the member churches, and, following that, a variety of legitimate views on the relation of the Church to the churches. Despite the change from preserving the koinonia to promises of eschatological koinonia, all the documents affirm that the goal of the ecumenical movement is still full, visible unity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energiataseen mallinnus on osa KarjaKompassi-hankkeeseen liittyvää kehitystyötä. Tutkielman tavoitteena oli kehittää lypsylehmän energiatasetta etukäteen ennustavia ja tuotoskauden aikana saatavia tietoja hyödyntäviä matemaattisia malleja. Selittävinä muuttujina olivat dieetti-, rehu-, maitotuotos-, koelypsy-, elopaino- ja kuntoluokkatiedot. Tutkimuksen aineisto kerättiin 12 Suomessa tehdyistä 8 – 28 laktaatioviikon pituisesta ruokintakokeesta, jotka alkoivat heti poikimisen jälkeen. Mukana olleista 344 lypsylehmästä yksi neljäsosa oli friisiläis- ja loput ayshire-rotuisia. Vanhempien lehmien päätiedosto sisälsi 2647 havaintoa (koe * lehmä * laktaatioviikko) ja ensikoiden 1070. Aineisto käsiteltiin SAS-ohjelmiston Mixed-proseduuria käyttäen ja poikkeavat havainnot poistettiin Tukeyn menetelmällä. Korrelaatioanalyysillä tarkasteltiin energiataseen ja selittävien muuttujien välisiä yhteyksiä. Energiatase mallinnettiin regressioanalyysillä. Laktaatiopäivän vaikutusta energiataseeseen selitettiin viiden eri funktion avulla. Satunnaisena tekijänä mallissa oli lehmä kokeen sisällä. Mallin sopivuutta aineistoon tarkasteltiin jäännösvirheen, selitysasteen ja Bayesin informaatiokriteerin avulla. Parhaat mallit testattiin riippumattomassa aineistossa. Laktaatiopäivän vaikutusta energiataseeseen selitti hyvin Ali-Schaefferin funktio, jota käytettiin perusmallina. Kaikissa energiatasemalleissa vaihtelu kasvoi laktaatioviikosta 12. alkaen, kun havaintojen määrä väheni ja energiatase muuttui positiiviseksi. Ennen poikimista käytettävissä olevista muuttujista dieetin väkirehuosuus ja väkirehun syönti-indeksi paransivat selitysastetta ja pienensivät jäännösvirhettä. Ruokinnan onnistumista voidaan seurata maitotuotoksen, maidon rasvapitoisuuden ja rasva-valkuaissuhteen tai EKM:n sisältävillä malleilla. EKM:n vakiointi pienensi mallin jäännösvirhettä. Elopaino ja kuntoluokka olivat heikkoja selittäjiä. Malleja voidaan hyödyntää karjatason ruokinnan suunnittelussa ja seurannassa, mutta yksittäisen lehmän energiataseen ennustamiseen ne eivät sovellu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an efficient and parameter-free scoring criterion, the factorized conditional log-likelihood (ˆfCLL), for learning Bayesian network classifiers. The proposed score is an approximation of the conditional log-likelihood criterion. The approximation is devised in order to guarantee decomposability over the network structure, as well as efficient estimation of the optimal parameters, achieving the same time and space complexity as the traditional log-likelihood scoring criterion. The resulting criterion has an information-theoretic interpretation based on interaction information, which exhibits its discriminative nature. To evaluate the performance of the proposed criterion, we present an empirical comparison with state-of-the-art classifiers. Results on a large suite of benchmark data sets from the UCI repository show that ˆfCLL-trained classifiers achieve at least as good accuracy as the best compared classifiers, using significantly less computational resources.