977 resultados para statistical techniques
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
The modern technological ability to handle large amounts of information confronts the chemist with the necessity to re-evaluate the statistical tools he routinely uses. Multivariate statistics furnishes theoretical bases for analyzing systems involving large numbers of variables. The mathematical calculations required for these systems are no longer an obstacle due to the existence of statistical packages that furnish multivariate analysis options. Here basic concepts of two multivariate statistical techniques, principal component and hierarchical cluster analysis that have received broad acceptance for treating chemical data are discussed.
Resumo:
Water quality was monitored at the upper course of the Rio das Velhas, a major tributary of the São Francisco basin located in the state of Minas Gerais, over an extension of 108 km from its source up to the limits with the Sabara district. Monitoring was done at 37 different sites over a period of 2 years (2003-2004) for 39 parameters. Multivariate statistical techniques were applied to interpret the large water-quality data set and to establish an optimal long-term monitoring network. Cluster analysis separated the sampling sites into groups of similarity, and also indicated the stations investigated for correlation and recommended to be removed from the monitoring network. Principal component analysis identified four components, which are responsible for the data structure explaining 80% of the total variance of the data. The principal parameters are characterized as due to mining activities and domestic sewage. Significant data reduction was achieved.
Resumo:
The study of spatial variability of soil and plants attributes, or precision agriculture, a technique that aims the rational use of natural resources, is expanding commercially in Brazil. Nevertheless, there is a lack of mathematical analysis that supports the correlation of these independent variables and their interactions with the productivity, identifying scientific standards technologically applicable. The aim of this study was to identify patterns of soil variability according to the eleven physical and seven chemical indicators in an agricultural area. It was used two multivariate techniques: the hierarchical cluster analysis (HCA) and the principal component analysis (PCA). According to the HCA, the area was divided into five management zones: zone 1 with 2.87ha, zone 2 with 0.8ha, zone 3 with 1.84ha, zone 4 with 1.33ha and zone 5 with 2.76ha. By the PCA, it was identified the most important variables within each zone: V% for the zone 1, CTC in the zone 2, levels of H+Al in the zone 4 and sand content and altitude in the zone 5. The zone 3 was classified as an intermediate zone with characteristics of all others. According to the results it is concluded that it is possible to separate into groups (management zones) samples with the same patterns of variability by the multivariate statistical techniques.
Resumo:
Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.
Resumo:
Cooked ham is considered a high-value product due to the quality of its raw material. Although its consumption is still low in Brazil, it is increasing due to the rising purchasing power of sectors of the population. This study aimed to assess the microbiological, physicochemical, rheological, and sensory quality of cooked hams (n=11) marketed in Brazil. All samples showed microbiological results within the standards established by Brazilian legislation. Eight of the eleven samples studied met all the legal requirements; two samples violated the standards due to the addition of starch; one sample had lower protein content than the minimum required, and another one had sodium content higher than that stated on the label. The use of Hierarchical Cluster Analysis allowed the agglomeration of the samples into three groups with distinct quality traits and with significant differences in moisture content, chromaticity, syneresis, and heating and freezing loss. Principal Component Analysis showed that the samples which correlated to higher sensory acceptance regarding flavor and overall acceptability were those with higher moisture, protein, fat, and luminosity values. This study confirmed the efficacy of multivariate statistical techniques in assessing the quality of commercial cooked hams and in indicating the physicochemical parameters associated with the perception of product quality.
Resumo:
This paper explores the cognitive functions of the Reality Status Evaluation (RSE) system in our experiences of narrative mediated messages (NMM) (fictional, narrative, audio-visual one-way input and moving picture messages), such as fictional TV programs and films. We regard reality in mediated experiences as a special mental and emotional construction and a multi-dimensional concept. We argue that viewers' reality sense in NMM is influenced by many factors with "real - on" as the default value. Some of these factors function as primary mental processes, including the content realism factors of those messages such as Factuality (F), Social Realism (SR), Life Relevance (LR), and Perceptual Realism - involvement (PR), which would have direct impacts on reality evaluations. Other factors, such as Narrative Meaning (NM), Emotional Responses, and personality trait Absorption (AB), will influence the reality evaluations directly or through the mediations of these main dimensions. I designed a questionnaire to study this theoretical construction. I developed items to form scales and sub-scales measuring viewers' subjective experiences of reality evaluations and these factors. Pertinent statistical techniques, such as internal consistency and factorial analysis, were employed to make revisions and improve the quality of the questionnaire. In the formal experiment, after viewing two short films, which were selected as high or low narrative structure messages from previous experiments, participants were required to answer the questionnaire, Absorption questionnaire, and SAM (Self-Assessment Manikin, measuring immediate emotional responses). Results were analyzed using the EQS, structural equation modeling (SEM), and discussed in terms oflatent relations among these subjective factors in mediated experience. The present results supported most of my theoretical hypotheses. In NMM, three main jactors, or dimensions, could be extracted in viewers' subjective reality evaluations: Social Realism (combining with Factuality), Life Relevance and Perceptual Realism. I designed two ways to assess viewers' understanding of na"ative meanings in mediated messages, questionnaire (NM-Q) and rating (NM-R) measurement, and its significant influences on reality evaluations was supported in the final EQS models. Particularly in high story stnlcture messages, the effect of Narrative Meaning (NM) can rarely be explained by only these dimensions of reality evaluations. Also, Empathy seems to playa more important role in RSE of low story structure messages. Also, I focused on two other factors that were pertinent to RSE in NMM, the personality trait Absorption, and Emotional Responses (including two dimensions: Valence and Intensity). Final model results partly supported my theoretical hypotheses about the relationships among Absorption (AB), Social Realism (SR) and Life Relevance (LR); and the immediate impact of Emotional Responses on Perceptual Realism cPR).
Resumo:
Les chutes chez les personnes âgées représentent un problème majeur. Il n’est donc pas étonnant que l’identification des facteurs qui en accroissent le risque ait mobilisé autant d’attention. Les aînés plus fragiles ayant besoin de soutien pour vivre dans la communauté sont néanmoins demeurés le parent pauvre de la recherche, bien que, plus récemment, les autorités québécoises en aient fait une cible d’intervention prioritaire. Les études d’observation prospectives sont particulièrement indiquées pour étudier les facteurs de risque de chutes chez les personnes âgées. Leur identification optimale est cependant compliquée par le fait que l’exposition aux facteurs de risque peut varier au cours du suivi et qu’un même individu peut subir plus d’un événement. Il y a 20 ans, des chercheurs ont tenté de sensibiliser leurs homologues à cet égard, mais leurs efforts sont demeurés vains. On continue aujourd’hui à faire peu de cas de ces considérations, se concentrant sur la proportion des personnes ayant fait une chute ou sur le temps écoulé jusqu’à la première chute. On écarte du coup une quantité importante d’information pertinente. Dans cette thèse, nous examinons les méthodes en usage et nous proposons une extension du modèle de risques de Cox. Nous illustrons cette méthode par une étude des facteurs de risque susceptibles d’être associés à des chutes parmi un groupe de 959 personnes âgées ayant eu recours aux services publics de soutien à domicile. Nous comparons les résultats obtenus avec la méthode de Wei, Lin et Weissfeld à ceux obtenus avec d’autres méthodes, dont la régression logistique conventionnelle, la régression logistique groupée, la régression binomiale négative et la régression d’Andersen et Gill. L’investigation est caractérisée par des prises de mesures répétées des facteurs de risque au domicile des participants et par des relances téléphoniques mensuelles visant à documenter la survenue des chutes. Les facteurs d’exposition étudiés, qu’ils soient fixes ou variables dans le temps, comprennent les caractéristiques sociodémographiques, l’indice de masse corporelle, le risque nutritionnel, la consommation d’alcool, les dangers de l’environnement domiciliaire, la démarche et l’équilibre, et la consommation de médicaments. La quasi-totalité (99,6 %) des usagers présentaient au moins un facteur à haut risque. L’exposition à des risques multiples était répandue, avec une moyenne de 2,7 facteurs à haut risque distincts par participant. Les facteurs statistiquement associés au risque de chutes incluent le sexe masculin, les tranches d’âge inférieures, l’histoire de chutes antérieures, un bas score à l’échelle d’équilibre de Berg, un faible indice de masse corporelle, la consommation de médicaments de type benzodiazépine, le nombre de dangers présents au domicile et le fait de vivre dans une résidence privée pour personnes âgées. Nos résultats révèlent cependant que les méthodes courantes d’analyse des facteurs de risque de chutes – et, dans certains cas, de chutes nécessitant un recours médical – créent des biais appréciables. Les biais pour les mesures d’association considérées proviennent de la manière dont l’exposition et le résultat sont mesurés et définis de même que de la manière dont les méthodes statistiques d’analyse en tiennent compte. Une dernière partie, tout aussi innovante que distincte de par la nature des outils statistiques utilisés, complète l’ouvrage. Nous y identifions des profils d’aînés à risque de devenir des chuteurs récurrents, soit ceux chez qui au moins deux chutes sont survenues dans les six mois suivant leur évaluation initiale. Une analyse par arbre de régression et de classification couplée à une analyse de survie a révélé l’existence de cinq profils distinctifs, dont le risque relatif varie de 0,7 à 5,1. Vivre dans une résidence pour aînés, avoir des antécédents de chutes multiples ou des troubles de l’équilibre et consommer de l’alcool sont les principaux facteurs associés à une probabilité accrue de chuter précocement et de devenir un chuteur récurrent. Qu’il s’agisse d’activité de dépistage des facteurs de risque de chutes ou de la population ciblée, cette thèse s’inscrit dans une perspective de gain de connaissances sur un thème hautement d’actualité en santé publique. Nous encourageons les chercheurs intéressés par l’identification des facteurs de risque de chutes chez les personnes âgées à recourir à la méthode statistique de Wei, Lin et Weissfeld car elle tient compte des expositions variables dans le temps et des événements récurrents. Davantage de recherches seront par ailleurs nécessaires pour déterminer le choix du meilleur test de dépistage pour un facteur de risque donné chez cette clientèle.
Resumo:
The present work is an attempt to understand the characteristics of the upper troposphere and lower stratosphere over the Asian summer monsoon region, more specifically over the Indian subcontinent. Mainly three important parameters are taken such as zonal wind, temperature and ozone over the UT/LS of the Asian summer monsoon region. It made a detailed study of its interannual variability and characteristics of theses parameters during the Indian summer monsoon period. Monthly values of zonal wind and temperature from the NCEP/NCAR reanalysis for the period 1960-2002 are used for the present study. Also the daily overpass total ozone data for the 12 Indian stations (from low latitude to high latitudes) from the TOMS Nimbus 7 satellite for the period 1979 to 1992 were also used to understand the total ozone variation over the Indian region. The study reveals that if QBO phases in the stratosphere is easterly or weak westerly then the respective monsoon is found to be DRY or below Normal . On the other hand, if the phase is westerly or weak easterly the respective Indian summer monsoon is noted as a WET year. This connection of stratospheric QBO phases and Indian summer monsoon gives more insight in to the long-term predictions of Indian summer monsoon rainfall. Wavelet analysis and EOF methods are the two advanced statistical techniques used in the present study to explore more information of the zonal wind that from the smaller scale to higher scale variability over the Asian summer monsoon region. The interannual variability of temperature for different stratospheric and tropospheric levels over the Asian summer monsoon region have been studied. An attempt has been made to understand the total ozone characteristics and its interannual variablilty over 12 Indian stations spread from south latitudes to north latitudes. Finally it found that the upper troposphere and lower stratosphere contribute significantly to monsoon variability and climate changes. It is also observed that there exists a link between the stratospheric QBO and Indian summer monsoon
Resumo:
Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.
Resumo:
Information communication technology (IC T) has invariably brought about fundamental changes in the way in which libraries gather. preserve and disseminate information. The study was carried out with an aim to estimate and compare the information seeking behaviour (ISB) of the academics of two prominent universities of Kerala in the context of advancements achieved through ICT. The study was motivated by the fast changing scenario of libraries with the proliferation of many high tech products and services. The main purpose of the study was to identify the chief source of information of the academics, and also to examine academics preference upon the form and format of information source. The study also tries to estimate the adequacy of the resources and services currently provided by the libraries.The questionnaire was the central instrument for data collection. An almost census method was adopted for data collection engaging various methods and tools for eliciting data.The total population of the study was 957, out of which questionnaire was distributed to 859 academics. 646 academics responded to the survey, of which 564 of them were sound responses. Data was coded and analysed using Statistical Package for Social Sciences (SPSS) software and also with the help of Microsofl Excel package. Various statistical techniques were engaged to analyse data. A paradigm shift is evident by the fact that academies push themselves towards information in internet i.e. they prefer electronic source to traditional source and the very shift is coupled itself with e-seeking of information. The study reveals that ISB of the academics is influenced priman'ly by personal factors and comparative analysis shows that the ISB ofthc academics is similar in both universities. The productivity of the academics was tested to dig up any relation with respect to their ISB, and it is found that productivity of the academics is extensively related with their ISB. Study also reveals that the users ofthe library are satisfied with the services provided but not with the sources and in conjunction, study also recommends ways and means to improve the existing library system.
Resumo:
The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.
Resumo:
Man's concern with environmental deterioration is one of the major reasons for the increased interest in marine and estuarine microbes. Microbes form an important link in the biogeochemical cycling and their cyclinq activites often determine to a large measure the potential productivity of an ecosystem In the recycling of the nutrients in the estuary, bacteria and fungi therefore play a particularly significant role.The allochthonous plant materials contain biopolymers such as cellulose, lignin, humus etc., that are difficult to degrade into simpler substances. The fungi have the ability to degrade _substances, thereby making them available for cycling within the system. The present study is devoted to find the composition and the activity of myco populations of Cochin backwater. For convenience the thesis is divided into eight chapters. The opening chapter briefly reviews the literature and projects the importance of work and the main objectives. Second chapter discusses the materials and methods. In the third chapter the systematic and taxonomy of estuarine yeasts are examined in detail since this information is scarcely available for our waters. The general ecological aspects of the yeasts and filamentous fungi in the area of study are examined in the fourth chapter using appropriate statistical techniques. A special reference to the fungi in a small mangrove ecosystem is attempted in the fifth chapter. The biochemical studies are discussed in the sixth chapter and the penultimate chapter provides an overall discussion. In the last chapter the summary of the work is presented.
Resumo:
Study of Kerala State in relation to the western Ghats, using The present thesis envisages a hydrometeorological various statistical techniques and the water balance concepts first developed by Thornthwaite. The first chapter of the thesis gives general introduction where the purpose and scope of the study have been given. Chapter II discusses the importance of hydrometeorological studies in general and of water balance in particular, in planning for the overall development of any region. Chapter III consists of the presentation of various geographical features of Kerala. An introduction to the physiography of the western Ghats and detailed hydroclimatic studies of the Western Ghats region which includes analysis of rainfall and the study of water balance elements form Chapter IV. In Chapter V, a detailed hydrometeorological study of Kerala State is made. Discussion of the results of the study and suggestions for optimum utilization of the available water resources for the overall development of the western Ghats region in general and Kerala in particular are made in Chapter VI.
Resumo:
Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes