932 resultados para methods of analysis
Resumo:
Biorefineries is a perspective field of study that covers many opportunities of a successful business unit with respect to sustainability. The thesis focuses on the following key objective: identification of a competitive biorefineries production process in small and medium segments of the chemical and forest industries in Finland. The scope of the research relates to the selected biorefineries operations in Finland and the use of hemicellulose, as a raw material. The identification of the types of biorefineries and the important technical and process characteristics opens the advantage in the company’s competitive analysis. The study concentrates on the practical approach to the scientific methods of the market and companies research with the help of Quality Function Deployment and House of Quality tool. The thesis’s findings provide mindset version of the expert’s House of Quality application, identification of crucial biorefineries technical and design characteristics’ correlation and their effect on the competitive behavior of a company. The theoretical background helps to build the picture of the problematic issues within the field and provides scientific possible solutions. The analysis of the biorefineries’ market and companies operations bring the practical-oriented aptitude of the research. The results of the research can be used for the following investigations in a field and may be applied as a company’s management analytic and strategic application.
Resumo:
In this study, methods of media literacy instruction including analytic activities, production activities, and a combination of analytic and production activities were compared to determine their influence on grade 8 students' knowledge, attitudes, and behaviours towards commercials. The findings showed that media literacy instruction does improve media literacy skills. Specifically, activities that included an analytic component or an analytic and production component were significantly better than activities that included a production component. Participants that completed analytic or analytic and production activities were able to discern media-related terms, target audience, selling techniques, social values, and stereotypes in commercials better than participants that completed only production activities. The research findings also showed obstacles when teaching media literacy. When engaged in analytic activities, the difficulties included locating suitable resources, addressing the competition from commercials, encouraging written reflection, recognizing social values, and discussing racial stereotypes. When engaged in production activities, the difficulties were positioning recording stations, managing group work, organizing ideas, filming the footage, computer issues, and scheduling time. Strategies to overcome these obstacles are described.
Resumo:
Behavioral researchers commonly use single subject designs to evaluate the effects of a given treatment. Several different methods of data analysis are used, each with their own set of methodological strengths and limitations. Visual inspection is commonly used as a method of analyzing data which assesses the variability, level, and trend both within and between conditions (Cooper, Heron, & Heward, 2007). In an attempt to quantify treatment outcomes, researchers developed two methods for analysing data called Percentage of Non-overlapping Data Points (PND) and Percentage of Data Points Exceeding the Median (PEM). The purpose of the present study is to compare and contrast the use of Hierarchical Linear Modelling (HLM), PND and PEM in single subject research. The present study used 39 behaviours, across 17 participants to compare treatment outcomes of a group cognitive behavioural therapy program, using PND, PEM, and HLM on three response classes of Obsessive Compulsive Behaviour in children with Autism Spectrum Disorder. Findings suggest that PEM and HLM complement each other and both add invaluable information to the overall treatment results. Future research should consider using both PEM and HLM when analysing single subject designs, specifically grouped data with variability.
Resumo:
McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).
Resumo:
L’analyse de la marche a émergé comme l’un des domaines médicaux le plus im- portants récemment. Les systèmes à base de marqueurs sont les méthodes les plus fa- vorisées par l’évaluation du mouvement humain et l’analyse de la marche, cependant, ces systèmes nécessitent des équipements et de l’expertise spécifiques et sont lourds, coûteux et difficiles à utiliser. De nombreuses approches récentes basées sur la vision par ordinateur ont été développées pour réduire le coût des systèmes de capture de mou- vement tout en assurant un résultat de haute précision. Dans cette thèse, nous présentons notre nouveau système d’analyse de la démarche à faible coût, qui est composé de deux caméras vidéo monoculaire placées sur le côté gauche et droit d’un tapis roulant. Chaque modèle 2D de la moitié du squelette humain est reconstruit à partir de chaque vue sur la base de la segmentation dynamique de la couleur, l’analyse de la marche est alors effectuée sur ces deux modèles. La validation avec l’état de l’art basée sur la vision du système de capture de mouvement (en utilisant le Microsoft Kinect) et la réalité du ter- rain (avec des marqueurs) a été faite pour démontrer la robustesse et l’efficacité de notre système. L’erreur moyenne de l’estimation du modèle de squelette humain par rapport à la réalité du terrain entre notre méthode vs Kinect est très prometteur: les joints des angles de cuisses (6,29◦ contre 9,68◦), jambes (7,68◦ contre 11,47◦), pieds (6,14◦ contre 13,63◦), la longueur de la foulée (6.14cm rapport de 13.63cm) sont meilleurs et plus stables que ceux de la Kinect, alors que le système peut maintenir une précision assez proche de la Kinect pour les bras (7,29◦ contre 6,12◦), les bras inférieurs (8,33◦ contre 8,04◦), et le torse (8,69◦contre 6,47◦). Basé sur le modèle de squelette obtenu par chaque méthode, nous avons réalisé une étude de symétrie sur différentes articulations (coude, genou et cheville) en utilisant chaque méthode sur trois sujets différents pour voir quelle méthode permet de distinguer plus efficacement la caractéristique symétrie / asymétrie de la marche. Dans notre test, notre système a un angle de genou au maximum de 8,97◦ et 13,86◦ pour des promenades normale et asymétrique respectivement, tandis que la Kinect a donné 10,58◦et 11,94◦. Par rapport à la réalité de terrain, 7,64◦et 14,34◦, notre système a montré une plus grande précision et pouvoir discriminant entre les deux cas.
Resumo:
Three dimensional (3D) composites are strong contenders for the structural applications in situations like aerospace,aircraft and automotive industries where multidirectional thermal and mechanical stresses exist. The presence of reinforcement along the thickness direction in 3D composites,increases the through the thickness stiffness and strength properties.The 3D preforms can be manufactured with numerous complex architecture variations to meet the needs of specific applications.For hot structure applications Carbon-Carbon(C-C) composites are generally used,whose property variation with respect to temperature is essential for carrying out the design of hot structures.The thermomechanical behavior of 3D composites is not fully understood and reported.The methodology to find the thermomechanical properties using analytical modelling of 3D woven,3D 4-axes braided and 3D 5-axes braided composites from Representative Unit Cells(RUC's) based on constitutive equations for 3D composites has been dealt in the present study.High Temperature Unidirectional (UD) Carbon-Carbon material properties have been evaluated using analytical methods,viz.,Composite cylinder assemblage Model and Method of Cells based on experiments carried out on Carbon-Carbon fabric composite for a temparature range of 300 degreeK to 2800degreeK.These properties have been used for evaluating the 3D composite properties.From among the existing methods of solution sequences for 3D composites,"3D composite Strength Model" has been identified as the most suitable method.For thegeneration of material properies of RUC's od 3D composites,software has been developed using MATLAB.Correlaton of the analytically determined properties with test results available in literature has been established.Parametric studies on the variation of all the thermomechanical constants for different 3D performs of Carbon-Carbon material have been studied and selection criteria have been formulated for their applications for the hot structures.Procedure for the structural design of hot structures made of 3D Carbon-Carbon composites has been established through the numerical investigations on a Nosecap.Nonlinear transient thermal and nonlinear transient thermo-structural analysis on the Nosecap have been carried out using finite element software NASTRAN.Failure indices have been established for the identified performs,identification of suitable 3D composite based on parametric studies on strength properties and recommendation of this material for Nosecap of RLV based on structural performance have been carried out in this Study.Based on the 3D failure theory the best perform for the Nosecap has been identified as 4-axis 15degree braided composite.
Resumo:
The classical methods of analysing time series by Box-Jenkins approach assume that the observed series uctuates around changing levels with constant variance. That is, the time series is assumed to be of homoscedastic nature. However, the nancial time series exhibits the presence of heteroscedasticity in the sense that, it possesses non-constant conditional variance given the past observations. So, the analysis of nancial time series, requires the modelling of such variances, which may depend on some time dependent factors or its own past values. This lead to introduction of several classes of models to study the behaviour of nancial time series. See Taylor (1986), Tsay (2005), Rachev et al. (2007). The class of models, used to describe the evolution of conditional variances is referred to as stochastic volatility modelsThe stochastic models available to analyse the conditional variances, are based on either normal or log-normal distributions. One of the objectives of the present study is to explore the possibility of employing some non-Gaussian distributions to model the volatility sequences and then study the behaviour of the resulting return series. This lead us to work on the related problem of statistical inference, which is the main contribution of the thesis
Resumo:
Class exercise to analyse qualitative data mediated on use of a set of transcripts, augmented by videos from web site. Discussion is around not only how the data is codes, interview bias, dimensions of analysis. Designed as an introduction.
Resumo:
In this theme you will work through a series of texts and activities and reflect on your view of research and the process of analysis of data and information. Most activities are supported by textual or audio material and are there to stimulate your thinking in a given area. The purpose of this theme is to help you gain a general overview of the main approaches to research design. Although the theme comprises two main sections, one on quantitative research and the other on qualitative research, this is purely to guide your study. The two approaches may be viewed as being part of a continuum with many research studies now incorporating elements of both styles. Eventually you will need to choose a research approach or methodology that will be practical, relevant, appropriate, ethical, of good quality and effective for the research idea or question that you have in mind.
Resumo:
Structure is an important physical feature of the soil that is associated with water movement, the soil atmosphere, microorganism activity and nutrient uptake. A soil without any obvious organisation of its components is known as apedal and this state can have marked effects on several soil processes. Accurate maps of topsoil and subsoil structure are desirable for a wide range of models that aim to predict erosion, solute transport, or flow of water through the soil. Also such maps would be useful to precision farmers when deciding how to apply nutrients and pesticides in a site-specific way, and to target subsoiling and soil structure stabilization procedures. Typically, soil structure is inferred from bulk density or penetrometer resistance measurements and more recently from soil resistivity and conductivity surveys. To measure the former is both time-consuming and costly, whereas observations made by the latter methods can be made automatically and swiftly using a vehicle-mounted penetrometer or resistivity and conductivity sensors. The results of each of these methods, however, are affected by other soil properties, in particular moisture content at the time of sampling, texture, and the presence of stones. Traditional methods of observing soil structure identify the type of ped and its degree of development. Methods of ranking such observations from good to poor for different soil textures have been developed. Indicator variograms can be computed for each category or rank of structure and these can be summed to give the sum of indicator variograms (SIV). Observations of the topsoil and subsoil structure were made at four field sites where the soil had developed on different parent materials. The observations were ranked by four methods and indicator and the sum of indicator variograms were computed and modelled for each method of ranking. The individual indicators were then kriged with the parameters of the appropriate indicator variogram model to map the probability of encountering soil with the structure represented by that indicator. The model parameters of the SIVs for each ranking system were used with the data to krige the soil structure classes, and the results are compared with those for the individual indicators. The relations between maps of soil structure and selected wavebands from aerial photographs are examined as basis for planning surveys of soil structure. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The Representative Soil Sampling Scheme of England and Wales has recorded information on the soil of agricultural land in England and Wales since 1969. It is a valuable source of information about the soil in the context of monitoring for sustainable agricultural development. Changes in soil nutrient status and pH were examined over the period 1971-2001. Several methods of statistical analysis were applied to data from the surveys during this period. The main focus here is on the data for 1971, 1981, 1991 and 2001. The results of examining change over time in general show that levels of potassium in the soil have increased, those of magnesium have remained fairly constant, those of phosphorus have declined and pH has changed little. Future sampling needs have been assessed in the context of monitoring, to determine the mean at a given level of confidence and tolerable error and to detect change in the mean over time at these same levels over periods of 5 and 10 years. The results of a non-hierarchical multivariate classification suggest that England and Wales could be stratified to optimize future sampling and analysis. To monitor soil quality and health more generally than for agriculture, more of the country should be sampled and a wider range of properties recorded.
Resumo:
This review article addresses recent advances in the analysis of foods and food components by capillary electrophoresis (CE). CE has found application to a number of important areas of food analysis, including quantitative chemical analysis of food additives, biochemical analysis of protein composition, and others. The speed, resolution and simplicity of CE, combined with low operating costs, make the technique an attractive option for the development of improved methods of food analysis for the new millennium.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Resumo:
Peptides have been proposed to function in intracellular signaling within the cytosol. Although cytosolic peptides are considered to be highly unstable, a large number of peptides have been detected in mouse brain and other biological samples. In the present study, we evaluated the peptidome of three diverse cell lines: SH-SY5Y, MCF7, and HEIC293 cells. A comparison of the peptidomes revealed considerable overlap in the identity of the peptides found in each cell line. The majority of the observed peptides are not derived from the most abundant or least stable proteins in the cell, and approximately half of the cellular peptides correspond to the N- or C- termini of the precursor proteins. Cleavage site analysis revealed a preference for hydrophobic residues in the PI position. Quantitative peptidomic analysis indicated that the levels of most cellular peptides are not altered in response to elevated intracellular calcium, suggesting that calpain is not responsible for their production. The similarity of the peptidomes of the three cell lines and the lack of correlation with the predicted cellular degradome implies the selective formation or retention of these peptides, consistent with the hypothesis that they are functional in the cells.