970 resultados para Univariate Analysis box-jenkins methodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Limited information is available regarding the methodology required to characterize hashish seizures for assessing the presence or the absence of a chemical link between two seizures. This casework report presents the methodology applied for assessing that two different police seizures were coming from the same block before this latter one was split. The chemical signature was extracted using GC-MS analysis and the implemented methodology consists in a study of intra- and inter-variability distributions based on the measurement of the chemical profiles similarity using a number of hashish seizures and the calculation of the Pearson correlation coefficient. Different statistical scenarios (i.e., a combination of data pretreatment techniques and selection of target compounds) were tested to find the most discriminating one. Seven compounds showing high discrimination capabilities were selected on which a specific statistical data pretreatment was applied. Based on the results, the statistical model built for comparing the hashish seizures leads to low error rates. Therefore, the implemented methodology is suitable for the chemical profiling of hashish seizures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is to examine the proper use of dimensions and curve fitting practices elaborating on Georgescu-Roegen’s economic methodology in relation to the three main concerns of his epistemological orientation. Section 2 introduces two critical issues in relation to dimensions and curve fitting practices in economics in view of Georgescu-Roegen’s economic methodology. Section 3 deals with the logarithmic function (ln z) and shows that z must be a dimensionless pure number, otherwise it is nonsensical. Several unfortunate examples of this analytical error are presented including macroeconomic data analysis conducted by a representative figure in this field. Section 4 deals with the standard Cobb-Douglas function. It is shown that the operational meaning cannot be obtained for capital or labor within the Cobb-Douglas function. Section 4 also deals with economists "curve fitting fetishism". Section 5 concludes thispaper with several epistemological issues in relation to dimensions and curve fitting practices in economics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One feature of the modern nutrition transition is the growing consumption of animal proteins. The most common approach in the quantitative analysis of this change used to be the study of averages of food consumption. But this kind of analysis seems to be incomplete without the knowledge of the number of consumers. Data about consumers are not usually published in historical statistics. This article introduces a methodological approach for reconstructing consumer populations. This methodology is based on some assumptions about the diffusion process of foodstuffs and the modeling of consumption patterns with a log-normal distribution. This estimating process is illustrated with the specific case of milk consumption in Spain between 1925 and 1981. These results fit quite well with other data and indirect sources available showing that this dietary change was a slow and late process. The reconstruction of consumer population could shed a new light in the study of nutritional transitions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to document the outcome of a global three-year long supply chain improvement initiative at a multi-national producer of branded sporting goods that is transforming from a holding structure to an integrated company. The case company is comprised of seven internationally well-known sport brands, which form a diverse set of independent sub-cases, on which the same supply chain metrics and change project approach was applied to improve supply chain performance. Design/methodology/approach - By using in-depth case study and statistical analysis the paper analyzes across the brands how supply chain complexity (SKU count), supply chain type (make or buy) and seasonality affect completeness and punctuality of deliveries, and inventory as the change project progresses. Findings - Results show that reduction in supply chain complexity improves delivery performance, but has no impact on inventory. Supply chain type has no impact on service level, but brands with in-house production are better in improving inventory than those with outsourced production. Non-seasonal business units improve service faster than seasonal ones, yet there is no impact on inventory. Research limitations/implications - The longitudinal data used for the analysis is biased with the general business trend, yet the rich data from different cases and three-years of data collection enables generalizations to a certain level. Practical implications - The in-depth case study serves as an example for other companies on how to initiate a supply chain improvement project across business units with tangible results. Originality/value - The seven sub-cases with their different characteristics on which the same improvement initiative was applied sets a unique ground for longitudinal analysis to study supply chain complexity, type and seasonality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we address the complexity of the analysis of water use in relation to the issue of sustainability. In fact, the flows of water in our planet represent a complex reality which can be studied using many different perceptions and narratives referring to different scales and dimensions of analysis. For this reason, a quantitative analysis of water use has to be based on analytical methods that are semantically open: they must be able to define what we mean with the term “water” when crossing different scales of analysis. We propose here a definition of water as a resource that deal with the many services it provides to humans and ecosystems. WE argue that water can fulfil so many of them since the element has many characteristics that allow for the resource to be labelled with different attributes, depending on the end use –such as drinkable. Since the services for humans and the functions for ecosystems associated with water flows are defined on different scales but still interconnected it is necessary to organize our assessment of water use across different hierarchical levels. In order to do so we define how to approach the study of water use in the Societal Metabolism, by proposing the Water Metabolism, tganized in three levels: societal level, ecosystem level and global level. The possible end uses we distinguish for the society are: personal/physiological use, household use, economic use. Organizing the study of “water use” across all these levels increases the usefulness of the quantitative analysis and the possibilities of finding relevant and comparable results. To achieve this result, we adapted a method developed to deal with multi-level, multi-scale analysis - the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach - to the analysis of water metabolism. In this paper, we discuss the peculiar analytical identity that “water” shows within multi-scale metabolic studies: water represents a flow-element when considering the metabolism of social systems (at a small scale, when describing the water metabolism inside the society) and a fund-element when considering the metabolism o ecosystems (at a larger scale when describing the water metabolism outside the society). The theoretical analysis is illustrated using two case which characterize the metabolic patterns regarding water use of a productive system in Catalonia and a water management policy in Andarax River Basin in Andalusia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network analysis naturally relies on graph theory and, more particularly, on the use of node and edge metrics to identify the salient properties in graphs. When building visual maps of networks, these metrics are turned into useful visual cues or are used interactively to filter out parts of a graph while querying it, for instance. Over the years, analysts from different application domains have designed metrics to serve specific needs. Network science is an inherently cross-disciplinary field, which leads to the publication of metrics with similar goals; different names and descriptions of their analytics often mask the similarity between two metrics that originated in different fields. Here, we study a set of graph metrics and compare their relative values and behaviors in an effort to survey their potential contributions to the spatial analysis of networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optimally cross-linked peptidoglycan requires both transglycosylation and transpeptidation, provided by class A and class B penicillin-binding proteins (PBPs). Streptococcus gordonii possesses three class A PBPs (PBPs 1A, 1B, and 2A) and two class B PBPs (PBPs 2B and 2X) that are important for penicillin resistance. High-level resistance (MIC, > or =2 microg/ml) requires mutations in class B PBPs. However, although unmutated, class A PBPs are critical to facilitate resistance development (M. Haenni and P. Moreillon, Antimicrob. Agents Chemother. 50:4053-4061, 2006). Thus, their overexpression might be important to sustain the drug. Here, we determined the promoter regions of the S. gordonii PBPs and compared them to those of other streptococci. The extended -10 box was highly conserved and complied with a sigma(A)-type promoter consensus sequence. In contrast, the -35 box was poorly conserved, leaving the possibility of differential PBP regulation. Gene expression in a penicillin-susceptible parent (MIC, 0.008 microg/ml) and a high-level-resistant mutant (MIC, 2 microg/ml) was monitored using luciferase fusions. In the absence of penicillin, all PBPs were constitutively expressed, but their expression was globally increased (1.5 to 2 times) in the resistant mutant. In the presence of penicillin, class A PBPs were specifically overexpressed both in the parent (PBP 2A) and in the resistant mutant (PBPs 1A and 2A). By increasing transglycosylation, class A PBPs could promote peptidoglycan stability when transpeptidase is inhibited by penicillin. Since penicillin-related induction of class A PBPs occurred in both susceptible and resistant cells, such a mutation-independent facilitating mechanism could be operative at each step of resistance development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1 6 STRUCTURE OF THIS THESIS -Chapter I presents the motivations of this dissertation by illustrating two gaps in the current body of knowledge that are worth filling, describes the research problem addressed by this thesis and presents the research methodology used to achieve this goal. -Chapter 2 shows a review of the existing literature showing that environment analysis is a vital strategic task, that it shall be supported by adapted information systems, and that there is thus a need for developing a conceptual model of the environment that provides a reference framework for better integrating the various existing methods and a more formal definition of the various aspect to support the development of suitable tools. -Chapter 3 proposes a conceptual model that specifies the various enviromnental aspects that are relevant for strategic decision making, how they relate to each other, and ,defines them in a more formal way that is more suited for information systems development. -Chapter 4 is dedicated to the evaluation of the proposed model on the basis of its application to a concrete environment to evaluate its suitability to describe the current conditions and potential evolution of a real environment and get an idea of its usefulness. -Chapter 5 goes a step further by assembling a toolbox describing a set of methods that can be used to analyze the various environmental aspects put forward by the model and by providing more detailed specifications for a number of them to show how our model can be used to facilitate their implementation as software tools. -Chapter 6 describes a prototype of a strategic decision support tool that allow the analysis of some of the aspects of the environment that are not well supported by existing tools and namely to analyze the relationship between multiple actors and issues. The usefulness of this prototype is evaluated on the basis of its application to a concrete environment. -Chapter 7 finally concludes this thesis by making a summary of its various contributions and by proposing further interesting research directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For doping control, analyses of samples are generally achieved in two steps: a rapid screening and, in the case of a positive result, a confirmatory analysis. A two-step methodology based on ultra-high-pressure liquid chromatography coupled to a quadrupole time-of-flight mass spectrometry (UHPLC-QTOF-MS) was developed to screen and confirm 103 doping agents from various classes (e.g., beta-blockers, stimulants, diuretics, and narcotics). The screening method was presented in a previous article as part I (i.e., Fast analysis of doping agents in urine by ultra-high-pressure liquid chromatography-quadrupole time-of-flight mass spectrometry. Part I: screening analysis). For the confirmatory method, basic, neutral and acidic compounds were extracted by a dedicated solid-phase extraction (SPE) in a 96-well plate format and detected by MS in the tandem mode to obtain precursor and characteristic product ions. The mass accuracy and the elemental composition of precursor and product ions were used for compound identification. After validation including matrix effect determination, the method was considered reliable to confirm suspect results without ambiguity according to the positivity criteria established by the World Anti-Doping Agency (WADA). Moreover, an isocratic method was developed to separate ephedrine from its isomer pseudoephedrine and cathine from phenylpropanolamine in a single run, what allowed their direct quantification in urine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser desorption ionisation mass spectrometry (LDI-MS) has demonstrated to be an excellent analytical method for the forensic analysis of inks on a questioned document. The ink can be analysed directly on its substrate (paper) and hence offers a fast method of analysis as sample preparation is kept to a minimum and more importantly, damage to the document is minimised. LDI-MS has also previously been reported to provide a high power of discrimination in the statistical comparison of ink samples and has the potential to be introduced as part of routine ink analysis. This paper looks into the methodology further and evaluates statistically the reproducibility and the influence of paper on black gel pen ink LDI-MS spectra; by comparing spectra of three different black gel pen inks on three different paper substrates. Although generally minimal, the influences of sample homogeneity and paper type were found to be sample dependent. This should be taken into account to avoid the risk of false differentiation of black gel pen ink samples. Other statistical approaches such as principal component analysis (PCA) proved to be a good alternative to correlation coefficients for the comparison of whole mass spectra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we look at how a web-based social software can be used to make qualitative data analysis of online peer-to-peer learning experiences. Specifically, we propose to use Cohere, a web-based social sense-making tool, to observe, track, annotate and visualize discussion group activities in online courses. We define a specific methodology for data observation and structuring, and present results of the analysis of peer interactions conducted in discussion forum in a real case study of a P2PU course. Finally we discuss how network visualization and analysis can be used to gather a better understanding of the peer-to-peer learning experience. To do so, we provide preliminary insights on the social, dialogical and conceptual connections that have been generated within one online discussion group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today's approach to anti-doping is mostly centered on the judicial process, despite pursuing a further goal in the detection, reduction, solving and/or prevention of doping. Similarly to decision-making in the area of law enforcement feeding on Forensic Intelligence, anti-doping might significantly benefit from a more extensive gathering of knowledge. Forensic Intelligence might bring a broader logical dimension to the interpretation of data on doping activities for a more future-oriented and comprehensive approach instead of the traditional case-based and reactive process. Information coming from a variety of sources related to doping, whether directly or potentially, would feed an organized memory to provide real time intelligence on the size, seriousness and evolution of the phenomenon. Due to the complexity of doping, integrating analytical chemical results and longitudinal monitoring of biomarkers with physiological, epidemiological, sociological or circumstantial information might provide a logical framework enabling fit for purpose decision-making. Therefore, Anti-Doping Intelligence might prove efficient at providing a more proactive response to any potential or emerging doping phenomenon or to address existing problems with innovative actions or/and policies. This approach might prove useful to detect, neutralize, disrupt and/or prevent organized doping or the trafficking of doping agents, as well as helping to refine the targeting of athletes or teams. In addition, such an intelligence-led methodology would serve to address doping offenses in the absence of adverse analytical chemical evidence.