970 resultados para Univariate Analysis box-jenkins methodology
Resumo:
Recent theoretical models of economic growth have emphasised the role of external effects on the accumulation of factors of production. Although most of the literature has considered the externalities across firms within a region, in this paper we go a step further and consider the possibility that these externalities cross the barriers of regional economies. We assess the role of these external effects in explaining growth and economic convergence. We present a simple growth model, which includes externalities across economies, developing a methodology for testing their existence and estimating their strength. In our view, spatial econometrics is naturally suited to an empirical consideration of these externalities. We obtain evidence on the presence of significant externalities both across Spanish and European regions.
Resumo:
Today's approach to anti-doping is mostly centered on the judicial process, despite pursuing a further goal in the detection, reduction, solving and/or prevention of doping. Similarly to decision-making in the area of law enforcement feeding on Forensic Intelligence, anti-doping might significantly benefit from a more extensive gathering of knowledge. Forensic Intelligence might bring a broader logical dimension to the interpretation of data on doping activities for a more future-oriented and comprehensive approach instead of the traditional case-based and reactive process. Information coming from a variety of sources related to doping, whether directly or potentially, would feed an organized memory to provide real time intelligence on the size, seriousness and evolution of the phenomenon. Due to the complexity of doping, integrating analytical chemical results and longitudinal monitoring of biomarkers with physiological, epidemiological, sociological or circumstantial information might provide a logical framework enabling fit for purpose decision-making. Therefore, Anti-Doping Intelligence might prove efficient at providing a more proactive response to any potential or emerging doping phenomenon or to address existing problems with innovative actions or/and policies. This approach might prove useful to detect, neutralize, disrupt and/or prevent organized doping or the trafficking of doping agents, as well as helping to refine the targeting of athletes or teams. In addition, such an intelligence-led methodology would serve to address doping offenses in the absence of adverse analytical chemical evidence.
Resumo:
In this paper we test for the hysteresis versus the natural rate hypothesis on the unemployment rates of the EU new members using unit root tests that account for the presence of level shifts. As a by product, the analysis proceeds to the estimation of a NAIRU measure from a univariate point of view. The paper also focuses on the precision of these NAIRU estimates studying the two sources of inaccuracy that derive from the break points estimation and the autoregressive parameters estimation. The results point to the existence of up to four structural breaks in the transition countries NAIRU that can be associated with institutional changes implementing market-oriented reforms. Moreover, the degree of persistence in unemployment varies dramatically among the individual countries depending on the stage reached in the transition process
Resumo:
Drosophila melanogaster is a model organism instrumental for numerous biological studies. The compound eye of this insect consists of some eight hundred individual ommatidia or facets, ca. 15 µm in cross-section. Each ommatidium contains eighteen cells including four cone cells secreting the lens material (cornea). High-resolution imaging of the cornea of different insects has demonstrated that each lens is covered by the nipple arrays--small outgrowths of ca. 200 nm in diameter. Here we for the first time utilize atomic force microscopy (AFM) to investigate nipple arrays of the Drosophila lens, achieving an unprecedented visualization of the architecture of these nanostructures. We find by Fourier analysis that the nipple arrays of Drosophila are disordered, and that the seemingly ordered appearance is a consequence of dense packing of the nipples. In contrast, Fourier analysis confirms the visibly ordered nature of the eye microstructures--the individual lenses. This is different in the frizzled mutants of Drosophila, where both Fourier analysis and optical imaging detect disorder in lens packing. AFM reveals intercalations of the lens material between individual lenses in frizzled mutants, providing explanation for this disorder. In contrast, nanostructures of the mutant lens show the same organization as in wild-type flies. Thus, frizzled mutants display abnormal organization of the corneal micro-, but not nano-structures. At the same time, nipples of the mutant flies are shorter than those of the wild-type. We also analyze corneal surface of glossy-appearing eyes overexpressing Wingless--the lipoprotein ligand of Frizzled receptors, and find the catastrophic aberration in nipple arrays, providing experimental evidence in favor of the major anti-reflective function of these insect eye nanostructures. The combination of the easily tractable genetic model organism and robust AFM analysis represents a novel methodology to analyze development and architecture of these surface formations.
Resumo:
Generally, medicine books are concentrated almost exclusively in explaining methodology that analyzes fixed measures, measures done in a certain moment, nevertheless the evolution of the measurement and correct interpretation of the missed values are very important and sometimes can give the key information of the results obtained. Thus, the analysis of the temporary series and spectral analysis or analysis of the time series in the dominion of frequencies can be regarded as an appropriate tool for this kind of studies.In this work the frequency of the pulsating secretion of luteinizing hormone LH (thatregulates the fertile life of women) were analyzed in order to determine the existence of the significant frequencies obtained by analysis of Fourier. Detection of the frequencies, with which the pulsating secretion of the LH takes place, is a quite difficult question due topresence of the random errors in measures and samplings, i.e. that pulsating secretions of small amplitude are not detected and disregarded. In physiology it is accepted that cyclical patterns in the secretion of the LH exist and in the results of this research confirm this pattern and determine its frequency presented in the corresponded periodograms to each of studied cycle. The obtained results can be used as key pattern for future sampling frequencies in order to ¿catch¿ the significant picks of the luteinizing hormone and reflect on time forproductivity treatment of women.
Resumo:
Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.
Resumo:
Large animal models are an important resource for the understanding of human disease and for evaluating the applicability of new therapies to human patients. For many diseases, such as cone dystrophy, research effort is hampered by the lack of such models. Lentiviral transgenesis is a methodology broadly applicable to animals from many different species. When conjugated to the expression of a dominant mutant protein, this technology offers an attractive approach to generate new large animal models in a heterogeneous background. We adopted this strategy to mimic the phenotype diversity encounter in humans and generate a cohort of pigs for cone dystrophy by expressing a dominant mutant allele of the guanylate cyclase 2D (GUCY2D) gene. Sixty percent of the piglets were transgenic, with mutant GUCY2D mRNA detected in the retina of all animals tested. Functional impairment of vision was observed among the transgenic pigs at 3 months of age, with a follow-up at 1 year indicating a subsequent slower progression of phenotype. Abnormal retina morphology, notably among the cone photoreceptor cell population, was observed exclusively amongst the transgenic animals. Of particular note, these transgenic animals were characterized by a range in the severity of the phenotype, reflecting the human clinical situation. We demonstrate that a transgenic approach using lentiviral vectors offers a powerful tool for large animal model development. Not only is the efficiency of transgenesis higher than conventional transgenic methodology but this technique also produces a heterogeneous cohort of transgenic animals that mimics the genetic variation encountered in human patients.
Resumo:
A second collaborative exercise on RNA/DNA co-analysis for body fluid identification and STR profiling was organized by the European DNA Profiling Group (EDNAP). Six human blood stains, two blood dilution series (5-0.001 μl blood) and, optionally, bona fide or mock casework samples of human or non-human origin were analyzed by the participating laboratories using a RNA/DNA co-extraction or solely RNA extraction method. Two novel mRNA multiplexes were used for the identification of blood: a highly sensitive duplex (HBA, HBB) and a moderately sensitive pentaplex (ALAS2, CD3G, ANK1, SPTB and PBGD). The laboratories used different chemistries and instrumentation. All of the 18 participating laboratories were able to successfully isolate and detect mRNA in dried blood stains. Thirteen laboratories simultaneously extracted RNA and DNA from individual stains and were able to utilize mRNA profiling to confirm the presence of blood and to obtain autosomal STR profiles from the blood stain donors. The positive identification of blood and good quality DNA profiles were also obtained from old and compromised casework samples. The method proved to be reproducible and sensitive using different analysis strategies. The results of this collaborative exercise involving a RNA/DNA co-extraction strategy support the potential use of an mRNA based system for the identification of blood in forensic casework that is compatible with current DNA analysis methodology.
Resumo:
Evidence exists that many natural facts are described better as a fractal. Although fractals are very useful for describing nature, it is also appropiate to review the concept of random fractal in finance. Due to the extraordinary importance of Brownian motion in physics, chemistry or biology, we will consider the generalization that supposes fractional Brownian motion introduced by Mandelbrot.The main goal of this work is to analyse the existence of long range dependence in instantaneous forward rates of different financial markets. Concretelly, we perform an empirical analysis on the Spanish, Mexican and U.S. interbanking interest rate. We work with three time series of daily data corresponding to 1 day operations from 28th March 1996 to 21st May 2002. From among all the existing tests on this matter we apply the methodology proposed in Taqqu, Teverovsky and Willinger (1995).
Resumo:
Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.
Resumo:
For years, specifications have focused on the water to cement ratio (w/cm) and strength of concrete, despite the majority of the volume of a concrete mixture consisting of aggregate. An aggregate distribution of roughly 60% coarse aggregate and 40% fine aggregate, regardless of gradation and availability of aggregates, has been used as the norm for a concrete pavement mixture. Efforts to reduce the costs and improve sustainability of concrete mixtures have pushed owners to pay closer attention to mixtures with a well-graded aggregate particle distribution. In general, workability has many different variables that are independent of gradation, such as paste volume and viscosity, aggregate’s shape, and texture. A better understanding of how the properties of aggregates affect the workability of concrete is needed. The effects of aggregate characteristics on concrete properties, such as ability to be vibrated, strength, and resistivity, were investigated using mixtures in which the paste content and the w/cm were held constant. The results showed the different aggregate proportions, the maximum nominal aggregate sizes, and combinations of different aggregates all had an impact on the performance in the strength, slump, and box test.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
The Agenda 21 for the Geneva region is the results from a broad consultation process including all local actors. The article 12 stipulates that « the State facilitates possible synergies between economic activities in order to minimize their environmental impacts » thus opening the way for Industrial Ecology (IE) and Industrial Symbiosis (IS). An Advisory Board for Industrial Ecology and Industrial Symbiosis implementation was established in 2002 involving relevant government agencies. Regulatory and technical conditions for IS are studied in the Swiss context. Results reveal that the Swiss law on waste does not hinder by-product exchanges. Methodology and technical factors including geographic, qualitative, quantitative and economical aspects are detailed. The competition with waste operators in a highly developed recycling system is also tackled.The IS project develops an empirical and systematic method for detecting and implementing by-products synergies between industrial actors disseminated throughout the Geneva region. Database management tool for the treatment of input-output analysis data and GIS tools for detecting potentials industrial partners are constantly improved. Potential symbioses for 17 flows (including energy, water and material flows) are currently studied for implementation.
Resumo:
The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.