975 resultados para Instrumental analysis
Resumo:
One of the most influential statements in the anomie theory tradition has been Merton’s argument that the volume of instrumental property crime should be higher where there is a greater imbalance between the degree of commitment to monetary success goals and the degree of commitment to legitimate means of pursing such goals. Contemporary anomie theories stimulated by Merton’s perspective, most notably Messner and Rosenfeld’s institutional anomie theory, have expanded the scope conditions by emphasizing lethal criminal violence as an outcome to which anomie theory is highly relevant, and virtually all contemporary empirical studies have focused on applying the perspective to explaining spatial variation in homicide rates. In the present paper, we argue that current explications of Merton’s theory and IAT have not adequately conveyed the relevance of the core features of the anomie perspective to lethal violence. We propose an expanded anomie model in which an unbalanced pecuniary value system – the core causal variable in Merton’s theory and IAT – translates into higher levels of homicide primarily in indirect ways by increasing levels of firearm prevalence, drug market activity, and property crime, and by enhancing the degree to which these factors stimulate lethal outcomes. Using aggregate-level data collected during the mid-to-late 1970s for a sample of relatively large social aggregates within the U.S., we find a significant effect on homicide rates of an interaction term reflecting high levels of commitment to monetary success goals and low levels of commitment to legitimate means. Virtually all of this effect is accounted for by higher levels of property crime and drug market activity that occur in areas with an unbalanced pecuniary value system. Our analysis also reveals that property crime is more apt to lead to homicide under conditions of high levels of structural disadvantage. These and other findings underscore the potential value of elaborating the anomie perspective to explicitly account for lethal violence.
Resumo:
The meteorological circumstances that led to the Blizzard of March 1888 that hit New York are analysed in Version 2 of the “Twentieth Century Reanalysis” (20CR). The potential of this data set for studying historical extreme events has not yet been fully explored. A detailed analysis of 20CR data alongside other data sources (including historical instrumental data and weather maps) for historical extremes such as the March 1888 blizzard may give insights into the limitations of 20CR. We find that 20CR reproduces the circulation pattern as well as the temperature development very well. Regarding the absolute values of variables such as snow fall or minimum and maximum surface pressure, there is anunderestimation of the observed extremes, which may be due to the low spatial resolution of 20CR and the fact that only the ensemble mean is considered. Despite this drawback, the dataset allows us to gain new information due to its complete spatial and temporal coverage.
Resumo:
The Genesis mission Solar Wind Concentrator was built to enhance fluences of solar wind by an average of 20x over the 2.3 years that the mission exposed substrates to the solar wind. The Concentrator targets survived the hard landing upon return to Earth and were used to determine the isotopic composition of solar-wind—and hence solar—oxygen and nitrogen. Here we report on the flight operation of the instrument and on simulations of its performance. Concentration and fractionation patterns obtained from simulations are given for He, Li, N, O, Ne, Mg, Si, S, and Ar in SiC targets, and are compared with measured concentrations and isotope ratios for the noble gases. Carbon is also modeled for a Si target. Predicted differences in instrumental fractionation between elements are discussed. Additionally, as the Concentrator was designed only for ions ≤22 AMU, implications of analyzing elements as heavy as argon are discussed. Post-flight simulations of instrumental fractionation as a function of radial position on the targets incorporate solar-wind velocity and angular distributions measured in flight, and predict fractionation patterns for various elements and isotopes of interest. A tighter angular distribution, mostly due to better spacecraft spin stability than assumed in pre-flight modeling, results in a steeper isotopic fractionation gradient between the center and the perimeter of the targets. Using the distribution of solar-wind velocities encountered during flight, which are higher than those used in pre-flight modeling, results in elemental abundance patterns slightly less peaked at the center. Mean fractionations trend with atomic mass, with differences relative to the measured isotopes of neon of +4.1±0.9 ‰/amu for Li, between -0.4 and +2.8 ‰/amu for C, +1.9±0.7‰/amu for N, +1.3±0.4 ‰/amu for O, -7.5±0.4 ‰/amu for Mg, -8.9±0.6 ‰/amu for Si, and -22.0±0.7 ‰/amu for S (uncertainties reflect Monte Carlo statistics). The slopes of the fractionation trends depend to first order only on the relative differential mass ratio, Δ m/ m. This article and a companion paper (Reisenfeld et al. 2012, this issue) provide post-flight information necessary for the analysis of the Genesis solar wind samples, and thus serve to complement the Space Science Review volume, The Genesis Mission (v. 105, 2003).
Resumo:
This thesis project is motivated by the potential problem of using observational data to draw inferences about a causal relationship in observational epidemiology research when controlled randomization is not applicable. Instrumental variable (IV) method is one of the statistical tools to overcome this problem. Mendelian randomization study uses genetic variants as IVs in genetic association study. In this thesis, the IV method, as well as standard logistic and linear regression models, is used to investigate the causal association between risk of pancreatic cancer and the circulating levels of soluble receptor for advanced glycation end-products (sRAGE). Higher levels of serum sRAGE were found to be associated with a lower risk of pancreatic cancer in a previous observational study (255 cases and 485 controls). However, such a novel association may be biased by unknown confounding factors. In a case-control study, we aimed to use the IV approach to confirm or refute this observation in a subset of study subjects for whom the genotyping data were available (178 cases and 177 controls). Two-stage IV method using generalized method of moments-structural mean models (GMM-SMM) was conducted and the relative risk (RR) was calculated. In the first stage analysis, we found that the single nucleotide polymorphism (SNP) rs2070600 of the receptor for advanced glycation end-products (AGER) gene meets all three general assumptions for a genetic IV in examining the causal association between sRAGE and risk of pancreatic cancer. The variant allele of SNP rs2070600 of the AGER gene was associated with lower levels of sRAGE, and it was neither associated with risk of pancreatic cancer, nor with the confounding factors. It was a potential strong IV (F statistic = 29.2). However, in the second stage analysis, the GMM-SMM model failed to converge due to non- concaveness probably because of the small sample size. Therefore, the IV analysis could not support the causality of the association between serum sRAGE levels and risk of pancreatic cancer. Nevertheless, these analyses suggest that rs2070600 was a potentially good genetic IV for testing the causality between the risk of pancreatic cancer and sRAGE levels. A larger sample size is required to conduct a credible IV analysis.^
Resumo:
Non-failure analysis aims at inferring that predicate calis in a program will never fail. This type of information has many applications in functional/logic programming. It is essential for determining lower bounds on the computational cost of calis, useful in the context of program parallelization, instrumental in partial evaluation and other program transformations, and has also been used in query optimization. In this paper, we re-cast the non-failure analysis proposed by Debray et al. as an abstract interpretation, which not only allows to investígate it from a standard and well understood theoretical framework, but has also several practical advantages. It allows us to incorpórate non-failure analysis into a standard, generic abstract interpretation engine. The analysis thus benefits from the fixpoint propagation algorithm, which leads to improved information propagation. Also, the analysis takes advantage of the multi-variance of the generic engine, so that it is now able to infer sepárate non-failure information for different cali patterns. Moreover, the implementation is simpler, and allows to perform non-failure and covering analyses alongside other analyses, such as those for modes and types, in the same framework. Finally, besides the precisión improvements and the additional simplicity, our implementation (in the Ciao/CiaoPP multiparadigm programming system) also shows better efRciency.
Resumo:
Los estudios realizados hasta el momento para la determinación de la calidad de medida del instrumental geodésico han estado dirigidos, fundamentalmente, a las medidas angulares y de distancias. Sin embargo, en los últimos años se ha impuesto la tendencia generalizada de utilizar equipos GNSS (Global Navigation Satellite System) en el campo de las aplicaciones geomáticas sin que se haya establecido una metodología que permita obtener la corrección de calibración y su incertidumbre para estos equipos. La finalidad de esta Tesis es establecer los requisitos que debe satisfacer una red para ser considerada Red Patrón con trazabilidad metrológica, así como la metodología para la verificación y calibración de instrumental GNSS en redes patrón. Para ello, se ha diseñado y elaborado un procedimiento técnico de calibración de equipos GNSS en el que se han definido las contribuciones a la incertidumbre de medida. El procedimiento, que se ha aplicado en diferentes redes para distintos equipos, ha permitido obtener la incertidumbre expandida de dichos equipos siguiendo las recomendaciones de la Guide to the Expression of Uncertainty in Measurement del Joint Committee for Guides in Metrology. Asimismo, se han determinado mediante técnicas de observación por satélite las coordenadas tridimensionales de las bases que conforman las redes consideradas en la investigación, y se han desarrollado simulaciones en función de diversos valores de las desviaciones típicas experimentales de los puntos fijos que se han utilizado en el ajuste mínimo cuadrático de los vectores o líneas base. Los resultados obtenidos han puesto de manifiesto la importancia que tiene el conocimiento de las desviaciones típicas experimentales en el cálculo de incertidumbres de las coordenadas tridimensionales de las bases. Basándose en estudios y observaciones de gran calidad técnica, llevados a cabo en estas redes con anterioridad, se ha realizado un exhaustivo análisis que ha permitido determinar las condiciones que debe satisfacer una red patrón. Además, se han diseñado procedimientos técnicos de calibración que permiten calcular la incertidumbre expandida de medida de los instrumentos geodésicos que proporcionan ángulos y distancias obtenidas por métodos electromagnéticos, ya que dichos instrumentos son los que van a permitir la diseminación de la trazabilidad metrológica a las redes patrón para la verificación y calibración de los equipos GNSS. De este modo, ha sido posible la determinación de las correcciones de calibración local de equipos GNSS de alta exactitud en las redes patrón. En esta Tesis se ha obtenido la incertidumbre de la corrección de calibración mediante dos metodologías diferentes; en la primera se ha aplicado la propagación de incertidumbres, mientras que en la segunda se ha aplicado el método de Monte Carlo de simulación de variables aleatorias. El análisis de los resultados obtenidos confirma la validez de ambas metodologías para la determinación de la incertidumbre de calibración de instrumental GNSS. ABSTRACT The studies carried out so far for the determination of the quality of measurement of geodetic instruments have been aimed, primarily, to measure angles and distances. However, in recent years it has been accepted to use GNSS (Global Navigation Satellite System) equipment in the field of Geomatic applications, for data capture, without establishing a methodology that allows obtaining the calibration correction and its uncertainty. The purpose of this Thesis is to establish the requirements that a network must meet to be considered a StandardNetwork with metrological traceability, as well as the methodology for the verification and calibration of GNSS instrumental in those standard networks. To do this, a technical calibration procedure has been designed, developed and defined for GNSS equipment determining the contributions to the uncertainty of measurement. The procedure, which has been applied in different networks for different equipment, has alloweddetermining the expanded uncertainty of such equipment following the recommendations of the Guide to the Expression of Uncertainty in Measurement of the Joint Committee for Guides in Metrology. In addition, the three-dimensional coordinates of the bases which constitute the networks considered in the investigationhave been determined by satellite-based techniques. There have been several developed simulations based on different values of experimental standard deviations of the fixed points that have been used in the least squares vectors or base lines calculations. The results have shown the importance that the knowledge of experimental standard deviations has in the calculation of uncertainties of the three-dimensional coordinates of the bases. Based on high technical quality studies and observations carried out in these networks previously, it has been possible to make an exhaustive analysis that has allowed determining the requirements that a standard network must meet. In addition, technical calibration procedures have been developed to allow the uncertainty estimation of measurement carried outby geodetic instruments that provide angles and distances obtained by electromagnetic methods. These instruments provide the metrological traceability to standard networks used for verification and calibration of GNSS equipment. As a result, it has been possible the estimation of local calibration corrections for high accuracy GNSS equipment in standardnetworks. In this Thesis, the uncertainty of calibration correction has been calculated using two different methodologies: the first one by applying the law of propagation of uncertainty, while the second has applied the propagation of distributions using the Monte Carlo method. The analysis of the obtained results confirms the validity of both methodologies for estimating the calibration uncertainty of GNSS equipment.
Resumo:
Abstract interpretation has been widely used for the analysis of object-oriented languages and, more precisely, Java source and bytecode. However, while most of the existing work deals with the problem of finding expressive abstract domains that track accurately the characteristics of a particular concrete property, the underlying fixpoint algorithms have received comparatively less attention. In fact, many existing (abstract interpretation based) fixpoint algorithms rely on relatively inefficient techniques to solve inter-procedural call graphs or are specific and tied to particular analyses. We argue that the design of an efficient fixpoint algorithm is pivotal to support the analysis of large programs. In this paper we introduce a novel algorithm for analysis of Java bytecode which includes a number of optimizations in order to reduce the number of iterations. Also, the algorithm is parametric in the sense that it is independent of the abstract domain used and it can be applied to different domains as "plug-ins". It is also incremental in the sense that, if desired, analysis data can be saved so that only a reduced amount of reanalysis is needed after a small program change, which can be instrumental for large programs. The algorithm is also multivariant and flowsensitive. Finally, another interesting characteristic of the algorithm is that it is based on a program transformation, prior to the analysis, that results in a highly uniform representation of all the features in the language and therefore simplifies analysis. Detailed descriptions of decompilation solutions are provided and discussed with an example.
Resumo:
The consumption of melon (Cucumis melo L.) has been, until several years ago, regional, seasonal and without commercial interest. Recent commercial changes and world wide transportation have changed this situation. Melons from 3 different ripeness stages at harvest and 7 cold storage periods have been analysed by destructive and non destructive tests. Chemical, physical, mechanical (non destructive impact, compression, skin puncture and Magness- Taylor) and sensory tests were carried out in order to select the best test to assess quality and to determine the optimal ripeness stage at harvest. Analysis of variance and Principal Component Analysis were performed to study the data. The mechanical properties based on non-destructive Impact and Compression can be used to monitor cold storage evolution. They can also be used at harvest to segregate the highest ripeness stage (41 days after anthesis DAA) in relation to less ripe stages (34 and 28 DAA).Only 34 and 41 DAA reach a sensory evaluation above 50 in a scale from 0-100.
Resumo:
Mealiness is a negative attribute of sensory texture, characterised by the lack of juiciness without variation of total water content in the tissues. In peaches, mealiness is also known as "woolliness" and "leatheriness". This internal disorder is characterised by the lack of juiciness and flavour. In peaches, it is associated with interna browning near the stone and the incapacity of ripening although there is externa ripe appearance. Woolliness is associated with inadequate cold storage and is considered as a physiological disorder that appears in stone fruits when an unbalanced pectolitic enzyme activity during storage occurs (Kailasapathy and Melton, 1992). Many attempts have been carried out to identify and measure mealiness and woolliness in fruits. The texture of a food product is composed by a wide spectrum of sensory attributes. Consumer defines the texture integrating simultaneously all the sensory attributes. However, an instrument assesses one or several parameters related to a fraction of the texture spectrum (Kramer, 1973). The complexity of sensory analysis by means of trained panels to assess the quality of some producing processes, supports the attempt to estimate texture characteristics by instrumental means. Some studies have been carried out comparing sensory and instrumental methods to assess mealiness and woolliness. The current study is centered on analysis and evaluation of woolliness in peaches and is part of the European project FAIR CT95 0302 "Mealiness in fruits: consumer perception and means for detection". The main objective of this study was to develop procedures to detect woolly peaches by sensory and by instrumental means, as well as to compare both measuring procedures.
Resumo:
Fractal antennas have been proposed to improve the bandwidth of resonant structures and optical antennas. Their multiband characteristics are of interest in radiofrequency and microwave technologies. In this contribution we link the geometry of the current paths built-in the fractal antenna with the spectral response. We have seen that the actual currents owing through the structure are not limited to the portion of the fractal that should be geometrically linked with the signal. This fact strongly depends on the design of the fractal and how the different scales are arranged within the antenna. Some ideas involving materials that could actively respond to the incoming radiation could be of help to spectrally select the response of the multiband design.
Resumo:
O Brasil é um dos maiores produtores mundiais de mel, no qual sua produção é baseada principalmente na criação da espécie exótica Apis mellifera. A produção de mel da Apis mellifera é cerca de 10 vezes maior que das espécies de abelhas sem ferrão, contudo, o mel de abelhas nativas possui maior valor comercial. Embora pouco explorado, o mel de abelhas sem ferrão desperta interesse em indústrias de cosméticos e medicinas naturais. A sua produção se apresenta como uma ferramenta com grande potencial para agregar valor econômico aos ecossistemas brasileiros, em especial os florestais, de forma sustentável e com menor potencial de influências de contaminantes traços. A qualidade química do mel é um importante requisito comercial, principalmente o destinado à exportação. Como exemplo, a União Européia em 2006 decidiu suspender a importação do mel produzido no Brasil sob alegação de que o país não possuía equivalência ao bloco quanto as diretrizes para o controle de resíduos e qualidade do produto. Diante do potencial de produção comercial sustentável do mel de abelhas nativas brasileiras e a falta de conhecimento sobre possíveis resíduos encontrados em sua composição, em especial os elementos traços, como objetivo principal deste trabalho pretendeu-se caracterizar a composição de elementos químicos do mel de abelhas sem ferrão, comparar com o de Apis mellifera e verificar as possíveis variações causadas pelo ambiente. Este estudo investigou a composição química dos méis de abelhas sem ferrão de cinco estados brasileiros: Bahia, Minas Gerais, Rio Grande do Norte, Santa Catarina e São Paulo; compreendendo um total de 70 colméias de diferentes espécies: Melipona quadrifasciata, Melipona scutelaris, Melipona mandacaia, Melipona capixaba, Melipona rufiventris, Melipona compressipes, Melipona bicolor, Nannotrigona testaceicornis, Tetragona clavipes, Tetragonisca angustula e Scaptotrigona sp.. Pólen, a principal fonte de minerais para a colméia, e as próprias abelhas foram também coletadas para estudos de composição e correlação com os méis. A análise por ativação neutrônica instrumental permitiu a determinação de Br, Ca, Co, Cs, Fe, La, Na, Rb, Sc e Zn nos méis, Br, Ca, Co, Cs, Fe, K, La, Na, Rb, Sc, Se e Zn nas amostras de pólen e As, Br, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Se e Zn em abelhas. Méis das abelhas da subtribo trigonina apresentaram maiores concentrações dos elementos alcalinos. Alta razão K/Na foram observadas nas amostras de mel e pólen. Pólen se apresentou como uma grande fonte de P e Se. Análises quimiométricas indicaram os méis e abelhas como bons indicadores de atividades antrópicas. Arsênio apareceu nas abelhas coletadas em áreas de maior atividade antrópica. Como resultado, este estudo tem demonstrado o potencial nutracêutico do mel e pólen meliponícola e o potencial das abelhas nativas como ferramentas de avaliação da qualidade ambiental. A proximidade a atividades antrópicas mostrou-se fator decisivo para concentrações mais elevadas de As mas abelhas
Resumo:
This work presents a forensic analysis of buildings affected by mining subsidence, which is based on deformation data obtained by Differential Interferometry (DInSAR). The proposed test site is La Union village (Murcia, SE Spain) where subsidence was triggered in an industrial area due to the collapse of abandoned underground mining labours occurred in 1998. In the first part of this work the study area was introduced, describing the spatial and temporal evolution of ground subsidence, through the elaboration of a cracks map on the buildings located within the affected area. In the second part, the evolution of the most significant cracks found in the most damaged buildings was monitored using biaxial extensometric units and inclinometers. This article describes the work performed in the third part, where DInSAR processing of satellite radar data, available between 1998 and 2008, has permitted to determine the spatial and temporal evolution of the deformation of all the buildings of the study area in a period when no continuous in situ instrumental data is available. Additionally, the comparison of these results with the forensic data gathered in the 2005–2008 period, reveal that there is a coincidence between damaged buildings, buildings where extensometers register significant movements of cracks, and buildings deformation estimated from radar data. As a result, it has been demonstrated that the integration of DInSAR data into forensic analysis methodologies contributes to improve significantly the assessment of the damages of buildings affected by mining subsidence.
Resumo:
Previous editions edited by W. W. Scott.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06