158 resultados para baselines


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Las instituciones para la gestión hídrica tienen serias dificultades para asir la realidad a intervenir porque parten de supuestos abstractos y homogéneos que poco se asemejan a las complejas problemáticas locales, conflictivas y heterogéneas que estarían destinadas a mejorar. Se genera así una problemática brecha entre las políticas públicas y los problemas concretos. El trabajo propone discutir los marcos teóricos institucionales dominantes señalando limitaciones e implicaciones. A partir de una revisión crítica de bibliografía sobre el tema, el artículo establece líneas de base así como un conjunto de dimensiones de análisis que -desde una mirada diferente- aporte a la conformación de un régimen institucional que contribuya a la reproducción sustentable de los recursos hídricos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Las instituciones para la gestión hídrica tienen serias dificultades para asir la realidad a intervenir porque parten de supuestos abstractos y homogéneos que poco se asemejan a las complejas problemáticas locales, conflictivas y heterogéneas que estarían destinadas a mejorar. Se genera así una problemática brecha entre las políticas públicas y los problemas concretos. El trabajo propone discutir los marcos teóricos institucionales dominantes señalando limitaciones e implicaciones. A partir de una revisión crítica de bibliografía sobre el tema, el artículo establece líneas de base así como un conjunto de dimensiones de análisis que -desde una mirada diferente- aporte a la conformación de un régimen institucional que contribuya a la reproducción sustentable de los recursos hídricos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, ICESat altimetry data are used to provide precise lake elevations of the Tibetan Plateau (TP) during the period of 2003-2009. Among the 261 lakes examined ICESat data are available on 111 lakes: 74 lakes with ICESat footprints for 4-7 years and 37 lakes with footprints for 1 -3 years. This is the first time that precise lake elevation data are provided for the 111 lakes. Those ICESat elevation data can be used as baselines for future changes in lake levels as well as for changes during the 2003-2009 period. It is found that in the 74 lakes (56 salt lakes) examined, 62 (i.e. 84%) of all lakes and 50 (i.e. 89%) of the salt lakes show tendency of lake level increase. The mean lake water level increase rate is 0.23 m/year for the 56 salt lakes and 0.27 m/year for the 50 salt lakes of water level increase. The largest lake level increase rate (0.80 m/year) found in this study is the lake Cedo Caka. The 74 lakes are grouped into four subareas based on geographical locations and change tendencies in lake levels. Three of the four subareas show increased lake levels. The mean lake level change rates for subareas I, II, III, IV, and the entire TP are 0.12, 0.26, 0.19, -0.11, and 0.2 m/year, respectively. These recent increases in lake level, particularly for a high percentage of salt lakes, supports accelerated glacier melting due to global warming as the most likely cause.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globally, areas categorically known to be free of human visitation are rare, but still exist in Antarctica. Such areas may be among the most pristine locations remaining on Earth and, therefore, be valuable as baselines for future comparisons with localities impacted by human activities, and as sites preserved for scientific research using increasingly sophisticated future technologies. Nevertheless, unvisited areas are becoming increasingly rare as the human footprint expands in Antarctica. Therefore, an understanding of historical and contemporary levels of visitation at locations across Antarctica is essential to a) estimate likely cumulative environmental impact, b) identify regions that may have been impacted by non-native species introductions, and c) inform the future designation of protected areas under the Antarctic Treaty System. Currently, records of Antarctic tourist visits exist, but little detailed information is readily available on the spatial and temporal distribution of national governmental programme activities in Antarctica. Here we describe methods to fulfil this need. Using information within field reports and archive and science databases pertaining to the activities of the United Kingdom as an illustration, we describe the history and trends in its operational footprint in the Antarctic Peninsula since c. 1944. Based on this illustration, we suggest that these methodologies could be applied productively more generally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interlinking text documents with Linked Open Data enables the Web of Data to be used as background knowledge within document-oriented applications such as search and faceted browsing. As a step towards interconnecting the Web of Documents with the Web of Data, we developed DBpedia Spotlight, a system for automatically annotating text documents with DBpedia URIs. DBpedia Spotlight allows users to congure the annotations to their specic needs through the DBpedia Ontology and quality measures such as prominence, topical pertinence, contextual ambiguity and disambiguation condence. We compare our approach with the state of the art in disambiguation, and evaluate our results in light of three baselines and six publicly available annotation systems, demonstrating the competitiveness of our system. DBpedia Spotlight is shared as open source and deployed as a Web Service freely available for public use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interlinking text documents with Linked Open Data enables the Web of Data to be used as background knowledge within document-oriented applications such as search and faceted browsing. As a step towards interconnecting the Web of Documents with the Web of Data, we developed DBpedia Spotlight, a system for automatically annotating text documents with DBpedia URIs. DBpedia Spotlight allows users to configure the annotations to their specific needs through the DBpedia Ontology and quality measures such as prominence, topical pertinence, contextual ambiguity and disambiguation confidence. We compare our approach with the state of the art in disambiguation, and evaluate our results in light of three baselines and six publicly available annotation systems, demonstrating the competitiveness of our system. DBpedia Spotlight is shared as open source and deployed as a Web Service freely available for public use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El geoide, definido como la superficie equipotencial que mejor se ajusta (en el sentido de los mínimos cuadrados) al nivel medio del mar en una determinada época, es la superficie que utilizamos como referencia para determinar las altitudes ortométricas. Si disponemos de una superficie equipotencial de referencia como dátum altimétrico preciso o geoide local, podemos entonces determinar las altitudes ortométricas de forma eficiente a partir de las altitudes elipsoidales proporcionadas por el Sistema Global de Navegación por Satélite (Global Navigation Satellite System, GNSS ). Como es sabido uno de los problemas no resueltos de la geodesia (quizás el más importante de los mismos en la actualidad) es la carencia de un dátum altimétrico global (Sjoberg, 2011) con las precisiones adecuadas. Al no existir un dátum altimétrico global que nos permita obtener los valores absolutos de la ondulación del geoide con la precisión requerida, es necesario emplear modelos geopotenciales como alternativa. Recientemente fue publicado el modelo EGM2008 en el que ha habido una notable mejoría de sus tres fuentes de datos, por lo que este modelo contiene coeficientes adicionales hasta el grado 2190 y orden 2159 y supone una sustancial mejora en la precisión (Pavlis et al., 2008). Cuando en una región determinada se dispone de valores de gravedad y Modelos Digitales del Terreno (MDT) de calidad, es posible obtener modelos de superficies geopotenciales más precisos y de mayor resolución que los modelos globales. Si bien es cierto que el Servicio Nacional Geodésico de los Estados Unidos de América (National Geodetic Survey, NGS) ha estado desarrollando modelos del geoide para la región de los Estados Unidos de América continentales y todos sus territorios desde la década de los noventa, también es cierto que las zonas de Puerto Rico y las Islas Vírgenes Estadounidenses han quedado un poco rezagadas al momento de poder aplicar y obtener resultados de mayor precisión con estos modelos regionales del geoide. En la actualidad, el modelo geopotencial regional vigente para la zona de Puerto Rico y las Islas Vírgenes Estadounidenses es el GEOID12A (Roman y Weston, 2012). Dada la necesidad y ante la incertidumbre de saber cuál sería el comportamiento de un modelo del geoide desarrollado única y exclusivamente con datos de gravedad locales, nos hemos dado a la tarea de desarrollar un modelo de geoide gravimétrico como sistema de referencia para las altitudes ortométricas. Para desarrollar un modelo del geoide gravimétrico en la isla de Puerto Rico, fue necesario implementar una metodología que nos permitiera analizar y validar los datos de gravedad terrestre existentes. Utilizando validación por altimetría con sistemas de información geográfica y validación matemática por colocación con el programa Gravsoft (Tscherning et al., 1994) en su modalidad en Python (Nielsen et al., 2012), fue posible validar 1673 datos de anomalías aire libre de un total de 1894 observaciones obtenidas de la base de datos del Bureau Gravimétrico Internacional (BGI). El aplicar estas metodologías nos permitió obtener una base de datos anomalías de la gravedad fiable la cual puede ser utilizada para una gran cantidad de aplicaciones en ciencia e ingeniería. Ante la poca densidad de datos de gravedad existentes, fue necesario emplear un método alternativo para densificar los valores de anomalías aire libre existentes. Empleando una metodología propuesta por Jekeli et al. (2009b) se procedió a determinar anomalías aire libre a partir de los datos de un MDT. Estas anomalías fueron ajustadas utilizando las anomalías aire libre validadas y tras aplicar un ajuste de mínimos cuadrados por zonas geográficas, fue posible obtener una malla de datos de anomalías aire libre uniforme a partir de un MDT. Tras realizar las correcciones topográficas, determinar el efecto indirecto de la topografía del terreno y la contribución del modelo geopotencial EGM2008, se obtuvo una malla de anomalías residuales. Estas anomalías residuales fueron utilizadas para determinar el geoide gravimétrico utilizando varias técnicas entre las que se encuentran la aproximación plana de la función de Stokes y las modificaciones al núcleo de Stokes, propuestas por Wong y Gore (1969), Vanicek y Kleusberg (1987) y Featherstone et al. (1998). Ya determinados los distintos modelos del geoide gravimétrico, fue necesario validar los mismos y para eso se utilizaron una serie de estaciones permanentes de la red de nivelación del Datum Vertical de Puerto Rico de 2002 (Puerto Rico Vertical Datum 2002, PRVD02 ), las cuales tenían publicados sus valores de altitud elipsoidal y elevación. Ante la ausencia de altitudes ortométricas en las estaciones permanentes de la red de nivelación, se utilizaron las elevaciones obtenidas a partir de nivelación de primer orden para determinar los valores de la ondulación del geoide geométrico (Roman et al., 2013). Tras establecer un total de 990 líneas base, se realizaron dos análisis para determinar la 'precisión' de los modelos del geoide. En el primer análisis, que consistió en analizar las diferencias entre los incrementos de la ondulación del geoide geométrico y los incrementos de la ondulación del geoide de los distintos modelos (modelos gravimétricos, EGM2008 y GEOID12A) en función de las distancias entre las estaciones de validación, se encontró que el modelo con la modificación del núcleo de Stokes propuesta por Wong y Gore presentó la mejor 'precisión' en un 91,1% de los tramos analizados. En un segundo análisis, en el que se consideraron las 990 líneas base, se determinaron las diferencias entre los incrementos de la ondulación del geoide geométrico y los incrementos de la ondulación del geoide de los distintos modelos (modelos gravimétricos, EGM2008 y GEOID12A), encontrando que el modelo que presenta la mayor 'precisión' también era el geoide con la modificación del núcleo de Stokes propuesta por Wong y Gore. En este análisis, el modelo del geoide gravimétrico de Wong y Gore presento una 'precisión' de 0,027 metros en comparación con la 'precisión' del modelo EGM2008 que fue de 0,031 metros mientras que la 'precisión' del modelo regional GEOID12A fue de 0,057 metros. Finalmente podemos decir que la metodología aquí presentada es una adecuada ya que fue posible obtener un modelo del geoide gravimétrico que presenta una mayor 'precisión' que los modelos geopotenciales disponibles, incluso superando la precisión del modelo geopotencial global EGM2008. ABSTRACT The geoid, defined as the equipotential surface that best fits (in the least squares sense) to the mean sea level at a particular time, is the surface used as a reference to determine the orthometric heights. If we have an equipotential reference surface or a precise local geoid, we can then determine the orthometric heights efficiently from the ellipsoidal heights, provided by the Global Navigation Satellite System (GNSS). One of the most common and important an unsolved problem in geodesy is the lack of a global altimetric datum (Sjoberg, 2011)) with the appropriate precision. In the absence of one which allows us to obtain the absolute values of the geoid undulation with the required precision, it is necessary to use alternative geopotential models. The EGM2008 was recently published, in which there has been a marked improvement of its three data sources, so this model contains additional coefficients of degree up to 2190 and order 2159, and there is a substantial improvement in accuracy (Pavlis et al., 2008). When a given region has gravity values and high quality digital terrain models (DTM), it is possible to obtain more accurate regional geopotential models, with a higher resolution and precision, than global geopotential models. It is true that the National Geodetic Survey of the United States of America (NGS) has been developing geoid models for the region of the continental United States of America and its territories from the nineties, but which is also true is that areas such as Puerto Rico and the U.S. Virgin Islands have lagged behind when to apply and get more accurate results with these regional geopotential models. Right now, the available geopotential model for Puerto Rico and the U.S. Virgin Islands is the GEOID12A (Roman y Weston, 2012). Given this need and given the uncertainty of knowing the behavior of a regional geoid model developed exclusively with data from local gravity, we have taken on the task of developing a gravimetric geoid model to use as a reference system for orthometric heights. To develop a gravimetric geoid model in the island of Puerto Rico, implementing a methodology that allows us to analyze and validate the existing terrestrial gravity data is a must. Using altimetry validation with GIS and mathematical validation by collocation with the Gravsoft suite programs (Tscherning et al., 1994) in its Python version (Nielsen et al., 2012), it was possible to validate 1673 observations with gravity anomalies values out of a total of 1894 observations obtained from the International Bureau Gravimetric (BGI ) database. Applying these methodologies allowed us to obtain a database of reliable gravity anomalies, which can be used for many applications in science and engineering. Given the low density of existing gravity data, it was necessary to employ an alternative method for densifying the existing gravity anomalies set. Employing the methodology proposed by Jekeli et al. (2009b) we proceeded to determine gravity anomaly data from a DTM. These anomalies were adjusted by using the validated free-air gravity anomalies and, after that, applying the best fit in the least-square sense by geographical area, it was possible to obtain a uniform grid of free-air anomalies obtained from a DTM. After applying the topographic corrections, determining the indirect effect of topography and the contribution of the global geopotential model EGM2008, a grid of residual anomalies was obtained. These residual anomalies were used to determine the gravimetric geoid by using various techniques, among which are the planar approximation of the Stokes function and the modifications of the Stokes kernel, proposed by Wong y Gore (1969), Vanicek y Kleusberg (1987) and Featherstone et al. (1998). After determining the different gravimetric geoid models, it was necessary to validate them by using a series of stations of the Puerto Rico Vertical Datum of 2002 (PRVD02) leveling network. These stations had published its values of ellipsoidal height and elevation, and in the absence of orthometric heights, we use the elevations obtained from first - order leveling to determine the geometric geoid undulation (Roman et al., 2013). After determine a total of 990 baselines, two analyzes were performed to determine the ' accuracy ' of the geoid models. The first analysis was to analyze the differences between the increments of the geometric geoid undulation with the increments of the geoid undulation of the different geoid models (gravimetric models, EGM2008 and GEOID12A) in function of the distance between the validation stations. Through this analysis, it was determined that the model with the modified Stokes kernel given by Wong and Gore had the best 'accuracy' in 91,1% for the analyzed baselines. In the second analysis, in which we considered the 990 baselines, we analyze the differences between the increments of the geometric geoid undulation with the increments of the geoid undulation of the different geoid models (gravimetric models, EGM2008 and GEOID12A) finding that the model with the highest 'accuracy' was also the model with modifying Stokes kernel given by Wong and Gore. In this analysis, the Wong and Gore gravimetric geoid model presented an 'accuracy' of 0,027 meters in comparison with the 'accuracy' of global geopotential model EGM2008, which gave us an 'accuracy' of 0,031 meters, while the 'accuracy ' of the GEOID12A regional model was 0,057 meters. Finally we can say that the methodology presented here is adequate as it was possible to obtain a gravimetric geoid model that has a greater 'accuracy' than the geopotential models available, even surpassing the accuracy of global geopotential model EGM2008.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Una de las barreras para la aplicación de las técnicas de monitorización de la integridad estructural (SHM) basadas en ondas elásticas guiadas (GLW) en aeronaves es la influencia perniciosa de las condiciones ambientales y de operación (EOC). En esta tesis se ha estudiado dicha influencia y la compensación de la misma, particularizando en variaciones del estado de carga y temperatura. La compensación de dichos efectos se fundamenta en Redes Neuronales Artificiales (ANN) empleando datos experimentales procesados con la Transformada Chirplet. Los cambios en la geometría y en las propiedades del material respecto al estado inicial de la estructura (lo daños) provocan cambios en la forma de onda de las GLW (lo que denominamos característica sensible al daño o DSF). Mediante técnicas de tratamiento de señal se puede buscar una relación entre dichas variaciones y los daños, esto se conoce como SHM. Sin embargo, las variaciones en las EOC producen también cambios en los datos adquiridos relativos a las GLW (DSF) que provocan errores en los algoritmos de diagnóstico de daño (SHM). Esto sucede porque las firmas de daño y de las EOC en la DSF son del mismo orden. Por lo tanto, es necesario cuantificar y compensar el efecto de las EOC sobre la GLW. Si bien existen diversas metodologías para compensar los efectos de las EOC como por ejemplo “Optimal Baseline Selection” (OBS) o “Baseline Signal Stretching” (BSS), estas, se emplean exclusivamente en la compensación de los efectos térmicos. El método propuesto en esta tesis mezcla análisis de datos experimentales, como en el método OBS, y modelos basados en Redes Neuronales Artificiales (ANN) que reemplazan el modelado físico requerido por el método BSS. El análisis de datos experimentales consiste en aplicar la Transformada Chirplet (CT) para extraer la firma de las EOC sobre la DSF. Con esta información, obtenida bajo diversas EOC, se entrena una ANN. A continuación, la ANN actuará como un interpolador de referencias de la estructura sin daño, generando información de referencia para cualquier EOC. La comparación de las mediciones reales de la DSF con los valores simulados por la ANN, dará como resultado la firma daño en la DSF, lo que permite el diagnóstico de daño. Este esquema se ha aplicado y verificado, en diversas EOC, para una estructura unidimensional con un único camino de daño, y para una estructura representativa de un fuselaje de una aeronave, con curvatura y múltiples elementos rigidizadores, sometida a un estado de cargas complejo, con múltiples caminos de daños. Los efectos de las EOC se han estudiado en detalle en la estructura unidimensional y se han generalizado para el fuselaje, demostrando la independencia del método respecto a la configuración de la estructura y el tipo de sensores utilizados para la adquisición de datos GLW. Por otra parte, esta metodología se puede utilizar para la compensación simultánea de una variedad medible de EOC, que afecten a la adquisición de datos de la onda elástica guiada. El principal resultado entre otros, de esta tesis, es la metodología CT-ANN para la compensación de EOC en técnicas SHM basadas en ondas elásticas guiadas para el diagnóstico de daño. ABSTRACT One of the open problems to implement Structural Health Monitoring techniques based on elastic guided waves in real aircraft structures at operation is the influence of the environmental and operational conditions (EOC) on the damage diagnosis problem. This thesis deals with the compensation of these environmental and operational effects, specifically, the temperature and the external loading, by the use of the Chirplet Transform working with Artificial Neural Networks. It is well known that the guided elastic wave form is affected by the damage appearance (what is known as the damage sensitive feature or DSF). The DSF is modified by the temperature and by the load applied to the structure. The EOC promotes variations in the acquired data (DSF) and cause mistakes in damage diagnosis algorithms. This effect promotes changes on the waveform due to the EOC variations of the same order than the damage occurrence. It is difficult to separate both effects in order to avoid damage diagnosis mistakes. Therefore it is necessary to quantify and compensate the effect of EOC over the GLW forms. There are several approaches to compensate the EOC effects such as Optimal Baseline Selection (OBS) or Baseline Signal Stretching (BSS). Usually, they are used for temperature compensation. The new method proposed here mixes experimental data analysis, as in the OBS method, and Artificial Neural Network (ANN) models to replace the physical modelling which involves the BSS method. The experimental data analysis studied is based on apply the Chirplet Transform (CT) to extract the EOC signature on the DSF. The information obtained varying EOC is employed to train an ANN. Then, the ANN will act as a baselines interpolator of the undamaged structure. The ANN generates reference information at any EOC. By comparing real measurements of the DSF against the ANN simulated values, the damage signature appears clearly in the DSF, enabling an accurate damage diagnosis. This schema has been applied in a range of EOC for a one-dimensional structure containing single damage path and two dimensional real fuselage structure with stiffener elements and multiple damage paths. The EOC effects tested in the one-dimensional structure have been generalized to the fuselage showing its independence from structural arrangement and the type of sensors used for GLW data acquisition. Moreover, it can be used for the simultaneous compensation of a variety of measurable EOC, which affects the guided wave data acquisition. The main result, among others, of this thesis is the CT-ANN methodology for the compensation of EOC in GLW based SHM technique for damage diagnosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To test the significance of ultrafast protein folding signals (≪1 msec), we studied cytochrome c (Cyt c) and two Cyt c fragments with major C-terminal segments deleted. The fragments remain unfolded under all conditions and so could be used to define the unfolded baselines for protein fluorescence and circular dichroism (CD) as a function of denaturant concentration. When diluted from high to low denaturant in kinetic folding experiments, the fragments readjust to their new baseline values in a “burst phase” within the mixing dead time. The fragment burst phase reflects a contraction of the polypeptide from a more extended unfolded condition at high denaturant to a more contracted unfolded condition in the poorer, low denaturant solvent. Holo Cyt c exhibits fluorescence and CD burst phase signals that are essentially identical to the fragment signals over the whole range of final denaturant concentrations, evidently reflecting the same solvent-dependent, relatively nonspecific contraction and not the formation of a specific folding intermediate. The significance of fast folding signals in Cyt c and other proteins is discussed in relation to the hypothesis of an initial rate-limiting search-nucleation-collapse step in protein folding [Sosnick, T. R., Mayne, L. & Englander, S. W. (1996) Proteins Struct. Funct. Genet. 24, 413–426].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il GPS è un sistema che consente d’individuare la posizione di un qualsiasi punto sulla superficie terrestre, dal quale sia possibile ricevere i segnali da almeno quattro satelliti. È costituito da un sistema di satelliti in orbita, i quali trasmettono informazioni a dei ricevitori GPS sulla Terra. Questi dati raccolti riguardano coordinate geografiche ed orari, che vengono successivamente elaborate dal ricevitore stesso. In campo topografico possono essere adottate diverse modalità operativo, soprattutto: il metodo statico e statico rapido, mentre per il rilievo di dettaglio è principalmente utilizzato il metodo cinematico. Il GPS può essere utilizzato per rilievi sia in planimetria che in altimetria. In questa tesi si è deciso di valutare principalmente l’utilizzo del GPS in altimetria in modalità statica. Per fare ciò, si è eseguito un confronto su corte distanze tra le misurazioni altimetriche, in una prima fase eseguite con strumentazione topografica classica e successivamente con il sistema GPS. Il sito scelto per effettuare questa valutazione è stato il terrazzo di copertura dell’edificio della Scuola di Ingegneria di Bologna, sul quale sono presenti due antenne di riferimento (SLITTA e UNBO) con ricevitori GPS, che hanno consentito la realizzazione di una baseline di test. Essendo molto corta la distanza tra le due antenne, si può quindi parlare di short-baselines. Il sistema GPS fornisce dati con una precisione che risente in modo molto minore, rispetto alle tecniche topografiche classiche, della propagazione della varianza all’aumentare della distanza, ma al tempo stesso non consente di raggiungere precisioni millimetriche anche su brevissime distanze. Verranno descritte le due metodologie di rilievo e i risultati ottenuti da ognuna. Si procederà in seguito con l’analisi e la rielaborazione di questi dati tramite software e solo in fine verrà eseguito il confronto, atto a verificare la precisione del GPS rispetto alla strumentazione topografica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As técnicas de gerenciamento de projetos empregadas no controle e monitoramento durante o ciclo de vida de um projeto, em sua forma mais tradicional, são realizadas levando em consideração a tríplice restrição (escopo, tempo e custo) em busca do pleno atendimento às linhas de bases estabelecidas preliminarmente nas fases de planejamento dos projetos, sejam eles de Tecnologia da Informação (TI) ou de quaisquer naturezas. O desempenho dos projetos da Tecnologia da Informação pode sofrer maiores impactos devido ao alto grau de complexidade e devido à intangibilidade dos mesmos; aliado a isso e com o crescente avanço de estudos de sustentabilidade aplicados à diversas áreas de projeto, incluindo-se à estes os projetos de TI, este trabalho tem como tema central identificar como as empresas de TI aplicam os indicadores de sustentabilidade nos processos de gerenciamento de projetos de tecnologia da informação. O objetivo geral é verificar a presença do uso de indicadores de sustentabilidade em projetos de TI em empresas prestadoras de serviços de tecnologia da informação, e os específicos são: (i) analisar a existência de indicadores de sustentabilidade utilizados nos projetos de TI; (ii) compreender a relação dos indicadores de sustentabilidade com os projetos de TI; e (iii) verificar as contribuições do uso de indicadores de sustentabilidade nos projetos de TI. A abordagem do trabalho será de origem qualitativa e exploratória, realizada por meio de um estudo de múltiplos casos a ser aplicado em empresas prestadoras de serviços do segmento de Tecnologia da Informação. Concluiu-se que as organizações possuem de fato a aplicação de indicadores de sustentabilidade em nível organizacional na gestão de projetos de TI, porém, nenhum dos casos abordados apresenta a presença de indicadores específicos a projetos TI, dessa forma somente fazendo parte de um conjunto geral de indicadores e com maior a menor intensidade dependendo da natureza envolvida no projeto de tecnologia da informação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To evaluate the protective eyewear promotion ( PEP) project, which was a comprehensive educational strategy to increase the use of appropriate protective eyewear by squash players. Methods: An ecological study design was used. Four squash venues in one playing association were randomly chosen to receive PEP and four in another association maintained usual practice and hence formed a control group. The primary evaluation measurements were surveys of cross sectional samples of players carried out before and after the intervention. The surveys investigated players' knowledge, behaviours, and attitudes associated with the use of protective eyewear. The survey carried out after the intervention also determined players' exposure to PEP. Univariate and multivariate analyses were undertaken to describe differences at PEP venues from pre- to post-intervention and to compare these with the control venues. Results: The PEP players had 2.4 times the odds (95% confidence interval, 1.3 to 4.2) of wearing appropriate eyewear compared with control group players post-intervention, relative to the groups' preintervention baselines. Components of PEP, such as stickers and posters and the availability and prominent positioning of the project eyewear, were found to contribute to players adopting favourable eyewear behaviours. Conclusions: Components of the PEP intervention were shown to be effective. The true success will be the sustainability and dissemination of the project, favourable eyewear behaviours, and evidence of the prevention of eye injuries long into the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sediments, mosses and algae, collected from lake catchments of the Larsemann Hills, East Antarctica, were analysed to establish baseline levels of trace metals (Ag, As, Cd, Co, Cr, Cu, Ni, Sb, Pb, Se, V and Zn), and to quantify the extent of trace metal pollution in the area. Both impacted and non-impacted sites were included in the study. Four different leaching solutions (1 M MgCl2, 1 M CH3COONH4, 1 M NH4NO3, and 0.3 N HCl) were tested on the fine fraction (< 63 mu m) of the sediments to extract the mobile fraction of trace metals derived from human impact and from weathering of basement lithologies. Results of these tests indicate that dilute HCl partly dissolves primary minerals present in the sediment, thus leading to an overestimate of the mobile trace metal fraction. Concentrations of trace metals released using the other 3 procedures indicate negligible levels of anthropogenic contribution to the trace metal budget. Data derived from this study and a thorough characterisation of the site allowed the authors to define natural baseline levels of trace metals in sediments, mosses and algae, and their spatial variability across the area. The results show that, with a few notable exceptions, human activities at the research stations have contributed negligible levels (lower than natural variability) of trace metals to the Larsemann Hills ecosystem. This study further demonstrates that anthropogenic sources of trace metals can be correctly identified and quantified only if natural baselines, their variability, and processes controlling the mobility of trace metals in the ecosystem, have been fully characterised. (c) 2006 Elsevier Ltd. All rights reserved.