930 resultados para digital work environment
Resumo:
En l’actual context de la societat de la informació, les noves tecnologies de la comunicació i, específicament, el fenomen d’Internet, adquireixen una importància rellevant en tots els sectors socials i fan replantejar la comunicació de masses. Les característiques que ofereixen els mitjans digitals (multimedialitat, hipertextualitat, interactivitat) comporten una nova forma d’estructurar i llegir la informació. Així, capçaleres i periodistes s’han d’actualitzar en la producció de la informació i en la seva transmissió, i l’usuari ha d’aprendre noves formes de lectura per adaptar-se al nou paradigma comunicacional. D’altra banda, la societat del benestar en què vivim reclama cada cop més als mitjans de comunicació generals i també específicament als digitals que parlin de temes de salut. Aquests mitjans esdevenen fonts principals d’aquest tipus d’informació, després del metge, per a la gestió autònoma de la pròpia salut. És a partir d’aquesta convergència de mitjans digitals, periodisme i salut, de la necessitat d’explorar la situació actual i del desig d’aportar coneixements per a la millora de la pràctica professional del periodisme, que es desenvolupa el treball que es presenta en aquesta memòria. El projecte que es presenta és una anàlisi restrospectiva i descriptiva de les notícies de salut que es produeixen durant un mes als diaris digitals que publiquen en llengua catalana: El Periódico de Catalunya, l’Avui, i LaMalla.net.
Resumo:
Resumen: El artículo analiza los problemas de accesibilidad que actualmente presentan los artículos científicos en soporte digital. El estudio se centra en los aspectos de facilidad de uso del contenido de los documentos digitales según la forma en que se publiquen, sin entrar en el estudio de los distintos sistemas de recuperación. Se analizan los dos formatos más utilizados para la publicación de artículos científicos en soporte digital: HTML y PDF, estudiando el desempeño lector en relación a la presencia de sumarios o de tablas internas o vinculadas. El estudio se ha realizado con dos colectivos: 30 sujetos ciegos, usuarios de Jaws, contactados gracias a la mediación de la Fundación ONCE, y 30 sujetos no ciegos, profesores del Departamento de Biblioteconomía y Documentación de la Universidad de Barcelona. El estudio muestra que la localización de los datos contenidos en tablas se ve facilitada en documentos HTML por la inclusión de un sumario que vincule con la tabla, así como la inclusión de tablas completas en el cuerpo del documento HTML facilita la actividad lectora por parte de los usuarios ciegos. A nivel metodológico la presente investigación aporta dos novedades relevantes respecto a la literatura existente en los estudios de usabilidad con ciegos: estudia la usabilidad del formato PDF y es un test de usabilidad cuantitativo; este último hecho dificulta su comparación con la mayoría de artículos publicados. Abstract: This paper analyses the problems of accessibility posed by scientific articles published in digital format, focusing on the ease of use of their content with respect to the form in which they are published (irrespective of the recovery system). The two most widely used formats for the publication of scientific articles in digital format, HTML and PDF, are analysed, examining reader performance in relation to the presence of contents lists or internal or linked tables. The study involved two groups: 30 blind subjects, all JAWS users, contacted through the ONCE Foundation, and 30 sighted subjects, lecturers in the Department of Librarianship and Documentation of the University of Barcelona. The results shows the location of data in tables is easier in HTML documents through the inclusion of a contents list linked to these tables. Further, the inclusion of complete tables in the body of HTML document facilitates the reading activity of blind users. At the methodological level, this work reports two novelties with respect to the existing literature on usability by blind people: it examines the usability of the PDF format, and discusses a quantitative usability test. The latter hinders comparison with the majority of published articles.
Resumo:
The Rebuild Iowa Agriculture and Environment Task Force respectfully submits its report to the Rebuild Iowa Advisory Commission (RIAC) for consideration of the impacts of the tornadoes, storms, high winds, and flooding affecting Iowa’s agriculture sector and environment. The Task Force was required to address very complex and multi-faceted issues. Understanding that there were a broad range of immediate concerns, as well as critical issues that need to be addressed in the future, the Task Force structured its work in two sessions. To better address the issues and priorities of the Task Force, this report categorizes the issues as agriculture, conservation, environment, and livestock.
Resumo:
Centrally located in America’s upper Midwest, Iowa lies in the heart of a 12-state region that will have installed an average of 2,701 mfi per year through 2014. In 2009 alone, this region, which is within one day delivery from Iowa, installed turbines valued at $7.8 billion! Once you understand how this exploding growth in the market intersects with the supply chain established by over 250 Iowa companies that are already providing components and services to wind energy manufacturers, you have an outstanding picture of exactly why all major wind manufacturing components are made in Iowa.
Resumo:
Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]
Resumo:
The object of this study was to compare the protective action of a new barrier cream (Excipial Protect, Spirig Pharma AG, Egerkingen, Switzerland) to its vehicle in the context of hand irritation of apprentice hairdressers caused by repeated shampooing and exposure to hair-care products. This was a double-blind cross-over comparing Excipial Protect (containing aluminium chlorohydrate 5% as active ingredient) against its vehicle alone. The efficacy of the creams was evaluated taking into account: (1) clinical scores by researchers, (2) biometric measurements, (3) subjective opinions of the subjects. An analysis of variance was performed considering order of application, degree of atopy, and reported number of shampoos. We observed very little difference in efficacy between the protective cream and its vehicle. The presence, however, of aluminium chlorhydrate in the protective cream was shown to have a positive effect against work-related irritation. The cosmetic qualities of the creams seemed, to the participants, to be as important as their real protective and hydrating properties, an important factor in compliance issues.
Resumo:
The region of greatest variability on soil maps is along the edge of their polygons, causing disagreement among pedologists about the appropriate description of soil classes at these locations. The objective of this work was to propose a strategy for data pre-processing applied to digital soil mapping (DSM). Soil polygons on a training map were shrunk by 100 and 160 m. This strategy prevented the use of covariates located near the edge of the soil classes for the Decision Tree (DT) models. Three DT models derived from eight predictive covariates, related to relief and organism factors sampled on the original polygons of a soil map and on polygons shrunk by 100 and 160 m were used to predict soil classes. The DT model derived from observations 160 m away from the edge of the polygons on the original map is less complex and has a better predictive performance.
Resumo:
Digital information generates the possibility of a high degree of redundancy in the data available for fitting predictive models used for Digital Soil Mapping (DSM). Among these models, the Decision Tree (DT) technique has been increasingly applied due to its capacity of dealing with large datasets. The purpose of this study was to evaluate the impact of the data volume used to generate the DT models on the quality of soil maps. An area of 889.33 km² was chosen in the Northern region of the State of Rio Grande do Sul. The soil-landscape relationship was obtained from reambulation of the studied area and the alignment of the units in the 1:50,000 scale topographic mapping. Six predictive covariates linked to the factors soil formation, relief and organisms, together with data sets of 1, 3, 5, 10, 15, 20 and 25 % of the total data volume, were used to generate the predictive DT models in the data mining program Waikato Environment for Knowledge Analysis (WEKA). In this study, sample densities below 5 % resulted in models with lower power of capturing the complexity of the spatial distribution of the soil in the study area. The relation between the data volume to be handled and the predictive capacity of the models was best for samples between 5 and 15 %. For the models based on these sample densities, the collected field data indicated an accuracy of predictive mapping close to 70 %.
Resumo:
O mapeamento digital de solos permite prever padrões de ocorrência de solos com base em áreas de referência e no uso de técnicas de mineração de dados para modelar associações solo-paisagem. Os objetivos deste trabalho foram produzir um mapa pedológico digital por meio de técnicas de mineração de dados aplicadas a variáveis geomorfométricas e de geologia, com base em áreas de referência; e testar a confiabilidade desse mapa por meio de validação em campo com diferentes sistemas de amostragem. O mapeamento foi realizado na folha Botucatu (SF-22-Z-B-VI-3), utilizando-se as folhas 1:50.000, Dois Córregos e São Pedro, como áreas de referência. Variáveis descritoras do relevo e de geologia associadas às unidades de mapeamento pedológico das áreas de referência compuseram a matriz de dados de treinamento. A matriz foi analisada pelo algoritmo PART de árvore de decisão, do aplicativo Weka (Waikato Environment for Knowledge Analysis), que cria regras de classificação. Essas regras foram aplicadas aos dados geomorfométricos e geológicos da folha Botucatu, para predição de unidades de mapeamento pedológico. A validação de campo dos mapas digitais deu-se por meio de amostragem por transectos em uma unidade de mapeamento da folha São Pedro e de forma aleatório-estratificada na folha Botucatu. A avaliação da unidade de mapeamento na folha São Pedro verificou confiabilidade, respectivamente, de 83 e 66 %, para os mapas pedológicos digital e tradicional com legenda simplificada. Apesar de terem sido geradas regras para todas as unidades de mapeamento pedológico das áreas de treinamento, nem todas as unidades de mapeamento foram preditas na folha Botucatu, o que resultou das diferenças de relevo e geologia entre as áreas de treinamento e de mapeamento. A validação de campo do mapa digital da folha Botucatu verificou exatidão global de 52 %, compatível com levantamentos em nível de reconhecimento de baixa intensidade, e kappa de 0,41, indicando qualidade Boa. Unidades de mapeamento mais extensas geraram mais regras, resultando melhor reprodução dos padrões solo-relevo na área a ser mapeada. A validação por transectos na folha São Pedro indicou compatibilidade do mapa digital com o nível de reconhecimento de alta intensidade e compatibilidade do mapa tradicional, após simplificação de sua legenda, com o nível de reconhecimento de baixa intensidade. O treinamento do algoritmo em mapas e não em observações pontuais reduziu em 14 % a exatidão do mapa pedológico digital da folha Botucatu. A amostragem aleatório-estratificada pelo hipercubo latino é apropriada a mapeamentos com extensa base de dados, o que permite avaliar o mapa como um todo, tornando os trabalhos de campo mais eficientes. A amostragem em transectos é compatível com a avaliação da pureza de unidades de mapeamento individualmente, não necessitando de base de dados detalhada e permitindo estudos de associações solo-paisagem em pedossequências.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
Soil properties have an enormous impact on economic and environmental aspects of agricultural production. Quantitative relationships between soil properties and the factors that influence their variability are the basis of digital soil mapping. The predictive models of soil properties evaluated in this work are statistical (multiple linear regression-MLR) and geostatistical (ordinary kriging and co-kriging). The study was conducted in the municipality of Bom Jardim, RJ, using a soil database with 208 sampling points. Predictive models were evaluated for sand, silt and clay fractions, pH in water and organic carbon at six depths according to the specifications of the consortium of digital soil mapping at the global level (GlobalSoilMap). Continuous covariates and categorical predictors were used and their contributions to the model assessed. Only the environmental covariates elevation, aspect, stream power index (SPI), soil wetness index (SWI), normalized difference vegetation index (NDVI), and b3/b2 band ratio were significantly correlated with soil properties. The predictive models had a mean coefficient of determination of 0.21. Best results were obtained with the geostatistical predictive models, where the highest coefficient of determination 0.43 was associated with sand properties between 60 to 100 cm deep. The use of a sparse data set of soil properties for digital mapping can explain only part of the spatial variation of these properties. The results may be related to the sampling density and the quantity and quality of the environmental covariates and predictive models used.
Resumo:
ABSTRACT In recent years, geotechnologies as remote and proximal sensing and attributes derived from digital terrain elevation models indicated to be very useful for the description of soil variability. However, these information sources are rarely used together. Therefore, a methodology for assessing and specialize soil classes using the information obtained from remote/proximal sensing, GIS and technical knowledge has been applied and evaluated. Two areas of study, in the State of São Paulo, Brazil, totaling approximately 28.000 ha were used for this work. First, in an area (area 1), conventional pedological mapping was done and from the soil classes found patterns were obtained with the following information: a) spectral information (forms of features and absorption intensity of spectral curves with 350 wavelengths -2,500 nm) of soil samples collected at specific points in the area (according to each soil type); b) obtaining equations for determining chemical and physical properties of the soil from the relationship between the results obtained in the laboratory by the conventional method, the levels of chemical and physical attributes with the spectral data; c) supervised classification of Landsat TM 5 images, in order to detect changes in the size of the soil particles (soil texture); d) relationship between classes relief soils and attributes. Subsequently, the obtained patterns were applied in area 2 obtain pedological classification of soils, but in GIS (ArcGIS). Finally, we developed a conventional pedological mapping in area 2 to which was compared with a digital map, ie the one obtained only with pre certain standards. The proposed methodology had a 79 % accuracy in the first categorical level of Soil Classification System, 60 % accuracy in the second category level and became less useful in the categorical level 3 (37 % accuracy).
Resumo:
Objective Exposure to bioaerosols in the occupational environment of sawmills could be associated with a wide range of health effects, in particular respiratory impairment, allergy and organic dust toxic syndrome. The objective of the study was to assess the frequency of medical respiratory and general symptoms and their relation to bioaerosol exposure. Method Twelve sawmills in the French part of Switzerland were investigated and the relationship between levels of bioaerosols (wood dust, airborne bacteria, airborne fungi and endotoxins), medical symptoms and impaired lung function was explored. A health questionnaire was distributed to 111 sawmill workers. Results The concentration of airborne fungi exceeded the limit recommended by the Swiss National Insurance (SUVA) in the twelve sawmills. This elevated fungi level significantly influenced the occurrence of bronchial syndrome (defined by cough and expectorations). No other health effects (irritations or respiratory effects) could be associated to the measured exposures. We observed that junior workers showed significantly more irritation syndrome (defined by itching/running nose, snoring and itching/red eyes) than senior workers. Lung function tests were not influenced by bioaerosol levels nor dust exposure levels. Conclusion Results suggest that occupational exposure to wood dust in a Swiss sawmill does not promote a clinically relevant decline in lung function. However, the occurrence of bronchial syndrome is strongly influenced by airborne fungi levels. [Authors]
Resumo:
This work compares the detector performance and image quality of the new Kodak Min-R EV mammography screen-film system with the Fuji CR Profect detector and with other current mammography screen-film systems from Agfa, Fuji and Kodak. Basic image quality parameters (MTF, NPS, NEQ and DQE) were evaluated for a 28 kV Mo/Mo (HVL = 0.646 mm Al) beam using different mAs exposure settings. Compared with other screen-film systems, the new Kodak Min-R EV detector has the highest contrast and a low intrinsic noise level, giving better NEQ and DQE results, especially at high optical density. Thus, the properties of the new mammography film approach those of a fine mammography detector, especially at low frequency range. Screen-film systems provide the best resolution. The presampling MTF of the digital detector has a value of 15% at the Nyquist frequency and, due to the spread size of the laser beam, the use of a smaller pixel size would not permit a significant improvement of the detector resolution. The dual collection reading technology increases significantly the low frequency DQE of the Fuji CR system that can at present compete with the most efficient mammography screen-film systems.
Resumo:
The goal of this work is to develop a method to objectively compare the performance of a digital and a screen-film mammography system in terms of image quality. The method takes into account the dynamic range of the image detector, the detection of high and low contrast structures, the visualisation of the images and the observer response. A test object, designed to represent a compressed breast, was constructed from various tissue equivalent materials ranging from purely adipose to purely glandular composition. Different areas within the test object permitted the evaluation of low and high contrast detection, spatial resolution and image noise. All the images (digital and conventional) were captured using a CCD camera to include the visualisation process in the image quality assessment. A mathematical model observer (non-prewhitening matched filter), that calculates the detectability of high and low contrast structures using spatial resolution, noise and contrast, was used to compare the two technologies. Our results show that for a given patient dose, the detection of high and low contrast structures is significantly better for the digital system than for the conventional screen-film system studied. The method of using a test object with a large tissue composition range combined with a camera to compare conventional and digital imaging modalities can be applied to other radiological imaging techniques. In particular it could be used to optimise the process of radiographic reading of soft copy images.