998 resultados para Surface Web


Relevância:

70.00% 70.00%

Publicador:

Resumo:

En el Siglo XXI, donde las Nuevas Tecnologías de la Información y Comunicación están a la orden del día, se suceden manifestaciones que tradicionalmente abarcaban otros entornos menos virtuales y engloban grupos etarios desconocidos en los que la diferenciación de género es manifiesta. Con la facilidad de acceso y conexión a Internet, muchos disponemos de herramientas suficientes como para escribir en Redes Sociales determinadas emociones en su grado extremo así como ideaciones suicidas. Sin embargo, hay ubicaciones más profundas y desconocidas por algunos usuarios, como la Deep Web (y su navegador Tor), que permiten un completo anonimato del usuario. Por tanto, surge necesidad de la creación de un corpus de mensajes de índole suicida y relacionados con las emociones profundas con el fin de analizar el léxico mediante el lenguaje computacional y una previa categorización de los resultados con el fin de fomentar la creación de programas que detecten estas manifestaciones y ejerzan una labor preventiva.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Trabalho baseado no relatório para a disciplina “Sociologia das Novas Tecnologias de Informação” no âmbito do Mestrado Integrado de Engenharia e Gestão Industrial, da Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa em 2015-16. O trabalho foi orientado pelo Prof. António Brandão Moniz do Departamento de Ciências Sociais Aplicadas (DCSA) na mesma Faculdade.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Travail dirigé présenté à la Faculté des études supérieures et postdoctorales en vue de l’obtention du grade de Maître ès sciences (M.Sc.) en criminologie, option Criminalistique et information

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Travail dirigé présenté à la Faculté des études supérieures et postdoctorales en vue de l’obtention du grade de Maître ès sciences (M.Sc.) en criminologie, option Criminalistique et information

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The solution structure of robustoxin, the lethal neurotoxin from the Sydney funnel-web spider Atrax robustus, has been determined from 2D H-1 NMR data, Robustoxin is a polypeptide of 42 residues cross-linked by four disulphide bonds, the connectivities of which were determined from NMR data and trial structure calculations to be 1-15, 8-20, 14-31 and 16-42 (a 1-4/2-6/3-7/5-8 pattern), The structure consists of a small three-stranded, anti-parallel beta-sheet and a series of interlocking gamma-turns at the C-terminus. It also contains a cystine knot, thus placing it in the inhibitor cystine knot motif family of structures, which includes the omega-conotoxins and a number of plant and animal toxins and protease inhibitors. Robustoxin contains three distinct charged patches on its surface, and an extended loop that includes several aromatic and non-polar residues, Both of these structural features may play a role in its binding to the voltage-gated sodium channel. (C) 1997 Federation of European Biochemical Societies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is devoted to the problem of reconstructing the basis weight structure at paper web with black{box techniques. The data that is analyzed comes from a real paper machine and is collected by an o®-line scanner. The principal mathematical tool used in this work is Autoregressive Moving Average (ARMA) modelling. When coupled with the Discrete Fourier Transform (DFT), it gives a very flexible and interesting tool for analyzing properties of the paper web. Both ARMA and DFT are independently used to represent the given signal in a simplified version of our algorithm, but the final goal is to combine the two together. Ljung-Box Q-statistic lack-of-fit test combined with the Root Mean Squared Error coefficient gives a tool to separate significant signals from noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various strength properties of paper are measured to tell how well it resists breaks in a paper machine or in printing presses. The most often measured properties are dry tensile strength and dry tear strength. However, in many situations where paper breaks, it is not dry. For example, in web breaks after the wet pressing the dry matter content can be around 45%. Thus, wet-web strength is often a more critical paper property than dry strength. Both wet and dry strength properties of the samples were measured with a L&W tensile tester. Originally this device was not designed for the measurement of the wet web tensile strength, thus a new procedure to handle the wet samples was developed. The method was tested with Pine Kraft (never dried). The effect of different strength additives on the wet-web and dry paper tensile strength was studied. The polymers used in this experiment were aqueous solution of a cationic polyamidoamine-epichlorohydrin resin (PAE), cationic hydrophilised polyisocyanate and cationic polyvinylamine (PVAm). From all three used chemicals only Cationic PAE considerably increased the wet web strength. However it was noticed that at constant solids content all chemicals decreased the wet web tensile strength. So, since all chemicals enhanced solid content it can be concluded that they work as drainage aids, not as wet web strength additives. From all chemicals only PVAm increased the dry strength and two other chemicals even decreased the strength. As chemicals were used in strong diluted forms and were injected into the pulp slurry, not on the surface of the papersheets, changes in samples densities did not happen. Also it has to be noted that all these chemicals are mainly used to improve the wet strength after the drying of the web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis was to identify the effects of different factors on the tension and tension relaxation of wet paper web after high-speed straining. The study was motivated by the plausible connection between wet web mechanical properties and wet web runnability on paper machines shown by previous studies. The mechanical properties of wet paper were examined using a fast tensile test rig with a strain rate of 1000%/s. Most of the tests were carried out with laboratory handsheets, but samples from a pilot paper machine were also used. The tension relaxation of paper was evaluated as the tension remaining after 0.475 s of relaxation (residual tension). The tensile and relaxation properties of wet webs were found to be strongly dependent on the quality and amount of fines. With low fines content, the tensile strength and residual tension of wet paper was mainly determined by the mechanical interactions between fibres at their contact points. As the fines strengthen the mechanical interaction in the network, the fibre properties also become important. Fibre deformations caused by the mechanical treatment of pulp were shown to reduce the mechanical properties of both dry and wet paper. However, the effect was significantly higher for wet paper. An increase of filler content from 10% to 25% greatly reduced the tensile strength of dry paper, but did not significantly impair wet web tensile strength or residual tension. Increased filler content in wet web was shown to increase the dryness of the wet web after the press section, which partly compensates for the reduction of fibrous material in the web. It is also presumable that fillers increase entanglement friction between fibres, which is beneficial for wet web strength. Different contaminants present in white water during sheet formation resulted in lowered surface tension and increased dryness after wet pressing. The addition of different contaminants reduced the tensile strength of the dry paper. The reduction of dry paper tensile strength could not be explained by the reduced surface tension, but rather on the tendency of different contaminants to interfere with the inter-fibre bonding. Additionally, wet web strength was not affected by the changes in the surface tension of white water or possible changes in the hydrophilicity of fibres caused by the addition of different contaminants. The spraying of different polymers on wet paper before wet pressing had a significant effect on both dry and wet web tensile strength, whereas wet web elastic modulus and residual tension were basically not affected. We suggest that the increase of dry and wet paper strength could be affected by the molecular level interactions between these chemicals and fibres. The most significant increases in dry and wet paper strength were achieved with a dual application of anionic and cationic polymers. Furthermore, selectively adding papermaking chemicals to different fibre fractions (as opposed to adding chemicals to the whole pulp) improved the wet web mechanical properties and the drainage of the pulp suspension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea surface temperature (SST) measurements are required by operational ocean and atmospheric forecasting systems to constrain modeled upper ocean circulation and thermal structure. The Global Ocean Data Assimilation Experiment (GODAE) High Resolution SST Pilot Project (GHRSST-PP) was initiated to address these needs by coordinating the provision of accurate, high-resolution, SST products for the global domain. The pilot project is now complete, but activities continue within the Group for High Resolution SST (GHRSST). The pilot project focused on harmonizing diverse satellite and in situ data streams that were indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework implemented in an internationally distributed manner. Data with meaningful error estimates developed within GHRSST are provided by services within R/GTS. Currently, several terabytes of data are processed at international centers daily, creating more than 25 gigabytes of product. Ensemble SST analyses together with anomaly SST outputs are generated each day, providing confidence in SST analyses via diagnostic outputs. Diagnostic data sets are generated and Web interfaces are provided to monitor the quality of observation and analysis products. GHRSST research and development projects continue to tackle problems of instrument calibration, algorithm development, diurnal variability, skin temperature deviation, and validation/verification of GHRSST products. GHRSST also works closely with applications and users, providing a forum for discussion and feedback between SST users and producers on a regular basis. All data within the GHRSST R/GTS framework are freely available. This paper reviews the progress of GHRSST-PP, highlighting achievements that have been fundamental to the success of the pilot project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES] El objetivo que se pretendía alcanzar con la realización de este proyecto era desarrollar una aplicación web 2.0 capaz de gestionar rutas independientemente de su tipología u otro tipo de características. El usuario de la aplicación construida tras analizar este proyecto puede crear rutas y modificarlas. Para su visualización se ha optado por un entorno 3D que muestra el globo terráqueo de manera realista: Google Earth en su versión como complemento para navegadores. La variedad de rutas creadas puede ser muy amplia, como por ejemplo: las etapas de la Vuelta Ciclista a España, la travesía de una competición náutica, el recorrido que siguió Colón en su primer viaje a América o los caminos reales de Canarias. Las rutas se crean desplazando el ratón y haciendo clic sobre la superficie del globo terráqueo. La aplicación muestra una línea que pasa por los puntos indicados con la representación de la ruta. Para las rutas creadas el usuario puede añadir una serie de elementos multimedia que permitan obtener un mayor conocimiento de dichas rutas. La aplicación permite añadir fotos, vídeos web, documentos y elementos 3D. Cuando el usuario seleccione mostrar una ruta creada previamente a la que le ha añadido alguno de los elementos indicados previamente, éste se visualizará en el navegador. Además de la posibilidad de gestionar rutas, el usuario también puede gestionar sitios de su interés de manera sencilla. El usuario puede incorporar localizaciones de diversa índole a su biblioteca de elementos para disponer de una mejor accesibilidad de sus lugares predilectos. Los sitios de interés creados aparecen en el globo como un icono junto con su nombre.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Routine bridge inspections require labor intensive and highly subjective visual interpretation to determine bridge deck surface condition. Light Detection and Ranging (LiDAR) a relatively new class of survey instrument has become a popular and increasingly used technology for providing as-built and inventory data in civil applications. While an increasing number of private and governmental agencies possess terrestrial and mobile LiDAR systems, an understanding of the technology’s capabilities and potential applications continues to evolve. LiDAR is a line-of-sight instrument and as such, care must be taken when establishing scan locations and resolution to allow the capture of data at an adequate resolution for defining features that contribute to the analysis of bridge deck surface condition. Information such as the location, area, and volume of spalling on deck surfaces, undersides, and support columns can be derived from properly collected LiDAR point clouds. The LiDAR point clouds contain information that can provide quantitative surface condition information, resulting in more accurate structural health monitoring. LiDAR scans were collected at three study bridges, each of which displayed a varying degree of degradation. A variety of commercially available analysis tools and an independently developed algorithm written in ArcGIS Python (ArcPy) were used to locate and quantify surface defects such as location, volume, and area of spalls. The results were visual and numerically displayed in a user-friendly web-based decision support tool integrating prior bridge condition metrics for comparison. LiDAR data processing procedures along with strengths and limitations of point clouds for defining features useful for assessing bridge deck condition are discussed. Point cloud density and incidence angle are two attributes that must be managed carefully to ensure data collected are of high quality and useful for bridge condition evaluation. When collected properly to ensure effective evaluation of bridge surface condition, LiDAR data can be analyzed to provide a useful data set from which to derive bridge deck condition information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Off-site effects of soil erosion are becoming increasingly important, particularly the pollution of surface waters. In order to develop environmentally efficient and cost effective mitigation options it is essential to identify areas that bear both a high erosion risk and high connectivity to surface waters. This paper introduces a simple risk assessment tool that allows the delineation of potential critical source areas (CSA) of sediment input into surface waters concerning the agricultural areas of Switzerland. The basis are the erosion risk map with a 2 m resolution (ERM2) and the drainage network, which is extended by drained roads, farm tracks, and slope depressions. The probability of hydrological and sedimentological connectivity is assessed by combining soil erosion risk and extended drainage network with flow distance calculation. A GIS-environment with multiple-flow accumulation algorithms is used for routing runoff generation and flow pathways. The result is a high resolution connectivity map of the agricultural area of Switzerland (888,050 ha). Fifty-five percent of the computed agricultural area is potentially connected with surface waters, 45% is not connected. Surprisingly, the larger part of 34% (62% of the connected area) is indirectly connected with surface waters through drained roads, and only 21% are directly connected. The reason is the topographic complexity and patchiness of the landscape due to a dense road and drainage network. A total of 24% of the connected area and 13% of the computed agricultural area, respectively, are rated with a high connectivity probability. On these CSA an adapted land use is recommended, supported by vegetated buffer strips preventing sediment load. Even areas that are far away from open water bodies can be indirectly connected and need to be included in planning of mitigation measures. Thus, the connectivity map presented is an important decision-making tool for policy-makers and extension services. The map is published on the web and thus available for application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The physicochemical properties of the sea surface microlayer (SML), i.e. the boundary layer between the air and the sea, and its impact on air-sea exchange processes have been investigated for decades. However, a detailed description about these processes remains incomplete. In order to obtain a better chemical characterization of the SML, in a case study three pairs of SML and corresponding bulk water samples were taken in the southern Baltic Sea. The samples were analyzed for dissolved organic carbon and dissolved total nitrogen, as well as for several organic nitrogen containing compounds and carbohydrates, namely aliphatic amines, dissolved free amino acids, dissolved free monosaccharides, sugar alcohols, and monosaccharide anhydrates. Therefore, reasonable analytical procedures with respect to desalting and enrichment were established. All aliphatic amines and the majority of the investigated amino acids (11 out of 18) were found in the samples with average concentrations between 53 ng/l and 1574 ng/l. The concentrations of carbohydrates were slightly higher, averaging 2900 ng/l. Calculation of the enrichment factor (EF) between the sea surface microlayer and the bulk water showed that dissolved total nitrogen was more enriched (EF: 1.1 and 1.2) in the SML than dissolved organic carbon (EF: 1.0 and 1.1). The nitrogen containing organic compounds were generally found to be enriched in the SML (EF: 1.9-9.2), whereas dissolved carbohydrates were not enriched or even depleted (EF: 0.7-1.2). Although the investigated compounds contributed on average only 0.3% to the dissolved organic carbon and 0.4% to the total dissolved nitrogen fraction, these results underline the importance of single compound analysis to determine SML structure, function, and its potential for a transfer of compounds into the atmosphere.