952 resultados para Data-stream balancing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of biomonitoring programs based on the macroinvertebrate community requires the understanding of species distribution patterns, as well as of the responses of the community to anthropogenic stressors. In this study, 49 metrics were tested as potential means of assessing the condition of 29 first- and second-order streams located in areas of differing types of land use in So Paulo State, Brazil. Of the sampled streams, 15 were in well-preserved regions in the Atlantic Forest, 5 were among sugarcane cultivations, 5 were in areas of pasture, and 4 were among eucalyptus plantations. The metrics were assessed against the following criteria: (1) predictable response to the impact of human activity; (2) highest taxonomic resolution, and (3) operational and theoretical simplicity. We found that 18 metrics were correlated with the environmental and spatial predictors used, and seven of these satisfied the selection criteria and are thus candidates for inclusion in a multimetric system to assess low-order streams in So Paulo State. These metrics are family richness; Ephemeroptera, Plecoptera and Trichoptera (EPT) richness; proportion of Megaloptera and Hirudinea; proportion of EPT; Shannon diversity index for genus; and adapted Biological Monitoring Work Party biotic index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uma grande parte do tempo de uma organização é despendida em atividades que não criam qualquer tipo de valor. Este tipo de atividades são consideradas como desperdícios, pois consomem recursos e tempo, como é o caso de deslocações, controlos, ajustes, armazenamento de materiais, resolução de problemas, entre tantos outros, levando a um elevado custo dos produtos disponibilizados. Em 1996 a designação de Lean Thinking foi usada, pela primeira vez, por Womack e Jones, onde é falada como uma filosofia de gestão, que tem como principal objetivo reduzir os desperdícios num processo produtivo. Reduzindo os desperdícios aumenta-se a qualidade e diminui-se os tempos de processamento e, consequentemente, os custos de produção. É nesta base que assenta o documento aqui presente, que tem o objetivo de criar e desenvolver um jogo de simulação onde seja possível aplicar várias ferramentas Lean. O jogo de simulação é uma continuação de uma pesquisa e estudo teórico de um aluno de erasmus e faz parte de um projeto internacional do Lean Learning Academy (LLA). Criou-se um processo produtivo de montagem de canetas que fosse o mais semelhante ao que se encontram nas empresas, com todos os acessórios para o pleno funcionamento da simulação, como é o caso de instruções de montagem, procedimentos de controlo e ordens de produção, para assim posteriormente ser possível analisar os dados e as dificuldades encontradas, de modo a aplicar-se as ferramentas Lean. Apesar de serem abordadas várias ferramentas Lean neste trabalho, foram trabalhadas mais detalhadamente as seguintes: - Value Stream Mapping (VSM); - Single Minute Exchange of Dies (SMED); - Balanceamento da linha. De modo a ser percetível o conteúdo e as vantagens das três ferramentas Lean mencionadas no trabalho, estas foram aplicadas e simuladas, de forma a existir uma componente prática no seu estudo, para mais fácil compreensão e rápida aprendizagem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta dissertação apresenta uma proposta de sistema capaz de preencher a lacuna entre documentos legislativos em formato PDF e documentos legislativos em formato aberto. O objetivo principal é mapear o conhecimento presente nesses documentos de maneira a representar essa coleção como informação interligada. O sistema é composto por vários componentes responsáveis pela execução de três fases propostas: extração de dados, organização de conhecimento, acesso à informação. A primeira fase propõe uma abordagem à extração de estrutura, texto e entidades de documentos PDF de maneira a obter a informação desejada, de acordo com a parametrização do utilizador. Esta abordagem usa dois métodos de extração diferentes, de acordo com as duas fases de processamento de documentos – análise de documento e compreensão de documento. O critério utilizado para agrupar objetos de texto é a fonte usada nos objetos de texto de acordo com a sua definição no código de fonte (Content Stream) do PDF. A abordagem está dividida em três partes: análise de documento, compreensão de documento e conjunção. A primeira parte da abordagem trata da extração de segmentos de texto, adotando uma abordagem geométrica. O resultado é uma lista de linhas do texto do documento; a segunda parte trata de agrupar os objetos de texto de acordo com o critério estipulado, produzindo um documento XML com o resultado dessa extração; a terceira e última fase junta os resultados das duas fases anteriores e aplica regras estruturais e lógicas no sentido de obter o documento XML final. A segunda fase propõe uma ontologia no domínio legal capaz de organizar a informação extraída pelo processo de extração da primeira fase. Também é responsável pelo processo de indexação do texto dos documentos. A ontologia proposta apresenta três características: pequena, interoperável e partilhável. A primeira característica está relacionada com o facto da ontologia não estar focada na descrição pormenorizada dos conceitos presentes, propondo uma descrição mais abstrata das entidades presentes; a segunda característica é incorporada devido à necessidade de interoperabilidade com outras ontologias do domínio legal, mas também com as ontologias padrão que são utilizadas geralmente; a terceira característica é definida no sentido de permitir que o conhecimento traduzido, segundo a ontologia proposta, seja independente de vários fatores, tais como o país, a língua ou a jurisdição. A terceira fase corresponde a uma resposta à questão do acesso e reutilização do conhecimento por utilizadores externos ao sistema através do desenvolvimento dum Web Service. Este componente permite o acesso à informação através da disponibilização de um grupo de recursos disponíveis a atores externos que desejem aceder à informação. O Web Service desenvolvido utiliza a arquitetura REST. Uma aplicação móvel Android também foi desenvolvida de maneira a providenciar visualizações dos pedidos de informação. O resultado final é então o desenvolvimento de um sistema capaz de transformar coleções de documentos em formato PDF para coleções em formato aberto de maneira a permitir o acesso e reutilização por outros utilizadores. Este sistema responde diretamente às questões da comunidade de dados abertos e de Governos, que possuem muitas coleções deste tipo, para as quais não existe a capacidade de raciocinar sobre a informação contida, e transformá-la em dados que os cidadãos e os profissionais possam visualizar e utilizar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mestrado em Engenharia Mecânica – Especialização Gestão Industrial

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trophic relationships in fish communities are affected by the availability of resources, which in turn is affected by spatial and temporal variations throughout the year. The aims of this study were to characterize the diet of A. tetramerus in a streamlet in the north of Brazil and compare its composition in different hydrological seasons (wet and dry seasons). Collections were performed every two months from October 2011 to September 2012 with the aid of seine nets, hand net and fishing traps in the streamlet located in the Machado River drainage basin in the Rondônia state. Most of the specimens collected were quite small (< 40 mm) and had empty stomachs. Our results showed that A. tetramerus feeds on a wide variety of items of plant origin, such as algae, seeds and leaves, as well as items of animal origin, including bryozoans, crustaceans, fish scales, terrestrial insects and detritus. The data also indicated higher consumption of aquatic insects than other food items, suggesting a primarily insect-based diet. Items of plant and allochthonous origin were consumed more in the wet season than in the dry season, but there were no seasonal differences in the consumption of animal and autochthonous items.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the relationship among mesohabitat and aquatic oligochaete species in the Galharada Stream (Campos do Jordão State Park, state of São Paulo, Brazil). Between August 2005 and May 2006 a total of 192 samples were obtained in areas of four different mesohabitats: riffle leaf litter (RL), pool leaf litter (PL), pool sediment (PS) and interstitial sediment from rocky beds in riffle areas (IS). In the mesohabitats sampled, 2007 specimens were identified, belonging to two families (Naididae and Enchytraeidae). Among the oligochaetes identified Naididae was represented by six genera (Allonais, Chaetogaster, Nais, Pristina, Aulodrilus and Limnodrilus). Principal components analysis (PCA) revealed the first two axes explained 85.1% of the total variance of the data. Limnodrilus hoffmeisteri Claparede, 1862 and Aulodrilus limnobius Bretscher, 1899 were associated with the pool areas (PL and PS). Most species of genera Pristina and Nais demonstrated apparent affinity with the riffle mesohabitats. The Indicator Species Analysis (IndVal) revealed that Nais communis Piguet, 1906, Pristina leidyi Smith, 1896 and Pristina (Pristinella) jenkinae (Stephenson, 1931) are indicative of RL mesohabitat, while family Enchytraeidae was considered indicative of PL mesohabitat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are far-reaching conceptual similarities between bi-static surface georadar and post-stack, "zero-offset" seismic reflection data, which is expressed in largely identical processing flows. One important difference is, however, that standard deconvolution algorithms routinely used to enhance the vertical resolution of seismic data are notoriously problematic or even detrimental to the overall signal quality when applied to surface georadar data. We have explored various options for alleviating this problem and have tested them on a geologically well-constrained surface georadar dataset. Standard stochastic and direct deterministic deconvolution approaches proved to be largely unsatisfactory. While least-squares-type deterministic deconvolution showed some promise, the inherent uncertainties involved in estimating the source wavelet introduced some artificial "ringiness". In contrast, we found spectral balancing approaches to be effective, practical and robust means for enhancing the vertical resolution of surface georadar data, particularly, but not exclusively, in the uppermost part of the georadar section, which is notoriously plagued by the interference of the direct air- and groundwaves. For the data considered in this study, it can be argued that band-limited spectral blueing may provide somewhat better results than standard band-limited spectral whitening, particularly in the uppermost part of the section affected by the interference of the air- and groundwaves. Interestingly, this finding is consistent with the fact that the amplitude spectrum resulting from least-squares-type deterministic deconvolution is characterized by a systematic enhancement of higher frequencies at the expense of lower frequencies and hence is blue rather than white. It is also consistent with increasing evidence that spectral "blueness" is a seemingly universal, albeit enigmatic, property of the distribution of reflection coefficients in the Earth. Our results therefore indicate that spectral balancing techniques in general and spectral blueing in particular represent simple, yet effective means of enhancing the vertical resolution of surface georadar data and, in many cases, could turn out to be a preferable alternative to standard deconvolution approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marine mammals are often reported to possess reduced variation of major histocompatibility complex (MHC) genes compared with their terrestrial counterparts. We evaluated diversity at two MHC class II B genes, DQB and DRB, in the New Zealand sea lion (Phocarctos hookeri, NZSL) a species that has suffered high mortality owing to bacterial epizootics, using Sanger sequencing and haplotype reconstruction, together with next-generation sequencing. Despite this species' prolonged history of small population size and highly restricted distribution, we demonstrate extensive diversity at MHC DRB with 26 alleles, whereas MHC DQB is dimorphic. We identify four DRB codons, predicted to be involved in antigen binding, that are evolving under adaptive evolution. Our data suggest diversity at DRB may be maintained by balancing selection, consistent with the role of this locus as an antigen-binding region and the species' recent history of mass mortality during a series of bacterial epizootics. Phylogenetic analyses of DQB and DRB sequences from pinnipeds and other carnivores revealed significant allelic diversity, but little phylogenetic depth or structure among pinniped alleles; thus, we could neither confirm nor refute the possibility of trans-species polymorphism in this group. The phylogenetic pattern observed however, suggests some significant evolutionary constraint on these loci in the recent past, with the pattern consistent with that expected following an epizootic event. These data may help further elucidate some of the genetic factors underlying the unusually high susceptibility to bacterial infection of the threatened NZSL, and help us to better understand the extent and pattern of MHC diversity in pinnipeds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most important issues in portland cement concrete pavement research today is surface characteristics. The issue is one of balancing surface texture construction with the need for durability, skid resistance, and noise reduction. The National Concrete Pavement Technology Center at Iowa State University, in conjunction with the Federal Highway Administration, American Concrete Pavement Association, International Grinding and Grooving Association, Iowa Highway Research Board, and other states, have entered into a three-part National Surface Characteristics Program to resolve the balancing problem. As a portion of Part 2, this report documents the construction of 18 separate pavement surfaces for use in the first level of testing for the national project. It identifies the testing to be done and the limitations observed in the construction process. The results of the actual tests will be included in the subsequent national study reports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanotechnology has been heralded as a "revolution" in science, for two reasons: first, because of its revolutionary view of the way in which chemicals and elements, such as gold and silver, behave, compared to traditional scientific understanding of their properties. Second, the impact of these new discoveries, as applied to commerce, can transform the daily life of consumer products ranging from sun tan lotions and cosmetics, food packaging and paints and coatings for cars, housing and fabrics, medicine and thousands of industrial processes.9 Beneficial consumer use of nanotechnologies, already in the stream of commerce, improves coatings on inks and paints in everything from food packaging to cars. Additionally, "Nanomedicine" offers the promise of diagnosis and treatment at the molecular level in order to detect and treat presymptomatic disease,10 or to rebuild neurons in Alzheimer's and Parkinson's disease. There is a possibility that severe complications such as stroke or heart attack may be avoided by means of prophylactic treatment of people at risk, and bone regeneration may keep many people active who never expected rehabilitation. Miniaturisation of diagnostic equipment can also reduce the amount of sampling materials required for testing and medical surveillance. Miraculous developments, that sound like science fiction to those people who eagerly anticipate these medical products, combined with the emerging commercial impact of nanotechnology applications to consumer products will reshape civil society - permanently. Thus, everyone within the jurisdiction of the Council of Europe is an end-user of nanotechnology, even without realising that nanotechnology has touched daily life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents the results of work zone field data analyzed on interstate highways in Missouri to determine the mean breakdown and queue-discharge flow rates as measures of capacity. Several days of traffic data collected at a work zone near Pacific, Missouri with a speed limit of 50 mph were analyzed in both the eastbound and westbound directions. As a result, a total of eleven breakdown events were identified using average speed profiles. The traffic flows prior to and after the onset of congestion were studied. Breakdown flow rates ranged between 1194 to 1404 vphpl, with an average of 1295 vphpl, and a mean queue discharge rate of 1072 vphpl was determined. Mean queue discharge, as used by the Highway Capacity Manual 2000 (HCM), in terms of pcphpl was found to be 1199, well below the HCM’s average capacity of 1600 pcphpl. This reduced capacity found at the site is attributable mainly to narrower lane width and higher percentage of heavy vehicles, around 25%, in the traffic stream. The difference found between mean breakdown flow (1295 vphpl) and queue-discharge flow (1072 vphpl) has been observed widely, and is due to reduced traffic flow once traffic breaks down and queues start to form. The Missouri DOT currently uses a spreadsheet for work zone planning applications that assumes the same values of breakdown and mean queue discharge flow rates. This study proposes that breakdown flow rates should be used to forecast the onset of congestion, whereas mean queue discharge flow rates should be used to estimate delays under congested conditions. Hence, it is recommended that the spreadsheet be refined accordingly.