903 resultados para Data-driven Methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper approaches the strategy in business management and aimed at identifying and outlining the interests and commitment of stakeholders in strategic resources management concerning production and implementation of wind turbine equipment of a Brazilian wind power company and also verifying if internal and external results deriving from such activities were sustainable, taking as main reference seminal publications and periodicals relevant to the research point that discuss the Resource Theory, Stakeholders and Sustainability. An analysis was carried out to assess how stakeholders, beyond the temporal context, intermediated the composition, development and management of the organization´s resources, as well as the social, environmental and economic results obtained from resources management in the production and supply of wind turbines to a Wind Power Plant located in the State of Ceara, in order to portray that Brazil sustainability can be an important competitive advantage source that creates value for shareholders and the community (Hart & Milstein, 2003). The strategy herein applied was the qualitative investigation using a single study case, which allowed for the thorough examination of an active organization operating in the Brazilian industry of wind power and also the resources used in the production and implementation of wind turbines supplied to the a Wind Power Plant in Ceara. Considering the content analysis and the triangulation principle, three qualitative data collection methods were applied to identify and characterize stakeholders’ interest and commitment in resource management of the organization operating in the Brazilian wind power industry, as follows, semistructured deep interview with managers of tactic-strategic level and analysts of organization´s value chain nine activities, analysis of public internal and external documents; and analysis of audio-visual material. Nonetheless, to identify the internal and external economic, social and environmental results of implementation and supply of wind turbines to the Wind Power Plant in Ceara, semistructured interviews were also carried out with the residents of the region. Results showed the BNDES (Brazilian Development Bank) and the organization head office were the stakeholders who exerted the strongest influence on resources related to production and implementation of the aerogenerator product at Trairi Wind Plant in Ceara. Concerning the organization resources, at the current stage of the Brazilian Wind Industry ,although the brand, reliability and reputation of the organization under study were valuable esources, rare, hard to imitate and exploited by the organization, it was noticed that opposed to RBV, they did not actually represent a source of competitive advantage . For the local community the social, economic and environmental results related to the wind turbines implementation were more positive than negative, despite the fact that the productive process caused negative environmental impacts such as the high emission of CO2 to transport wind turbines components to Trairi Wind Power Plant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In systems that combine the outputs of classification methods (combination systems), such as ensembles and multi-agent systems, one of the main constraints is that the base components (classifiers or agents) should be diverse among themselves. In other words, there is clearly no accuracy gain in a system that is composed of a set of identical base components. One way of increasing diversity is through the use of feature selection or data distribution methods in combination systems. In this work, an investigation of the impact of using data distribution methods among the components of combination systems will be performed. In this investigation, different methods of data distribution will be used and an analysis of the combination systems, using several different configurations, will be performed. As a result of this analysis, it is aimed to detect which combination systems are more suitable to use feature distribution among the components

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Restricted breeding seasons used in beef cattle produce censored data for reproduction traits measured in regard to these seasons. To analyze these data, adequate methods must be used. The objective of this paper was to compare three approaches aiming to evaluate sexual precocity in Nellore cattle. The final data set contained 6699 records of age at first conception (AFC14) (in days) and of heifer pregnancy (HP14) (binary) obtained from females exposed to the bulls for the first time at about 14 months of age. Records of females that did not calve in the following year after being exposed to a sire were considered censored (77.5% of total). The models used to obtain genetic parameters and expected progeny differences (EPDs) were a Weibull mixed and a censored linear model for AFC14 and threshold model for HP14. The mean heritabilities obtained were 0.76 and 0.44, respectively, for survival and censored linear models (for AFC14), and 0.58 for HP14. Ranking and Pearson correlations varied (in absolute values) from 0.54 to 0.99 (considering different percentages of sires selected), indicating moderate changes in the classification. Considering survival analysis as the best selection criterion (that would result in the best response to selection), it was observed that selection for HP14 would lead to a more significant decrease in selection response if compared with selection for AFC14 analysed by censored linear model, from which results were very similar to the survival analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an application of AMMI models - Additive Main effects and Multiplicative Interaction model - for a thorough study about the effect of the interaction between genotype and environment in multi-environments experiments with balanced data. Two methods of crossed validation are presented and the improvement of these methods through the correction of eigenvalues, being these rearranged by the isotonic regression. A comparative study between these methods is made, with real data. The results show that the EASTMENT & KRZANOWSKI (1982) method selects a more parsimonious model and when this method is improved with the correction of the eigenvalues, the number of components are not modified. GABRIEL (2002) method selects a huge number of terms to hold back in the model, and when this method is improved by the correction of eigenvalue, the number of terms diminishes. Therefore, the improvement of these methods through the correction of eigenvalues brings a great benefit from the practical point of view for the analyst of data proceeding from multi-ambient, since the selection of numbers of multiplicative terms represents a profit of the number of blocks (or repetitions), when the model AMMI is used, instead of the complete model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Commissioning studies of the CMS hadron calorimeter have identified sporadic uncharacteristic noise and a small number of malfunctioning calorimeter channels. Algorithms have been developed to identify and address these problems in the data. The methods have been tested on cosmic ray muon data, calorimeter noise data, and single beam data collected with CMS in 2008. The noise rejection algorithms can be applied to LHC collision data at the trigger level or in the offline analysis. The application of the algorithms at the trigger level is shown to remove 90% of noise events with fake missing transverse energy above 100 GeV, which is sufficient for the CMS physics trigger operation. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Brazil, due to the breeding season for Thoroughbred, the reproductive data are normally truncate, since the breeders try to get animals that were born at the beginning of the breeding season in order to take their competitive advantages (more developed, mature and trained animals) compared to animals born later in the same breeding season. To analyze these data suitable methods should be used. Then, this paper aims to compare three methodologies: the method of maximum restricted likelihood, using MTDFREML, bayesian analysis without censured data by software MTGSAM and bayesian analysis with censured data by software LMCD, to evaluate age at first conception in thoroughbred mares, in order to verify its impact on the choice of stallions during selection. The database contained 3509 records for age at first conception (months) for thoroughbred mares. The heritability estimates were 0.23, 0.30 and 0.0926 (log scale), for MTDF, MTGSAM and LMCD, respectively. Considering all animals in the pedigree (6713), ranking correlations varied from 0.91 to 0.99. When only stallions were considered (656), those varied from 0.48 to 0.99 (considering different percentages of selected males) between evalua-tion methods. The highest changes in the general classification were observed when LMCD was compared to the other two methods. As the linear censured model is the most suitable for trait analysis with censured data, it was observed that censure information would lead to the choice of different animals during the selection process, when compared to the two other methodologies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Cirurgia Veterinária - FCAV

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O processamento de voz tornou-se uma tecnologia cada vez mais baseada na modelagem automática de vasta quantidade de dados. Desta forma, o sucesso das pesquisas nesta área está diretamente ligado a existência de corpora de domínio público e outros recursos específicos, tal como um dicionário fonético. No Brasil, ao contrário do que acontece para a língua inglesa, por exemplo, não existe atualmente em domínio público um sistema de Reconhecimento Automático de Voz (RAV) para o Português Brasileiro com suporte a grandes vocabulários. Frente a este cenário, o trabalho tem como principal objetivo discutir esforços dentro da iniciativa FalaBrasil [1], criada pelo Laboratório de Processamento de Sinais (LaPS) da UFPA, apresentando pesquisas e softwares na área de RAV para o Português do Brasil. Mais especificamente, o presente trabalho discute a implementação de um sistema de reconhecimento de voz com suporte a grandes vocabulários para o Português do Brasil, utilizando a ferramenta HTK baseada em modelo oculto de Markov (HMM) e a criação de um módulo de conversão grafema-fone, utilizando técnicas de aprendizado de máquina.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O método de empilhamento Superfície de Reflexão Comum (SRC) foi originalmente introduzido como um método data-driven para simular seções afastamento-nulo a partir de dados sísmicos de reflexão pré-empilhados 2-D adquiridos ao longo de uma linha de aquisição reta. Este método está baseado em uma aproximação de tempos de trânsito hiperbólica de segunda ordem parametrizada com três atributos cinemáticos do campo de onda. Em dados terrestres, os efeitos topográficos desempenham um papel importante no processamento e imageamento de dados sísmicos. Assim, esta característica tem sido considerada recentemente pelo método SRC. Neste trabalho apresentamos uma revisão das aproximações de tempos de trânsito SRC que consideram topografia suave e rugosa. Adicionalmente, nós revemos também a aproximação de tempos de trânsito Multifoco para o caso da topografia rugosa. Por meio de um exemplo sintético simples, nós fornecemos finalmente as primeiras comparações entre as diferentes expressões de tempos de trânsito.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Apresentamos dois métodos de interpretação de dados de campos potenciais, aplicados à prospecção de hidrocarbonetos. O primeiro emprega dados aeromagnéticos para estimar o limite, no plano horizontal, entre a crosta continental e a crosta oceânica. Este método baseia-se na existência de feições geológicas magnéticas exclusivas da crosta continental, de modo que as estimativas das extremidades destas feições são usadas como estimativas dos limites da crosta continental. Para tanto, o sinal da anomalia aeromagnética na região da plataforma, do talude e da elevação continental é amplificado através do operador de continuação analítica para baixo usando duas implementações: o princípio da camada equivalente e a condição de fronteira de Dirichlet. A maior carga computacional no cálculo do campo continuado para baixo reside na resolução de um sistema de equações lineares de grande porte. Este esforço computacional é minimizado através do processamento por janelas e do emprego do método do gradiente conjugado na resolução do sistema de equações. Como a operação de continuação para baixo é instável, estabilizamos a solução através do funcional estabilizador de primeira ordem de Tikhonov. Testes em dados aeromagnéticos sintéticos contaminados com ruído pseudo-aleatório Gaussiano mostraram a eficiência de ambas as implementações para realçar os finais das feições magnéticas exclusivas da crosta continental, permitindo o delineamento do limite desta com a crosta oceânica. Aplicamos a metodologia em suas duas implementações a dados aeromagnéticos reais de duas regiões da costa brasileira: Foz do Amazonas e Bacia do Jequitinhonha. O segundo método delineia, simultaneamente, a topografia do embasamento de uma bacia sedimentar e a geometria de estruturas salinas contidas no pacote sedimentar. Os modelos interpretativos consistem de um conjunto de prismas bidimensionais verticais justapostos, para o pacote sedimentar e de prismas bidimensionais com seções verticais poligonais para as estruturas salinas. Estabilizamos a solução, incorporando características geométricas do relevo do embasamento e das estruturas salinas compatíveis com o ambiente geológico através dos estabilizadores da suavidade global, suavidade ponderada e da concentração de massa ao longo de direções preferenciais, além de vínculos de desigualdade nos parâmetros. Aplicamos o método a dados gravimétricos sintéticos produzidos por fontes 2D simulando bacias sedimentares intracratônicas e marginais apresentando densidade do pacote sedimentar variando com a profundidade segundo uma lei hiperbólica e abrigando domos e almofadas salinas. Os resultados mostraram que o método apresenta potencial para delinear, simultaneamente, as geometrias tanto de almofadas e domos salinos, como de relevos descontínuos do embasamento. Aplicamos o método, também, a dados reais ao longo de dois perfis gravimétricos sobre as Bacias de Campos e do Jequitinhonha e obtivemos interpretações compatíveis com a geologia da área.