699 resultados para data warehouse tuning aggregato business intelligence performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

RothC and Century are two of the most widely used soil organic matter (SOM) models. However there are few examples of specific parameterisation of these models for environmental conditions in East Africa. The aim of this study was therefore, to evaluate the ability of RothC and the Century to estimate changes in soil organic carbon (SOC) resulting from varying land use/management practices for the climate and soil conditions found in Kenya. The study used climate, soils and crop data from a long term experiment (1976-2001) carried out at The Kabete site at The Kenya National Agricultural Research Laboratories (NARL, located in a semi-humid region) and data from a 13 year experiment carried out in Machang'a (Embu District, located in a semi-arid region). The NARL experiment included various fertiliser (0, 60 and 120 kg of N and P2O5 ha(-1)), farmyard manure (FYM - 5 and 10 t ha(-1)) and plant residue treatments, in a variety of combinations. The Machang'a experiment involved a fertiliser (51 kg N ha(-1)) and a FYM (0, 5 and 10 t ha(-1)) treatment with both monocropping and intercropping. At Kabete both models showed a fair to good fit to measured data, although Century simulations for treatments with high levels of FYM were better than those without. At the Machang'a site with monocrops, both models showed a fair to good fit to measured data for all treatments. However, the fit of both models (especially RothC) to measured data for intercropping treatments at Machang'a was much poorer. Further model development for intercrop systems is recommended. Both models can be useful tools in soil C Predictions, provided time series of measured soil C and crop production data are available for validating model performance against local or regional agricultural crops. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We seek to address formally the question raised by Gardner (2003) in his Elmhirst lecture as to the direction of causality between agricultural value added per worker and Gross Domestic Product (GDP) per capita. Using the Granger causality test in the panel data analyzed by Gardner for 85 countries, we find overwhelming evidence that supports the conclusion that agricultural value added is the causal variable in developing countries, while the direction of causality in developed countries is unclear. We also examine further the use of the Granger causality test in integrated data and provide evidence that the performance of the test can be increased in small samples through the use of the bootstrap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet -- matrix-assisted laser desorption/ionisation -- mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. The low-femtomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydroxybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and low-mass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet - matrix-assisted laser desorption/ ionisation - mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. U. Am. Soc. Mass Spectrom. 1998, 9, 166-174). The low-ferntomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydrox-ybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and lowmass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Antarctic Peninsula region is currently undergoing rapid environmental change, resulting in the thinning, acceleration and recession of glaciers and the sequential collapse of ice shelves. It is important to view these changes in the context of long-term palaeoenvironmental complexity and to understand the key processes controlling ice sheet growth and recession. In addition, numerical ice sheet models require detailed geological data for tuning and testing. Therefore, this paper systematically and holistically reviews published geological evidence for Antarctic Peninsula Ice Sheet variability for each key locality throughout the Cenozoic, and brings together the prevailing consensus of the extent, character and behaviour of the glaciations of the Antarctic Peninsula region. Major contributions include a downloadable database of 186 terrestrial and marine calibrated dates; an original reconstruction of the LGM ice sheet; and a new series of isochrones detailing ice sheet retreat following the LGM. Glaciation of Antarctica was initiated around the Eocene/Oligocene transition in East Antarctica. Palaeogene records of Antarctic Peninsula glaciation are primarily restricted to King George Island, where glacigenic sediments provide a record of early East Antarctic glaciations, but with modification of far-travelled erratics by local South Shetland Island ice caps. Evidence for Neogene glaciation is derived primarily from King George Island and James Ross Island, where glaciovolcanic strata indicate that ice thicknesses reached 500–850 m during glacials. This suggests that the Antarctic Peninsula Ice Sheet draped, rather than drowned, the topography. Marine geophysical investigations indicate multiple ice sheet advances during this time. Seismic profiling of continental shelf-slope deposits indicates up to ten large advances of the Antarctic Peninsula Ice Sheet during the Early Pleistocene, when the ice sheet was dominated by 40 kyr cycles. Glacials became more pronounced, reaching the continental shelf edge, and of longer duration during the Middle Pleistocene. During the Late Pleistocene, repeated glacials reached the shelf edge, but ice shelves inhibited iceberg rafting. The Last Glacial Maximum (LGM) occurred at 18 ka BP, after which transitional glaciomarine sediments on the continental shelf indicate ice-sheet retreat. The continental shelf contains large bathymetric troughs, which were repeatedly occupied by large ice streams during Pleistocene glaciations. Retreat after the LGM was episodic in the Weddell Sea, with multiple readvances and changes in ice-flow direction, but rapid in the Bellingshausen Sea. The late Holocene Epoch was characterised by repeated fluctuations in palaeoenvironmental conditions, with associated glacial readvances. However, this has been subsumed by rapid warming and ice-shelf collapse during the twentieth century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combining data on structural characteristics and economic performance for a large sample of Italian firms with data on exporting and importing activity, we uncover evidence supporting recent theories on firm heterogeneity and international trade, together with some new facts. In particular, we find that importing is associated with substantial firm heterogeneity. First, we document that trade is more concentrated than employment and sales, and show that importing is even more concentrated than exporting both within sectors and along the sector- and country-extensive margins. Second, while supporting the fact that firms involved in both are the best performers, we also find that firms involved only in importing activities perform better than those involved only in exporting. Our evidence suggests there is a strong self-selection effect in the case of importers and the performance premia of internationalised firms correlate relatively more with the degree of geographical and sectoral diversification of imports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integration of nanostructured films containing biomolecules and silicon-based technologies is a promising direction for reaching miniaturized biosensors that exhibit high sensitivity and selectivity. A challenge, however, is to avoid cross talk among sensing units in an array with multiple sensors located on a small area. In this letter, we describe an array of 16 sensing units, of a light-addressable potentiometric sensor (LAPS), which was made with layer-by-Layer (LbL) films of a poly(amidomine) dendrimer (PAMAM) and single-walled carbon nanotubes (SWNTs), coated with a layer of the enzyme penicillinase. A visual inspection of the data from constant-current measurements with liquid samples containing distinct concentrations of penicillin, glucose, or a buffer indicated a possible cross talk between units that contained penicillinase and those that did not. With the use of multidimensional data projection techniques, normally employed in information Visualization methods, we managed to distinguish the results from the modified LAPS, even in cases where the units were adjacent to each other. Furthermore, the plots generated with the interactive document map (IDMAP) projection technique enabled the distinction of the different concentrations of penicillin, from 5 mmol L(-1) down to 0.5 mmol L(-1). Data visualization also confirmed the enhanced performance of the sensing units containing carbon nanotubes, consistent with the analysis of results for LAPS sensors. The use of visual analytics, as with projection methods, may be essential to handle a large amount of data generated in multiple sensor arrays to achieve high performance in miniaturized systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis aims to elaborate on the optimum trigger speed for Vehicle Activated Signs (VAS) and to study the effectiveness of VAS trigger speed on drivers’ behaviour. Vehicle activated signs (VAS) are speed warning signs that are activated by individual vehicle when the driver exceeds a speed threshold. The threshold, which triggers the VAS, is commonly based on a driver speed, and accordingly, is called a trigger speed. At present, the trigger speed activating the VAS is usually set to a constant value and does not consider the fact that an optimal trigger speed might exist. The optimal trigger speed significantly impacts driver behaviour. In order to be able to fulfil the aims of this thesis, systematic vehicle speed data were collected from field experiments that utilized Doppler radar. Further calibration methods for the radar used in the experiment have been developed and evaluated to provide accurate data for the experiment. The calibration method was bidirectional; consisting of data cleaning and data reconstruction. The data cleaning calibration had a superior performance than the calibration based on the reconstructed data. To study the effectiveness of trigger speed on driver behaviour, the collected data were analysed by both descriptive and inferential statistics. Both descriptive and inferential statistics showed that the change in trigger speed had an effect on vehicle mean speed and on vehicle standard deviation of the mean speed. When the trigger speed was set near the speed limit, the standard deviation was high. Therefore, the choice of trigger speed cannot be based solely on the speed limit at the proposed VAS location. The optimal trigger speeds for VAS were not considered in previous studies. As well, the relationship between the trigger value and its consequences under different conditions were not clearly stated. The finding from this thesis is that the optimal trigger speed should be primarily based on lowering the standard deviation rather than lowering the mean speed of vehicles. Furthermore, the optimal trigger speed should be set near the 85th percentile speed, with the goal of lowering the standard deviation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O processo de análise de crédito para as empresas em geral não é novidade, sobretudo caso considere-se essa prática para o meio bancário. Entretanto, para o segmento de telefonia, que até alguns anos atrás convivia com carência desse serviço, provavelmente, a prática da análise de crédito não possuía a devida importância na gestão do negócio. Nesse contexto, o presente trabalho tem como objetivo avaliar a atual Política de Crédito Corporativa da Telet S/A, a qual determina que seja analisado crédito dos clientes corporativos que habilitam mais de 10 linhas. O estudo foi desenvolvido avaliando-se os percentuais de inadimplência desse segmento de clientes, de forma que pudessem ser observadas as diferenças existentes entre os clientes que são avaliados o crédito daqueles que não são. Os dados foram extraídos do sistema de Business Intelligence (BI) disponível na Telet, sendo que o período estudado foi de janeiro de 2001 a abril de 2002. De modo a atingir o objetivo proposto, os índices de inadimplência foram avaliados de forma segmentada em quatro grupos gerenciais, quais sejam: Ciclos de Faturamento, Tempo de Casa do cliente na base, Regiões de Faturamento e Método de Pagamento. Os índices de inadimplência foram estudados a fim de ser observado se as diferenças existentes entre as médias analisadas eram estatisticamente significativas, indicando assim, dentre as segmentações, quais apresentavam as maiores e menores semelhanças no comportamento dos pagamentos efetuados, caso fosse feita a opção de alteração do critério. Como resultado geral deste estudo, pode-se observar que os índices de inadimplência dos clientes que habilitam até 10 linhas são significativamente superiores se comparados aos índices totais da inadimplência, indicando assim que a alteração do critério de análise resultaria em prováveis reduções de perda financeira para a Telet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sistemas de Business Intelligence ou Inteligência de Negócio e Responsabilidade Social Corporativa (RSC), enfocados em separado, representam questões-chave da contemporaneidade que levaram ao crescente interesse e aumento substancial do número de estudos. Entretanto, tomados em conjunto, estão praticamente ausentes da literatura. Com o objetivo de iniciar a preencher essa lacuna, a contribuição desse estudo é apresentar um modelo teórico para a concepção de sistemas de Inteligência de Negócio, mais especificamente na fase de identificação das necessidades informacionais, integrando o conceito de responsabilidade social corporativa, o qual passa a compor o conjunto de informações relevantes a serem gerenciadas pelas empresas. O método de pesquisa utilizado é a Grounded Theory, sendo conduzida em cinco organizações reconhecidas pela sua atuação voltada para a sustentabilidade. A questão de pesquisa de como integrar a gestão das informações relativas a RSC aos indicadores de desempenho tradicionais na concepção dos sistemas de Inteligência de Negócio levou-nos a um modelo teórico baseado em dois eixos, os quais denominamos Contexto Institucional e Indicadores em Perspectiva. Por um lado, a incorporação da sustentabilidade à estratégia empresarial depende fundamentalmente de variáveis relacionadas à organização, as quais foram identificadas no eixo denominado Contexto Institucional. Por outro lado, o eixo Indicadores em Perspectiva trata de como categorizar os indicadores de desempenho de tal forma que a gestão e a estratégia da empresa possam ser avaliadas e analisadas em um único modelo que integre não somente as dimensões social e ambiental, mas também dimensões tradicionais de negócio. Finalmente, essa estrutura multidimensional para integração dos indicadores econômicos, sociais e ambientais mostrou-se como uma etapa final em um processo que leva a uma organização verdadeiramente sustentável.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudo objetivou identificar de que forma as especificidades das organizações públicas influenciam a implementação da Gestão de Processos em seu âmbito. Para isso foi realizada uma análise de referenciais teóricos com a finalidade de delinear, principalmente, questões relevantes para entender as diferenças entre a administração pública e a privada e que impactam na implementação da Gestão de Processos naquelas organizações, e de subsidiar a elaboração de um roteiro de entrevistas estruturado. A partir dessa etapa, foi efetuada uma pesquisa de campo constituída por oito entrevistas com especialistas brasileiros de destaque no mundo acadêmico nos quadros de empresas ou instituições que atuam com Gestão por Processos, em consultorias nessa área e/ou profissionais com certificação internacional em processos, bem como com experiência na implantação da Gestão de Processos em organizações da Administração Pública Direta Federal. O resultado dessas entrevistas foi analisado, consolidado e examinado à luz da posição dos autores do referencial teórico. Na percepção dos especialistas consultados nesta pesquisa, a menor maturidade apresentada pelas organizações públicas, relacionada à medição de resultados e ao acompanhamento do desempenho, prejudica a tangilibilização dos resultados decorrentes da adoção da Gestão de Processos. Em consequência, as pessoas envolvidas têm dificuldade em perceber os ganhos que podem dela obter, situação que complica sobremaneira a possibilidade de movimentos de mobilização e o próprio comprometimento para a implementação da Gestão de Processos. Além disso, os entrevistados consideram que a postura e o perfil dos servidores públicos, aliados a outras especificidades das organizações públicas - tais como estabilidade na carreira; falta de mecanismos de reconhecimento e recompensa e de avaliação de desempenho criteriosa; cultura de documentação e controle excessivos e as disfunções da burocracia; e a descontinuidade na gestão devido a influências políticas - também podem prejudicar o engajamento, a motivação das pessoas para essa gestão e a identificação e implementação de melhorias contínuas nos processos organizacionais, atividades intrínsecas da Gestão de Processos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work aims at evaluating how effective is knowledge disclosure in attenuating institutional negative reactions caused by uncertainties brought by firms’ new strategies that respond to novel technologies. The empirical setting is from an era of technological ferment, the period of the introduction of the voice over internet protocol (VoIP) in the USA in the early 2000’s. This technology led to the convergence of the wireline telecommu- nications and cable television industries. The Institutional Brokers’ Estimate System (also known as the I/B/E/S system) was used to capture reactions of securities analysts, a revealed important source of institutional pressure on firms’ strategies. For assessing knowledge disclosure, a coding technique and a established content analysis framework were used to quantitatively measure the non-numerical and unstructured data of transcripts of business events occurred at that time. Eventually, several binary response models were tested in order to assess the effect of knowledge disclosure on the probability of institutional positive reactions. The findings are that the odds of favorable institutional reactions increase when a specific kind of knowledge is disclosed. It can be concluded that knowledge disclosure can be considered as a weapon in technological changes situations, attenuating adverse institutional reactions to the companies’ strategies in environments of technological changes.