84 resultados para Data warehouse


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Logística, vista como uma perspetiva integradora entre os parceiros de negócio, com objetivos comuns de proporcionar ao cliente mais-valias e aspetos diferenciadores perante os outros concorrentes, contribui em muito na manutenção das empresas na globalização atual, que se torna cada vez mais flexível. Através de uma boa gestão de processos críticos de negócio, boa localização dos materiais, sejam eles quais forem, produtos finais, matérias-primas ou produtos em vias de fabrico e através do transporte a logística cria utilidade temporal e diferenciadora. De facto, a logística poderá assumir um papel fundamental em proporcionar valor acrescentado ao disponibilizar, a tempo, os serviços que os clientes necessitam ou esperam. Enquadrando-se na temática de gestão dos armazéns, o presente projeto consistiu no estudo de operações de picking com a finalidade de otimização dos processos de picking no armazém do operador logístico AR – Serviços de Logística, localizado em Ribeirão, Vila Nova de Famalicão. O trabalho inicial passou pelo levantamento do funcionamento das operações do processo de picking na empresa e posteriormente confrontá-los com as tecnologias e procedimentos atuais no mercado. Com base nos resultados obtidos, foi possível definir e implementar métricas enquadradas nas finalidades estratégicas e operacionais do operador logístico. As soluções passaram também pela melhoria da aplicação de gestão de armazéns (WMS), reavaliação dos indicadores previamente estabelecidos e na aquisição de equipamentos para automatização das operações picking e localizações. Os registos e informações relacionadas com os módulos fulcrais são armazenados e tratados na base de dados de suporte à aplicação com contributo de melhoria contínua aos procedimentos logístico da empresa e sua relação com os stakeholders na estratégia global de negócio com o operador logístico. Finalmente, foi possível analisar os resultados obtidos em modo real em relação as estimativas calculadas e definidas na fase de implementação e desenvolvimento.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta dissertação foi realizada em colaboração com o grupo empresarial Monteiro, Ribas e teve como principais objetivos efetuar uma avaliação das melhores técnicas disponíveis relativas à refrigeração industrial e às emissões resultantes da armazenagem. O primeiro objetivo teve como alvo todas as instalações da Monteiro, Ribas enquanto que o segundo objetivo se debruçou sobre Monteiro, Ribas, Embalagens Flexíveis, S.A.. Para cumprir estes objetivos, inicialmente efetuou-se um levantamento das melhores técnicas disponíveis apresentadas nos respetivos documentos de referência. Em seguida selecionaram-se as técnicas que se adequavam às condições e às instalações em estudo e procedeu-se a uma avaliação de forma a verificar o grau de implementação das medidas sugeridas no BREF (Best Available Techniques Reference Document). Relativamente aos sistemas de refrigeração industrial verificou-se que estão implementadas quase todas as medidas referenciadas no respetivo documento de referência. Isto prende-se com o facto dos sistemas de refrigeração existentes no complexo industrial Monteiro, Ribas serem relativamente recentes. Foram implementados no ano de 2012, e são caracterizados por apresentarem uma conceção moderna com elevada eficiência. No que diz respeito à armazenagem de produtos químicos perigosos, a instalação em estudo, apresenta algumas inconformidades, uma vez que a maioria das técnicas mencionadas no BREF não se encontram implementadas, pelo que foi necessário efetuar uma avaliação de riscos ambientais, com recurso à metodologia proposta pela Norma Espanhola UNE 150008:2008 – Análise e Avaliação do Risco Ambiental. Para isso procedeu-se então à formulação de vários cenários de riscos e à quantificação de riscos para à Monteiro, Ribas Embalagens Flexíveis S.A., tendo-se apurado que os riscos estavam avaliados como moderados a altos. Por fim foram sugeridas algumas medidas de prevenção e de minimização do risco que a instalação deve aplicar, como por exemplo, o parque de resíduos perigosos deve ser equipado com kits de contenção de derrames (material absorvente), procedimentos a realizar em caso de emergência, fichas de dados de segurança e o extintor deve ser colocado num local de fácil visualização. No transporte de resíduos perigosos, para o respetivo parque, é aconselhável utilizar bacias de contenção de derrames portáteis e kits de contenção de derrames. Relativamente ao armazém de produtos químicos perigosos é recomendado que se proceda a sua reformulação tendo em conta as MTD apresentadas no subcapítulo 5.2.3 desta dissertação.