962 resultados para data complexity
Resumo:
Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.
Resumo:
Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.
Resumo:
27th Euromicro Conference on Real-Time Systems (ECRTS 2015), Lund, Sweden.
Resumo:
RESUMO - A preocupação com os conhecimentos, atitudes, crenças e práticas, no que concerne à utilização de radiações ionizantes para fins de diagnóstico, e a sensibilização de todos os agentes envolvidos, médicos, técnicos, físicos, utentes e responsáveis pela Saúde Publica, relativamente aos níveis de radiação emitida nos exames de Tomografia Computorizada (TC), assume particular importância no domínio da Saúde Pública, na medida em que é necessário influenciar o desenvolvimento de práticas que promovam, auditem e garantam a prestação do controlo da qualidade radiológica e dosimétrica nos serviços de Radiologia a nível Nacional. Para tal, e no âmbito da publicação de estudos já realizados ao nível da União Europeia, ―Orientações Europeias dos Critérios de Qualidade para a Tomografia Computorizada (1999) ‖, é proposto estabelecer orientações na realização de estudos que permitam, numa primeira fase, estabelecer a comparação com os resultados obtidos pelos mecanismos de Controlo da Qualidade (CQ), analisar e proceder aos ajustes (se necessário) e, numa segunda fase, implementar uma moldura sistemática de avaliação periódica dos níveis de dose de radiação por exame TC e que permita a monitorização dos dados. Nesse sentido, propõe-se a realização de um Estudo Nacional que envolva a rede hospitalar pública, privada e universitária, partindo da metodologia utilizada em estudos prévios noutros países da Europa, como seja, selecção do equipamento de TC existente na Instituição Hospitalar, onde serão reunidas informações através do preenchimento de questionários relativos ao equipamento a utilizar. Serão recolhidos dados relativos ao utente, ao equipamento e parâmetros de aquisição de imagem, que permitam identificar os níveis de referência de diagnóstico (NRD) em TC, na realidade Portuguesa. Foi efectuado um estudo piloto numa instituição EPE e os valores obtidos não são significativos, nem podem assumir valor preditivo dado o reduzido tamanho da amostra. Apesar disso, sugerem a existência de parâmetros que podem ser alterados e com isso podem fazer variar a dose de radiação utilizada. ENSP/UNL Maria de Fátima Vaz de Carvalho 5 Espera-se obter com este estudo, como foi referido, a base do estabelecimento dos NRD em TC em Portugal. ----------------- ABSTRACT - The purpose of this study, in an empirical point of view, emerges from concern with the knowledge, attitudes, beliefs and practices regarding the use of ionizing radiation for diagnostic purposes and awareness of all actors involved, medical physical, technical, and responsible public health for the development of practices that promote, audited and ensure the provision of radiological quality control and dosing in radiology service at national level. In view of the complexity and characteristics involved in relation to ionizing radiation, all assume their part in protecting the physical integrity of each user and a global perspective, to ensure the safeguarding of public health, while global and globalizing factor. To this end, and in the context of the publication of studies already carried out at European Union level, "European guidelines for quality criteria for computed tomography", it is proposed to establish guidelines in conducting studies to initially establish the comparison with the results obtained by QC and make adjustments if necessary, and subsequently implement a systematic periodic assessment frame that allows monitoring of data. Accordingly, it is proposed to conduct a national study involving the public network, private and University hospitals, that extends from the methodology used in previous studies in other countries of Europe, as is, selection of equipment of existing CT in Hospital Institution, where information will be gathered by filling out questionnaires concerning the equipment to be used. Data will be collected for the wearer, equipment and parameters of image acquisition, identifying diagnostic reference levels (NRD) in CT in Portuguese fact. A pilot study was carried out in an institution EPE and the values obtained are not significant, nor can they take predictive value given the small sample size. Despite this, suggest the existence of parameters that can be changed and this can vary the dose of radiation used It is hoped to get with this study, as mentioned, the basis of the establishment of NRD in CT in Portugal.
Resumo:
Advances in technology have produced more and more intricate industrial systems, such as nuclear power plants, chemical centers and petroleum platforms. Such complex plants exhibit multiple interactions among smaller units and human operators, rising potentially disastrous failure, which can propagate across subsystem boundaries. This paper analyzes industrial accident data-series in the perspective of statistical physics and dynamical systems. Global data is collected from the Emergency Events Database (EM-DAT) during the time period from year 1903 up to 2012. The statistical distributions of the number of fatalities caused by industrial accidents reveal Power Law (PL) behavior. We analyze the evolution of the PL parameters over time and observe a remarkable increment in the PL exponent during the last years. PL behavior allows prediction by extrapolation over a wide range of scales. In a complementary line of thought, we compare the data using appropriate indices and use different visualization techniques to correlate and to extract relationships among industrial accident events. This study contributes to better understand the complexity of modern industrial accidents and their ruling principles.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.
Resumo:
Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.
Resumo:
Motivation: Auditing is not merely a collection of technical tasks but also a programmatic idea circulating in organizational environment, an idea which promises a certain style of control and organizational transparency (Power, 1998, p. 122) Performance appraisal within public organization aims to promote this organizational transparency and promote learning and improvement process both for employees and for the organization. However, we suggest that behind its clear intentions, there are some other goals tied to performance appraisal that could be seen as components of a discipline and surveillance systems to make the employee “knowable, calculable and administrative object” (Miller and Rose, 1990, p. 5). Objective: In Portuguese public organizations, performance appraisal follows the SIADAP (Performance Appraisal Systems for Public Administration). The objective of this study is to capture whatever employees of public organizations (appraisers and appraisee) perceived the performance appraisal system (SIADAP) as an appraisal model that promotes equity, learning and improvement or just as an instrument of control to which they feel dominated and watched over. Method: We developed an in-depth qualitative case study using semi-structured interviews with appraisers and their subordinates in the administrative department of a university institute of Medicine. The discourse of the participants was theoretically analyzed based on Foucauldian framework. Prior to qualitative data collection, we collected quantitative data, with a questionnaire, to measure the (un)satisfaction of employees with the all appraisal system. Findings: Although some key points of Foucault perspective were identified, its framework revealed some limitations to capture the all complexity of performance appraisal. Qualitative data revealed a significant tendency in discourses of appraisers and their subordinates considering SIADAP as an instrument that’s aims to introduced political rationalities and limits to the employer’s promotions within their careers. Contribution: This study brings a critical perspectives and new insights about performance appraisals in Portuguese’s public administrations. It is original contribution to management of human recourses in public administration and primary to audit of performance appraisal systems.
Resumo:
A indisciplina e a violência em contexto escolar têm sido uma preocupação social crescente e um assunto amplamente discutido sob diversas perspetivas. Pela complexidade de que se revestem e amplitude de implicações que acarretam, nomeadamente o insucesso escolar e as consequências psicossociais e individuais inerentes, os problemas ao nível do comportamento exigem cada vez mais uma resposta eficaz da comunidade escolar, que vê o seu funcionamento diário ser prejudicado. A acumulação de processos disciplinares evidencia a ineficácia dos sistemas punitivos, pelo que se impõe uma alternativa eficaz. Neste trabalho procuramos apresentar um programa de intervenção comportamental de caráter preventivo e que tem revelado eficácia em diversos contextos escolares. Pelo seu carácter proativo e cientificamente fundamentado, o sistema PBIS (Positive Behavioral Interventions and Supports) apoia-se em princípios da Psicologia Positiva e em dados empíricos e oferece um quadro operacional adaptável a qualquer instituição escolar. O estudo de caso apresentado apresenta intervenções de nível dois e três que, mesmo sem a implementação do nível um de base, revelaram resultados bastante positivos, pelo que há evidências de que seria benéfico para as escolas a introdução destes sistemas no combate diário, constante e preocupante à indisciplina.
Resumo:
Relatório de Estágio apresentado para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Novos Media e Práticas Web
Resumo:
Twenty-four whole blood and serum samples were drawn from an eight year-old heart transplant child during a 36 months follow-up. EBV serology was positive for VCA-IgM and IgG, and negative for EBNA-IgG at the age of five years old when the child presented with signs and symptoms suggestive of acute infectious mononucleosis. After 14 months, serological parameters were: positive VCA-IgG, EBNA-IgG and negative VCA-IgM. This serological pattern has been maintained since then even during episodes suggestive of EBV reactivation. PCR amplified a specific DNA fragment from the EBV gp220 (detection limit of 100 viral copies). All twenty-four whole blood samples yielded positive results by PCR, while 12 out of 24 serum samples were positive. We aimed at analyzing whether detection of EBV-DNA in serum samples by PCR was associated with overt disease as stated by the need of antiviral treatment and hospitalization. Statistical analysis showed agreement between the two parameters evidenced by the Kappa test (value 0.750; p < 0.001). We concluded that detection of EBV-DNA in serum samples of immunosuppressed patients might be used as a laboratory marker of active EBV disease when a Real-Time PCR or another quantitative method is not available.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Física
Resumo:
40 Echinococcus isolates from sheep and cattle in Southern Brazil were genetically analysed in order to obtain further data on the presence of different taxa of the Echinococcus granulosus complex. Differentiation was done using a PCR technique and sequencing of mitochondrial cytochrome c oxidase subunit 1 (CO1). Most samples (38) could be allocated to the sheep strain (G1) of E. granulosus, while two samples belonged to E. ortleppi, previously known as cattle strain (G5) of E. granulosus. Due to the shorter prepatent period in dogs of the latter taxon, this records have important implications for the design of control measures in this endemic region.