91 resultados para data capture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

São muitas as organizações que por todo o mundo possuem instalações deste tipo, em Portugal temos o exemplo da Portugal Telecom que recentemente inaugurou o seu Data Center na Covilhã. O desenvolvimento de um Data Center exige assim um projeto muito cuidado, o qual entre outros aspetos deverá garantir a segurança da informação e das próprias instalações, nomeadamente no que se refere à segurança contra incêndio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demo in Workshop on ns-3 (WNS3 2015). 13 to 14, May, 2015. Castelldefels, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Auditing is not merely a collection of technical tasks but also a programmatic idea circulating in organizational environment, an idea which promises a certain style of control and organizational transparency (Power, 1998, p. 122) Performance appraisal within public organization aims to promote this organizational transparency and promote learning and improvement process both for employees and for the organization. However, we suggest that behind its clear intentions, there are some other goals tied to performance appraisal that could be seen as components of a discipline and surveillance systems to make the employee “knowable, calculable and administrative object” (Miller and Rose, 1990, p. 5). Objective: In Portuguese public organizations, performance appraisal follows the SIADAP (Performance Appraisal Systems for Public Administration). The objective of this study is to capture whatever employees of public organizations (appraisers and appraisee) perceived the performance appraisal system (SIADAP) as an appraisal model that promotes equity, learning and improvement or just as an instrument of control to which they feel dominated and watched over. Method: We developed an in-depth qualitative case study using semi-structured interviews with appraisers and their subordinates in the administrative department of a university institute of Medicine. The discourse of the participants was theoretically analyzed based on Foucauldian framework. Prior to qualitative data collection, we collected quantitative data, with a questionnaire, to measure the (un)satisfaction of employees with the all appraisal system. Findings: Although some key points of Foucault perspective were identified, its framework revealed some limitations to capture the all complexity of performance appraisal. Qualitative data revealed a significant tendency in discourses of appraisers and their subordinates considering SIADAP as an instrument that’s aims to introduced political rationalities and limits to the employer’s promotions within their careers. Contribution: This study brings a critical perspectives and new insights about performance appraisals in Portuguese’s public administrations. It is original contribution to management of human recourses in public administration and primary to audit of performance appraisal systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Every year forest fires consume large areas, being a major concern in many countries like Australia, United States and Mediterranean Basin European Countries (e.g., Portugal, Spain, Italy and Greece). Understanding patterns of such events, in terms of size and spatiotemporal distributions, may help to take measures beforehand in view of possible hazards and decide strategies of fire prevention, detection and suppression. Traditional statistical tools have been used to study forest fires. Nevertheless, those tools might not be able to capture the main features of fires complex dynamics and to model fire behaviour [1]. Forest fires size-frequency distributions unveil long range correlations and long memory characteristics, which are typical of fractional order systems [2]. Those complex correlations are characterized by self-similarity and absence of characteristic length-scale, meaning that forest fires exhibit power-law (PL) behaviour. Forest fires have also been proved to exhibit time-clustering phenomena, with timescales of the order of few days [3]. In this paper, we study forest fires in the perspective of dynamical systems and fractional calculus (FC). Public domain forest fires catalogues, containing data of events occurred in Portugal, in the period 1980 up to 2011, are considered. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses. The frequency spectra of such signals are determined using Fourier transforms, and approximated through PL trendlines. The PL parameters are then used to unveil the fractional-order dynamics characteristics of the data. To complement the analysis, correlation indices are used to compare and find possible relationships among the data. It is shown that the used approach can be useful to expose hidden patterns not captured by traditional tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

. Residents tend to have high expectations about the benefits of hosting a mega‐event. So, it was not surprising that the nomination of Guimarães, Portugal, as the 2012 European Capital of Culture (2012 ECOC) had raised great expectations in the local community towards its socio‐economic and cultural benefits. The present research was designed to examine the Guimarães residents’ perceptions on the impacts of hosting the 2012 ECOC approached in two different time schedules, the pre‐ and the post‐event, trying to capture the evolution of the residents` evaluation of its impacts. For getting the data, two surveys were applied to Guimarães` residents, one in the pre‐event phase, in 2011, and another in the post‐event phase, in 2013. This approach is uncommonly applied to Portugal data and it is even the first time it was done to a Portuguese European Capital of Culture. After a factor analysis, the results of t‐tests indicate that there were significant differences (p<0.05) between the samples from the pre‐ and post‐2012 ECOC on two positive impact factors (Community’ benefits and Residents’ benefits) and one negative impact factor (Economic, social and environmental costs). Respondents also showed a negative perception of the impacts in all dimensions, except Changes in habits of Guimarães residents. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Some chemicals used in consumer products or manufacturing (eg, plastics, pesticides) have estrogenic activities; these xenoestrogens (XEs) may affect immune responses and have recently emerged as a new risk factors for obesity and cardiovascular disease. However, the extent and impact on health of chronic exposure of the general population to XEs are still unknown. Objective: The objective of the study was to investigate the levels of XEs in plasma and adipose tissue (AT) depots in a sample of pre- and postmenopausal obese women undergoing bariatric surgery and their cardiometabolic impact in an obese state. Design and Participants: We evaluated XE levels in plasma and visceral and subcutaneous AT samples of Portuguese obese (body mass index ≥ 35 kg/m2) women undergoing bariatric surgery. Association with metabolic parameters and 10-year cardiovascular disease risk was assessed, according to menopausal status (73 pre- and 48 postmenopausal). Levels of XEs were determined by gas chromatography with electron-capture detection. Anthropometric and biochemical data were collected prior to surgery. Adipocyte size was determined on tissue sections obtained during surgery. Results: Our data show that XEs are pervasive in this obese population. Distribution of individual and concentration of total XEs differed between plasma, visceral AT, and subcutaneous AT, and the pattern of accumulation was different between pre- and postmenopausal women. Significant associations between XE levels and metabolic and inflammatory parameters were found. In premenopausal women, XEs in plasma seem to be a predictor of 10-year cardiovascular disease risk. Conclusions: Our findings point toward a different distribution of XE between plasma and AT in pre- and postmenopausal women, and reveal the association between XEs on the development of metabolic abnormalities in obese premenopausal women

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eruca sativa (rocket salad) has been intensely consumed all over the world, insomuch as, this work was undertaken to evaluate the antioxidant status and the environmental contamination (positive and negative nutritional contribution) of leaves and stems from this vegetable. Antioxidant capacity of rocket salad was assessed by mean of optical methods, such as the total phenolic content (TPC), reducing power assay and DPPH radical scavenging activity. The extent of the environmental contamination was reached through the quantification of thirteen organochlorine pesticides (OCP) by using gas chromatography coupled with electron-capture detector (GC-ECD) and compound confirmations employing gas chromatography tandem mass-spectrometry (GC-MS/MS). The OCP residues were extracted by using Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) methodology.The extent of the environmental contamination was reached through the quantification of thirteen OCP by using gas chromatography coupled with electron-capture detector (GC-ECD) and compound confirmations employing GC-MS/MS. The OCP residues were extracted by using Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) methodology. This demonstrated that leaves presented more antioxidant activity than stems, emphasizing that leaves contained six times more polyphenolic compounds than stems. In what concerns the OCP occurrence, the average recoveries obtained at the three levels tested (40, 60 and 80 µg kg−1) ranged from 55% to 149% with a relative standard deviation of 11%, (except hexachrorobenzene). Three vegetables samples were collected from supermarkets and analysed following this study. According to data, only one sample achieved 16.21 of β-hexachlorocyclohexane, confirmed by GC-MS/MS. About OCP quantification, the data indicated that only one sample achieved 16.21 µg kg−1 of β-hexachlorocyclohexane, confirmed by GC-MS/MS, being the QuEChERS a good choice for the of OCPs extraction. Furthermore, the leaves consumption guaranty higher levels of antioxidants than stems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.