955 resultados para Application level


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we address the real-time capabilities of P-NET, which is a multi-master fieldbus standard based on a virtual token passing scheme. We show how P-NET’s medium access control (MAC) protocol is able to guarantee a bounded access time to message requests. We then propose a model for implementing fixed prioritybased dispatching mechanisms at each master’s application level. In this way, we diminish the impact of the first-come-first-served (FCFS) policy that P-NET uses at the data link layer. The proposed model rises several issues well known within the real-time systems community: message release jitter; pre-run-time schedulability analysis in non pre-emptive contexts; non-independence of tasks at the application level. We identify these issues in the proposed model and show how results available for priority-based task dispatching can be adapted to encompass priority-based message dispatching in P-NET networks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Collaborative software is usually thought of as providing audio-video conferencing services, application/desktop sharing, and access to large content repositories. However mobile device usage is characterized by users carrying out short and intermittent tasks sometimes referred to as 'micro-tasking'. Micro-collaborations are not well supported by traditional groupware systems and the work in this paper seeks out to address this. Mico is a system that provides a set of application level peer-to-peer services for the ad-hoc formation and facilitation of collaborative groups across a diverse mobile device domain. The system builds on the Java ME bindings of the JXTA P2P protocols, and is designed with an approach to use the lowest common denominators that are required for collaboration between varying degrees of mobile device capability. To demonstrate how our platform facilitates application development, we built an exemplary set of demonstration applications and include code examples here to illustrate the ease and speed afforded when developing collaborative software with Mico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Rheumatic fever (RF), a systemic illness that may occur following Group A beta-hemolytic streptococcal (GABHS) pharyngitis in children, is a major problem in countries with limited resources. Because of its long track record and low cost, an injection of benzathine penicillin G (BPG) suspension every 3 or 4 weeks has been used as secondary prophylaxis. Despite its excellent in vitro efficacy, the inability of BPG to eradicate GABHS has been frequently reported.Areas covered: This work reviews the possible causes of failure, as well as the inconvenience of the current prophylactic treatment of acute RF and suggests a new pharmacotherapeutic system that could replace the current one.Expert opinion: RF is a major problem concerning only countries with limited resources and could be considered as a neglected disease. The dose regimen using BPG suspension results in failures, which could be avoided by the use of nanocarrier-based systems. To meet this ultimate goal, the research should be transposed from the laboratory scale to an industrial and clinical application level. This research should be conducted to produce a pharmaceutical dosage form that will be commercially available, consumed by and affordable for patients. However, health, environmental and socioeconomic hazards should be considered.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study was aimed to verify if chicks from eggs injected with ascorbic acid and subjected to thermal stress would have higher immunity than chicks from incubation at thermoneutrality without injection of ascorbic acid. The parameters evaluated were temperature on oxygen saturation in hemoglobin, glucose, number of erythrocytes, hematocrit rate and number of hemoglobins of newly hatched male chicks, hatched from eggs injected with ascorbic acid (AA) and subjected to thermal stress during incubation. The experimental design was completely randomized in factorial scheme 5 (application levels of ascorbic acid) x 2 (incubation temperatures). The data were subjected to analysis of variance using the General Linear Model procedure (GLM) of SAS ®. For the parameters (number of erythrocytes, rate of hematrocit and values of hemoglobin), there was significant interaction (p <0.05) between treatments in egg and incubation temperatures. Analyzing the interactions for these parameters, it was observed that the application of 0% ascorbic acid in egg minimized the effect of heat stress when compared with treatment without injection. The application of ascorbic acid levels in eggs incubated under heat stress failed to maximize the immunity of newly hatched chicks. It is assumed that the increased liquid in the amniotic fluid, in those embryos injected with water, favored the lower heat conductance for these embryos, thus helping in their development in relation to immunity. Considering that hemoglobin is related to the transport of gases, these data suggest that increasing the concentration of AA solution inoculated may influence the respiratory rates of eggs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study aimed to verify that chicks from eggs injected with ascorbic acid and subjected to heat stress would have changes in acid-base balance, compared to chicks incubated at thermoneutral without injection of ascorbic acid. The parameters evaluated were blood pressure of carbon dioxide and oxygen, base excess, total carbon dioxide, concentration of sodium, potassium, ionized calcium, bicarbonate and pH of newly hatched male chicks, hatched from eggs injected with acid ascorbic acid (AA) and subjected to heat stress during incubation. The experimental design was completely randomized in factorial scheme 5 (application levels of ascorbic acid) x 2 (incubation temperatures). The data were subjected to analysis of variance using the General Linear Model procedure (GLM) of SAS ®. For the blood pH was observed significant interaction (p <0.05) between treatments with application in eggs and incubation temperatures. For the other parameters were not significant effects (p< 0.05) of AA level and neither temperature of incubation. Analyzing the unfolding of the interaction to pH was observed that chicks from eggs injected with 6% ascorbic acid and subjected to heat stress during incubation had a higher pH value compared with the thermoneutral temperature incubated (p <0.05). Therefore, it is suggested that the incubation of eggs in high temperature (39°C) can alter the metabolic rate of these embryos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’effettività della tutela cautelare, intesa come tutela tempestiva, adeguata e piena, è stata la linea cardine dell’evoluzione della giustizia amministrativa, che, nel corso di un periodo durato più di un secolo, grazie all’opera della giurisprudenza e della dottrina, si è strutturata oggi su un vero processo. Approdo recente, e allo stesso tempo, simbolo di questa evoluzione, è sicuramente il Codice del processo amministrativo emanato con il d. lgs. 2 luglio 2010, n. 104. La ricerca, di cui questo contributo costituisce un resoconto, è iniziata contestualmente all’entrata in vigore del nuovo testo, e quindi è stata anche l’occasione per vederne le prime applicazioni giurisprudenziali. In particolare la lettura del Codice, prescindendo da una mera ricognizione di tutto il suo lungo articolato, è stata fatta alla luce di una ponderazione, nell’attualità, non solo del principio di effettività, ma anche del principio di strumentalità che lega tradizionalmente la fase cautelare con la fase di merito. I risultati della ricerca manifestano la volontà del legislatore di confermare questo rapporto strumentale, per fronteggiare una deriva incontrollata verso una cautela dagli effetti alle volte irreversibili, quale verificatasi nell’applicazione giurisprudenziale, ma contestualmente evidenziano la volontà di estendere la portata della tutela cautelare. Guardando a cosa sia diventata oggi la tutela cautelare, si è assistito ad un rafforzamento degli effetti conformativi, tipici delle sentenze di merito ma che si sono estesi alla fase cautelare. I giudici, pur consapevoli che la tutela cautelare non sia una risposta a cognizione piena, bensì sommaria, intendono comunque garantire una tutela tempestiva ed effettiva, anche per il tramite di tecniche processuali particolari, come quella del remand, cui, all’interno della ricerca, viene dedicato ampio spazio. Nella sua ultima parte la ricerca si è focalizzata, sempre volendo guardare in maniera globale agli effetti della tutela cautelare, sul momento dell’esecuzione e quindi sul giudizio di ottemperanza.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this study, we evaluated the potential use of entomopathogenic nematodes as a control for the beetle Aethina tumida Murray (Coleoptera: Nitidulidae). In particular, we conducted 1) four screening bioassays to determine nematode (seven species, 10 total strains tested) and application level effects on A. tumida larvae and pupae, 2) a generational persistence bioassay to determine whether single inoculations with nematodes would control multiple generations of A. tumida larvae in treated soil, and 3) a field bioassay to determine whether the nematodes would remain efficacious in the field. In the screening bioassays, nematode efficacy varied significantly by tested nematode and the infective juvenile (IJ) level at which they were applied. Although nematode virulence was moderate in screening bioassays 1-3 (0 - 68% A. tumida mortality), A. tumida mortality approached higher levels in screening bioassay 4 (nearly 100% after 39 d) that suggest suitable applicability of some of the test nematodes as field controls for A. tumida. In the generational persistence bioassay, Steinernema Hobrave Cabanillas, Poinar & Raulston 7-12 strain and Heterorhabditis indica Poinar, Karunaka & David provided adequate A. tumida control for 19 wk after a single soil inoculation (76-94% mortality in A. tumida pupae). In the field bioassay, the same two nematode species also showed high virulence toward pupating A. tumida (88-100%) mortality. Our data suggest that nematode use may be an integral component of an integrated pest management scheme aimed at reducing A. tumida populations in bee colonies to tolerable levels.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Obesity is becoming an epidemic phenomenon in most developed countries. The fundamental cause of obesity and overweight is an energy imbalance between calories consumed and calories expended. It is essential to monitor everyday food intake for obesity prevention and management. Existing dietary assessment methods usually require manually recording and recall of food types and portions. Accuracy of the results largely relies on many uncertain factors such as user's memory, food knowledge, and portion estimations. As a result, the accuracy is often compromised. Accurate and convenient dietary assessment methods are still blank and needed in both population and research societies. In this thesis, an automatic food intake assessment method using cameras, inertial measurement units (IMUs) on smart phones was developed to help people foster a healthy life style. With this method, users use their smart phones before and after a meal to capture images or videos around the meal. The smart phone will recognize food items and calculate the volume of the food consumed and provide the results to users. The technical objective is to explore the feasibility of image based food recognition and image based volume estimation. This thesis comprises five publications that address four specific goals of this work: (1) to develop a prototype system with existing methods to review the literature methods, find their drawbacks and explore the feasibility to develop novel methods; (2) based on the prototype system, to investigate new food classification methods to improve the recognition accuracy to a field application level; (3) to design indexing methods for large-scale image database to facilitate the development of new food image recognition and retrieval algorithms; (4) to develop novel convenient and accurate food volume estimation methods using only smart phones with cameras and IMUs. A prototype system was implemented to review existing methods. Image feature detector and descriptor were developed and a nearest neighbor classifier were implemented to classify food items. A reedit card marker method was introduced for metric scale 3D reconstruction and volume calculation. To increase recognition accuracy, novel multi-view food recognition algorithms were developed to recognize regular shape food items. To further increase the accuracy and make the algorithm applicable to arbitrary food items, new food features, new classifiers were designed. The efficiency of the algorithm was increased by means of developing novel image indexing method in large-scale image database. Finally, the volume calculation was enhanced through reducing the marker and introducing IMUs. Sensor fusion technique to combine measurements from cameras and IMUs were explored to infer the metric scale of the 3D model as well as reduce noises from these sensors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The widespread use of wireless enabled devices and the increasing capabilities of wireless technologies has promoted multimedia content access and sharing among users. However, the quality perceived by the users still depends on multiple factors such as video characteristics, device capabilities, and link quality. While video characteristics include the video time and spatial complexity as well as the coding complexity, one of the most important device characteristics is the battery lifetime. There is the need to assess how these aspects interact and how they impact the overall user satisfaction. This paper advances previous works by proposing and validating a flexible framework, named EViTEQ, to be applied in real testbeds to satisfy the requirements of performance assessment. EViTEQ is able to measure network interface energy consumption with high precision, while being completely technology independent and assessing the application level quality of experience. The results obtained in the testbed show the relevance of combined multi-criteria measurement approaches, leading to superior end-user satisfaction perception evaluation .