952 resultados para Multicast application level


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The market for solder paste materials in the electronic manufacturing and assembly sector is very large and consists of material and equipment suppliers and end users. These materials are used to bond electronic components (such as flip-chip, CSP and BGA) to printed circuit boards (PCB's) across a range of dimensions where the solder interconnects can be in the order of 0.05mm to 5mm in size. The non-Newtonian flow properties exhibited by solder pastes during its manufacture and printing/deposition phases have been of practical concern to surface mount engineers and researchers for many years. The printing of paste materials through very small-sized stencil apertures is known to lead to increased stencil clogging and incomplete transfer of paste to the substrate pads. At these very narrow aperture sizes the paste rheology and particle-wall interactions become crucial for consistent paste withdrawal. These non-Newtonian effects must be understood so that the new paste formulations can be optimised for consistent printing. The focus of the study reported in this paper is the characterisation of the rheological properties of solder pastes and flux mediums, and the evaluation of the effect of these properties on the pastes' printing performance at the flip-chip assembly application level. Solder pastes are known to exhibit a thixotropic behaviour, which is recognised by the decrease in apparent viscosity of paste material with time when subjected to a constant shear rate. The proper characterisation of this time-dependent theological behaviour of solder pastes is crucial for establishing the relationships between the pastes' structure and flow behaviour; and for correlating the physical parameters with paste printing performance. In this paper, we present a number of methods which have been developed for characterising the time-dependent and non-Newtonian rheological behaviour of solder pastes and flux mediums as a function of shear rates. We also present results of the study of the rheology of the solder pastes and flux mediums using the structural kinetic modelling approach, which postulates that the network structure of solder pastes breaks down irreversibly under shear, leading to time and shear dependent changes in the flow properties. Our results show that for the solder pastes used in the study, the rate and extent of thixotropy was generally found to increase with increasing shear rate. The technique demonstrated in this study has wide utility for R&D personnel involved in new paste formulation, for implementing quality control procedures used in solder paste manufacture and packaging; and for qualifying new flip-chip assembly lines

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Future digital signal processing (DSP) systems must provide robustness on algorithm and application level to the presence of reliability issues that come along with corresponding implementations in modern semiconductor process technologies. In this paper, we address this issue by investigating the impact of unreliable memories on general DSP systems. In particular, we propose a novel framework to characterize the effects of unreliable memories, which enables us to devise novel methods to mitigate the associated performance loss. We propose to deploy specifically designed data representations, which have the capability of substantially improving the system reliability compared to that realized by conventional data representations used in digital integrated circuits, such as 2's-complement or sign-magnitude number formats. To demonstrate the efficacy of the proposed framework, we analyze the impact of unreliable memories on coded communication systems, and we show that the deployment of optimized data representations substantially improves the error-rate performance of such systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasingly large amounts of data are stored in main memory of data center servers. However, DRAM-based memory is an important consumer of energy and is unlikely to scale in the future. Various byte-addressable non-volatile memory (NVM) technologies promise high density and near-zero static energy, however they suffer from increased latency and increased dynamic energy consumption.

This paper proposes to leverage a hybrid memory architecture, consisting of both DRAM and NVM, by novel, application-level data management policies that decide to place data on DRAM vs. NVM. We analyze modern column-oriented and key-value data stores and demonstrate the feasibility of application-level data management. Cycle-accurate simulation confirms that our methodology reduces the energy with least performance degradation as compared to the current state-of-the-art hardware or OS approaches. Moreover, we utilize our techniques to apportion DRAM and NVM memory sizes for these workloads.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we address the real-time capabilities of P-NET, which is a multi-master fieldbus standard based on a virtual token passing scheme. We show how P-NET’s medium access control (MAC) protocol is able to guarantee a bounded access time to message requests. We then propose a model for implementing fixed prioritybased dispatching mechanisms at each master’s application level. In this way, we diminish the impact of the first-come-first-served (FCFS) policy that P-NET uses at the data link layer. The proposed model rises several issues well known within the real-time systems community: message release jitter; pre-run-time schedulability analysis in non pre-emptive contexts; non-independence of tasks at the application level. We identify these issues in the proposed model and show how results available for priority-based task dispatching can be adapted to encompass priority-based message dispatching in P-NET networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Collaborative software is usually thought of as providing audio-video conferencing services, application/desktop sharing, and access to large content repositories. However mobile device usage is characterized by users carrying out short and intermittent tasks sometimes referred to as 'micro-tasking'. Micro-collaborations are not well supported by traditional groupware systems and the work in this paper seeks out to address this. Mico is a system that provides a set of application level peer-to-peer services for the ad-hoc formation and facilitation of collaborative groups across a diverse mobile device domain. The system builds on the Java ME bindings of the JXTA P2P protocols, and is designed with an approach to use the lowest common denominators that are required for collaboration between varying degrees of mobile device capability. To demonstrate how our platform facilitates application development, we built an exemplary set of demonstration applications and include code examples here to illustrate the ease and speed afforded when developing collaborative software with Mico.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Rheumatic fever (RF), a systemic illness that may occur following Group A beta-hemolytic streptococcal (GABHS) pharyngitis in children, is a major problem in countries with limited resources. Because of its long track record and low cost, an injection of benzathine penicillin G (BPG) suspension every 3 or 4 weeks has been used as secondary prophylaxis. Despite its excellent in vitro efficacy, the inability of BPG to eradicate GABHS has been frequently reported.Areas covered: This work reviews the possible causes of failure, as well as the inconvenience of the current prophylactic treatment of acute RF and suggests a new pharmacotherapeutic system that could replace the current one.Expert opinion: RF is a major problem concerning only countries with limited resources and could be considered as a neglected disease. The dose regimen using BPG suspension results in failures, which could be avoided by the use of nanocarrier-based systems. To meet this ultimate goal, the research should be transposed from the laboratory scale to an industrial and clinical application level. This research should be conducted to produce a pharmaceutical dosage form that will be commercially available, consumed by and affordable for patients. However, health, environmental and socioeconomic hazards should be considered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study was aimed to verify if chicks from eggs injected with ascorbic acid and subjected to thermal stress would have higher immunity than chicks from incubation at thermoneutrality without injection of ascorbic acid. The parameters evaluated were temperature on oxygen saturation in hemoglobin, glucose, number of erythrocytes, hematocrit rate and number of hemoglobins of newly hatched male chicks, hatched from eggs injected with ascorbic acid (AA) and subjected to thermal stress during incubation. The experimental design was completely randomized in factorial scheme 5 (application levels of ascorbic acid) x 2 (incubation temperatures). The data were subjected to analysis of variance using the General Linear Model procedure (GLM) of SAS ®. For the parameters (number of erythrocytes, rate of hematrocit and values of hemoglobin), there was significant interaction (p <0.05) between treatments in egg and incubation temperatures. Analyzing the interactions for these parameters, it was observed that the application of 0% ascorbic acid in egg minimized the effect of heat stress when compared with treatment without injection. The application of ascorbic acid levels in eggs incubated under heat stress failed to maximize the immunity of newly hatched chicks. It is assumed that the increased liquid in the amniotic fluid, in those embryos injected with water, favored the lower heat conductance for these embryos, thus helping in their development in relation to immunity. Considering that hemoglobin is related to the transport of gases, these data suggest that increasing the concentration of AA solution inoculated may influence the respiratory rates of eggs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study aimed to verify that chicks from eggs injected with ascorbic acid and subjected to heat stress would have changes in acid-base balance, compared to chicks incubated at thermoneutral without injection of ascorbic acid. The parameters evaluated were blood pressure of carbon dioxide and oxygen, base excess, total carbon dioxide, concentration of sodium, potassium, ionized calcium, bicarbonate and pH of newly hatched male chicks, hatched from eggs injected with acid ascorbic acid (AA) and subjected to heat stress during incubation. The experimental design was completely randomized in factorial scheme 5 (application levels of ascorbic acid) x 2 (incubation temperatures). The data were subjected to analysis of variance using the General Linear Model procedure (GLM) of SAS ®. For the blood pH was observed significant interaction (p <0.05) between treatments with application in eggs and incubation temperatures. For the other parameters were not significant effects (p< 0.05) of AA level and neither temperature of incubation. Analyzing the unfolding of the interaction to pH was observed that chicks from eggs injected with 6% ascorbic acid and subjected to heat stress during incubation had a higher pH value compared with the thermoneutral temperature incubated (p <0.05). Therefore, it is suggested that the incubation of eggs in high temperature (39°C) can alter the metabolic rate of these embryos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’effettività della tutela cautelare, intesa come tutela tempestiva, adeguata e piena, è stata la linea cardine dell’evoluzione della giustizia amministrativa, che, nel corso di un periodo durato più di un secolo, grazie all’opera della giurisprudenza e della dottrina, si è strutturata oggi su un vero processo. Approdo recente, e allo stesso tempo, simbolo di questa evoluzione, è sicuramente il Codice del processo amministrativo emanato con il d. lgs. 2 luglio 2010, n. 104. La ricerca, di cui questo contributo costituisce un resoconto, è iniziata contestualmente all’entrata in vigore del nuovo testo, e quindi è stata anche l’occasione per vederne le prime applicazioni giurisprudenziali. In particolare la lettura del Codice, prescindendo da una mera ricognizione di tutto il suo lungo articolato, è stata fatta alla luce di una ponderazione, nell’attualità, non solo del principio di effettività, ma anche del principio di strumentalità che lega tradizionalmente la fase cautelare con la fase di merito. I risultati della ricerca manifestano la volontà del legislatore di confermare questo rapporto strumentale, per fronteggiare una deriva incontrollata verso una cautela dagli effetti alle volte irreversibili, quale verificatasi nell’applicazione giurisprudenziale, ma contestualmente evidenziano la volontà di estendere la portata della tutela cautelare. Guardando a cosa sia diventata oggi la tutela cautelare, si è assistito ad un rafforzamento degli effetti conformativi, tipici delle sentenze di merito ma che si sono estesi alla fase cautelare. I giudici, pur consapevoli che la tutela cautelare non sia una risposta a cognizione piena, bensì sommaria, intendono comunque garantire una tutela tempestiva ed effettiva, anche per il tramite di tecniche processuali particolari, come quella del remand, cui, all’interno della ricerca, viene dedicato ampio spazio. Nella sua ultima parte la ricerca si è focalizzata, sempre volendo guardare in maniera globale agli effetti della tutela cautelare, sul momento dell’esecuzione e quindi sul giudizio di ottemperanza.