30 resultados para Error-Free Transformations
Resumo:
Modelling the fundamental performance limits of wireless sensor networks (WSNs) is of paramount importance to understand the behaviour of WSN under worst case conditions and to make the appropriate design choices. In that direction, this paper contributes with a methodology for modelling cluster tree WSNs with a mobile sink. We propose closed form recurrent expressions for computing the worst case end to end delays, buffering and bandwidth requirements across any source-destination path in the cluster tree assuming error free channel. We show how to apply our theoretical results to the specific case of IEEE 802.15.4/ZigBee WSNs. Finally, we demonstrate the validity and analyze the accuracy of our methodology through a comprehensive experimental study, therefore validating the theoretical results through experimentation.
Resumo:
O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.
Resumo:
Two chromatographic methods, gas chromatography with flow ionization detection (GC–FID) and liquid chromatography with ultraviolet detection (LC–UV), were used to determine furfuryl alcohol in several kinds of foundry resins, after application of an optimised extraction procedure. The GC method developed gave feasibility that did not depend on resin kind. Analysis by LC was suitable just for furanic resins. The presence of interference in the phenolic resins did not allow an appropriate quantification by LC. Both methods gave accurate and precise results. Recoveries were >94%; relative standard deviations were ≤7 and ≤0.3%, respectively for GC and LC methods. Good relative deviations between the two methods were found (≤3%).
Resumo:
Formaldehyde is a toxic component that is present in foundry resins. Its quantification is important to the characterisation of the resin (kind and degradation) as well as for the evaluation of free contaminants present in wastes generated by the foundry industry. The complexity of the matrices considered suggests the need for separative techniques. The method developed for the identification and quantification of formaldehyde in foundry resins is based on the determination of free carbonyl compounds by derivatization with 2,4-dinitrophenylhydrazine (DNPH), being adapted to the considered matrices using liquid chromatography (LC) with UV detection. Formaldehyde determinations in several foundry resins gave precise results. Mean recovery and R.S.D. were, respectively, >95 and 5%. Analyses by the hydroxylamine reference method gave comparable results. Results showed that hydroxylamine reference method is applicable just for a specific kind of resin, while the developed method has good performance for all studied resins.
Resumo:
Phenol is a toxic compound present in a wide variety of foundry resins. Its quantification is important for the characterization of the resins as well as for the evaluation of free contaminants present in foundry wastes. Two chromatographic methods, liquid chromatography with ultraviolet detection (LC-UV) and gas chromatography with flame ionization detection (GC-FID), for the analysis of free phenol in several foundry resins, after a simple extraction procedure (30 min), were developed. Both chromatographic methods were suitable for the determination of phenol in the studied furanic and phenolic resins, showing good selectivity, accuracy (recovery 99–100%; relative deviations <5%), and precision (coefficients of variation <6%). The used ASTM reference method was only found to be useful in the analysis of phenolic resins, while the LC and GC methods were applicable for all the studied resins. The developed methods reduce the time of analysis from 3.5 hours to about 30 min and can readily be used in routine quality control laboratories.
Resumo:
Celiac disease (CD) is an autoimmune enteropathy, characterized by an inappropriate T-cell-mediated immune response to the ingestion of certain dietary cereal proteins in genetically susceptible individuals. This disorder presents environmental, genetic, and immunological components. CD presents a prevalence of up to 1% in populations of European ancestry, yet a high percentage of cases remain underdiagnosed. The diagnosis and treatment should be made early since untreated disease causes growth retardation and atypical symptoms, like infertility or neurological disorders. The diagnostic criteria for CD, which requires endoscopy with small bowel biopsy, have been changing over the last few decades, especially due to the advent of serological tests with higher sensitivity and specificity. The use of serological markers can be very useful to rule out clinical suspicious cases and also to help monitor the patients, after adherence to a gluten-free diet. Since the current treatment consists of a life-long glutenfree diet, which leads to significant clinical and histological improvement, the standardization of an assay to assess in an unequivocal way gluten in gluten-free foodstuff is of major importance.
Resumo:
In this paper, we characterize two power indices introduced in [1] using two different modifications of the monotonicity property first stated by [2]. The sets of properties are easily comparable among them and with previous characterizations of other power indices.
Resumo:
This communication presents a novel kind of silicon nanomaterial: freestanding Si nanowire arrays (Si NWAs), which are synthesized facilely by one-step template-free electro-deoxidation of SiO2 in molten CaCl2. The self-assembling growth process of this material is also investigated preliminarily.
Resumo:
The TEM family of enzymes has had a crucial impact on the pharmaceutical industry due to their important role in antibiotic resistance. Even with the latest technologies in structural biology and genomics, no 3D structure of a TEM- 1/antibiotic complex is known previous to acylation. Therefore, the comprehension of their capability in acylate antibiotics is based on the protein macromolecular structure uncomplexed. In this work, molecular docking, molecular dynamic simulations, and relative free energy calculations were applied in order to get a comprehensive and thorough analysis of TEM-1/ampicillin and TEM-1/amoxicillin complexes. We described the complexes and analyzed the effect of ligand binding on the overall structure. We clearly demonstrate that the key residues involved in the stability of the ligand (hot-spots) vary with the nature of the ligand. Structural effects such as (i) the distances between interfacial residues (Ser70−Oγ and Lys73−Nζ, Lys73−Nζ and Ser130−Oγ, and Ser70−Oγ−Ser130−Oγ), (ii) side chain rotamer variation (Tyr105 and Glu240), and (iii) the presence of conserved waters can be also influenced by ligand binding. This study supports the hypothesis that TEM-1 suffers structural modifications upon ligand binding.
Resumo:
We consider reliable communications in Body Area Networks (BAN), where a set of nodes placed on human body are connected using wireless links. In order to keep the Specific Absorption Rate (SAR) as low as possible for health safety reasons, these networks operate in low transmit power regime, which however, is known to be error prone. It has been observed that the fluctuations of the Received Signal Strength (RSS) at the nodes of a BAN on a moving person show certain regularities and that the magnitude of these fluctuations are significant (5 - 20 dB). In this paper, we present BANMAC, a MAC protocol that monitors and predicts the channel fluctuations and schedules transmissions opportunistically when the RSS is likely to be higher. The MAC protocol is capable of providing differentiated service and resolves co-channel interference in the event of multiple co-located BANs in a vicinity. We report the design and implementation details of BANMAC integrated with the IEEE 802.15.4 protocol stack. We present experimental data which show that the packet loss rate (PLR) of BANMAC is significantly lower as compared to that of the IEEE 802.15.4 MAC. For comparable PLR, the power consumption of BANMAC is also significantly lower than that of the IEEE 802.15.4. For co-located networks, the convergence time to find a conflict-free channel allocation was approximately 1 s for the centralized coordination mechanism and was approximately 4 s for the distributed coordination mechanism.
Resumo:
It is widely assumed that scheduling real-time tasks becomes more difficult as their deadlines get shorter. With deadlines shorter, however, tasks potentially compete less with each other for processors, and this could produce more contention-free slots at which the number of competing tasks is smaller than or equal to the number of available processors. This paper presents a policy (called CF policy) that utilizes such contention-free slots effectively. This policy can be employed by any work-conserving, preemptive scheduling algorithm, and we show that any algorithm extended with this policy dominates the original algorithm in terms of schedulability. We also present improved schedulability tests for algorithms that employ this policy, based on the observation that interference from tasks is reduced when their executions are postponed to contention-free slots. Finally, using the properties of the CF policy, we derive a counter-intuitive claim that shortening of task deadlines can help improve schedulability of task systems. We present heuristics that effectively reduce task deadlines for better scheduability without performing any exhaustive search.
Resumo:
The recently standardized IEEE 802.15.4/Zigbee protocol stack offers great potentials for ubiquitous and pervasive computing, namely for Wireless Sensor Networks (WSNs). However, there are still some open and ambiguous issues that turn its practical use a challenging task. One of those issues is how to build a synchronized multi-hop cluster-tree network, which is quite suitable for QoS support in WSNs. In fact, the current IEEE 802.15.4/Zigbee specifications restrict the synchronization in the beacon-enabled mode (by the generation of periodic beacon frames) to star-based networks, while it supports multi-hop networking using the peer-to-peer mesh topology, but with no synchronization. Even though both specifications mention the possible use of cluster-tree topologies, which combine multi-hop and synchronization features, the description on how to effectively construct such a network topology is missing. This paper tackles this problem, unveils the ambiguities regarding the use of the cluster-tree topology and proposes two collision-free beacon frame scheduling schemes. We strongly believe that the results provided in this paper trigger a significant step towards the practical and efficient use of IEEE 802.15.4/Zigbee cluster-tree networks.