937 resultados para rank-based procedure
Resumo:
AIM: This work presents detailed experimental performance results from tests executed in the hospital environment for Health Monitoring for All (HM4All), a remote vital signs monitoring system based on a ZigBee® (ZigBee Alliance, San Ramon, CA) body sensor network (BSN). MATERIALS AND METHODS: Tests involved the use of six electrocardiogram (ECG) sensors operating in two different modes: the ECG mode involved the transmission of ECG waveform data and heart rate (HR) values to the ZigBee coordinator, whereas the HR mode included only the transmission of HR values. In the absence of hidden nodes, a non-beacon-enabled star network composed of sensing devices working on ECG mode kept the delivery ratio (DR) at 100%. RESULTS: When the network topology was changed to a 2-hop tree, the performance degraded slightly, resulting in an average DR of 98.56%. Although these performance outcomes may seem satisfactory, further investigation demonstrated that individual sensing devices went through transitory periods with low DR. Other tests have shown that ZigBee BSNs are highly susceptible to collisions owing to hidden nodes. Nevertheless, these tests have also shown that these networks can achieve high reliability if the amount of traffic is kept low. Contrary to what is typically shown in scientific articles and in manufacturers' documentation, the test outcomes presented in this article include temporal graphs of the DR achieved by each wireless sensor device. CONCLUSIONS: The test procedure and the approach used to represent its outcomes, which allow the identification of undesirable transitory periods of low reliability due to contention between devices, constitute the main contribution of this work.
Resumo:
Novel ionic liquids containing ampicillin as an active pharmaceutical ingredient anion were prepared with good yields by using a new, efficient synthetic procedure based on the neutralization of a moderately basic ammonia solution of ampicillin with different organic cation hydroxides. The relevant physical and thermal properties of these novel ionic liquids based on ampicillin were also evaluated.
Resumo:
This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.
Resumo:
Model updating methods often neglect that in fact all physical structures are damped. Such simplification relies on the structural modelling approach, although it compromises the accuracy of the predictions of the structural dynamic behaviour. In the present work, the authors address the problem of finite element (FE) model updating based on measured frequency response functions (FRFs), considering damping. The proposed procedure is based upon the complex experimental data, which contains information related to the damped FE model parameters and presents the advantage of requiring no prior knowledge about the damping matrix structure or its content, only demanding the definition of the damping type. Numerical simulations are performed in order to establish the applicability of the proposed damped FE model updating technique and its results are discussed in terms of the correlation between the simulated experimental complex FRFs and the ones obtained from the updated FE model.
Resumo:
A copper(II) chiral aza-bis(oxazoline) homogeneous catalyst (CuazaBox) was anchored onto the external surface of MCM-22 and ITQ-2 structures, as well as encapsulated into hierarchical MCM-22. The transition metal complex loading onto the porous solids was determined by ICP-AES and the materials were also characterized by elemental analysis (C, N, H, S), FTIR, XPS, TG and low temperature N-2 adsorption isotherms. The materials were tested as heterogeneous catalysts in the benchmark reaction of cyclopropanation of styrene to check the effect of the immobilization procedure on the catalytic parameters, as well as on their reutilization in several catalytic cycles. Catalyst CuazaBox anchored onto the external surface of MCM-22 and ITQ-2 materials were more active and enantioselective in the cyclopropanation of styrene than the corresponding homogeneous phase reaction run under similar experimental conditions. This is due to the propylation of the acidic aza-Box nitrogen. HMCM-22 was nevertheless the best heterogeneous catalyst. Encapsulation of CuazaBox on post-synthesis modified MCM-22 materials led to low activities and enantioselectivities. But reversal on the stereochemical course of the reaction was observed, probably due to confinement effect. (C) 2013 Elsevier Inc. All rights reserved.
Resumo:
OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.
Resumo:
Ascorbic acid is found in many food samples. Its clinical and technological importance demands an easyto- use, rapid, robust and inexpensive method of analysis. For this purpose, this work proposes a new flow procedure based on the oxidation of ascorbic acid by periodate. A new potentiometric periodate sensor was constructed to monitor this reaction. The selective membranes were of PVC with porphyrin-based sensing systems and a lipophilic cation as additive. The sensor displayed a near-Nernstian response for periodate over 1.0x10-2–6.0x10-6 M, with an anionic slope of 73.9 ± 0.9 mV decade-1. It was pH independent in acidic media and presented good selectivity features towards several inorganic anions. The flow set-up operated in double-channel, carrying a 5.0x10-4 M IO- 4 solution and a suitable buffer; these were mixed in a 50-cm reaction coil. The overall flow rate was 7 ml min-1 and the injection volume 70 µl. Under these conditions, a linear behaviour against concentration was observed for 17.7–194.0 µg ml-1, presenting slopes of 0.169 mV (mg/l)-1, a reproducibility of ±1.1 mV (n = 5), and a sampling rate of ~96 samples h-1. The proposed method was applied to the analysis of beverages and pharmaceuticals.
Resumo:
Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.
Resumo:
The present work describes the optimization of a short-term assay, based on the inhibition of the esterase activity of the alga Pseudokirchneriella subcapitata, in a microplate format. The optimization of the staining procedure showed that the incubation of the algal cells with 20 μmolL−1 fluorescein diacetate (FDA) for 40 min allowed discrimination between metabolic active and inactive cells. The shortterm assay was tested using Cu as toxicant. For this purpose, algal cells, in the exponential or stationary phase of growth, were exposed to the heavy metal in growing conditions. After 3 or 6 h, cells were subsequently stained with FDA, using the optimized procedure. For Cu, the 3- and 6-h EC50 values, based on the inhibition of the esterase activity of algal cells in the exponential phase of growth, were 209 and 130 μg L−1, respectively. P. subcapitata cells, in the stationary phase of growth, displayed higher effective concentration values than those observed in the exponential phase. The 3- and 6-h EC50 values for Cu, for cells in the stationary phase, were 443 and 268 μgL−1, respectively. This short-term microplate assay showed to be a rapid endpoint for testing toxicity using the alga P. subcapitata. The small volume required, the simplicity of the assay (no washing steps), and the automatic reading of the fluorescence make the assay particularly well suited for the evaluation of the toxicity of a high number of environmental samples.
Resumo:
The problem of uncertainty propagation in composite laminate structures is studied. An approach based on the optimal design of composite structures to achieve a target reliability level is proposed. Using the Uniform Design Method (UDM), a set of design points is generated over a design domain centred at mean values of random variables, aimed at studying the space variability. The most critical Tsai number, the structural reliability index and the sensitivities are obtained for each UDM design point, using the maximum load obtained from optimal design search. Using the UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on supervised evolutionary learning. Finally, using the developed ANN a Monte Carlo simulation procedure is implemented and the variability of the structural response based on global sensitivity analysis (GSA) is studied. The GSA is based on the first order Sobol indices and relative sensitivities. An appropriate GSA algorithm aiming to obtain Sobol indices is proposed. The most important sources of uncertainty are identified.
Resumo:
Engineering Education includes not only teaching theoretical fundamental concepts but also its verification during practical lessons in laboratories. The usual strategies to carry out this action are frequently based on Problem Based Learning, starting from a given state and proceeding forward to a target state. The possibility or the effectiveness of this procedure depends on previous states and if the present state was caused or resulted from earlier ones. This often happens in engineering education when the achieved results do not match the desired ones, e.g. when programming code is being developed or when the cause of the wrong behavior of an electronic circuit is being identified. It is thus important to also prepare students to proceed in the reverse way, i.e. given a start state generate the explanation or even the principles that underlie it. Later on, this sort of skills will be important. For instance, to a doctor making a patient?s story or to an engineer discovering the source of a malfunction. This learning methodology presents pedagogical advantages besides the enhanced preparation of students to their future work. The work presented on his document describes an automation project developed by a group of students in an engineering polytechnic school laboratory. The main objective was to improve the performance of a Braille machine. However, in a scenario of Reverse Problem-Based learning, students had first to discover and characterize the entire machine's function before being allowed (and being able) to propose a solution for the existing problem.
Resumo:
Weblabs are spreading their influence in Science and Engineering (S&E) courses providing a way to remotely conduct real experiments. Typically, they are implemented by different architectures and infrastructures supported by Instruments and Modules (I&Ms) able to be remotely controlled and observed. Besides the inexistence of a standard solution for implementing weblabs, their reconfiguration is limited to a setup procedure that enables interconnecting a set of preselected I&Ms into an Experiment Under Test (EUT). Moreover, those I&Ms are not able to be replicated or shared by different weblab infrastructures, since they are usually based on hardware platforms. Thus, to overcome these limitations, this paper proposes a standard solution that uses I&Ms embedded into Field-Programmable Gate Array (FPGAs) devices. It is presented an architecture based on the IEEE1451.0 Std. supported by a FPGA-based weblab infrastructure able to be remotely reconfigured with I&Ms, described through standard Hardware Description Language (HDL) files, using a Reconfiguration Tool (RecTool).
Resumo:
One of the most important measures to prevent wild forest fires is the use of prescribed and controlled burning actions as it reduce the fuel mass availability. The impact of these management activities on soil physical and chemical properties varies according to the type of both soil and vegetation. Decisions in forest management plans are often based on the results obtained from soil-monitoring campaigns. Those campaigns are often man-labor intensive and expensive. In this paper we have successfully used the multivariate statistical technique Robust Principal Analysis Compounds (ROBPCA) to investigate on the sampling procedure effectiveness for two different methodologies, in order to reflect on the possibility of simplifying and reduce the sampling collection process and its auxiliary laboratory analysis work towards a cost-effective and competent forest soil characterization.
Resumo:
A bi-enzymatic biosensor (LACC–TYR–AuNPs–CS/GPE) for carbamates was prepared in a single step by electrodeposition of a hybrid film onto a graphene doped carbon paste electrode (GPE). Graphene and the gold nanoparticles (AuNPs) were morphologically characterized by transmission electron microscopy, X-ray photoelectron spectroscopy, dynamic light scattering and laser Doppler velocimetry. The electrodeposited hybrid film was composed of laccase (LACC), tyrosinase (TYR) and AuNPs entrapped in a chitosan (CS) polymeric matrix. Experimental parameters, namely graphene redox state, AuNPs:CS ratio, enzymes concentration, pH and inhibition time were evaluated. LACC–TYR–AuNPs–CS/GPE exhibited an improved Michaelis–Menten kinetic constant (26.9 ± 0.5 M) when compared with LACC–AuNPs–CS/GPE (37.8 ± 0.2 M) and TYR–AuNPs–CS/GPE (52.3 ± 0.4 M). Using 4-aminophenol as substrate at pH 5.5, the device presented wide linear ranges, low detection limits (1.68×10− 9 ± 1.18×10− 10 – 2.15×10− 7 ± 3.41×10− 9 M), high accuracy, sensitivity (1.13×106 ± 8.11×104 – 2.19×108 ± 2.51×107 %inhibition M− 1), repeatability (1.2–5.8% RSD), reproducibility (3.2–6.5% RSD) and stability (ca. twenty days) to determine carbaryl, formetanate hydrochloride, propoxur and ziram in citrus fruits based on their inhibitory capacity on the polyphenoloxidases activity. Recoveries at two fortified levels ranged from 93.8 ± 0.3% (lemon) to 97.8 ± 0.3% (orange). Glucose, citric acid and ascorbic acid do not interfere significantly in the electroanalysis. The proposed electroanalytical procedure can be a promising tool for food safety control.
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.