852 resultados para Initial data problem
Resumo:
Objective: To evaluate the effectiveness of the Gram stain in the initial diagnosis of the etiologic agent of peritonitis in continuous ambulatory peritoneal dialysis (CAPD). Design: Retrospective study analyzing the sensitivity (S), specificity (SS), positive predictive value (+PV), and negative predictive value (-PV) of the Gram stain relating to the results of cultures in 149 episodes of peritonitis in CAPD. The data were analyzed in two studies. In the first, only the cases with detection of a single agent by Gram stain were taken (Study 1). In the second, only the cases with two agents in Gram stain were evaluated (Study 2). Setting: Dialysis Unit and Laboratory of Microbiology of a tertiary medical center. Patients: Sixty-three patients on regular CAPD who presented one or more episodes of peritonitis from May 1992 to May 1995. Results: The positivity of Gram stain was 93.2% and the sensitivity was 95.7%. The values of S, SS, +PV, and -PV were respectively: 94.9%, 53.5%, 68.3%, and 90.9% for gram-positive cocci and 83.3%, 98.8%, 95.2%, and 95.6% for gram-negative bacilli. The association of gram-positive cocci plus gram-negative bacilli were predictive of growth of both in 6.8%, growth of gram-positive cocci in 13.7%, and growth of gram-negative bacilli in 72.5%. Conclusions: The Gram stain is a method of great value in the initial diagnosis of the etiologic agent of peritonitis in CAPD, especially for gram-negative bacilli.
Resumo:
Hemoglobin remains, despite the enormous amount of research involving this molecule, as a prototype for allosteric models and new conformations. Functional studies carried out on Hemoglobin-I from the South-American Catfish Liposarcus anisitsi [1] suggest the existence of conformational states beyond those already described for human hemoglobin, which could be confirmed crystallographically. The present work represents the initial steps towards that goal.
Resumo:
In this paper, the concept of Matching Parallelepiped (MP) is presented. It is shown that the volume of the MP can be used as an additional measure of `distance' between a pair of candidate points in a matching algorithm by Relaxation Labeling (RL). The volume of the MP is related with the Epipolar Geometry and the use of this measure works as an epipolar constraint in a RL process, decreasing the efforts in the matching algorithm since it is not necessary to explicitly determine the equations of the epipolar lines and to compute the distance of a candidate point to each epipolar line. As at the beginning of the process the Relative Orientation (RO) parameters are unknown, a initial matching based on gradient, intensities and correlation is obtained. Based on this set of labeled points the RO is determined and the epipolar constraint included in the algorithm. The obtained results shown that the proposed approach is suitable to determine feature-point matching with simultaneous estimation of camera orientation parameters even for the cases where the pair of optical axes are not parallel.
Resumo:
Minimization of a differentiable function subject to box constraints is proposed as a strategy to solve the generalized nonlinear complementarity problem (GNCP) defined on a polyhedral cone. It is not necessary to calculate projections that complicate and sometimes even disable the implementation of algorithms for solving these kinds of problems. Theoretical results that relate stationary points of the function that is minimized to the solutions of the GNCP are presented. Perturbations of the GNCP are also considered, and results are obtained related to the resolution of GNCPs with very general assumptions on the data. These theoretical results show that local methods for box-constrained optimization applied to the associated problem are efficient tools for solving the GNCP. Numerical experiments are presented that encourage the use of this approach.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
Includes bibliography
Resumo:
This paper proposes a cluster partitioning technique to calculate improved upper bounds to the optimal solution of maximal covering location problems. Given a covering distance, a graph is built considering as vertices the potential facility locations, and with an edge connecting each pair of facilities that attend a same client. Coupling constraints, corresponding to some edges of this graph, are identified and relaxed in the Lagrangean way, resulting in disconnected subgraphs representing smaller subproblems that are computationally easier to solve by exact methods. The proposed technique is compared to the classical approach, using real data and instances from the available literature. © 2010 Edson Luiz França Senne et al.
Resumo:
Includes bibliography
Resumo:
In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.
Resumo:
This paper proposes a tabu search approach to solve the Synchronized and Integrated Two-Level Lot Sizing and Scheduling Problem (SITLSP). It is a real-world problem, often found in soft drink companies, where the production process has two integrated levels with decisions concerning raw material storage and soft drink bottling. Lot sizing and scheduling of raw materials in tanks and products in bottling lines must be simultaneously determined. Real data provided by a soft drink company is used to make comparisons with a previous genetic algorithm. Computational results have demonstrated that tabu search outperformed genetic algorithm in all instances. Copyright 2011 ACM.
Resumo:
The Capacitated Arc Routing Problem (CARP) is a well-known NP-hard combinatorial optimization problem where, given an undirected graph, the objective is to find a minimum cost set of tours servicing a subset of required edges under vehicle capacity constraints. There are numerous applications for the CARP, such as street sweeping, garbage collection, mail delivery, school bus routing, and meter reading. A Greedy Randomized Adaptive Search Procedure (GRASP) with Path-Relinking (PR) is proposed and compared with other successful CARP metaheuristics. Some features of this GRASP with PR are (i) reactive parameter tuning, where the parameter value is stochastically selected biased in favor of those values which historically produced the best solutions in average; (ii) a statistical filter, which discard initial solutions if they are unlikely to improve the incumbent best solution; (iii) infeasible local search, where high-quality solutions, though infeasible, are used to explore the feasible/infeasible boundaries of the solution space; (iv) evolutionary PR, a recent trend where the pool of elite solutions is progressively improved by successive relinking of pairs of elite solutions. Computational tests were conducted using a set of 81 instances, and results reveal that the GRASP is very competitive, achieving the best overall deviation from lower bounds and the highest number of best solutions found. © 2011 Elsevier Ltd. All rights reserved.
Resumo:
The central and western portion of the S̃ao Paulo State has large areas of sugar cane plantations, and due to the growing demand for biofuels, the production is increasing every year. During the harvest period some plantation areas are burnt a few hours before the manual cutting, causing significant quantities of biomass burning aerosol to be injected into the atmosphere. During August 2010, a field campaign has been carried out in Ourinhos, situated in the south-western region of S̃ao Paulo State. A 2-channel Raman Lidar system and two meteorological S-Band Doppler Radars are used to indentify and quantify the biomass burning plumes. In addiction, CALIPSO Satellite observations were used to compare the aerosol optical properties detected in that region with those retrieved by Raman Lidar system. Although the campaign yielded 30 days of measurements, this paper will be focusing only one case study, when aerosols released from nearby sugar cane fires were detected by the Lidar system during a CALIPSO overpass. The meteorological radar, installed in Bauru, approximately 110 km northeast from the experimental site, had recorded echoes (dense smoke comprising aerosols) from several fires occurring close to the Raman Lidar system, which also detected an intense load of aerosol in the atmosphere. HYSPLIT model forward trajectories presented a strong indication that both instruments have measured the same air masss parcels, corroborated with the Lidar Ratio values from the 532 nm elastic and 607 nm Raman N2 channel analyses and data retrieved from CALIPSO have indicated the predominance of aerosol from biomass burning sources. © 2011 SPIE.
Resumo:
Detecting misbehavior (such as transmissions of false information) in vehicular ad hoc networks (VANETs) is a very important problem with wide range of implications, including safety related and congestion avoidance applications. We discuss several limitations of existing misbehavior detection schemes (MDS) designed for VANETs. Most MDS are concerned with detection of malicious nodes. In most situations, vehicles would send wrong information because of selfish reasons of their owners, e.g. for gaining access to a particular lane. It is therefore more important to detect false information than to identify misbehaving nodes. We introduce the concept of data-centric misbehavior detection and propose algorithms which detect false alert messages and misbehaving nodes by observing their actions after sending out the alert messages. With the data-centric MDS, each node can decide whether an information received is correct or false. The decision is based on the consistency of recent messages and new alerts with reported and estimated vehicle positions. No voting or majority decisions is needed, making our MDS resilient to Sybil attacks. After misbehavior is detected, we do not revoke all the secret credentials of misbehaving nodes, as done in most schemes. Instead, we impose fines on misbehaving nodes (administered by the certification authority), discouraging them to act selfishly. This reduces the computation and communication costs involved in revoking all the secret credentials of misbehaving nodes. © 2011 IEEE.
Resumo:
In the present paper a study is made in order to find an algorithm that can calculate coplanar orbital maneuvers for an artificial satellite. The idea is to find a method that is fast enough to be combined with onboard orbit determination using GPS data collected from a receiver that is located in the satellite. After a search in the literature, three algorithms are selected to be tested. Preliminary studies show that one of them (the so called Minimum Delta-V Lambert Problem) has several advantages over the two others, both in terms of accuracy and time required for processing. So, this algorithm is implemented and tested numerically combined with the orbit determination procedure. Some adjustments are performed in this algorithm in the present paper to allow its use in real-time onboard applications. Considering the whole maneuver, first of all a simplified and compact algorithm is used to estimate in real-time and onboard the artificial satellite orbit using the GPS measurements. By using the estimated orbit as the initial one and the information of the final desired orbit (from the specification of the mission) as the final one, a coplanar bi-impulsive maneuver is calculated. This maneuver searches for the minimum fuel consumption. Two kinds of maneuvers are performed, one varying only the semi major axis and the other varying the semi major axis and the eccentricity of the orbit, simultaneously. The possibilities of restrictions in the locations to apply the impulses are included, as well as the possibility to control the relation between the processing time and the solution accuracy. Those are the two main reasons to recommend this method for use in the proposed application.
Resumo:
Semi-supervised learning is applied to classification problems where only a small portion of the data items is labeled. In these cases, the reliability of the labels is a crucial factor, because mislabeled items may propagate wrong labels to a large portion or even the entire data set. This paper aims to address this problem by presenting a graph-based (network-based) semi-supervised learning method, specifically designed to handle data sets with mislabeled samples. The method uses teams of walking particles, with competitive and cooperative behavior, for label propagation in the network constructed from the input data set. The proposed model is nature-inspired and it incorporates some features to make it robust to a considerable amount of mislabeled data items. Computer simulations show the performance of the method in the presence of different percentage of mislabeled data, in networks of different sizes and average node degree. Importantly, these simulations reveals the existence of the critical points of the mislabeled subset size, below which the network is free of wrong label contamination, but above which the mislabeled samples start to propagate their labels to the rest of the network. Moreover, numerical comparisons have been made among the proposed method and other representative graph-based semi-supervised learning methods using both artificial and real-world data sets. Interestingly, the proposed method has increasing better performance than the others as the percentage of mislabeled samples is getting larger. © 2012 IEEE.