886 resultados para Search-based technique
Resumo:
Nurse rostering is a complex scheduling problem that affects hospital personnel on a daily basis all over the world. This paper presents a new component-based approach with adaptive perturbations, for a nurse scheduling problem arising at a major UK hospital. The main idea behind this technique is to decompose a schedule into its components (i.e. the allocated shift pattern of each nurse), and then mimic a natural evolutionary process on these components to iteratively deliver better schedules. The worthiness of all components in the schedule has to be continuously demonstrated in order for them to remain there. This demonstration employs a dynamic evaluation function which evaluates how well each component contributes towards the final objective. Two perturbation steps are then applied: the first perturbation eliminates a number of components that are deemed not worthy to stay in the current schedule; the second perturbation may also throw out, with a low level of probability, some worthy components. The eliminated components are replenished with new ones using a set of constructive heuristics using local optimality criteria. Computational results using 52 data instances demonstrate the applicability of the proposed approach in solving real-world problems.
Resumo:
We evaluated the performance of a novel procedure for segmenting mammograms and detecting clustered microcalcifications in two types of image sets obtained from digitization of mammograms using either a laser scanner, or a conventional ""optical"" scanner. Specific regions forming the digital mammograms were identified and selected, in which clustered microcalcifications appeared or not. A remarkable increase in image intensity was noticed in the images from the optical scanner compared with the original mammograms. A procedure based on a polynomial correction was developed to compensate the changes in the characteristic curves from the scanners, relative to the curves from the films. The processing scheme was applied to both sets, before and after the polynomial correction. The results indicated clearly the influence of the mammogram digitization on the performance of processing schemes intended to detect microcalcifications. The image processing techniques applied to mammograms digitized by both scanners, without the polynomial intensity correction, resulted in a better sensibility in detecting microcalcifications in the images from the laser scanner. However, when the polynomial correction was applied to the images from the optical scanner, no differences in performance were observed for both types of images. (C) 2008 SPIE and IS&T [DOI: 10.1117/1.3013544]
Resumo:
Context. HD 181231 is a B5IVe star, which has been observed with the CoRoT satellite during similar to 5 consecutive months and simultaneously from the ground in spectroscopy and spectropolarimetry. Aims. By analysing these data, we aim to detect and characterize as many pulsation frequencies as possible, to search for the presence of beating effects possibly at the origin of the Be phenomenon. Our results will also provide a basis for seismic modelling. Methods. The fundamental parameters of the star are determined from spectral fitting and from the study of the circumstellar emission. The CoRoT photometric data and ground-based spectroscopy are analysed using several Fourier techniques: CLEAN-NG, PASPER, and TISAFT, as well as a time-frequency technique. A search for a magnetic field is performed by applying the LSD technique to the spectropolarimetric data. Results. We find that HD 181231 is a B5IVe star seen with an inclination of similar to 45 degrees. No magnetic field is detected in its photosphere. We detect at least 10 independent significant frequencies of variations among the 54 detected frequencies, interpreted in terms of non-radial pulsation modes and rotation. Two longer-term variations are also detected: one at similar to 14 days resulting from a beating effect between the two main frequencies of short-term variations, the other at similar to 116 days due either to a beating of frequencies or to a zonal pulsation mode. Conclusions. Our analysis of the CoRoT light curve and ground-based spectroscopic data of HD 181231 has led to the determination of the fundamental and pulsational parameters of the star, including beating effects. This will allow a precise seismic modelling of this star.
Resumo:
The aim of this paper was to study a method based on gas production technique to measure the biological effects of tannins on rumen fermentation. Six feeds were used as fermentation substrates in a semi-automated gas method: feed A - aroeira (Astronium urundeuva); feed B - jurema preta (Mimosa hostilis), feed C - sorghum grains (Sorghum bicolor); feed D - Tifton-85 (Cynodon sp.); and two others prepared mixing 450 g sorghum leaves, 450 g concentrate (maize and soybean meal) and 100 g either of acacia (Acacia mearnsii) tannin extract (feed E) or quebracho (Schinopsis lorentzii) tannin extract (feed F) per kg (w:w). Three assays were carried out to standardize the bioassay for tannins. The first assay compared two binding agents (polyethylene glycol - PEG - and polyvinyl polypirrolidone - PVPP) to attenuate the tannin effects. The complex formed by PEG and tannins showed to be more stable than PVPP and tannins. Then, in the second assay, PEG was used as binding agent, and this assay was done to evaluate levels of PEG (0, 500, 750, 1000 and 1250 mg/g DM) to minimize the tannin effect. All the tested levels of PEG produced a response to evaluate tannin effects but the best response was for dose of 1000 mg/g DM. Using this dose of PEG, the final assay was carried out to test three compounds (tannic acid, quebracho extract and acacia extract) to establish a curve of biological equivalent effect of tannins. For this, five levels of each compound were added to I g of a standard feed (Lucerne hay). The equivalent effect showed not to be directly related to the chemical analysis for tannins. It was shown that different sources of tannins had different activities or reactivities. The curves of biological equivalence can provide information about tannin reactivity and its use seems to be important as an additional factor for chemical analysis. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A phase-only encryption/decryption scheme with the readout based on the zeroth-order phase-contrast technique (ZOPCT), without the use of a phase-changing plate on the Fourier plane of an optical system based on the 4f optical correlator, is proposed. The encryption of a gray-level image is achieved by multiplying the phase distribution obtained directly from the gray-level image by a random phase distribution. The robustness of the encoding is assured by the nonlinearity intrinsic to the proposed phase-contrast method and the random phase distribution used in the encryption process. The experimental system has been implemented with liquid-crystal spatial modulators to generate phase-encrypted masks and a decrypting key. The advantage of this method is the easy scheme to recover the gray-level information from the decrypted phase-only mask applying the ZOPCT. An analysis of this decryption method was performed against brute force attacks. (C) 2009 Society of Photo-Optical Instrumentation Engineers. [DOI: 10.1117/1.3223629]
Resumo:
A general, fast wavelet-based adaptive collocation method is formulated for heat and mass transfer problems involving a steep moving profile of the dependent variable. The technique of grid adaptation is based on sparse point representation (SPR). The method is applied and tested for the case of a gas–solid non-catalytic reaction in a porous solid at high Thiele modulus. Accurate and convergent steep profiles are obtained for Thiele modulus as large as 100 for the case of slab and found to match the analytical solution.
Resumo:
The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.
Resumo:
Formal Concept Analysis is an unsupervised machine learning technique that has successfully been applied to document organisation by considering documents as objects and keywords as attributes. The basic algorithms of Formal Concept Analysis then allow an intelligent information retrieval system to cluster documents according to keyword views. This paper investigates the scalability of this idea. In particular we present the results of applying spatial data structures to large datasets in formal concept analysis. Our experiments are motivated by the application of the Formal Concept Analysis idea of a virtual filesystem [11,17,15]. In particular the libferris [1] Semantic File System. This paper presents customizations to an RD-Tree Generalized Index Search Tree based index structure to better support the application of Formal Concept Analysis to large data sources.
Resumo:
Objective: To test the feasibility of an evidence-based clinical literature search service to help answer general practitioners' (GPs') clinical questions. Design: Two search services supplied GPs who submitted questions with the best available empirical evidence to answer these questions. The GPs provided feedback on the value of the service, and concordance of answers from the two search services was assessed. Setting: Two literature search services (Queensland and Victoria), operating for nine months from February 1999. Main outcome measures: Use of the service; time taken to locate answers; availability of evidence; value of the service to GPs; and consistency of answers from the two services. Results: 58 GPs asked 160 questions (29 asked one, 11 asked five or more). The questions concerned treatment (65%), aetiology (17%), prognosis (13%), and diagnosis (5%). Answering a question took a mean of 3 hours 32 minutes of personnel time (95% Cl, 2.67-3.97); nine questions took longer than 10 hours each to answer, the longest taking 23 hours 30 minutes. Evidence of suitable quality to provide a sound answer was available for 126 (79%) questions. Feedback data for 84 (53%) questions, provided by 42 GPs, showed that they appreciated the service, and asking the questions changed clinical care. There were many minor differences between the answers from the two centres, and substantial differences in the evidence found for 4/14 questions. However, conclusions reached were largely similar, with no or only minor differences for all questions. Conclusions: It is feasible to provide a literature search service, but further assessment is needed to establish its cost effectiveness.
Resumo:
This paper presents a new and efficient methodology for distribution network reconfiguration integrated with optimal power flow (OPF) based on a Benders decomposition approach. The objective minimizes power losses, balancing load among feeders and subject to constraints: capacity limit of branches, minimum and maximum power limits of substations or distributed generators, minimum deviation of bus voltages and radial optimal operation of networks. The Generalized Benders decomposition algorithm is applied to solve the problem. The formulation can be embedded under two stages; the first one is the Master problem and is formulated as a mixed integer non-linear programming problem. This stage determines the radial topology of the distribution network. The second stage is the Slave problem and is formulated as a non-linear programming problem. This stage is used to determine the feasibility of the Master problem solution by means of an OPF and provides information to formulate the linear Benders cuts that connect both problems. The model is programmed in GAMS. The effectiveness of the proposal is demonstrated through two examples extracted from the literature.
Resumo:
Model updating methods often neglect that in fact all physical structures are damped. Such simplification relies on the structural modelling approach, although it compromises the accuracy of the predictions of the structural dynamic behaviour. In the present work, the authors address the problem of finite element (FE) model updating based on measured frequency response functions (FRFs), considering damping. The proposed procedure is based upon the complex experimental data, which contains information related to the damped FE model parameters and presents the advantage of requiring no prior knowledge about the damping matrix structure or its content, only demanding the definition of the damping type. Numerical simulations are performed in order to establish the applicability of the proposed damped FE model updating technique and its results are discussed in terms of the correlation between the simulated experimental complex FRFs and the ones obtained from the updated FE model.
Resumo:
An Electrocardiogram (ECG) monitoring system deals with several challenges related with noise sources. The main goal of this text was the study of Adaptive Signal Processing Algorithms for ECG noise reduction when applied to real signals. This document presents an adaptive ltering technique based on Least Mean Square (LMS) algorithm to remove the artefacts caused by electromyography (EMG) and power line noise into ECG signal. For this experiments it was used real noise signals, mainly to observe the di erence between real noise and simulated noise sources. It was obtained very good results due to the ability of noise removing that can be reached with this technique. A recolha de sinais electrocardiogr a cos (ECG) sofre de diversos problemas relacionados com ru dos. O objectivo deste trabalho foi o estudo de algoritmos adaptativos para processamento digital de sinal, para redu c~ao de ru do em sinais ECG reais. Este texto apresenta uma t ecnica de redu c~ao de ru do baseada no algoritmo Least Mean Square (LMS) para remo c~ao de ru dos causados quer pela actividade muscular (EMG) quer por ru dos causados pela rede de energia el ectrica. Para as experiencias foram utilizados ru dos reais, principalmente para aferir a diferen ca de performance do algoritmo entre os sinais reais e os simulados. Foram conseguidos bons resultados, essencialmente devido as excelentes caracter sticas que esta t ecnica tem para remover ru dos.
Resumo:
The advent of Wireless Sensor Network (WSN) technologies is paving the way for a panoply of new ubiquitous computing applications, some of them with critical requirements. In the ART-WiSe framework, we are designing a two-tiered communication architecture for supporting real-time and reliable communications in WSNs. Within this context, we have been developing a test-bed application, for testing, validating and demonstrating our theoretical findings - a search&rescue/pursuit-evasion application. Basically, a WSN deployment is used to detect, localize and track a target robot and a station controls a rescuer/pursuer robot until it gets close enough to the target robot. This paper describes how this application was engineered, particularly focusing on the implementation of the localization mechanism.
Resumo:
Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.
Resumo:
A clinical trial involving 80 patients of both sexes, from ages 15 to 55, with chronic intestinal or hepatointestinal schistosomiasis mansoni, was carried out to evaluate the therapeutical efficacy of different dose regimens of praziquantel. The patients were randomly allocated into four groups with an equal number of cases and were then treated with one of the following dosages: 60 mg/kg for 1 day; 60 mg/kg daily for 2 days; 60 mg/kg daily for 3 days; and 30 mg/kg daily for 6 days. The assessment of parasitological cure was based on the quantitative oogram technique through rectal mucosa biopsies which were undertaken prior to, as well as, 1,2,4 and 6 months post-treatment. Concurrently, stool examinations according to the qualitative Hoffman, Pons & Janer (HPJ) and the quantitative Kato-Katz (K-K) methods were also performed. The best tolerability was observed with 30 mg/kg daily for 6 days whereas the highest incidence of side-effects (mainly dizziness and nausea) was found with 60 mg/kg daily for 3 days. No serious adverse drug reaction has occurred. The achieved cure rates were: 25% with 60 mg/kg for 1 day; 60% with 60 mg/kg daily for 2 days; 89.5% with 60 mg/kg daily for 3 days; and 90% with 30 mg/kg daily for 6 days. At the same time there has been a downfall of 64%, 73%, 87% and 84% respectively, in the median number of viable S. mansoni ova per gram of tissue. Thus, a very clear direct correlation between dose and effect could be seen. The corresponding cure rates according to stool examinations by HPJ were 39%, 80%, 100% and 95%; by K-K 89%, 100%, 100% and 100%. This discrepancy in results amongst the three parasitological methods is certainly due to their unequal accuracy. In fact, when the number of viable eggs per gram of tissue fell below 5,000 the difference in the percentage of false negative findings between HPJ (28%) and K-K (80%) became significative. When this number dropped to less than 2,000 the percentage of false negative results obtained with HPJ (49%) turned significant in relation to the oogram as well. In conclusion, it has been proven that praziquantel is a highly efficacious agent against S. mansoni infections. If administered at a total dose of 180 mg/kg divided into either 3 or 6 days, it yields a 90% cure rate. Possibly, one could reach 100% by increasing the total dose to 240 mg/kg. Furthermore, it was confirmed that the quantitative oogram technique is the most reliable parasitological method when evaluating the efficacy of new drugs in schistosomiasis mansoni.