942 resultados para Constrained optimization problems
Resumo:
An optimised version of the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method for simultaneous determination of 14 organochlorine pesticides in carrots was developed using gas chromatography coupled with electron-capture detector (GC-ECD) and confirmation by gas chromatography tandem mass spectrometry (GC-MS/MS). A citrate-buffered version of QuEChERS was applied for the extraction of the organochlorine pesticides, and for the extract clean-up, primary secondary amine, octadecyl-bonded silica (C18), magnesium sulphate (MgSO4) and graphitized carbon black were used as sorbents. The GC-ECD determination of the target compounds was achieved in less than 20 min. The limits of detection were below the EUmaximum residue limits (MRLs) for carrots, 10–50 μg kg−1, while the limit of quantification did exceed 10 μg kg−1 for hexachlorobenzene (HCB). The introduction of a sonication step was shown to improve the recoveries. The overall average recoveries in carrots, at the four tested levels (60, 80, 100 and 140 μg kg−1), ranged from 66 to 111% with relative standard deviations in the range of 2– 15 % (n03) for all analytes, with the exception of HCB. The method has been applied to the analysis of 21 carrot samples from different Portuguese regions, and β-HCH was the pesticide most frequently found, with concentrations oscillating between less than the limit of quantification to 14.6 μg kg−1. Only one sample had a pesticide residue (β-HCH) above the MRL, 14.6 μg kg−1. This methodology combines the advantages of both QuEChERS and GC-ECD, producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.
Resumo:
The present work describes the optimization of a short-term assay, based on the inhibition of the esterase activity of the alga Pseudokirchneriella subcapitata, in a microplate format. The optimization of the staining procedure showed that the incubation of the algal cells with 20 μmolL−1 fluorescein diacetate (FDA) for 40 min allowed discrimination between metabolic active and inactive cells. The shortterm assay was tested using Cu as toxicant. For this purpose, algal cells, in the exponential or stationary phase of growth, were exposed to the heavy metal in growing conditions. After 3 or 6 h, cells were subsequently stained with FDA, using the optimized procedure. For Cu, the 3- and 6-h EC50 values, based on the inhibition of the esterase activity of algal cells in the exponential phase of growth, were 209 and 130 μg L−1, respectively. P. subcapitata cells, in the stationary phase of growth, displayed higher effective concentration values than those observed in the exponential phase. The 3- and 6-h EC50 values for Cu, for cells in the stationary phase, were 443 and 268 μgL−1, respectively. This short-term microplate assay showed to be a rapid endpoint for testing toxicity using the alga P. subcapitata. The small volume required, the simplicity of the assay (no washing steps), and the automatic reading of the fluorescence make the assay particularly well suited for the evaluation of the toxicity of a high number of environmental samples.
Resumo:
Competitive electricity markets have arisen as a result of power-sector restructuration and power-system deregulation. The players participating in competitive electricity markets must define strategies and make decisions using all the available information and business opportunities.
Resumo:
In this study, efforts were made in order to put forward an integrated recycling approach for the thermoset based glass fibre reinforced polymer (GPRP) rejects derived from the pultrusion manufacturing industry. Both the recycling process and the development of a new cost-effective end-use application for the recyclates were considered. For this purpose, i) among the several available recycling techniques for thermoset based composite materials, the most suitable one for the envisaged application was selected (mechanical recycling); and ii) an experimental work was carried out in order to assess the added-value of the obtained recyclates as aggregates and reinforcement replacements into concrete-polymer composite materials. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified concrete-polymer composites with regard to unmodified materials. In the mix design process of the new GFRP waste based composite material, the recyclate content and size grade, and the effect of the incorporation of an adhesion promoter were considered as material factors and systematically tested between reasonable ranges. The optimization process of the modified formulations was supported by the Fuzzy Boolean Nets methodology, which allowed finding the best balance between material parameters that maximizes both flexural and compressive strengths of final composite. Comparing to related end-use applications of GFRP wastes in cementitious based concrete materials, the proposed solution overcome some of the problems found, namely the possible incompatibilities arisen from alkalis-silica reaction and the decrease in the mechanical properties due to high water-cement ratio required to achieve the desirable workability. Obtained results were very promising towards a global cost-effective waste management solution for GFRP industrial wastes and end-of-life products that will lead to a more sustainable composite materials industry.
Resumo:
The main goal of this paper is to analyze the behavior of nonmono- tone hybrid tabu search approaches when solving systems of nonlinear inequalities and equalities through the global optimization of an appro- priate merit function. The algorithm combines global and local searches and uses a nonmonotone reduction of the merit function to choose the local search. Relaxing the condition aims to call the local search more often and reduces the overall computational e ort. Two variants of a perturbed pattern search method are implemented as local search. An experimental study involving a variety of problems available in the lit- erature is presented.
Resumo:
As operações de separação por adsorção têm vindo a ganhar importância nos últimos anos, especialmente com o desenvolvimento de técnicas de simulação de leitos móveis em colunas, tal como a cromatografia de Leito Móvel Simulado (Simulated Moving Bed, SMB). Esta tecnologia foi desenvolvida no início dos anos 60 como método alternativo ao processo de Leito Móvel Verdadeiro (True Moving Bed, TMB), de modo a resolver vários dos problemas associados ao movimento da fase sólida, usuais nestes métodos de separação cromatográficos de contracorrente. A tecnologia de SMB tem sido amplamente utilizada em escala industrial principalmente nas indústrias petroquímica e de transformação de açúcares e, mais recentemente, na indústria farmacêutica e de química fina. Nas últimas décadas, o crescente interesse na tecnologia de SMB, fruto do alto rendimento e eficiente consumo de solvente, levou à formulação de diferentes modos de operação, ditos não convencionais, que conseguem unidades mais flexíveis, capazes de aumentar o desempenho de separação e alargar ainda mais a gama de aplicação da tecnologia. Um dos exemplos mais estudados e implementados é o caso do processo Varicol, no qual se procede a um movimento assíncrono de portas. Neste âmbito, o presente trabalho foca-se na simulação, análise e avaliação da tecnologia de SMB para dois casos de separação distintos: a separação de uma mistura de frutose-glucose e a separação de uma mistura racémica de pindolol. Para ambos os casos foram considerados e comparados dois modos de operação da unidade de SMB: o modo convencional e o modo Varicol. Desta forma, foi realizada a implementação e simulação de ambos os casos de separação no simulador de processos Aspen Chromatography, mediante a utilização de duas unidades de SMB distintas (SMB convencional e SMB Varicol). Para a separação da mistura frutose-glucose, no quediz respeito à modelização da unidade de SMB convencional, foram utilizadas duas abordagens: a de um leito móvel verdadeiro (modelo TMB) e a de um leito móvel simulado real (modelo SMB). Para a separação da mistura racémica de pindolol foi considerada apenas a modelização pelo modelo SMB. No caso da separação da mistura frutose-glucose, procedeu-se ainda à otimização de ambas as unidades de SMB convencional e Varicol, com o intuito do aumento das suas produtividades. A otimização foi realizada mediante a aplicação de um procedimento de planeamento experimental, onde as experiências foram planeadas, conduzidas e posteriormente analisadas através da análise de variância (ANOVA). A análise estatística permitiu selecionar os níveis dos fatores de controlo de modo a obter melhores resultados para ambas as unidades de SMB.
Resumo:
A recent and comprehensive review of the use of race and ethnicity in research that address health disparities in epidemiology and public health is provided. First it is described the theoretical basis upon which race and ethnicity differ drawing from previous work in anthropology, social science and public health. Second, it is presented a review of 280 articles published in high impacts factor journals in regards to public health and epidemiology from 2009-2011. An analytical grid enabled the examination of conceptual, theoretical and methodological questions related to the use of both concepts. The majority of articles reviewed were grounded in a theoretical framework and provided interpretations from various models. However, key problems identified include a) a failure from researchers to differentiate between the concepts of race and ethnicity; b) an inappropriate use of racial categories to ascribe ethnicity; c) a lack of transparency in the methods used to assess both concepts; and d) failure to address limits associated with the construction of racial or ethnic taxonomies and their use. In conclusion, future studies examining health disparities should clearly establish the distinction between race and ethnicity, develop theoretically driven research and address specific questions about the relationships between race, ethnicity and health. One argue that one way to think about ethnicity, race and health is to dichotomize research into two sets of questions about the relationship between human diversity and health.
Resumo:
Securing group communication in wireless sensor networks has recently been extensively investigated. Many works have addressed this issue, and they have considered the grouping concept differently. In this paper, we consider a group as being a set of nodes sensing the same data type, and we alternatively propose an efficient secure group communication scheme guaranteeing secure group management and secure group key distribution. The proposed scheme (RiSeG) is based on a logical ring architecture, which permits to alleviate the group controller’s task in updating the group key. The proposed scheme also provides backward and forward secrecy, addresses the node compromise attack, and gives a solution to detect and eliminate the compromised nodes. The security analysis and performance evaluation show that the proposed scheme is secure, highly efficient, and lightweight. A comparison with the logical key hierarchy is preformed to prove the rekeying process efficiency of RiSeG. Finally, we present the implementation details of RiSeG on top of TelosB sensor nodes to demonstrate its feasibility.
Resumo:
This paper proposes a stochastic mixed-integer linear approach to deal with a short-term unit commitment problem with uncertainty on a deregulated electricity market that includes day-ahead bidding and bilateral contracts. The proposed approach considers the typically operation constraints on the thermal units and a spinning reserve. The uncertainty is due to the electricity prices, which are modeled by a scenario set, allowing an acceptable computation. Moreover, emission allowances are considered in a manner to allow for the consideration of environmental constraints. A case study to illustrate the usefulness of the proposed approach is presented and an assessment of the cost for the spinning reserve is obtained by a comparison between the situation with and without spinning reserve.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.
Resumo:
We present a 12*(1+|R|/(4m))-speed algorithm for scheduling constrained-deadline sporadic real-time tasks on a multiprocessor comprising m processors where a task may request one of |R| sequentially-reusable shared resources.
Resumo:
Cluster scheduling and collision avoidance are crucial issues in large-scale cluster-tree Wireless Sensor Networks (WSNs). The paper presents a methodology that provides a Time Division Cluster Scheduling (TDCS) mechanism based on the cyclic extension of RCPS/TC (Resource Constrained Project Scheduling with Temporal Constraints) problem for a cluster-tree WSN, assuming bounded communication errors. The objective is to meet all end-to-end deadlines of a predefined set of time-bounded data flows while minimizing the energy consumption of the nodes by setting the TDCS period as long as possible. Sinceeach cluster is active only once during the period, the end-to-end delay of a given flow may span over several periods when there are the flows with opposite direction. The scheduling tool enables system designers to efficiently configure all required parameters of the IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs in the network design time. The performance evaluation of thescheduling tool shows that the problems with dozens of nodes can be solved while using optimal solvers.
Resumo:
Compositional real-time scheduling clearly requires that ”normal” real-time scheduling challenges are addressed but challenges intrinsic to compositionality must be addressed as well, in particular: (i) how should interfaces be described? and (ii) how should numerical values be assigned to parameters constituting the interfaces? The real-time systems community has traditionally used narrow interfaces for describing a component (for example, a utilization/bandwidthlike metric and the distribution of this bandwidth in time). In this paper, we introduce the concept of competitive ratio of an interface and show that typical narrow interfaces cause poor performance for scheduling constrained-deadline sporadic tasks (competitive ratio is infinite). Therefore, we explore more expressive interfaces; in particular a class called medium-wide interfaces. For this class, we propose an interface type and show how the parameters of the interface should be selected. We also prove that this interface is 8-competitive.
Resumo:
In distributed soft real-time systems, maximizing the aggregate quality-of-service (QoS) is a typical system-wide goal, and addressing the problem through distributed optimization is challenging. Subtasks are subject to unpredictable failures in many practical environments, and this makes the problem much harder. In this paper, we present a robust optimization framework for maximizing the aggregate QoS in the presence of random failures. We introduce the notion of K-failure to bound the effect of random failures on schedulability. Using this notion we define the concept of K-robustness that quantifies the degree of robustness on QoS guarantee in a probabilistic sense. The parameter K helps to tradeoff achievable QoS versus robustness. The proposed robust framework produces optimal solutions through distributed computations on the basis of Lagrangian duality, and we present some implementation techniques. Our simulation results show that the proposed framework can probabilistically guarantee sub-optimal QoS which remains feasible even in the presence of random failures.
Resumo:
The foot and the ankle are small structures commonly affected by disorders, and their complex anatomy represent significant diagnostic challenges. SPECT/CT Image fusion can provide missing anatomical and bone structure information to functional imaging, which is particularly useful to increase diagnosis certainty of bone pathology. However, due to SPECT acquisition duration, patient’s involuntary movements may lead to misalignment between SPECT and CT images. Patient motion can be reduced using a dedicated patient support. We aimed at designing an ankle and foot immobilizing device and measuring its efficacy at improving image fusion. Methods: We enrolled 20 patients undergoing distal lower-limb SPECT/CT of the ankle and the foot with and without a foot holder. The misalignment between SPECT and CT images was computed by manually measuring 14 fiducial markers chosen among anatomical landmarks also visible on bone scintigraphy. Analysis of variance was performed for statistical analysis. Results: The obtained absolute average difference without and with support was 5.1±5.2 mm (mean±SD) and 3.1±2.7 mm, respectively, which is significant (p<0.001). Conclusion: The introduction of the foot holder significantly decreases misalignment between SPECT and CT images, which may have clinical influence in the precise localization of foot and ankle pathology.