996 resultados para standard batch algorithms
Resumo:
We investigate the sensitivity of a Markov model with states and transition probabilities obtained from clustering a molecular dynamics trajectory. We have examined a 500 ns molecular dynamics trajectory of the peptide valine-proline-alanine-leucine in explicit water. The sensitivity is quantified by varying the boundaries of the clusters and investigating the resulting variation in transition probabilities and the average transition time between states. In this way, we represent the effect of clustering using different clustering algorithms. It is found that in terms of the investigated quantities, the peptide dynamics described by the Markov model is sensitive to the clustering; in particular, the average transition times are found to vary up to 46%. Moreover, inclusion of nonphysical sparsely populated clusters can lead to serious errors of up to 814%. In the investigation, the time step used in the transition matrix is determined by the minimum time scale on which the system behaves approximately Markovian. This time step is found to be about 100 ps. It is concluded that the description of peptide dynamics with transition matrices should be performed with care, and that using standard clustering algorithms to obtain states and transition probabilities may not always produce reliable results.
Resumo:
Continuous-flow generation of α-diazosulfoxides results in a two- to three-fold increase in yields and decreased reaction times compared to standard batch synthesis methods. These high yielding reactions are enabled by flowing through a bed of polystyrene-supported base (PS-DBU or PS-NMe2) with highly controlled residence times. This engineered solution allows the α-diazosulfoxides to be rapidly synthesized while limiting exposure of the products to basic reaction conditions, which have been found to cause rapid decomposition. In addition to improved yields, this work has the added advantage of ease of processing, increased safety profile, and scale-up potential.
Resumo:
The central interest of this thesis is to comprehend how the public action impels the formation and transformation of the tourist destinies. The research was based on the premise that the public actions are the result of the mediation process of state and non-state actors considered important in a section, which interact aiming for prevailing their interests and world visions above the others. The case of Porto de Galinhas beach, in Pernambuco, locus of the investigation of this thesis, allowed the analysis of a multiplicity of actors on the formation and implementation of local actions toward the development of the tourism between the years 1970 and 2010, as well as permitted the comprehension of the construction of the referential on the interventions made. This thesis, of a qualitative nature, has as theoretical support the cognitive approach of analysis of the public policies developed in France, and it has as main exponents the authors Bruno Jobert and Pierre Muller. This choice was made by the emphasis on the cognitive and normative factors of the politics, which aspects are not very explored in the studies of public policies in Brazil. As the source of the data collection, documental, bibliographic and field researches were utilized to the (re)constitution of the formation and transformation in the site concerned. The analysis techniques applied were the content and the documental analysis. To trace the public action referential, it started by the characterization of the touristic section frontiers and the creation of images by the main international body: the World Tourism Organization, of which analysis of the minutes of the meetings underscored guidelines to the member countries, including Brazil, which compounds the global-sectorial reference of the section. As from the analysis of the evolution of the tourism in the country, was identified that public policies in Brazil passed by transformations in their organization over the years, indicating changes in the referential that guided the interventions. These guidelines and transformations were identified in the construction of the tourist destination of Porto de Galinhas, of which data was systematized and presented in four historical periods, in which were discussed the values, the standard, the algorithms, the images and the important mediators. It has been revealed that the State worked in different roles in the decades analyzed in local tourism. From the 1990s, however, new actors were inserted in the formulation and implementation of policies developed, especially for local hotelkeepers. These, through their association, establishes a leadership relation in the local touristic section, thereby, they could set their hegemony and spread their own interest. The leadership acquired by a group of actors, in the case of Porto de Galinhas, does not mean that trade within the industry were neutralized, but that there is a cognitive framework that confronts the actors involved. In spite of the advances achieved by the work of the mediators in the last decades, that resulted in an amplification and diversification of the activity in the area, as well as the consolidation at the beach, as a tourist destiny of national standout, the position of the place is instable, concerned to the competitiveness, once that there is an situation of social and environmental unsustainability
Resumo:
The central interest of this thesis is to comprehend how the public action impels the formation and transformation of the tourist destinies. The research was based on the premise that the public actions are the result of the mediation process of state and non-state actors considered important in a section, which interact aiming for prevailing their interests and world visions above the others. The case of Porto de Galinhas beach, in Pernambuco, locus of the investigation of this thesis, allowed the analysis of a multiplicity of actors on the formation and implementation of local actions toward the development of the tourism between the years 1970 and 2010, as well as permitted the comprehension of the construction of the referential on the interventions made. This thesis, of a qualitative nature, has as theoretical support the cognitive approach of analysis of the public policies developed in France, and it has as main exponents the authors Bruno Jobert and Pierre Muller. This choice was made by the emphasis on the cognitive and normative factors of the politics, which aspects are not very explored in the studies of public policies in Brazil. As the source of the data collection, documental, bibliographic and field researches were utilized to the (re)constitution of the formation and transformation in the site concerned. The analysis techniques applied were the content and the documental analysis. To trace the public action referential, it started by the characterization of the touristic section frontiers and the creation of images by the main international body: the World Tourism Organization, of which analysis of the minutes of the meetings underscored guidelines to the member countries, including Brazil, which compounds the global-sectorial reference of the section. As from the analysis of the evolution of the tourism in the country, was identified that public policies in Brazil passed by transformations in their organization over the years, indicating changes in the referential that guided the interventions. These guidelines and transformations were identified in the construction of the tourist destination of Porto de Galinhas, of which data was systematized and presented in four historical periods, in which were discussed the values, the standard, the algorithms, the images and the important mediators. It has been revealed that the State worked in different roles in the decades analyzed in local tourism. From the 1990s, however, new actors were inserted in the formulation and implementation of policies developed, especially for local hotelkeepers. These, through their association, establishes a leadership relation in the local touristic section, thereby, they could set their hegemony and spread their own interest. The leadership acquired by a group of actors, in the case of Porto de Galinhas, does not mean that trade within the industry were neutralized, but that there is a cognitive framework that confronts the actors involved. In spite of the advances achieved by the work of the mediators in the last decades, that resulted in an amplification and diversification of the activity in the area, as well as the consolidation at the beach, as a tourist destiny of national standout, the position of the place is instable, concerned to the competitiveness, once that there is an situation of social and environmental unsustainability
Resumo:
The application of multivariate calibration techniques to multicomponent analysis by UV-VIS molecular absorption spectrometry is a powerful tool for simultaneous determination of several chemical species. However, when this methodology is accomplished manually, it is slow and laborious, consumes high amounts of reagents and samples, is susceptible to contaminations and presents a high operational cost. To overcome these drawbacks, a flow-batch analyser is proposed in this work. This analyser was developed for automatic preparation of standard calibration and test (or validation) mixtures. It was applied to the simultaneous determination of Cu2+, Mn2+ and Zn2+ in polyvitaminic and polymineral pharmaceutical formulations, using 4-(2-piridilazo) resorcinol as reagent and a UV-VIS spectrophotometer with a photodiode array detector. The results obtained with the proposed system are in good agreement with those obtained by flame atomic absorption spectrometry, which was employed as reference method. With the proposed analyser, the preparation of calibration and test mixtures can be accomplished about four hours, while the manual procedure requires at least two days. Moreover, it consumes smaller amounts of reagents and samples than the manual procedure. After the preparation of calibration and test mixtures, 60 samples h-1 can be carried out with the proposed flow-batch analyser.
Comparison of Temporal and Standard Independent Component Analysis (ICA) Algorithms for EEG Analysis
Resumo:
ABSTRACT: The application of multivariate calibration techniques to multicomponent analysis by UV-VIS molecular absorption spectrometry is a powerful tool for simultaneous determination of several chemical species. However, when this methodology is accomplished manually, it is slow and laborious, consumes high amounts of reagents and samples, is susceptible to contaminations and presents a high operational cost. To overcome these drawbacks, a flow-batch analyser is proposed in this work. This analyser was developed for automatic preparation of standard calibration and test (or validation) mixtures. It was applied to the simultaneous determination of Cu2+, Mn2+ and Zn2+ in polyvitaminic and polymineral pharmaceutical formulations, using 4-(2-piridilazo) resorcinol as reagent and a UV-VIS spectrophotometer with a photodiode array detector. The results obtained with the proposed system are in good agreement with those obtained by flame atomic absorption spectrometry, which was employed as reference method. With the proposed analyser, the preparation of calibration and test mixtures can be accomplished about four hours, while the manual procedure requires at least two days. Moreover, it consumes smaller amounts of reagents and samples than the manual procedure. After the preparation of calibration and test mixtures, 60 samples-1 can be carried out with the proposed flow-batch analyser.
Resumo:
The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.
Resumo:
Fermentation processes as objects of modelling and high-quality control are characterized with interdependence and time-varying of process variables that lead to non-linear models with a very complex structure. This is why the conventional optimization methods cannot lead to a satisfied solution. As an alternative, genetic algorithms, like the stochastic global optimization method, can be applied to overcome these limitations. The application of genetic algorithms is a precondition for robustness and reaching of a global minimum that makes them eligible and more workable for parameter identification of fermentation models. Different types of genetic algorithms, namely simple, modified and multi-population ones, have been applied and compared for estimation of nonlinear dynamic model parameters of fed-batch cultivation of S. cerevisiae.
Resumo:
This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^
Biased Random-key Genetic Algorithms For The Winner Determination Problem In Combinatorial Auctions.
Resumo:
Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.
Resumo:
BP Refinery (Bulwer Island) Ltd (BP) located on the eastern Australian coast is currently undergoing a major expansion as a part of the Queensland Clean Fuels Project. The associated wastewater treatment plant upgrade will provide a better quality of treated effluent than is currently possible with the existing infrastructure, and which will be of a sufficiently high standard to meet not only the requirements of imposed environmental legislation but also BP's environmental objectives. A number of challenges were faced when considering the upgrade, particularly; cost constraints and limited plot space, highly variable wastewater, toxicity issues, and limited available hydraulic head. Sequencing Batch Reactor (SBR) Technology was chosen for the lagoon upgrade based on the following; SBR technology allowed a retro-fit of the existing earthen lagoon without the need for any additional substantial concrete structures, a dual lagoon system allowed partial treatment of wastewaters during construction, SBRs give substantial process flexibility, SBRs have the ability to easily modify process parameters without any physical modifications, and significant cost benefits. This paper presents the background to this application, an outline of laboratory studies carried out on the wastewater and details the full scale design issues and methods for providing a cost effective, efficient treatment system using the existing lagoon system.
The use of non-standard CT conversion ramps for Monte Carlo verification of 6 MV prostate IMRT plans
Resumo:
Monte Carlo (MC) dose calculation algorithms have been widely used to verify the accuracy of intensity-modulated radiotherapy (IMRT) dose distributions computed by conventional algorithms due to the ability to precisely account for the effects of tissue inhomogeneities and multileaf collimator characteristics. Both algorithms present, however, a particular difference in terms of dose calculation and report. Whereas dose from conventional methods is traditionally computed and reported as the water-equivalent dose (Dw), MC dose algorithms calculate and report dose to medium (Dm). In order to compare consistently both methods, the conversion of MC Dm into Dw is therefore necessary. This study aims to assess the effect of applying the conversion of MC-based Dm distributions to Dw for prostate IMRT plans generated for 6 MV photon beams. MC phantoms were created from the patient CT images using three different ramps to convert CT numbers into material and mass density: a conventional four material ramp (CTCREATE) and two simplified CT conversion ramps: (1) air and water with variable densities and (2) air and water with unit density. MC simulations were performed using the BEAMnrc code for the treatment head simulation and the DOSXYZnrc code for the patient dose calculation. The conversion of Dm to Dw by scaling with the stopping power ratios of water to medium was also performed in a post-MC calculation process. The comparison of MC dose distributions calculated in conventional and simplified (water with variable densities) phantoms showed that the effect of material composition on dose-volume histograms (DVH) was less than 1% for soft tissue and about 2.5% near and inside bone structures. The effect of material density on DVH was less than 1% for all tissues through the comparison of MC distributions performed in the two simplified phantoms considering water. Additionally, MC dose distributions were compared with the predictions from an Eclipse treatment planning system (TPS), which employed a pencil beam convolution (PBC) algorithm with Modified Batho Power Law heterogeneity correction. Eclipse PBC and MC calculations (conventional and simplified phantoms) agreed well (<1%) for soft tissues. For femoral heads, differences up to 3% were observed between the DVH for Eclipse PBC and MC calculated in conventional phantoms. The use of the CT conversion ramp of water with variable densities for MC simulations showed no dose discrepancies (0.5%) with the PBC algorithm. Moreover, converting Dm to Dw using mass stopping power ratios resulted in a significant shift (up to 6%) in the DVH for the femoral heads compared to the Eclipse PBC one. Our results show that, for prostate IMRT plans delivered with 6 MV photon beams, no conversion of MC dose from medium to water using stopping power ratio is needed. In contrast, MC dose calculations using water with variable density may be a simple way to solve the problem found using the dose conversion method based on the stopping power ratio.
Resumo:
Multiprocessors, particularly in the form of multicores, are becoming standard building blocks for executing reliable software. But their use for applications with hard real-time requirements is non-trivial. Well-known realtime scheduling algorithms in the uniprocessor context (Rate-Monotonic [1] or Earliest-Deadline-First [1]) do not perform well on multiprocessors. For this reason the scientific community in the area of real-time systems has produced new algorithms specifically for multiprocessors. In the meanwhile, a proposal [2] exists for extending the Ada language with new basic constructs which can be used for implementing new algorithms for real-time scheduling; the family of task splitting algorithms is one of them which was emphasized in the proposal [2]. Consequently, assessing whether existing task splitting multiprocessor scheduling algorithms can be implemented with these constructs is paramount. In this paper we present a list of state-of-art task-splitting multiprocessor scheduling algorithms and, for each of them, we present detailed Ada code that uses the new constructs.
Resumo:
A MATLAB/SIMULINK-based simulator was employed for studies concerning the control of baker’s yeast fed-batch fermentation. Four control algorithms were implemented and compared: the classical PID control, two discrete versions- modified velocity and position algorithms, and a fuzzy law. The simulation package was seen to be an efficient tool for the simulation and tests of control strategies of the nonlinear process.